ARLINGTON - The world's largest computing grid, with a major center at The University of Texas at Arlington, has passed its most comprehensive tests to date in anticipation of the restart of the world's most powerful particle accelerator, the Large Hadron Collider (LHC) at CERN in Geneva, Switzerland. CERN, the European Organization for Nuclear Research, is the world's leading laboratory for particle physics.
The successful dress rehearsal proves that the Worldwide LHC Computing Grid (WLCG) is ready to analyze and manage real data from the massive machine. The University of Texas at Arlington is a vital partner in the development and operation of the WLCG, and Physics Professor Kaushik De is the U.S computing coordinator for the ATLAS experiment, one of four experiments at CERN.
The full-scale computing grid test, called the Scale Test of the Experimental Program 2009 (STEP09), demonstrates the ability of the grid to efficiently navigate data collected from the LHC's intense collisions at CERN, all the way through a multi-layered management process that culminates at laboratories and universities around the world. When the LHC resumes operations this fall, this computer grid will handle more than 15 million gigabytes of data every year. By way of comparison, an average desktop computer holds about 40 gigabytes of data.
Although there have been several large-scale WLCG data-processing tests in the past, STEP09, which was completed on June 15, was the first to simultaneously test all of the key elements of the process.
"We have never attempted anything of this complexity, which tested every aspect of software, hardware and involved physicist users. We passed an important milestone - at the first attempt," said De, who helped to plan the STEP09 tests for ATLAS, and coordinated their successful completion in the U.S.
Dozens of scientists from UT Arlington participated in the tests. De prepared simulated data, similar to those expected at the LHC, in the weeks before the test. Physicists from UT Arlington managed operations and participated in analyzing the data. The success of the tests in the United States exceeded all expectations.
"Unlike previous challenges, which were dedicated testing periods, STEP09 was a production activity that closely matches the types of workload that we can expect during LHC data taking. It was a demonstration not only of the readiness of experiments, sites and services but also the operations, support procedures and infrastructures," said CERN's Ian Bird, leader of the WLCG project.
Once LHC data have been collected at CERN, dedicated optical fiber networks distribute the data to 11 major "Tier-1" computer centers in Europe, North America and Asia, including those at DOE's Brookhaven National Laboratory in New York and Fermi National Accelerator Laboratory in Illinois. From these, data are dispatched to more than 140 "Tier-2" centers around the world, including UT Arlington, one of 12 in the United States. It will be at the Tier-2 and Tier-3 centers that physicists will analyze data from the LHC experiments leading to new discoveries
"In order to really prove our readiness at close-to-real-life circumstances, we have to carry out data replication, data reprocessing, data analysis and event simulation all at the same time and all at the expected scale for data taking," said Michael Ernst, director of Brookhaven National Laboratory's Tier-1 Computing Center. "That's what made STEP09 unique."
The result was "wildly successful," Ernst said, adding that the U.S. distributed computing facility for the ATLAS experiment completed 150,000 analysis jobs at an efficiency rate of 94 percent.
Note to editors: Grid computing and Large Hadron Collider images are available at http://www.uslhc.us/Images.
The University of Texas at Arlington is an Equal Opportunity and Affirmative Action employer.