Monday, June 25, 2007

Petascale computing for Particle Physics

Google Tech Talks has a video presentation on the data collection for detectors being developed for the new Large Hadron Collider particle accelerator. The ATLAS detector they are developing has fairly unique computing requirements due to the massive amount (a Petabyte per second) of data produced. A lot of the processing requires them to very quickly reject most of the collision events (like low energy collisions) and then do more complicated analysis of the rest of the data. The abstract:
The Large Hadron Collider (LHC), scheduled to begin operation in Summer 2008, will collide protons at energies not accessible since the time of the early Universe. The study of the reactions produced at the LHC has the potential to revolutionize our understanding of the most fundamental forces in nature. The ATLAS experiment, currently being installed at the LHC, is designed to detect collisions at the LHC, to collect the relevant data and to provide a unified framework for the reconstruction and analysis of these data. This talk will review the goals of the ATLAS program and will describe the software and computing challenges associated analyzing these data. Among the relevant issues are the need to develop and maintain a unified analysis framework for use by more than 1000 scientists and the need for distributed access to large (petabyte) scale data samples, including a significant metadata component.

0 Comments:

Post a Comment

<< Home