Managing the Extreme Dataflow of LHC Experiments
暂无分享,去创建一个
The Large Hardron Collider (LHC) will extend our knowledge of particles and their interactions to unprecedented energies. The physicists hope to obtain answers on questions ranging from the origin of mass to the nature of dark matter. The high interactions rates and the large number of detector channels result in an enormous amount of data produced in every second. New developments in the area of data acquisition have been necessary to be able to transport and filter these data with complex trigger systems. The selected data will be then recorded, resulting in datasets reaching several Petabytes. The resource requirements for the analysis of these data poses also new challenges in the offline analysis. A global computing infrastructure, the grid, has been developed to interconnect computing centers all over the world to provide these resources and to analyze the data. The article reviews these new computing challenges for LHC covering both the online and the offline aspects.
Keywords:
high-energy physics;
Large Hadron Collider;
data acquisition;
high-level trigger;
grid computing;
data analysis
[1] S. Stapnes. Detector challenges at the LHC , 2007, Nature.
[2] J. Ellis. Beyond the standard model with the LHC , 2007, Nature.
[3] T. Wyatt. High-energy colliders and the rise of the standard model , 2007, Nature.