CMS The Computing Project

Acknowledgements The CMS Computing Group gratefully acknowledges the contributions of technical staff throughout the world who have been involved in the design, operations and analysis of the computing challenges that have led us to this TDR. The CMS computing technical design has been developed in cooperation with our colleagues in the Worldwide LHC Computing Grid, together with the computing teams of ALICE, ATLAS and LHCb. We thank them all for their collaboration and for their assistance with and operation of the data-challenges that form the underpinning of this report. We would like to thank Neil Geddes who has diligently followed the drafts of this TDR and contributed many important suggestions. We thank also the external reviewers of our Computing Model Paper: Neil, Tony Cass and John Harvey. We thank our CMS internal reviewers Jim Branson, Bob Clare, Lorenzo Foa and Gail Hanson for all their help and constructive comments. For their perpetual good humour in the face of our unreasonable requests and deadlines, we thank the CMS Secretariat: Special thanks to Sergio Cittolin for his artistic interpretation of the CMS Computing Model shown on the cover page We also wish to thank our collaborators on CMS and especially the CMS management for their continuous support and encouragement. xix xx Executive Summary This document provides a top-level description of the organisation of the CMS Offline Computing systems. This organisation relies upon a number of cooperating pieces: • A tier-organised structure of computing resources, based on a Tier-0 centre at CERN and a small number of Tier-1 centres connected using high-speed networks. • A relatively large number of Tier-2 analysis centres where physics analysis will be performed. • A comprehensive and performant software framework designed for high-energy event streams. • Workload management tools to coordinate work at the centres and data management tools to ensure the efficient use of computing resources and the integrity of the data, including adequate auditing and safekeeping of raw and reconstructed data, calibration data, and job parameters. • A comprehensive project management plan so that the various project deliver-ables are tracked and potential bottlenecks are detected and eliminated. These pieces are discussed in this document in terms of the CMS Data Model and the Physics Analysis Models. The workload management is considered in the context of integration into the LHC Computing Grid. Chapter 2 provides the motivation behind the top-level baseline CMS Computing Model and describes it …

[1]  Bartsch,et al.  Globally Distributed User Analysis Computing at CDF , 2005 .

[2]  N. Amapane,et al.  Volume-based Representation of the Magnetic Field , 2005 .

[3]  Miron Livny,et al.  Use of Condor and GLOW for CMS Simulation Production , 2005 .

[4]  Paul Avery,et al.  The Open Science Grid , 2007 .

[5]  Danek Kotlinski,et al.  Pixel Reconstruction in the CMS High-Level Trigger , 2005 .

[6]  F. F. Tikhonin STATE RESEARCH CENTER OF RUSSIA INSTITUTE FOR HIGH ENERGY PHYSICS , 2006 .

[7]  F Würthwein,et al.  The Condor based CDF CAF , 2005 .

[8]  Flavia Donno,et al.  Distributed computing grid experiences in CMS DC04 , 2005 .

[9]  L. Lueking,et al.  JIM Deployment for the CDF Experiment , 2005 .

[10]  F. Rademakers,et al.  ROOT — An object oriented data analysis framework , 1997 .

[11]  D. Chamont,et al.  OVAL: the CMS Testing Robot , 2003, ArXiv.

[12]  Klaus Rabbertz,et al.  Software Agents in Data and Workflow Management , 2004 .

[13]  Atm Ad Aerts,et al.  Status and perspective of detector databases in the CMS experiment at the LHC , 2004 .

[14]  J. Yu,et al.  Performance of an Operating High Energy Physics Data Grid: DØSAR-Grid , 2005, ArXiv.

[15]  C. Grandi,et al.  Running CMS software on GRID Testbeds , 2003 .

[16]  Erwin Laure,et al.  CMS Test of the European DataGrid Testbed , 2003 .

[17]  David Evans,et al.  McRunjob: A High Energy Physics Workflow Planner for Grid Production Processing , 2003, ArXiv.

[18]  Ashiq Anjum,et al.  The Clarens Grid-enabled Web Services Framework : Services and Implementation , 2005 .

[19]  Christophe Delaere,et al.  The high level trigger software for the CMS experiment , 2004 .

[20]  利久 亀井,et al.  California Institute of Technology , 1958, Nature.

[21]  Natalia Ratnikova,et al.  Production Management Software for the CMS Data Challenge , 2005 .

[22]  Claudio Grandi,et al.  The CMS Computing Model , 2004 .

[23]  M Liendl,et al.  CMS Detector Description : New Developments , 2005 .

[24]  R. Illingworth,et al.  Deployment of SAM for the CDF Experiment , 2005 .

[25]  Parag Mhashilkar,et al.  Experience using grid tools for CDF physics , 2004 .

[26]  S. Wynhoff Using the reconstruction software, ORCA, in the CMS data-challenge , 2005 .

[27]  Innocente,et al.  Composite Framework for CMS Applications , 2005 .

[28]  Ashiq Anjum,et al.  Grid Enabled Analysis : Architecture, prototype and status , 2005 .

[29]  P. Vanlaer,et al.  Deterministic annealing for vertex finding at CMS , 2003 .

[30]  Jason Lee,et al.  The Grid2003 production grid: principles and practice , 2004, Proceedings. 13th IEEE International Symposium on High performance Distributed Computing, 2004..

[31]  H Tallini,et al.  GROSS : an end user tool for carrying out batch analysis of CMS data on the LCG-2 Grid , 2005 .

[32]  S. Lacaprara,et al.  Use of grid tools to support CMS distributed analysis , 2004, IEEE Symposium Conference Record Nuclear Science 2004..

[33]  David Groep,et al.  D0 data processing within EDG/LCG , 2004 .

[34]  Andrew Baranovski,et al.  Testing the CDF Distributed Computing Framework , 2004 .

[35]  F Beaudette,et al.  FAMOS, a FAst MOnte-Carlo Simulation for CMS , 2005 .

[36]  C. Grandi,et al.  CMS distributed data analysis challenges , 2004 .

[37]  G. Zito,et al.  Monitoring CMS tracker construction and data quality using a Grid/Web service based on a visualization tool , 2004, IEEE Symposium Conference Record Nuclear Science 2004..

[38]  A.Arbree,et al.  Virtual Data in CMS Analysis , 2003 .

[39]  Farid Ould-Saada,et al.  Science on NorduGrid , 2004 .

[40]  Iosif Legrand,et al.  Models Of Networked Analysis At Regional Centres For Lhc Experiments (monarc), Phase 2 Report, 24th March 2000 , 2000 .

[41]  I Bird,et al.  LHC computing grid : Technical design report , 2005 .

[42]  Tony Wildish,et al.  The Spring 2002 DAQ TDR production , 2002 .

[43]  Tommaso Boccali,et al.  Mantis : the Geant4-based simulation specialization of the CMS COBRA framework , 2004 .

[45]  K. Lassila-Perini,et al.  An object-oriented simulation program for CMS , 2004, IEEE Symposium Conference Record Nuclear Science 2004..

[46]  D.,et al.  A v Fermi National Accelerator Laboratory , 1998 .

[47]  C.Grandi Plans for the Integration of grid tools in the CMS computing environment , 2003 .

[48]  N Neumeister Muon Reconstruction Software in CMS , 2005 .

[49]  Frank van Lingen,et al.  Migration of the XML Detector Description Data and Schema to a Relational Database , 2003 .

[50]  Stefan Schmid,et al.  PARALLEL COMPILATION OF CMS SOFTWARE , 2004 .

[51]  S Wynhoff,et al.  CMS Software Installation , 2005 .

[52]  Veronique Lefebure,et al.  RefDB: The Reference database for CMS Monte Carlo production , 2003 .

[53]  Gabriele Garzoglio,et al.  Tools for grid deployment of CDF offline and SAM data handling systems for summer 2004 computing , 2004 .

[54]  Claudio Grandi,et al.  Object Based System for Batch Job Submission and Monitoring (BOSS) , 2003 .

[55]  M Liendl,et al.  A database perspective on CMS detector data , 2005 .