Time-Critical Database Conditions Data-Handling for the CMS Experiment
暂无分享,去创建一个
Automatic, synchronous and of course reliable population of the condition database is critical for the correct operation of the online selection as well as of the offline reconstruction and data analysis. We will describe here the system put in place in the CMS experiment to automate the processes to populate centrally the database and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are “dropped” by the users in a dedicated service which synchronizes them and takes care of writing them into the online database. Then they are automatically streamed to the offline database, hence immediately accessible offline worldwide. This mechanism was intensively used during 2008 and 2009 operation with cosmic ray challenges and first LHC collision data, and many improvements were done so far. The experience of this first years of operation will be discussed in detail.
[1] M. De Gruttola,et al. Persistent storage of non-event data in the CMS databases , 2010 .
[2] Barry Blumenfeld,et al. CMS conditions data access using FroNTier , 2008 .
[3] M. Gonzalez-Berges,et al. The Joint COntrols Project Framework , 2003, physics/0305128.
[4] F.Glege,et al. First experience in operating the population of the condition databases for the CMS experiment , 2010 .
[5] B Hegner,et al. Analysis environments for CMS , 2008 .