Data Quality Monitoring Framework for the ATLAS Experiment at the LHC

Data quality monitoring (DQM) is an integral part of the data taking process of HEP experiments. DQM involves automated analysis of monitoring data through user-defined algorithms and relaying the summary of the analysis results to the shift personnel while data is being processed. In the online environment, DQM provides the shifter with current run information that can be used to overcome problems early on. During the offline reconstruction, more complex analysis of physics quantities is performed by DQM, and the results are used to assess the quality of the reconstructed data. The ATLAS data quality monitoring framework (DQMF) is a distributed software system providing DQM functionality in the online environment. The DQMF has a scalable architecture achieved by distributing the execution of the analysis algorithms over a configurable number of DQMF agents running on different nodes connected over the network. The core part of the DQMF is designed to have dependence only on software that is common between online and offline (such as ROOT) and therefore the same framework can be used in both environments. This paper describes the main requirements, the architectural design, and the implementation of the DQMF.