Download presentation
Presentation is loading. Please wait.
Published byEzra Bishop Modified over 9 years ago
1
CWG9 Data Quality Monitoring, Quality Assurance and Visualization B. von Haller 11.03.2014 CERN
2
CWG 9 in a nutshell ▶ Started in May 2013 along with O 2 ▶ Group working on ▶ the Data Quality Monitoring ▶ the Quality Assurance ▶ the Visualization ▶ For Run 2 and Run 3 B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 1
3
CWG9 Members ▶ Renu Bala ▶ Francesca Bellini ▶ Mihaela Gheata ▶ Lukasz Kamil Graczykowski ▶ Malgorzata Anna Janik ▶ Andreas Morsch ▶ Mihai Niculescu ▶ Jeremi Niedziela ▶ Ankita Sharma ▶ Maciej Pawel Szymanski ▶ Barthélémy Von Haller B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 2
4
Definitions (1) Data Quality Monitoring B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 3 ▶ Online feedback on the quality of data ▶ Make sure to record high quality data ▶ Identify and solve problem(s) early ▶ Data Quality Monitoring (DQM) involves ▶ Online gathering of data ▶ Analysis by user-defined algorithm ▶ Production of monitoring objects such as histograms ▶ Assessment of the quality of the data based on the objects ▶ Storage of monitoring data ▶ Visualization (+ human assessment of quality)
5
Definitions (2) Quality Assurance B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 4 ▶ Offline evaluation of the quality of data ▶ During and after reconstruction and calibration ▶ Make sure to analyze high quality data ▶ Identify high quality runs ▶ QA involves ▶ Analysis by user-defined algorithm ▶ Production of monitoring objects such as histograms ▶ Assessment of the quality of the data based on the objects ▶ Storage of monitoring data ▶ Visualization (+ human assessment of quality)
6
Definitions (3) ▶ Visualization of DQM/QA results ▶ Visualization of data ▶ Event Display ▶ More details in the coming presentations ! Visualization B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 5
7
Detailed description of CWG9 ▶ Group focusing on the data Quality Assurance, online and offline, and the visualization of its results and the data itself ▶ Run 3 ▶ Study how to monitor data efficiently and in plenty without interfering with the data taking ▶ Discuss QA output and results, incremental QA and procedures to formalize if the results are acceptable or not ▶ Determine the needs, and design the software, to access, visualize and interpret the results ▶ In addition, define and develop the software to visualize data, raw and reconstructed ▶ Run 2 ▶ Production data taking period -> maintain and improve existing software ▶ Opportunity to test concepts and software for Run 3 B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 6 Dataflow Monitoring Object Generation Automatic Quality Assessment Storage Visualization
8
Past and Current tasks ▶ Bring everyone aboard [done] ▶ Make people aware of others’ work in the field ▶ Give a picture of the current situation to everyone ▶ Run 2 ▶ Event Display review and meetings, Warsaw involvement ▶ Core refactoring ▶ New features ▶ Knowledge transfer Gain stability and support for Run 2 ▶ DQM/QA review and preparation ▶ Proposal for the online reconstruction and calibration ▶ Run 3 ▶ System requirements and system functionalities document [done] ▶ Detectors needs survey ▶ Definition of the future architecture and design ▶ Prototypes and feasibility tests ▶ Technical Design Report redaction B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 7
9
▶ April 2014: First draft ▶ August 2014: Submission ▶ CWG9 participates to ▶ Chapter 4 System architecture : Quality control and assessment ▶ Table 1: detectors needs ▶ Explain DQM/QA architecture and the choices made ▶ Figure 1: DQM/QA architecture ▶ Explain Event Display architecture and the choices made ▶ Figure 2: Event Display architecture ▶ Chapter 5 Technology survey, evaluations and prototypes 1.Mergers architecture and feasibility tests with 0MQ 2.Results of storage tests (e.g. DB technologies) 3.[Web gui architecture (ROOT JS lib + DABC)] 4.Event display design as tested (cf Run 2) ▶ Chapter 6 System Design B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 TDR and feasibility tests 8
10
Longer term work ▶ Run 2 ▶ Maintenance and support ▶ DQM/QA ▶ Event Display Organisation ? ▶ Run 3 ▶ Prototypes ▶ Implementation ▶ Interaction with users ▶ Bring inventive new ideas or approaches ▶ E.g. Investigation of complex automatic checks ▶ Room for more people joining ! B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 9
11
Conclusion ▶ CWG9 is an important and dynamic working group in the O 2 project ▶ WUT is a key player in CWG9 ▶ In terms of people and responsibility ▶ Crucial for the future of Visualization in ALICE ▶ We are grateful for your involvement so far and for the increasing responsibilities you are taking in the project ! B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 10
13
Backups B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 12
14
O 2 Technical Design Report ▶ Schedule ▶ October ‘13: ▶ Define table of content ▶ Establish editorial board ▶ December ‘13: ▶ System Requirement Document ▶ High-level dataflow model ▶ Computing platforms benchmarks ▶ Networking benchmark ▶ June ‘14 ▶ Software framework architecture ▶ Sep ‘14 ▶ TDR 13 B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014
15
CWG9 TDR Timeline ▶ January 14 ▶ Define list of tables and figures ▶ Draft of the architecture of the system ▶ Launch subsystems exhaustive survey Submit list of tables and figures to TDR EC ▶ February 14 ▶ Draft tables and figures ▶ Skeleton of 4.2.4 and 5.6 Submit skeleton to TDR EC ▶ March 14 ▶ Finalize tables and figures, including subsystems input ▶ Iterate on text using input of TDR EC Submit text and final tables and figures to TDR EC ▶ April 14 ▶ Finalize text Submit final text to TDR EC ▶ May 14 ▶ Iterate over our sections using CWGs input ▶ Review work of other CWGs (especially what concerns us!) Proposal B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 14
16
TDR – skeleton, tables & figures ▶ 4.2 Data processing and computing model ▶ 4.2.4 DQM and QA ▶ « Quality control and assessment » ▶ Table 1: detectors needs ▶ Explain architecture ▶ Figure 1: architecture ▶ Explain the choices ▶ Figure 2: Event display arch. Chapter 4 System architecture B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 15
17
TDR – skeleton, tables & figures ▶ 5.6 DQM and QA ▶ « Quality control and assessment » ▶ Technologies and design choices available concerning key points of our system ▶ Storage ▶ Access to results worldwide ▶ Event display ▶ Feasability tests & prototypes ▶ Table 1: results of storage tests (e.g. DB technologies) ▶ Figure 1: Web gui architecture (ROOT JS lib + DABC) ▶ Figure 2: Event display design as tested (cf Run 2) Chapter 5 Technology survey, evaluations and prototypes B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 16
18
Subsystems survey ▶ What “tasks” (name it agents or algorithms if you prefer) will your subsystem need in Run 3 ? ▶ For each of these task or group of tasks, tell us ▶ Whether it already exist today and if so what is its performance. ▶ What is the expected performance of such a task in Run 3. ▶ How many plots are expected to be produced (for the shifter and for the experts). ▶ Percentage of events needed to carry out the task online (minimum, optimal). ▶ What is the input ? i.e. at which stage will it run ? ▶ How fast the response has to be taken into account in the data flow ? ▶ Whether the DQM/QA results have to become persistent and for how long ? ▶ What does “Calibration QA” mean to you ? B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 17
19
Survey status B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 18 SubsysAckContact pers.Under disc.AnsweredCleared ACO CPV EMC FMDAbsent from Run 3 HMPDuring meeting MCH MTRSee MID PHOS PMD SDDSee ITS2 SPDSee ITS2 SSDSee ITS2 T0See FIT TOF TPC TRD V0See FIT ZDC Trigger DAQ MFT MID FIT ITS 2
20
Current and short term work ▶ Run 2 : Prepare a proposal for the QA in relation with the online reco, calibration and monitoring (QA tasks, validation, bridge to DQM, …) ▶ Online Calibration ▶ Mainly for TPC ▶ Many open questions on ▶ Requirements ▶ General architecture ▶ Implementation ▶ CDB ▶ Working on a proposal to meet requirements while minimizing work ▶ Use analysis QA train within HLT for reco monitoring ▶ Use analysis QA train within HLT for calib monitoring ▶ Use AMORE for raw data monitoring ▶ Use AMORE infrastructure for storage and visualization ▶ Run 2 – Review and preparation ▶ Detectors « interviews » ▶ (DATE Monitoring update) ▶ Run 3 : Prepare requirements of the future system following CWG1 input ▶ For the TDR (2014) ▶ Define requirements and general architecture and features of the QA-DQM-Viz for Run 3 ▶ Write it ▶ Event Display ▶ Decentralized model under implementation ▶ Better stability ▶ Split GUI and reconstruction ▶ Possibility to switch between offline and HLT reco ▶ Bookmarks (for users and for PR) ▶ Involvement of the Warsaw group ▶ Implemented by the end of 2013 B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 19
21
B. von Haller | CWG9 DQM-QA-Visualization | 11.03.2014 20 HCDB DCS, GRP Reco Selection/ filters Calib OCDB Data on Castor Reading Writing Producing DBs data procedures ESDs Raw Calib param PHYSICS Run QC (sort of)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.