14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema The GNAM monitoring system and the OHP histogram presenter for ATLAS 14 th IEEE-NPSS Real.

Slides:



Advertisements
Similar presentations
RPC & LVL1 Mu Barrel Online Monitoring during LS1 M. Della Pietra.
Advertisements

GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC M. Della Pietra, P. Adragna,
Simulation Project Major achievements (past 6 months 2007)
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
A Database Visualization Tool for ATLAS Monitoring Objects A Database Visualization Tool for ATLAS Monitoring Objects J. Batista, A. Amorim, M. Brandão,
Data Quality Monitoring for CMS RPC A. Cimmino, D. Lomidze P. Noli, M. Maggi, P. Paolucci.
March 2003 CHEP Online Monitoring Software Framework in the ATLAS Experiment Serguei Kolos CERN/PNPI On behalf of the ATLAS Trigger/DAQ Online Software.
Control and monitoring of on-line trigger algorithms using a SCADA system Eric van Herwijnen Wednesday 15 th February 2006.
First year experience with the ATLAS online monitoring framework Alina Corso-Radu University of California Irvine on behalf of ATLAS TDAQ Collaboration.
New Features of APV-SRS-LabVIEW Data Acquisition Program Eraldo Oliveri on behalf of Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer] NYC,
1 Raw Event : ByteStream implementation Muon Workshop, April 2002 A. Nisati, for the Muon Trigger group.
Large Scale and Performance Tests of the ATLAS Online Software CERN ATLAS TDAQ Online Software System D.Burckhart-Chromek, I.Alexandrov, A.Amorim, E.Badescu,
Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer]
Data Quality Monitoring of the CMS Tracker
DAQ System at the 2002 ATLAS Muon Test Beam G. Avolio – Univ. della Calabria E. Pasqualucci - INFN Roma.
ATLAS ONLINE MONITORING. FINISHED! Now what? How to check quality of the data?!! DATA FLOWS!
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
G. Maron, Agata Week, Orsay, January Agata DAQ Layout Gaetano Maron INFN – Laboratori Nazionali di Legnaro.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
Gnam Monitoring Overview M. Della Pietra, D. della Volpe (Napoli), A. Di Girolamo (Roma1), R. Ferrari, G. Gaudio, W. Vandelli (Pavia) D. Salvatore, P.
Tracker data quality monitoring based on event display M.S. Mennea – G. Zito University & INFN Bari - Italy.
TAU entered in various places. Manpower – Yan Benhammou++ (N. Guttman, N. Hod, E. Reinherz, A. Kreizel). Working on online Monitoring of TGC. Looking at.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
Multistep Runs with ROD Crate DAQ Murrough Landon, QMUL Outline: Overview Implementation Comparison with existing setup Readout Status ModuleServices API.
F.Carena, CERN/ALICE The ALICE Experiment Control System F. Carena / CERN-PH.
The Process Manager in the ATLAS DAQ System G. Avolio, M. Dobson, G. Lehmann Miotto, M. Wiesmann (CERN)
Peter Chochula ALICE Offline Week, October 04,2005 External access to the ALICE DCS archives.
5/2/  Online  Offline 5/2/20072  Online  Raw data : within the DAQ monitoring framework  Reconstructed data : with the HLT monitoring framework.
Tracker Visualization Tool: integration in ORCA Maria S. Mennea, Giuseppe Zito University & INFN Bari, Italy Tracker b-tau Cosmic Challenge preparation.
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
Online Reconstruction 1M.Ellis - CM th October 2008.
Online Monitoring for the CDF Run II Experiment T.Arisawa, D.Hirschbuehl, K.Ikado, K.Maeshima, H.Stadie, G.Veramendi, W.Wagner, H.Wenzel, M.Worcester MAR.
CMS Luigi Zangrando, Cern, 16/4/ Run Control Prototype Status M. Gulmini, M. Gaetano, N. Toniolo, S. Ventura, L. Zangrando INFN – Laboratori Nazionali.
Detector Description in LHCb Detector Description Workshop 13 June 2002 S. Ponce, P. Mato / CERN.
Online (GNAM) and offline (Express Stream and Tier0) monitoring produced results during cosmic/collision runs (Oct-Dec 2009) Shifter and expert level monitoring.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
TDAQ Experience in the BNL Liquid Argon Calorimeter Test Facility Denis Oliveira Damazio (BNL), George Redlinger (BNL).
The ATLAS DAQ System Online Configurations Database Service Challenge J. Almeida, M. Dobson, A. Kazarov, G. Lehmann-Miotto, J.E. Sloper, I. Soloviev and.
MUON DAQ WORKSHOP Muon Week, CERN February 2014 Nicoletta Garelli (SLAC)
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Alignment in real-time in current detector and upgrade 6th LHCb Computing Workshop 18 November 2015 Beat Jost / Cern.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
1 ECS CALO LED Control System CALO Piquet Training Session Anatoli Konoplyannikov /ITEP/ Outline  Introduction  Calorimeter ECS LED monitoring.
CMS Luigi Zangrando, Cern, 16/4/ Run Control Prototype Status M. Gulmini, M. Gaetano, N. Toniolo, S. Ventura, L. Zangrando INFN – Laboratori Nazionali.
TGC On-Line Monitoring Status for 1 beam Before LHC Startup on TomBox: A single machine (pc-tgc-mon-rod-01) receives all TGC events.
The ALICE data quality monitoring Barthélémy von Haller CERN PH/AID For the ALICE Collaboration.
ATLAS The ConditionDB is accessed by the offline reconstruction framework (ATHENA). COOLCOnditions Objects for LHC The interface is provided by COOL (COnditions.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
ANDREA NEGRI, INFN PAVIA – NUCLEAR SCIENCE SYMPOSIUM – ROME 20th October
TRTViewer: the ATLAS TRT detector monitoring and diagnostics tool 4 th Workshop on Advanced Transition Radiation Detectors for Accelerator and Space Applications.
Scalable Readout System Data Acquisition using LabVIEW Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer]
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
Slow Control and Run Initialization Byte-wise Environment
Slow Control and Run Initialization Byte-wise Environment
Risultati del run di integrazione M4
ATLAS MDT HV – LV Detector Control System (DCS)
Detector parameters and modelling
Pierluigi Paolucci & Giovanni Polese
Presentation transcript:

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema The GNAM monitoring system and the OHP histogram presenter for ATLAS 14 th IEEE-NPSS Real Time Conference 2005 June 4-10, Stockholm, Sweden P. Adragna 1,2, G. Crosetti 3, M. Della Pietra 4, A. Dotti 1, R. Ferrari 5, G. Gaudio 5, C. Roda 1, D. Salvatore 3, F. Sarri 1, W. Vandelli 5, P. F. Zema 3 1 University and INFN Pisa, Italy 2 now at University of Siena, Italy 3 University and INFN Cosenza, Italy 4 University and INFN Napoli, Italy 5 University and INFN Pavia, Italy

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema Layout ● ATLAS and ATLAS DAQ ● Online monitoring application: GNAM ● Online Histogram Presenter: OHP ● From CTB to commissioning

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema ATLAS (A Toroidal LHC ApparatuS)

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema DAQ & monitoring in ATLAS

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema ATLAS monitoring facilities GNAM and OHP use the facilities provided by the ATLAS Trigger and Data Acquisition (TDAQ) group for monitoring purposes: ● Information Service (IS), for archiving and sharing information (beam conditions, running parameters,...); ● Online Histogramming Service (OHS), based on IS: transient repository for histograms; ● Event Monitoring Service (EMS) : data samplers available at any dataflow level.

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema GNAM structure design  Detector independent architecture: rely on fragment structure, not on its content  GNAM must be modular: common actions (interaction with DAQ, data unpacking, histogram management) separated from detector specific ones (data decoding, histogram filling)  GNAM must be able to run automatically, controlled by the TDAQ system  Histogram production and display must be independent of one another

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema GNAM + OHP monitoring chain

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema GNAM + OHP monitoring chain

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema Finite State Machine ● ATLAS TDAQ is organized as a Finite State Machine. ● In order to run synchronously with the DAQ, GNAM has been implemented as a FSM, too.

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema Further GNAM features GNAM can be controlled interactively, too; e.g: run out of main TDAQ partition. file_sampler: a separate application that can read any raw data file and emulate a standard sampler; eg: run without any real TDAQ data source. Especially useful for developing and debugging.

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema Online Histogram Presenter Interactive presenter developed as a separate application to display the histograms published in the OHS Requirements for OHP: Both browse all the histograms produced and display configured sets of histograms Send commands (reset, rebin, update) to GNAM Operations on histograms (fitting, zooming,...) Display also histograms produced by non–GNAM applications Manage reference histograms

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema OHP as a browser for OHS

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema OHP: example of configured tab

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema Combined Test Beam 2004 ATLAS CTB 04: Mag -> Nov 2004 An entire slice of ATLAS exposed to Beam

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema GNAM and CTB 2004 GNAM and OHP were used during the whole test beam to inspect detector status, find faulty states, check calibrations,... At the end of CTB, GNAM and OHP managed more than 1000 histograms, produced by six processes running on two low-cost PCs.

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema ATLAS commissioning In a few months, all ATLAS detectors will be undergoing the final commissioning phase, with a big load of pre- and post-installation tests. Some detectors already using GNAM and OHP during test runs and others expressed interest: Pixel, TRT, LArg, Tile, TGC, MDT, RPC.

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema From CTB to commissioning CTB simpler than commissioning. GNAM and OHP need to be upgraded. GNAM: use of databases for tracing and handling all different elements and conditions (cabling maps, broken/noisy channels) and storing monitoring results (histograms,...) OHP being rewritten, especially the connection with OHS, to gain scalability and reduce network traffic. Started discussion about automatic error recognition and alarm generation.

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema Conclusions GNAM and OHP fully integrated in ATLAS TDAQ. GNAM and OHP were extensively used at CTB and proved to be light and extensible. GNAM and OHP are being used and enhanced for ATLAS commissioning. Experience gained in the commissioning phase will be used to upgrade both GNAM and OHP for ATLAS.

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema Backup slides Backup slides More info at:

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema Data encapsulation in ATLAS Apart from ROB, any fragment type usually contains more than one sub-fragment.

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema ATLAS monitoring schema Slide by S. Kolos

14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema GNAM plugin entrypoints