Level-1 Calo Monitoring

Slides:



Advertisements
Similar presentations
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
Advertisements

The LAr ROD Project and Online Activities Arno Straessner and Alain, Daniel, Annie, Manuel, Imma, Eric, Jean-Pierre,... Journée de réflexion du DPNC Centre.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
Level-1 Topology Processor for Phase 0/1 - Hardware Studies and Plans - Uli Schäfer Johannes Gutenberg-Universität Mainz Uli Schäfer 1.
Far Detector Data Quality Andy Blake Cambridge University.
Samuel Silverstein Stockholm University L1Calo upgrade hardware planning + Overview of current concept + Recent work, results.
Samuel Silverstein Stockholm University L1Calo upgrade discussion Overview Issues  Latency  Rates  Schedule Proposed upgrade strategy R&D.
DAQ System at the 2002 ATLAS Muon Test Beam G. Avolio – Univ. della Calabria E. Pasqualucci - INFN Roma.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
1 1 Rapid recent developments in Phase I L1Calo Weakly 25 Jul 2011 Norman Gee.
L1Calo Intro Cambridge Group, Dec 2008 Norman Gee.
Alan Watson ATLAS Overview week, Prague, 17/09/2003 Rx   Calorimeters (LAr, Tile) 0.2x x Mb/s analogue ~75m 0.1x0.1 RoI Builder L1 CTP.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
ATLAS Trigger / current L1Calo Uli Schäfer 1 Jet/Energy module calo µ CTP L1.
SL1Calo Input Signal-Handling Requirements Joint Calorimeter – L1 Trigger Workshop November 2008 Norman Gee.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
TDAQ Experience in the BNL Liquid Argon Calorimeter Test Facility Denis Oliveira Damazio (BNL), George Redlinger (BNL).
Calorimeter Digitisation Prototype (Material from A Straessner, C Bohm et al) L1Calo Collaboration Meeting Cambridge 23-Mar-2011 Norman Gee.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
October Test Beam DAQ. Framework sketch Only DAQs subprograms works during spills Each subprogram produces an output each spill Each dependant subprogram.
Hardeep Bansil (University of Birmingham) on behalf of L1Calo collaboration ATLAS UK Meeting, Royal Holloway January 2011 Argonne Birmingham Cambridge.
L1Calo Installation Status Murrough Landon, QMUL Level-1 Calorimeter Trigger (University of Birmingham, University of Heidelberg, University of Mainz,
UK LVL1 Meeting, RAL, 31/01/00Alan Watson 1 ATLAS Trigger Simulations Present & Future? What tools exist? What are they good for? What are the limitations?
Calibration & Monitoring M.N Minard Monitoring News Status of monitoring tools Histogramm and monitoring meeting 6/02/08 Calibration farm brainstorming.
 offline code: changes/updates, open items, readiness  1 st data taking plans and readiness.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
L1Calo EM Efficiency Maps Hardeep Bansil University of Birmingham L1Calo Weekly Meeting 07/03/2011.
Samuel Silverstein, SYSF ATLAS calorimeter trigger upgrade work Overview Upgrade to PreProcessor MCM Topological trigger.
TRTViewer: the ATLAS TRT detector monitoring and diagnostics tool 4 th Workshop on Advanced Transition Radiation Detectors for Accelerator and Space Applications.
CP Athena Monitoring Status as of 20/05/08 Revised directory structure (again!). Phi scale configurable in degrees, radians or channels. Existing plots.
Calliope-Louisa Sotiropoulou FTK: E RROR D ETECTION AND M ONITORING Aristotle University of Thessaloniki FTK WORKSHOP, ALEXANDROUPOLI: 10/03/2014.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
Samuel Silverstein, Stockholm University For the ATLAS TDAQ collaboration The Digital Algorithm Processors for the ATLAS Level-1 Calorimeter Trigger.
TDAQ status and plans for 2008 Carlos Solans TileCal Valencia meeting 17th December 2007.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
Schedule ● External Links ● Calorimeter Signals ● Internal Tests ● System Tests ● Combined Runs ● Schedule Murrough Landon 19 April 2007 A brief update.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
L1Calo DBs: Status and Plans ● Overview of L1Calo databases ● Present status ● Plans Murrough Landon 20 November 2006.
DB and Information Flow Issues ● Selecting types of run ● L1Calo databases ● Archiving run parameters ● Tools Murrough Landon 28 April 2009.
L1Calo Databases ● Overview ● Recent Activitity ● To Do List ● Run types Murrough Landon 4 June 2007.
L1Calo Upgrade Phase 2 ● Phase 2 Functional Blocks? – Thoughts on L1 “refinement” of L0 ● Simulation framework ● Phase 1 Online SW Murrough Landon 2 February.
Phase2 Level-0 Calo Trigger ● Phase 2 Overview: L0 and L1 ● L0Calo Functionality ● Interfaces to calo RODs ● Interfaces to L0Topo Murrough Landon 27 June.
M4 Operations ● Operational model for M4 ● Shifts and Experts ● Documentation and Checklists ● Control Room(s) ● AOB Murrough Landon 24 July 2007.
Configuration and local monitoring
Multi-step Runs – an L1Calo Perspective
Rainer Stamen, Norman Gee
Monitoring of L1Calo EM Efficiencies
Christoph Blume Offline Week, July, 2008
ATLAS calorimeter and topological trigger upgrades for Phase 1
Vito Palladino Straw Working Group 23/3/2015
L1Calo Phase-1 architechure
L1Calo upgrade discussion
Jimin Kim Thinh Nguyen Sen Mao
The Software Framework available at the ATLAS ROD Crate
Online Software Status
ATLAS L1Calo Phase2 Upgrade
Recent Online SW/DB Changes
Online Software Status
Remaining Online SW Tasks
Level 1 (Calo) Databases
Cabling Lengths and Timing
Data Quality Monitoring of the CMS Silicon Strip Tracker Detector
Online Software “To Do” List
Possibilities for CPM firmware upgrade
The First-Level Trigger of ATLAS
L1Calo Joint Meeting Introduction
CPM plans: the short, the medium and the long
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
CMS Pixel Data Quality Monitoring
CTP offline meeting 16/03/2009 A.Jusko and R.Lietava
Presentation transcript:

Level-1 Calo Monitoring Level-1 Architecture Data sources and monitoring rationale Monitoring tasks Detailed rate histograms Internal consistency Calorimeter comparison CTP comparison Module status Archiving Stephen Hillier, University of Birmingham July 7th 2005

Level-1 Architecture and Monitoring data sources Preprocessor 124 modules Cluster Processor 56 modules Jet/Energy 32 modules Merging 8 modules 4 modules Digitized Energies Calorimeter Signals Merged Results To CTP Real-time Data Path Readout Driver (ROD) 14 modules Data Region of Interest ROD 6 modules Interest Data Stephen Hillier, University of Birmingham July 7th 2005

Level-1 Architecture and Monitoring data sources Preprocessor 124 modules Cluster Processor 56 modules Jet/Energy 32 modules Merging 8 modules 4 modules Digitized Energies Calorimeter Signals Merged Results To CTP Real-time Data Path Readout Driver (ROD) 14 modules Data Region of Interest ROD 6 modules Interest Data 4. CTP comparison 2. Internal consistency 3. Calorimeter 1. Rate histograms 5. Module Status Stephen Hillier, University of Birmingham July 7th 2005

Detailed Rate Histograms PPM sees ‘raw’ trigger towers Hardware histogram capability Rates Hit maps BC Number dependence By far the largest monitoring data source (bar event stream) Analysis route VME read to SBC Publish to ‘gatherer’ in local PC? Export upstairs at regular intervals? Update time O(1 minute) Monitoring Farm? Surface USA15 x8 VME read Stephen Hillier, University of Birmingham July 7th 2005

Stephen Hillier, University of Birmingham Internal Consistency L1Calo data is routed via 6 ROSes 3 PPM 1 CP system 1 JEP system 1 RoI data Full internal consistency checks require event building SFI, EF, SFO Event rate No hard requirement As many as possible Results Hopefully very small – error flags, location of errors, MRS messages? Ideally one flag saying NO PROBLEMS! Some data monitoring performed at ROS CP, JEP, RoI hit maps (no need for gatherer) Relatively small amount of data Stephen Hillier, University of Birmingham July 7th 2005

Calorimeter Comparison Requirements similar to internal consistency Needs Liquid Argon and Tile data Therefore requires event building Event rate: as many as possible Results: a number (?) of histograms, errors, warnings Probably smallish (but not as small as digital comparisons) Calibration Notes We do not intend to use physics runs for online calibration Calibration requires dedicated runs Which probably will use monitoring tools, but not at high rate And probably eventual offline fine calibration But monitoring will be used to verify current calibrations eg reference histogram checking for correlations Stephen Hillier, University of Birmingham July 7th 2005

Stephen Hillier, University of Birmingham CTP Comparison Requirements similar again Needs CTP data But data size is trivial L1Calo sends ~100 bits to CTP But important they are correct! Stephen Hillier, University of Birmingham July 7th 2005

Stephen Hillier, University of Birmingham Module Status Relatively small number of parameters read by VME from every module Link status, parity errors, overflows, logic faults Not (in general) DCS type information Typically <100 words per module But remember system has ~300 modules Error flags should be alerted immediately! Publish frequency ‘high’ O(1 second) Need good GUI in control room Immediate alerts Quick fault identification Stephen Hillier, University of Birmingham July 7th 2005

Stephen Hillier, University of Birmingham Data Archiving Ideal: save all data every run keep everything forever and immediately accesible More realistic proposal: Run summary histograms at most 10 histos, keep forever, quick access Detailed summary histograms O(100) histos, keep forever, slow access All data and histograms Keep O(1 week), quick access Stephen Hillier, University of Birmingham July 7th 2005