TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ

Slides:



Advertisements
Similar presentations
Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Advertisements

GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC M. Della Pietra, P. Adragna,
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/27 A Control Software for the ALICE High Level Trigger Timm.
March 2003 CHEP Online Monitoring Software Framework in the ATLAS Experiment Serguei Kolos CERN/PNPI On behalf of the ATLAS Trigger/DAQ Online Software.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
First year experience with the ATLAS online monitoring framework Alina Corso-Radu University of California Irvine on behalf of ATLAS TDAQ Collaboration.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
ILC Trigger & DAQ Issues - 1 ILC DAQ issues ILC DAQ issues By P. Le Dû
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
R. Fantechi. TDAQ commissioning Status report on Infrastructure at the experiment PC farm Run control Network …
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
ATLAS TDAQ RoI Builder and the Level 2 Supervisor system R. E. Blair, J. Dawson, G. Drake, W. Haberichter, J. Schlereth, M. Abolins, Y. Ermoline, B. G.
Kostas KORDAS INFN – Frascati 10th Topical Seminar on Innovative Particle & Radiation Detectors (IPRD06) Siena, 1-5 Oct The ATLAS Data Acquisition.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
ECFA Workshop, Warsaw, June G. Eckerlin Data Acquisition for the ILD G. Eckerlin ILD Meeting ILC ECFA Workshop, Warsaw, June 11 th 2008 DAQ Concept.
Monitoring for the ALICE O 2 Project 11 February 2016.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
ANDREA NEGRI, INFN PAVIA – NUCLEAR SCIENCE SYMPOSIUM – ROME 20th October
TDAQ and L1Calo and Chamonix (Personal Impressions) 3 Mar2010 Norman Gee.
Upgrade Intro 10 Jan 2010 Norman Gee. N. Gee – Upgrade Introduction 2 LHC Peak Luminosity Lumi curve from F.Zimmermann : Nov Upgrade Week ? ?
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
WPFL General Meeting, , Nikhef A. Belias1 Shore DAQ system - report on studies A.Belias NOA-NESTOR.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
WP18, High-speed data recording Krzysztof Wrona, European XFEL
5/14/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
TDAQ Phase-II kick-off CERN background information
Enrico Gamberini for the GTK WG TDAQ WG Meeting June 01, 2016
LHC experiments Requirements and Concepts ALICE
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
Controlling a large CPU farm using industrial tools
Commissioning of the ALICE HLT, TPC and PHOS systems
The Software Framework available at the ATLAS ROD Crate
ATLAS L1Calo Phase2 Upgrade
Online Software Status
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Toward a costing model What next? Technology decision n Schedule
VELO readout On detector electronics Off detector electronics to DAQ
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
DAQ Architecture Design of Daya Bay Reactor Neutrino Experiment
The First-Level Trigger of ATLAS
Level-1 Calo Monitoring
ATLAS: Level-1 Calorimeter Trigger
12/3/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
John Harvey CERN EP/LBC July 24, 2001
1/2/2019 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
LHCb Trigger, Online and related Electronics
Design Principles of the CMS Level-1 Trigger Control and Hardware Monitoring System Ildefons Magrans de Abril Institute for High Energy Physics, Vienna.
Plans for the 2004 CSC Beam Test
Presentation transcript:

TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ System Overview Level-1 Trigger HLT infrastructure DAQ infrastructure DAQ software Commissioning Runs ATLAS Control Room during a TDAQ technical run Everything under control! February 12th, 2008 TDAQ commissioning and status

Trigger/DAQ architecture Detector Front End Electronics (Detector responsibility) Detector RODs Level-1 Trigger Custom Pipelined Hardware Region of Interest Builder Custom Hardware Readout System Custom built buffers in ROS PC farm High-Level Trigger Large PC farm High data bandwidth Dedicated ‘Data’ Network Event Building More PC farms on ‘data’ network DAQ software – control, configuration, monitoring (control network) February 12th, 2008 TDAQ commissioning and status

TDAQ commissioning and status Level-1 Trigger System Three major systems Calorimeter Trigger Muon Trigger Central Trigger Processor (CTP) Other triggers and signals also integrated by CTP Minimum bias Luminosity triggers Beam Pick-up CTP distributes all timing information February 12th, 2008 TDAQ commissioning and status

Central Trigger Processor (CTP) Core CTP available since 2006 Installation now complete New trigger inputs added as they come online Essential for detector commissioning Continuous heavy demand, hence introduction of… Local Trigger Processors Interfaces (LTPi) Allow flexible subdetector local triggering across multiple LTPs 30 modules produced, 5 tested Installation architecture planned for ‘Calorimeter Loops’ Disscussion of ID and muon usage required Documentation and software available February 12th, 2008 TDAQ commissioning and status

Level-1 Calorimeter Trigger Installation in USA15 completed in December 2007 Almost 300 complex custom modules Plus many simpler ones Spread over 10 racks Over 3000 cables laid ~800 thick analogue electrical ~2000 thin digital electrical ~400 optical fibres Commissioning with Calorimeters well underway In M3, M4, M5 Dedicated ‘Calorimeter’ weeks Jet triggers already used, electron/gamma/tau now ready February 12th, 2008 TDAQ commissioning and status

TDAQ commissioning and status Level-1 Muon Trigger On-Detector Electronics Essentially complete But Power Supply delivery is slow Off-Detector Electronics RODs, sector logic etc Complete by end March CTP interface components available Cables, Fibres almost finished February 12th, 2008 TDAQ commissioning and status

Level-1 Trigger Commissioning Calorimeter trigger signals need thorough testing before access disappears Probably end of April About half tested so far Muon trigger commissioning currently done sector by sector Availability of gas and power supplies Timing needs to be addressed Require all triggers to have same timing w.r.t. (non-existent) bunch-crossing The complete system – calo, muon - is rarely (or rather never) available Timing workshop, 11th March HAD Correlation b/w Trigger Tower (Level-1Calo) and Tile Energy Correlation b/w hits in RPC and MDT February 12th, 2008 TDAQ commissioning and status

Region of Interest Builder Installed and working for a long time Takes 8 inputs 1 CTP 1 Muon Trigger 6 Calo Trigger Region of Interest outputs tested in commissioning runs February 12th, 2008 TDAQ commissioning and status

High Level Trigger Infrastructure HLT nodes consist of: Level-2 Supervisors Level-2 Processing Nodes Event Filter Nodes Assignment of roles flexible eg in M5 64*Level-2, 96*EF Granularity of rack Currently 5 racks of PCs >150 nodes 1U rack mounted dual quad core Regularly used in commissioning runs More to come in next three months Expansion to 830 nodes 35% of foreseen system Rapid installation (3 racks per week) February 12th, 2008 TDAQ commissioning and status

High Level Trigger Algorithms HLT algorithm integration tested in ‘technical runs’ Week long and 24 hour Also Mx weeks Performance and timing studies See talk by Frank Winklmeier (this afternoon) February 12th, 2008 TDAQ commissioning and status

Network Infrastructure Two vital, fast networks Data network Dedicated data transport Control network Responsive configuration and monitoring Network expansion in January 2008 Added second control and data core Data core not yet connected to ROSes February 12th, 2008 TDAQ commissioning and status

TDAQ commissioning and status DAQ infrastructure Dataflow view of TDAQ infrastructure: Underground Surface Detectors Permanent Storage Readout Drivers Sub Farm Output Dedicated Optical Links Data Flow Manager Readout System Node Readout Buffers Sub Farm Input Data Network February 12th, 2008 TDAQ commissioning and status

TDAQ commissioning and status Readout System (ROS) Detector Readout Drivers feed Readout Buffers (ROBINs) via 1592 Readout Links All originally foreseen links installed A few more needed for forward detectors ROBINs located in 149 ROS PCs All installed and tested in detector commissioning and technical runs Plus 4 ‘hot’ spares Fast network interfaces to data network February 12th, 2008 TDAQ commissioning and status

Sub-Farm Input and Output Sub Farm Inputs perform Event Building Only on Level-2 selected events Prepare events for Event Filter 32 PCs available and used Event building at ~3kHz (event size 1.5 Mbyte) Sub Farm Outputs write events to disk On Event Filter selected events Write to separate streams 6 available (final number) 300 Mbyte/s to computer centre February 12th, 2008 TDAQ commissioning and status

Plus other necessary pieces Online services ATLAS control room PCs Monitoring farm 32 PCs File Servers Gateways Web servers Infrastructure all tested during many runs February 12th, 2008 TDAQ commissioning and status

TDAQ commissioning and status TDAQ software Provides common tools for Control Configuration Configuration and Conditions Databases Monitoring Current version tdaq-01-08-04 Validation via technical run 4th – 11th February Synchronized with HLT-13.2.0 February 12th, 2008 TDAQ commissioning and status

Monitoring and Data Quality Extensive framework already exists Many older and more mature tools Status displays, histogram producers, gatherers, displays, data quality Newer tools are also available and being improved/extended Monitoring Data Archive (MDA) Histograms, information automatically stored in CASTOR Operational Monitoring Display (OMD) Configurable display of any published information Time evolution, statistical analysis Trigger Presenter (TriP) Signal point of display of Trigger quantities at all levels Remote online monitoring becoming a reality: Web Monitoring Interface has been commissioned Run Status: http://pcatdwww.cern.ch/atlas-point1/wmi/Run%20Status_wmi/index.html Data Quality: http://pcatdwww.cern.ch/atlas-point1/wmi/Data%20Quality%20Monitoring_wmi/index.html Online remote access to P1 during data taking is being assessed See http://indico.cern.ch/conferenceDisplay.py?confId=23593 February 12th, 2008 TDAQ commissioning and status

TDAQ commissioning and status Trigger Presenter February 12th, 2008 TDAQ commissioning and status

Technical Run 4th-11th Feb Concentration on stress tests Stress data flow at all levels Heavy use of new Run Control implementation Achievements Stable running (several hours without intervention) Event sizes from 1 MB to 10 MB Event Building and Data Writing ran at Gbit link limits Controlled ~1500 applications over 350 nodes Successful, though some tweaks needed in error handling Playback of M4, M5 and Monte Carlo data to test all algorithm slices Feedback of problems useful for M6 preparation Generated 1031 data set used to test HLT menu for initial running February 12th, 2008 TDAQ commissioning and status

TDAQ commissioning and status Technical Run Display 1.5 hour running time stable trigger rate memory leak spotted February 12th, 2008 TDAQ commissioning and status

TDAQ commissioning and status Summary Vast majority of hardware now installed And tested in many detector and technical runs Most of requirements for initial luminosity met Large program of HLT node installation planned Commissioning weeks progressively more successful Mx weeks: More detectors, more stability Technical runs more ambitious Work to do Ongoing software development, improvement and testing Timing of Level-1 trigger system More experience with realistic triggers, rates, at all trigger levels Learn how to operate in 24/7 mode February 12th, 2008 TDAQ commissioning and status