Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays 1.Review of HLT architecture 2.Tools needed for Commissioning 3.Operational.

Slides:



Advertisements
Similar presentations
Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Advertisements

ATLAS Trigger "sur le terrain" Physique ATLAS France Evian Octobre 2009 C. Clément (Stockholm University) Evian, Octobre 2009 ATLAS Trigger.
1 The ATLAS Missing E T trigger Pierre-Hugues Beauchemin University of Oxford On behalf of the ATLAS Collaboration Pierre-Hugues Beauchemin University.
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
The ATLAS High Level Trigger Steering Journée de réflexion – Sept. 14 th 2007 Till Eifert DPNC – ATLAS group.
The ATLAS B physics trigger
Kostas KORDAS INFN – Frascati XI Bruno Touschek spring school, Frascati,19 May 2006 Higgs → 2e+2  O (1/hr) Higgs → 2e+2  O (1/hr) ~25 min bias events.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
Online Measurement of LHC Beam Parameters with the ATLAS High Level Trigger David W. Miller on behalf of the ATLAS Collaboration 27 May th Real-Time.
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
First year experience with the ATLAS online monitoring framework Alina Corso-Radu University of California Irvine on behalf of ATLAS TDAQ Collaboration.
What’s in the ATLAS data : Trigger Decision ATLAS Offline Software Tutorial CERN, August 2008 Ricardo Gonçalo - RHUL.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
1 A ROOT Tool for 3D Event Visualization in ATLAS Calorimeters Luciano Andrade José de Seixas Federal University of Rio de Janeiro/COPPE.
The Region of Interest Strategy for the ATLAS Second Level Trigger
TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of the ATLAS TDAQ.
1 John Baines Commissioning of the ATLAS High Level Trigger.
Valeria Perez Reale University of Bern On behalf of the ATLAS Physics and Event Selection Architecture Group 1 ATLAS Physics Workshop Athens, May
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Muon Trigger Slice Report Sergio Grancagnolo for the Muon Trigger group INFN-University of Lecce CERN Jan 23, 2007.
The Status of the ATLAS Experiment Dr Alan Watson University of Birmingham on behalf of the ATLAS Collaboration.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays Alessandro Di Mattia Michigan State University On behalf of the Atlas Collaboration.
ATLAS Trigger Development
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
CHIPP meeting Appenberg, 24 Aug 2009 Preparation for LHC beam, Jeroen van Tilburg 1/15 Jeroen van Tilburg (Universität Zürich) LHCb: Preparation for LHC.
Kostas KORDAS INFN – Frascati 10th Topical Seminar on Innovative Particle & Radiation Detectors (IPRD06) Siena, 1-5 Oct The ATLAS Data Acquisition.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Performance of the ATLAS Trigger with Proton Collisions at the LHC John Baines (RAL) for the ATLAS Collaboration 1.
1 Cosmic commissioning Milestone runs Jamie Boyd (CERN) ATLAS UK Physics Meeting-- J Boyd Jan
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
ANDREA NEGRI, INFN PAVIA – NUCLEAR SCIENCE SYMPOSIUM – ROME 20th October
The ATLAS Trigger System 1.Global Requirements for the ATLAS trigger system 2.Trigger/DAQ Architecture 3.LVL1 system 4.HLT System 5.Some results from LHC.
The CDF Upgrade - Incandela -CERN - January 26, 2001 slide 1 96 wire planes –(8 superlayers) –50% are 3 o stereo –Uniform drift (0.88 cm cell) –30,240.
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
ATLAS B trigger Overview of B trigger Di-muon algorithms Performance (efficiency/rates) Menu for data taking experience 7/1/20101ATLAS UK Meeting,
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Measuring the B+→J/ψ (μμ) K+ Channel with the first LHC data in Atlas
5/14/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
Ricardo Gonçalo, RHUL BNL Analysis Jamboree – Aug. 6, 2007
Commissioning of the ATLAS High Level Trigger
2018/6/15 The Fast Tracker Real Time Processor and Its Impact on the Muon Isolation, Tau & b-Jet Online Selections at ATLAS Francesco Crescioli1 1University.
Approved Plots from CMS Cosmic Runs (mostly CRUZET, some earlier)
Controlling a large CPU farm using industrial tools
Commissioning of the ALICE HLT, TPC and PHOS systems
Operating the ATLAS Data-Flow System with the First LHC Collisions
High Level Trigger Studies for the Efstathios (Stathis) Stefanidis
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
Trigger commissioning and early running strategy
The First-Level Trigger of ATLAS
12/3/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
Plans for checking hadronic energy
Commissioning of the ALICE-PHOS trigger
1/2/2019 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
MOORE (Muon Object Oriented REconstruction) MuonIdentification
Bringing the ATLAS Muon Spectrometer to Life with Cosmic Rays
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
Presentation transcript:

Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays 1.Review of HLT architecture 2.Tools needed for Commissioning 3.Operational conditions for single-beam and cosmic rays 4.HLT results from single-beam and cosmic ray running 5.Summary RT2009 Conference Beijing (China) 12-May-2009 Cristobal Padilla (IFAE-Barcelona/CERN) on behalf of the ATLAS Collaboration Outline:

Tuesday, May 12th, 2009Cristobal Padilla1 Trigger/DAQ Architecture DATAFLOWDATAFLOW EB ROS HLTHLT LV L1 D E T RO LVL2 TriggerDAQ 2.5  s ~ 10 ms 40 MHz 75 kHz ~2 kHz ~ 200 Hz Calo MuTrCh Other detectors SFI SFO EBN EFN FE Pipelines Read-Out Drivers Read-Out Sub-systems Dataflow Manager Sub-Farm Input Sub-Farm Output Event Filter N/work ROIB L2P L2SV L2N Event Filter DFM EFP RoI Builder L2 Supervisor L2 N/work L2 Proc Unit RoI RoI data = 1-2% RoI requests Lvl2 acc = ~2 kHz Event Building N/work ~ sec Lvl1 acc = 75 kHz 40 MHz 120 GB/s ~ 300 MB/s ~2+4 GB/s Event Filter Processors 120 GB/s ~4 GB/s EFacc = ~0.2 kHz Read-Out Buffers Read-Out Links Event Builder ROD ROB

HLT Hardware Installation  Currently 824 nodes on 27 racks  8 cores/node in 2xHarpertown GHz  2 GB memory/core  Can run L2 or EF Tuesday, May 12th, 2009Cristobal Padilla2 Example node monitored with a tool based on nagios

HLT Structure The basic ingredient of the HLT is the HLT Steering which controls the flow of code execution inside the HLT processing units  Algorithms for feature extraction (FEX) and applying requirements (HYPO)  Configurable by parameters  Results (Features) cached by HLT steering  Sequences: FEX and HYPO algorithms producing TriggerElements (ex: L2_mu10)  Chains: Ordered list of defined numbers of TriggerElements  Steering aborts chains as soon as given step fails (early reject)  Menu: Collection of chains (and pass- through+prescales)  In python or xml, recorded in database Tuesday, May 12th, 2009Cristobal Padilla3 Cluster? L2 Calorim. EF Calorim, EM ROI TrigEMCluster EM ROI L2 Calorim. Cluster? L2 tracking. EF Calorim, EF tracking e/γ reconst. e OK?γ OK? Track? Match? TrigInDetTrakcs CaloCluster EFTrack egamma FEX Track? HYPO Feature

Trigger Tool  Tool for shifters, experts and offline users  Offline user can easily check the configuration used in a run  Trigger shifter can modify pre-scales and pass-through settings  Expert can modify aspects of the trigger configuration Tuesday, May 12th, 2009Cristobal Padilla4

Online Monitoring  Trigger Presenter  Provide rate information and farm status  Displays detailed trigger rates (and history) at any step of the HLT selections  Algorithms online monitoring and Data Quality  Algorithms produce histograms for shifters and experts  Statistics of all nodes is gathered and centralized  Automatic checks are also performed and displayed Tuesday, May 12th, 2009Cristobal Padilla5 Automatic Data Quality checks Rate Monitoring

Offline Monitoring  Tier0: Designed to reconstruct all events (200 Hz) from ATLAS within ~24 hours (1600 cores, 2 GB /core, CERN batch workers)  Allows review of saved trigger quantities (used extensively) and comparison with offline reconstructed objects.  CAF: (CERN Analysis Facility), constituted by 400 cores, 64 for trigger use  Designed to rerun ~10% of collected events for calibration and commissioning  Checks the HLT decision: run on minimum bias stream and events taken in pass through mode to compare online and offline results  Handles the debug stream: events with HLT crashes, errors and timeouts  Deployment of new code for HLT farm: Separate patch branch of trigger code with its own “nighlies” to be tested with real data Tuesday, May 12th, 2009Cristobal Padilla6

Tuesday, May 12th, 2009Cristobal Padilla7 First experience with LHC beam  Reliability and stability main goals for the first beam  A simple configuration based on L1 decision only  Crucial to have the Beam Pickups (BPTX) and Minimum bias scintilators (MBTS) well timed in  Protect detectors  Pixel detector off  Semi Conductor Tracker (SCT) at low bias voltage  Muon system at reduced High Voltage  HLT infrastructure used for tagging event and routing them to streams  Algorithms not run except when needed for streaming tasks  Less than 1k events on which the HLT could be later run offline  Only those that have an RoI in Calorimeter or Muon in time with BPTX or MBTS Minbias Trigger Scintillator: 32 sectors on LAr cryostat BPTX, 175 m Tertiary collimators, 140 m beam splash events when closed LHC beam Loss monitor

8 z (cm) x (cm) ATLAS shafts Muon impact points extrapolated to surface as measured by Muon Trigger chambers (RPC) (Calorimeter trigger also available) Simulated cosmics flux in the ATLAS cavern Real Cosmic Event Rate ~100 Hz below ground: ~ O(15 Hz) crossing Inner Detector Tuesday, May 12th, 2009Cristobal Padilla Commissioning with Cosmics

A Nice Cosmic Muon Through the whole Detector Tuesday, May 12th, 2009Cristobal Padilla9

Tuesday, May 12th, 2009Cristobal Padilla10 Issues on Cosmic Event Data Pivot plane Confirm plane high pt Confirm plane low pt Muon Spectrometer: RPC trigger setup  No beam clock  Muon trigger chambers provide timing  Phase issues in read-out/calibration of precision muon chambers (MDT), transition radiation tracker (TRT), etc..  No beam/no IP  Tracks distributed over d 0, z 0  L2 dedicated algorithms for fast muon reconstruction (in MDTs) and fast tracking algorithms in inner detector assume particles pointing towards the beam line  Muons in HLT  The r-z view could not be fully reconstructed at L2 because algorithms are designed for pointing tracks and data access request is in trigger towers pointing to the IP  Possible to relax pointing requirements to study rejection/efficiency  Timing issues cause percent-level loss  Tracking in HLT  Significant modification to get tracks needed for inner-detector alignment

Commissioning with Cosmics Tuesday, May 12th, 2009Cristobal Padilla11 A huge amount of cosmic ray triggers are recorded, in total (left) as well as giving tracks also in the smallest-volume detector, the Pixels (below) Active use of the High Level Trigger system to select tracks that cross the Pixel detector and classify the events in a special stream. Good test of the infrastructure for trigger and analysis

Tuesday, May 12th, 2009Cristobal Padilla12 Cosmic run: use of Physics Menu  Despite low expected statics, a full physics menu run in parallel to cosmic chains. e/γ, jets/missing E T, , μ, minimum bias…  ROIs with e/γ, , etc. signatures not very common with cosmics, rarer to get events until the end of the chains.  A few thousand of events  Both L2 and EF algorithms exercised successfully Example plot from eγ FEX algorithms comparing L2 and EF: Shower shape in 2 nd EM sampling R η =E(3×7)/E(7×7).

Tuesday, May 12th, 2009Cristobal Padilla13 L2: calorimeter algorithm  Algorithm implemented into the calorimeter Read Out Driver (ROD) DSP and result decoded at L2  Poor efficiency (<< 1%) due to lack of ROI pointing  Back-to back distribution seen  Energy deposition agrees with that for a MIP Run energy deposition and  distributions of muon tracks in Tile Calorimeter ATLAS Preliminary Cosmic Monte Carlo

Tuesday, May 12th, 2009Cristobal Padilla14 Muon Event Filter MUON algorithm exercised on cosmic data angular resolutions ( ,  ):   =0.007,   =17mrad Solenoidal and toroidal field on Resolutions with respect to Offline  resolution  resolution Tails in the distributions are consequences of different calibration constants and the RoI-based strategy in the EF algorirthm

Tuesday, May 12th, 2009Cristobal Padilla15 L2 ID Tracking Three L2 tracking algorithms:  Si Track: Combinatoric search for track seeds in innermost Si layers and their extension into tracks in outer Si layers. Si algo with TRT extension.  IDSCAN: use histogramming techniques to find z-position of the IP and identify tracks originating from there. Si algo with TRT extension.  TRTSegFinder: TRT-only algorithm looking for segments in the TRT. Goal: Record as many ID tracks as possible, do not introduce biases in selection, keep rate at acceptable levels. Secondary goal: to the extend possible, try to use machinery, setup, algorithms, etc. that are used for collisions.

Tuesday, May 12th, 2009Cristobal Padilla16 L2 ID Tracking: Performance  Trigger chain starting with all L1 accepted events and involving an OR of any L2 tracking algorithm finding tracks.  Allowed collection of a good fraction of cosmic muons passing through the inner detector, with no significant biases Performance:  Uniform event efficiency of >99% for “golden Si” tracks.  Fake rates 0.01%-1%.  Algorithms complementary. Rerun 1 month later: HLT tracking works out of the box, despite some changes in the detector configuration!

Summary  The ATLAS HLT has been fully tested under actual data taking conditions  Algorithms for L2 and EF  Configuration  Steering  Monitoring  The HLT actively contributed to data taking  HLT infrastructure used for streaming in single beam and cosmic ray operation  Vital use of L2 tracking for collection of cosmic tracks to be used for Inner Detector alignment  HLT commissioning is progressing well and there is ongoing work in different areas to be ready for LHC operation  Monitoring improvements  Speeding up boot-up and configure transitions  Continue measuring performance with cosmic running Tuesday, May 12th, 2009Cristobal Padilla17

Cristobal Padilla  Hot cells in the eta region around are seen by the HLT monitoring and by the detector monitoring. Plots normalized wrt the counting of the bin  Cross checking possible. Calo Trigger functional and may help identifying hot detector regions. Hardware issues addressed during shutdown ATLAS preliminary Hardware issues addressed during shutdown MeV Detector Online monitoring L2 Calo: HLT feedback to the Detector Tuesday, May 12th,