Commissioning of the ATLAS High Level Trigger

Slides:



Advertisements
Similar presentations
ATLAS ATLAS PESA Meeting 25/04/02 B-Trigger Working Group Work-plan This talk:
Advertisements

1 The ATLAS Missing E T trigger Pierre-Hugues Beauchemin University of Oxford On behalf of the ATLAS Collaboration Pierre-Hugues Beauchemin University.
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
Inefficiencies in the feet region 40 GeV muons selection efficiency   Barrel – End Cap transition 10th International Conference on Advanced Technology.
RAL Summer School September, 2004 Efstathios (Stathis) Stefanidis High Level Trigger Studies for the.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
The ATLAS High Level Trigger Steering Journée de réflexion – Sept. 14 th 2007 Till Eifert DPNC – ATLAS group.
The ATLAS B physics trigger
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
July 7, 2008SLAC Annual Program ReviewPage 1 High-level Trigger Algorithm Development Ignacio Aracena for the SLAC ATLAS group.
Top Trigger Strategy in ATLASWorkshop on Top Physics, 18 Oct Patrick Ryan, MSU Top Trigger Strategy in ATLAS Workshop on Top Physics Grenoble.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
Tracking at the ATLAS LVL2 Trigger Athens – HEP2003 Nikos Konstantinidis University College London.
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
What’s in the ATLAS data : Trigger Decision ATLAS Offline Software Tutorial CERN, August 2008 Ricardo Gonçalo - RHUL.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Simulation Calor 2002, March. 27, 2002M. Wielers, TRIUMF1 Performance of Jets and missing ET in ATLAS Monika Wielers TRIUMF, Vancouver on behalf.
1 A ROOT Tool for 3D Event Visualization in ATLAS Calorimeters Luciano Andrade José de Seixas Federal University of Rio de Janeiro/COPPE.
The Region of Interest Strategy for the ATLAS Second Level Trigger
TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of the ATLAS TDAQ.
1 John Baines Commissioning of the ATLAS High Level Trigger.
HEP 2005 WorkShop, Thessaloniki April, 21 st – 24 th 2005 Efstathios (Stathis) Stefanidis Studies on the High.
ATLAS ATLAS Week: 25/Feb to 1/Mar 2002 B-Physics Trigger Working Group Status Report
Trigger & Analysis Avi Yagil UCSD. 14-June-2007HCPSS - Triggers & AnalysisAvi Yagil 2 Table of Contents Introduction –Rates & cross sections –Beam Crossings.
1 “Steering the ATLAS High Level Trigger” COMUNE, G. (Michigan State University ) GEORGE, S. (Royal Holloway, University of London) HALLER, J. (CERN) MORETTINI,
Valeria Perez Reale University of Bern On behalf of the ATLAS Physics and Event Selection Architecture Group 1 ATLAS Physics Workshop Athens, May
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Muon Trigger Slice Report Sergio Grancagnolo for the Muon Trigger group INFN-University of Lecce CERN Jan 23, 2007.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
PESAsim – the e/  analysis framework Validation of the framework First look at a trigger menu combining several signatures Short-term plans Mark Sutton.
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Hardeep Bansil (University of Birmingham) on behalf of L1Calo collaboration ATLAS UK Meeting, Royal Holloway January 2011 Argonne Birmingham Cambridge.
Performance of the ATLAS Trigger with Proton Collisions at the LHC John Baines (RAL) for the ATLAS Collaboration 1.
Trigger study on photon slice Yuan Li Feb 27 th, 2009 LPNHE ATLAS group meeting.
10 January 2008Neil Collins - University of Birmingham 1 Tau Trigger Performance Neil Collins ATLAS UK Physics Meeting Thursday 10 th January 2008.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
ATLAS B trigger Overview of B trigger Di-muon algorithms Performance (efficiency/rates) Menu for data taking experience 7/1/20101ATLAS UK Meeting,
Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays 1.Review of HLT architecture 2.Tools needed for Commissioning 3.Operational.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Measuring the B+→J/ψ (μμ) K+ Channel with the first LHC data in Atlas
Monitoring of L1Calo EM Efficiencies
Concluding discussion on Commissioning, Etmiss and DPD
Ricardo Gonçalo, RHUL BNL Analysis Jamboree – Aug. 6, 2007
The CMS High-Level Trigger
Controlling a large CPU farm using industrial tools
The University of Manchester
Particle detection and reconstruction at the LHC (IV)
Commissioning of the ALICE HLT, TPC and PHOS systems
OO Muon Reconstruction in ATLAS
The Silicon Track Trigger (STT) at DØ
High Level Trigger Studies for the Efstathios (Stathis) Stefanidis
Trigger commissioning and early running strategy
The First-Level Trigger of ATLAS
ATLAS: Level-1 Calorimeter Trigger
Special edition: Farewell for Stephen Bailey
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
Commissioning of the ALICE-PHOS trigger
MOORE (Muon Object Oriented REconstruction) MuonIdentification
Julie Kirk Rutherford Appleton Laboratory
Presentation transcript:

Commissioning of the ATLAS High Level Trigger John Baines

Overview of Talk ATLAS LHC Parameters The ATLAS Trigger UK & RAL involvement Commissioning System Tests Single Beam Cosmics Successes & Lessons Learned Commissioning in 2009 Summary Material taken from conference talks by: S. Farrington, C. Padilla, R.Hauser, F. Winklmeier, W. Wiedenmann, R. Goncalo, A. Ventura

The ATLAS Detector

Parameters at full luminosity LHC Parameters Parameters at full luminosity (L=1034cm-2s-1) - Bunch crossing interval : 25ns (40MHz) No. overlapping events : 23 => event rate ~ 1GHz Average no. particles : 1400 About 108 channels to read out. ➔ Event size: 1.5 Mbyte ➔ Larger during special runs: > 15 Mbyte ➔ Tier0/Reconstruction/Grid/Storage: output limit about 200 Hz/300 MByte/s Example signal & background rates: 100 GeV Higgs: ~0.1 Hz SUSY <1 Hz W ~500 kHz Z ~80 kHz Background Inelastic: ~1GHz Jets >1kHz

Trigger Architecture p p 40MHz Calorimeter or Muon (or TRTfastOR) - 40MHz Calorimeter or Muon (or TRTfastOR) Hardware: FPGA, ASIC Identify Regions of Interest for HLT Level 1 2.5ms 75kHz Software Trigger, commodity PCs Seeded by L1 ROI Full detector granularity. Requests data in RoI from Read Out Buffers Level 2 40ms 2GHz CPU 2 - 3kHz Software Trigger, commodity PCs Seeded by L1 & L2 Has access to entire event. Event Filter 4s 2GHz CPU 200Hz

ATLAS Trigger & DataFlow ~40ms ~4s

ATLAS UK HLT Manchester Oxford Royal Holloway RAL UCL RAL: Fred Wickens Monika Wielers Dmitry Emeliyanov Julie Kirk Bill Scott John Baines Student: Rudi Apolle Trigger Selection Software Inner Detector Trigger Electron/photon Trigger B-Physics Trigger Trigger Release Coordination Trigger Validation Trigger Hardware & Farms

Level-1 3 sub-systems: L1- Calorimeters L1- Muons Central Trigger Processor (CTP) Signature Identification e/g, t/h, jets, μ Multiplicities per pT threshold Isolation criteria Missing ET, total ET, jet ET CTP Receive and synchronize trigger information Generate Level-1 trigger decision (L1A) Deliver L1A to other subdetectors Sends the Regions of Interest to the Level 2 trigger

The HLT Farm Ultimately: 2300 processors (L2+EF) Now: ~1600 processors

Multi-core processors Resource requirements are multiplied with number of process instances Memory ~ 1–1.5 GByte/Application file descriptors network sockets, number of controlled applications ~ 7k presently ~ 20k final systemTrigger

HLT Framework Level-2 HLT selection software runs in the Level-2 Processing Unit (L2PU). Selection algorithms run in a worker thread. Event Filter(3 kHz→200 Hz) Independent Processing Tasks (PT) run selection software on Event Filter (EF) farm nodes HLT Event Selection Software is based on the ATLAS Athena offline Framework HLT framework interfaces the HLT event selection algorithms to online Driven by run control and data flow software Event loop managed by data flow software Allows HLT algorithms to run unchanged in the trigger and offline environment

HLT Selection Software LVL2: Reduce rate from up to 75 kHz to 2-3kHz in av. 40ms Custom algorithms with some offline components EF: Reduce rate from 2-3 kHz to 200-300Hz in av. 4s. Offline algorithms run from HLT-specific wrappers HLT: Processing in Region of Interest Only process ~few % of event At LVL2, request data over network for few % of event Early rejection – stepwise processing to minimize execution time for rejected events

RoI-based, stepwise processing : e/g example Level1 Region of Interest is found and position in EM calorimeter is passed to Level 2 EMROI L2 calorim. Event rejection possible at each step cluster? Electromagnetic clusters L2 tracking Level 2 seeded by Level 1 Fast reconstruction algorithms Reconstruction within RoI track? match? E.F.calorim. E.F.tracking Ev.Filter seeded by Level 2 Offline reconstruction algorithms Refined alignment and calibration track? e/ reconst. e/ OK?

Trigger Menus Example of electron signatures Trigger Menu defines chains of processing steps starting from LVL1 RoI Menu specified in terms of signatures e.g. mu6, e10, 2j40_xe30 etc. Chains can be prescaled at Level-1 or the HLT Signatures assigned to inclusive data-streams: egamma, jetTauEtmiss, muons, minbias, Lar and express Example of electron signatures

B-physics Triggers

Trigger Rates & Streams

Commissioning System tests with simulated & previously recorded cosmic data Download data to Read Out Buffers Can test with collision events Exercise system at max. LVL1 rate Cosmic tests: Individual detectors (“slice weeks”) Combined runs => Expose algorithms to real detector noise, data errors etc. Beam: Single beam Collisions

System Tests with simulated data

Single Beam 10:19 10/9/2008 Online Offline Single beam configuration – injection energy protons circulating in LHC On collision with a collimator, a spray of particles entered the detector - 10:19 10/9/2008 Online Offline

Level-1 Commissioning in Single Beam Each trigger component needs to be synchronised with the beam pick up - -10 Bunch crossing 10 -8 Bunch crossing 8

Commissioning with Cosmics

Cosmic Event

Differences in Cosmic v. Beam running No beam clock Muon trigger chambers provide timing Phase issues in read-out of TRT (straw detector) & Muon Drift Chambers No beam/no IP Tracks distributed over d0, z0 L2 dedicated algorithms for fast muon reconstruction (in MDTs) and fast tracking algorithms in inner detector optimized for trajectories pointing towards the beam line Muons in HLT The r-z view could not be fully reconstructed at L2 because algorithms are designed for pointing tracks and data access request is in trigger towers pointing to the IP Possible to relax pointing requirements to study rejection/efficiency Timing issues cause percent-level loss Tracking Level-2 algorithms optimized for tracks from Interaction Point

Calorimeter in e/g & t Triggers Study of performance of clustering algorithm in Tau trigger

e/g Example plot from eg FEX algorithms comparing L2 and EF: Shower shape in 2nd EM sampling Rη=E(3×7)/E(7×7).

Muon Trigger s=17mRad s=0.007

Muons in the Tile Calorimeter Df between tile cluster and ID track

Commissioning the InDet trigger Want to commission the LVL2 collisions algorithms with cosmic. But speed-optimisation of Level-2 algos means they are inefficient for tracks more that a ~5 mm from the nominal beam position. Three strategies: Use only the small fraction of events that pass close to the I.P. Loosen cuts in Pat. Rec. (not possible for all Algs.) Shift points.

Commissioning Level-2 tracking Add an initial step that applies a shifts to all the points, so the track seems to come from the Interaction Point

Level-2 ID Efficiency w.r.t. Tracks reconstructed offline

Cosmics for ID alignment HLT trigger used to select events passing through the ID, sent to the the IDCosmic stream & used for offline alignment

Commissioning with Cosmics 216 millions events 453 TB data 400k files several streams

Data Streaming

Online Handling of Time-Out Events Time-out Events go to the DEBUG stream The events are re-processed and streamed as if they had been processed online. The only difference is the file name. Files registered to the corresponding offline DB and processed normally, producing ESD, AOD, etc. , but still be separated and with the “recovered” tag.

Successes & Lessons learnt Some highlights: Trigger ready for First Beam Single beam events triggered with LVL1 & HLT streaming based on Level-1 HLT run offline on the CERN Analysis Farm Trigger including HLT algorithms exercised in cosmic running ~2 months running, 220 million events incl. long runs of >2M events Successfully streamed events incl. IDCosmic stream used for alignment. Exercised processing of events from the Debug stream Exercised procedures for evaluating new menus & code fixes on CAF prior to online deployment Successfully exercised release management in data-taking conditions deployed patch releases for P1 and HLT

Successes & Lessons learned Improvements for 2009 Running: Ability to change LVL1 pre-scales during a run was invaluable put in place infrastructure to enable HLT prescales to also be updated during run Change of magnetic field required a menu change: => Algorithms now able to configure magnetic field automatically based on magnet current Problems with calculating online Level-2 & EF trigger rates Old system too susceptible to problems collecting information from farm nodes. Improvements in rate calculation and collection of information from nodes Removal of detectors from readout caused errors in HLT => events in debug stream Allow algorithms to access Mask saying which detectors are in the run => modify error response Problems with noisy detectors Consolidate procedures for making noisy detector masks available online Improve monitoring, especially detector & trigger info. displayed side-by-side

Plans for 2009/10 Luminosity : ~2x1032 Integrated : ~200pb-1

Collisions Cosmics Cosmics with combined L1 muon triggers First beam menu: Cosmics + beam pickup trigger Bunch groups commissioned (requires clock commissioning) High Level Trigger performs streaming HLT algorithms run offline Add HLT one piece at a time in tagging mode Switch on HLT rejection after algorithms validated online Full 1031 Menu -

Collisions Cosmics Cosmics with combined L1 muon triggers First beam menu: Cosmics + beam pickup trigger Bunch groups commissioned (requires clock commissioning) -

Collisions Cosmics Cosmics with combined L1 muon triggers First beam menu: Cosmics + beam pickup trigger Bunch groups commissioned (requires clock commissioning) High Level Trigger performs streaming HLT algorithms run offline Add HLT one piece at a time in tagging mode Switch on HLT rejection after algorithms validated online Full 1031 Menu -

Eagerly awaiting collisions!! Conclusion The trigger was successfully commissioned in Single Beam and Cosmic running in Autumn 2008 Data has been analysed to validate the trigger operation. Improvements have been made in the light of experience from these runs Eagerly awaiting collisions!!

Backup Slide

High Level Trigger -

Level 1 Cosmic Rates -