Download presentation
Presentation is loading. Please wait.
Published byEmil Willis Modified over 9 years ago
1
Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays Alessandro Di Mattia Michigan State University On behalf of the Atlas Collaboration CHEP ’09, March 21-27, 2009
2
Prague - March 24, 2009 A. Di Mattia, MSU 2Outline Introduction to the LHC and the ATLAS Trigger/DAQ The Event Selection Software The Trigger Configuration Online and Offline Monitoring Experience with beam and cosmic Main HLT achievements
3
Prague - March 24, 2009 A. Di Mattia, MSU 3 The LHC challenge to ATLAS Trigger/DAQ LHC: proton-proton collisions @ ECM = 14 TeV L = 10 34 cm -2 s -1 23 collisions per bunch crossing @ 25 ns interval 1 year at L = 10 34 cm -2 s -1 ∫Ldt ≈ 100 fb -1 Challenge to the ATLAS Trigger/DAQ interaction rate 10 9 Hz, offline computing can handle O(10 2 Hz). cross section of physics processes vary over many order of magnitude: Inelastic: 10 9 Hz W → l : 10 2 Hz tt production: 10 Hz Higgs (100 GeV): 0.1 Hz Higgs (600 GeV):10 -2 Hz ATLAS has O(10 8 ) read-out channels → average event size ~1.6 MByte Muon System Barrel: Trigger Chamber: RPC Precision chamber: MDT Inner Detector: Transition Radiation Tracker (TRT) Silicon Detector (SCT) Pixel Muon System Endcap: Trigger Chamber: TGC Precision chamber: MDT,CSC
4
Prague - March 24, 2009 A. Di Mattia, MSU 4 ~3 KHz 75 KHz Level-1(hardware FPGA/ASIC) –analyzes coarse granularity data from CALO and MUON detector; –identifies the Region of Interest (RoI), seeds Level-2. Level-2 (software based) –accesses full granularity data within the RoI (2% total event size); –uses algorithms optimized for fast rejection. Event Filter (software based) –uses offline algorithms; –potential full event access; –exploits the seed from Level-2. Read Out System ROB SFO EFN Event Builder ~100 farm nodes SFI DFM ROB ATLAS: the Trigger/DAQ system Other detectors ROD High Level Trigger L2 ~ 40 ms L2P 500 farm nodes L2SVROIB EF ~ 4s EFP ~1600 farm nodes DCN CALO CTP MUON L1 2.5 s L1 accept, 75 KHz RoI request, data (~2%) L2 accept, ~3KHz RoI Dataflow 40 MHz ~200 Hz Typical event size ~1.6 MBytes, up to 14 MBytes for CALO calibration EF accept, ~ 200 Hz Calo + Muon tr. ch. Details given by T. Pauly - The ATLAS Level-1 Central Trigger System in Operation 320 MB/s 1 PB/s 120 GB/s 3 GB/s
5
Prague - March 24, 2009 A. Di Mattia, MSU 5 ATLAS Trigger/DAQ: the resources DAQ (ROS, EB, SFO): 100% of the final system available –delivered x2 design rate of event throughput with 5 SFO. Trigger farms (L2P, EFP): 35% of the final system available –~850 nodes on 27 racks; –8 cores per node (2 x Harpertown quad-core @ 2.5GHz),quad-core @ 2.5GHz 2 GBytes memory per core; –sufficient for the early data taking period. Homemade resource monitoring based on nagios Flexible resources assignment to DAQ/HLT: configuration allows for changing within a day the workload among Level-2 / Dataflow / Event Filter to cope with unexpected increase of data throughput (calo calibration runs, bad detector conditions etc..); Reconfiguration successfully exercised in the 2008 runs! Details given by A. Zaytsev - System Administration of ATLAS TDAQ Computing Environment and by R. Sjoen - Monitoring Individual Traffic Flows in the ATLAS TDAQ Network Details of the Dataflow given by W. Vandelli – ATLAS DataFlow Infrastructure: recent results from ATLAS cosmic and first-beam data-taking
6
Prague - March 24, 2009 A. Di Mattia, MSU 6 The HLT Selection Software Performance and functionalities tested in technical runs and combined detector runs Details given by W. Wiedenmann - The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger HLT Selection Software Framework ATHENA/GAUDI Reuse offline components Common to Level-2 and EF Offline algorithms used in EF HLT Data Flow Software
7
Prague - March 24, 2009 A. Di Mattia, MSU 7 The HLT Steering HLT steering manages the execution of the selection code –algorithms configurable by parameters; –applies early rejection: abort full chain as soon as a selection steps fails; –applies prescales and passthrough factors; –caches full history of TE and FEX and writes them into the HLT result: allows navigation through the steps of the trigger decision; avoids multiple execution of the same feature extraction; allows offline re-run of trigger selection with different Hypo cuts; Collection of Chains implements the trigger menu –in python or xml, recorded in the Trigger Configuration Database Steering used to select events in the 2008 cosmic data taking. From LVL1 Chain for Level-2 muon To EF selection L2_MU20 Track ? Muon Tracking Muon Feature Muon RoI Trigger Element Algorithms for extracting features of physics object from data Results (FEX) from feature extraction Trigger Elements (TE), marking the atomic selection steps (Sequences) Algorithms applying selection on FEX, thus confirming the TE Inn. Det. tracking combined track Combined Feature L2TrigIn DetTrack Comb. ?
8
Prague - March 24, 2009 A. Di Mattia, MSU 8 Trigger Configuration Trigger configuration: –active trigger Chains, algorithm parameters, prescale factors, passthrough fractions. Relational Database (TriggerDB) with no duplication of objects –four Database keys: LVL1 & HLT menu, L1 prescales, HLT prescales, bunch number; –user interface (TriggerTool); –read and write menu into XML format –menu consistency checks 2-month cosmic commissioning: over 3k chains, 6k components (algorithms, tools, services), 5k parameters. [Counting all versions of all objects] After run, Trigger Configuration becomes conditions data Run control can change complete menu at any run stop/start, prescales/passthroughs at any lumi block boundary Database proxy mechanism in place to avoid direct connection from every application Database exploited in the 2008 data taking for both online and offline
9
Prague - March 24, 2009 A. Di Mattia, MSU 9 The Online monitoring Trigger Presenter (TriP) –Provide rate information and farm status accessing the online Information System; –displays instantaneous trigger rate per selection chain / level and history plots; –allows for fast reaction against unexpected beam/detector conditions; Algorithms online monitor –algorithms produce histograms for shifter and experts; –statistic from Processing Units is collected by the Online Histogram Presenter (OHP); –automatic checks are performed on a subset of histos for data quality assessment: Exercised by the shifter crews. Provided also feedback to detector people during the early phases of the 2008 data taking. Distance between the track from LVL1 and the muon hits in the precision chambers selected in Level-2. Allows for checking the time sinchronization among muon trigger and muon precision detectors. OHP Tool Trigger presenter Details given by A. Corso Radu, Y. Ilchenko, P. Renkel, A. Dotti
10
Prague - March 24, 2009 A. Di Mattia, MSU 10 The Offline monitoring Tier0 [1600 cores, 2GB/core, CERN batch workers] Designed to reconstruct all events (~200Hz) from ATLAS within 1 day. Allows review of saved trigger quantities (used extensively) and comparison with offline (some tools yet in development). CAF (CERN Analysis Facility) [ 400 cores, 64 for trigger] Designed to rerun ~10% of collected events for calibration and commissioning. Deployment of new code to HLT farm: separate patch branch of trigger code with its own nightlies, tests with real data at CAF Check the HLT decision: run on minimum bias stream and on events taken in passthrough mode, deep monitoring of the algorithm functionalities Handles debug stream: events with HLT, errors and timeouts. (~2.5% of total events collected in Sept-Oct run, but really less than 0.1% from Oct onwards. Expected to be much lower for real collisions.)
11
Prague - March 24, 2009 A. Di Mattia, MSU 11 First experience with LHC beam Stability and reliability priority for the first beam. Simple trigger configuration (LVL1 decision only). Crucial to have the LVL1 triggers ( BPTX & MBTS) well timed-in. HLT used only for tagging events and routing them to data streams. Re-run the HLT offline on those events having Muon and Calo RoIs in time with BPTX or MBTS: –few statistic available (less than 1k event), due to short operation and non pointing tracks. Minbias Trigger Scintillator: 32 sectors on LAr cryostat BPTX, 175 m Tertiary collimators, 140 m beam splash events when closed Operation conditions –Pixel off; –Muon Sytem and Silicon Detector at reduced HV; –Other detectors on; LHC beam Loss monitor Details given by C. Ohm The ATLAS beam pick-up based timing system Details given by T. Pauly - The ATLAS Level-1 Central Trigger System in Operation
12
Prague - March 24, 2009 A. Di Mattia, MSU 12 Cosmic running and events collected HLT provided streaming and event selection for detectors Track selection for ID and Muon needed for alignment and calibration Fast turn around exercised to accommodate detector requirements Few cosmic runs before Sept. 2008 First time we had the Pixel Detector in a global run Mostly selected by the LVL1 Muon trigger
13
Prague - March 24, 2009 A. Di Mattia, MSU 13 LVL1 Muon algorithm BARREL ENDCAP Searches for TGC or RPC hit patterns compatible with tracks coming from Interaction Point. Uses coincidence windows. RPC MDT TGC MDT TGC 2
14
Prague - March 24, 2009 A. Di Mattia, MSU 14 Issues on cosmic event data No beam clock: timing provided by trigger Muon Trigger Chambers –Phase issues in read-out/calibration of trigger and precision muon chambers (MDT), transition radition tracker (TRT), etc. The r-z view could not be fully reconstructed @ L2 because algorithms are designed for pointing tracks and data access happens in trigger towers heading to Interaction Point. Pivot plane Confirm plane high pt Confirm plane low pt Muon Spectrometer: RPC trigger setup Muon Algorithms Possible to relax pointing requirement to study efficiency / rejection. Inner Detector Tracking Significant modification to get tracks needed for Inner Detector alignment. No tracks from Interaction Point : selected tracks distributed over d0, z0 –track selection unbiased in the r-z view for most of the runs
15
Prague - March 24, 2009 A. Di Mattia, MSU 15 Cosmic run: use of Physics Menu Despite low expected statics, a full physics menu run in parallel to cosmic chains. eγ, jets/missing E T, , μ, minimum bias… ROIs with eγ, , etc. signatures not very common with cosmics, rarer to get events until the end of the chains. A few thousand of events Both L2 and EF algorithms exercised successfully Example plot from eγ FEX algorithms comparing L2 and EF: Shower shape in 2 nd EM sampling R η =E(3×7)/E(7×7). Details on Tau trigger given by M. Dam - The ATLAS Tau Trigger
16
A. Di Mattia, MSU Hot cells in the eta region around 0.475 are seen by the HLT monitoring and by the detector monitoring. Plots normalized wrt the counting of the bin 0.475. Cross checking possible. Calo Trigger functional and may help identifying hot detector regions. Detector monitoring per partition (½ EM eta space) Hardware issues addressed during shutdown ATLAS preliminary Hardware issues addressed during shutdown MeV Detector Online monitoring LVL2 Calo: HLT feedback to the Detector Details given by D. Damazio – Atlas High Level Calorimeter Trigger Software Performance for Cosmic Ray Events
17
Prague - March 24, 2009 A. Di Mattia, MSU 17 High statistic available exercised some algorithms at L2: – Fast; – Iso; – TileRODMu; at EF: – TrigMuonEF; Fast operated since the very beginning, being serving data for the Muon Spectrometer Calibration and online Data Quality. (for details on remote MDT calibration see A. De Salvo ATLAS MDT remote calibration centers) Re-run on data for cross checking: basic distributions (track position, calo noise / m.i.p. signal) against montecarlo prediction; track finding efficiency studies at L2; studies on muon systems alignment at L2; HLT muon: commissioning with cosmic Display of a cosmic event, run 90272 Description of muon trigger algorithms given by A. Ventura: The Muon High Level Trigger of the ATLAS experiment
18
Prague - March 24, 2009 A. Di Mattia, MSU 18 Drift space from time Bad conversion (unphysical) Inefficiency position EMS5A14 missing LVL2 muon: Fast MDT cluster finding MDT cluster residual (w.r.t. TGC seed) vs nr. of TGC hits used as seed Cluster finding efficiency: 93% –design goal 99% –4.7% inefficiency due to missing MDT data –2% inefficiency due to bad MDT calibration noisy chamber
19
Prague - March 24, 2009 A. Di Mattia, MSU 19 L2 muon: calorimeter algorithm Algorithm implemented into the CALO ROD DSP Poor efficiency (<< 1%) due to lack of pointing Back-to back distribution seen Energy deposition agrees with that for a m.i.p. Run 91060 energy deposition and distributions of muon tracks in Tile Calorimeter ATLAS Preliminary Cosmic Monte Carlo
20
Prague - March 24, 2009 A. Di Mattia, MSU 20 Muon Event Filter TrigMuonEF algorithm exercised on cosmic data angular resolutions ( , ): =0.007, =17mrad Solenoidal and toroidal field on Resolutions with respect to Offline resolution resolution
21
Prague - March 24, 2009 A. Di Mattia, MSU 21 L2 ID Tracking Three L2 tracking algorithms: – Si Track: Combinatoric search for track seeds in innermost Si layers and their extension into tracks in outer Si layers. Si algo with TRT extension. – IDSCAN: use histogramming techniques to find z-position of the IP and identify tracks originating from there. Si algo with TRT extension. – TRTSegFinder: TRT-only algorithm looking for segments in the TRT. Goal: Record as many ID tracks as possible, do not introduce biases in selection, keep rate at acceptable levels. Secondary goal: to the extend possible, try to use machinery, setup, algorithms, etc. that are used for collisions. Earliest tracking possible at L2 (TRT can be read at LVL1 for cosmic)
22
Prague - March 24, 2009 A. Di Mattia, MSU 22 L2 ID Tracking: performance Trigger chain starting with all L1 accepted events and involving an OR of any L2 tracking algorithm finding tracks. Allowed collection of a good fraction of cosmic muons passing through the inner detector, with no significant biases Performance: –Uniform event efficiency of >99% for “golden Si” tracks. –Fake rates 0.01%-1%. –Algorithms complementary. Rerun 1 month later: HLT tracking works out of the box, despite some changes in the detector configuration! Details on ID tracking given by M. Sutton - Commissioning the ATLAS Inner Detector Trigger Details on data stream given by B. Pinto - Alignment data streams for the ATLAS Inner Detector
23
Prague - March 24, 2009 A. Di Mattia, MSU 23Conclusions HLT system fully exercised All the HLT infrastructure (steering, monitoring, data streaming, L2 & EF algorithms) tested to work under actual data taking conditions. Physics menu run in parallel to the cosmic slice. HLT performed event selection L2 ID tracking provided the data for detector alignment, L2 muon served data for online detector Data Quality and Monitoring. HLT operation was robust HLT commissioning performed while serving the subsystems. Provided a good balance between stability and responsiveness to detector condition / request. Lot of understanding driving further the commissioning work
24
Prague - March 24, 2009 A. Di Mattia, MSU 24 Backup (i.e. not to be shown on Chep)
25
Prague - March 24, 2009 A. Di Mattia, MSU 25 HLT Timing Timing studies done in “technical runs”, where MC events are injected to the HLT. 10^31 menu L2EF
26
Prague - March 24, 2009 A. Di Mattia, MSU 26 First beam !!!
27
Prague - March 24, 2009 A. Di Mattia, MSU 27 The Trigger Tool One tool for the shifters, experts and offline users Offline user can easily get read-only access using java webstart. Trigger shifter can modify prescales and passthroughs. Experts can modify all aspects of trigger configuration.
28
Prague - March 24, 2009 A. Di Mattia, MSU 28 Angle of muon track defined as follows IP w.r.t. pointing direction w.r.t. =0 direction x-y view r-z view x-y viewDefinitions
29
Prague - March 24, 2009 A. Di Mattia, MSU 29 LVL2 muon: Fast cross check on alignment with only Middle fit: poor match due to noise and bad MDT calibration; with Middle+Outer fit: good match, performance limited by alignment MO OM TGC Cross check of alignment: match angle measurement from TGC and MDT
30
Prague - March 24, 2009 A. Di Mattia, MSU 30 Algorithm assumes tracks are pointing to the IP 2 out of 3 segments: high efficiency on non pointing tracks; poor performance on the Sagitta reconstruction. 3 out of 3 segments: low efficiency on non pointing tracks; good Sagitta reconstruction, performance limited by the alignment. No magnetic field: straight tracks, cross check pattern recognition and reconstruction MDT angle: angle of MDT fit (in Middle Station) slope w.r.t. pointing direction LVL2 muon: Fast track reconstruction
31
Prague - March 24, 2009 A. Di Mattia, MSU 31 ID Tracking: offline reconstructed tracks d 0 ~ 18 mm would be rejected by the standard cut of the LVL2 ID tracking (efficiency up to d 0 =1 mm).
32
Prague - March 24, 2009 A. Di Mattia, MSU 32 L2 ID tracking: adjustments for Cosmics A simple independent pattern recognition for Si Start with hits in outermost layers, define a cigar- shaped road, if road contains enough hits, compute impact parameter for the road and apply a shift in x-y plane to all Si hits in detector.
33
Prague - March 24, 2009 A. Di Mattia, MSU 33 The MDT
34
Prague - March 24, 2009 A. Di Mattia, MSU 34 Assembling MDT+TGC for the Endcap
35
Prague - March 24, 2009 A. Di Mattia, MSU 35 The HLT Event Selection Software Integration into online proceeds through steps assess the same offline physics performance in –run single-node online emulator (check RoI-based data access) –run multi-node partition (full check of online infrastructure) Performance and functionalities tested in technical runs and combined detector runs HLT Selection Software Framework ATHENA/GAUDI Reuse offline components Common to Level-2 and EF
36
Prague - March 24, 2009 A. Di Mattia, MSU 36 The HLT Steering HLT steering manages the execution of the selection code –algorithms configurable by parameters; –applies early rejection: abort full chain as soon as a selection steps fails; –applies prescales and passthrough factors; –caches full history of TE and FEX and writes them into the HLT result: allows navigation through the steps of the trigger decision; avoids multiple execution of the same feature extraction; allows offline re-run of trigger selection with different Hypo cuts; Collection of Chains implements the trigger menu –in python or xml, recorded in the Trigger Configuration Database Steering used to select events in the 2008 cosmic data taking. From LVL1 Example of a Level-2 Chain for muon To EF selection L2_MU20 Isol. Feature Iso Hypo Iso isolation Fast Hypo Fast MS reco Muon Feature IDSCAN ID reco Comb CB reco Combined Feature L2TrigIn DetTrack Comb Hypo Muon RoI Trigger Element A Trigger Element B Algorithms for extracting features of physics object from data Results (FEX) from feature extraction Trigger Elements (TE), marking the atomic selection steps (Sequences) Algorithms applying selection on FEX, thus confirming the TE
37
Prague - March 24, 2009 A. Di Mattia, MSU 37 Cosmic event recorded Cosmic event recorded as a function of the run number. Plots shows also the status of the magnetic field
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.