Presentation is loading. Please wait.

Presentation is loading. Please wait.

ATLAS Trigger "sur le terrain" Physique ATLAS France Evian 19-21 Octobre 2009 C. Clément (Stockholm University) Evian, 19-21 Octobre 2009 ATLAS Trigger.

Similar presentations


Presentation on theme: "ATLAS Trigger "sur le terrain" Physique ATLAS France Evian 19-21 Octobre 2009 C. Clément (Stockholm University) Evian, 19-21 Octobre 2009 ATLAS Trigger."— Presentation transcript:

1 ATLAS Trigger "sur le terrain" Physique ATLAS France Evian 19-21 Octobre 2009 C. Clément (Stockholm University) Evian, 19-21 Octobre 2009 ATLAS Trigger 1

2  ATLAS Trigger architecture  Rate budget at L1/L2/EF/record rate, dead time  What can we do to prepare to LHC operation?  How to test L1, HLT  Trigger timing  Use cases for detector commissioning Evian, 19-21 Octobre 2009 ATLAS Trigger Overview

3 Level 1 Trigger L1Calo L1Mu Evian, 19-21 Octobre 2009 ATLAS Trigger 3 L1 Muon A BC D Signals to the Front end to send data to Read Out System L1 trigger rate <100kHz 40MHz L1 Calo 7200 projective trigger towers  x  =0.1 x 0.1 e/ ,  /hadron up to  2.5 jets up to  total energy, ETmiss up to  EM HAD

4 Access shafts Elevators SDX1 USA15 Read- Out Drivers ( ROD s ) First- level trigger Read-Out Subsystems ( ROS s ) UX15 USA15 Dedicated links Timing Trigger Control (TTC) 1600 Read- Out Links Gigabit Ethernet RoI Builder pROS Regions Of Interest VME ~150 PCs Data of events accepted by first-level trigger Event data requests Delete commands Requested event data stores LVL2 output Event data pushed @ ≤ 100 kHz, 1600 fragments of ~ 1 kByte each Second- level trigger LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500~1600 dual-CPU nodes ~100~30 Network switches Event data pulled : partial events @ ≤ 100 kHz, full events @ ~ 3 kHz Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs ( SFO s) LVL2 farm Network switches Event Builder SubFarm Inputs ( SFI s) Evian, 19-21 Octobre 2009 ATLAS Trigger 4

5 Access shafts Elevators SDX1 USA15 Read- Out Drivers ( ROD s ) First- level trigger Read-Out Subsystems ( ROS s ) UX15 USA15 Dedicated links Timing Trigger Control (TTC) 1600 Read- Out Links Gigabit Ethernet RoI Builder pROS Regions Of Interest VME ~150 PCs Data of events accepted by first-level trigger Event data requests Delete commands Requested event data stores LVL2 output Event data pushed @ ≤ 100 kHz, 1600 fragments of ~ 1 kByte each Second- level trigger LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500~1600 dual-CPU nodes ~100~30 Network switches Event data pulled : partial events @ ≤ 100 kHz, full events @ ~ 3 kHz Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs ( SFO s) LVL2 farm Network switches Event Builder SubFarm Inputs ( SFI s) μ tracks extrapolated to surface Evian, 19-21 Octobre 2009 ATLAS Trigger 5

6 H  ZZ'  e e  Evian, 19-21 Octobre 2009 ATLAS Trigger... 6 L2 / Region of Interest

7 High-Level-Trigger (HLT) farm:  850 PC (CPU: 2 x quad-core) installed = 35% of final system  Final size: ~ 500 PC for L2 + 1800 PC for EF (multi-core technology) : processor purchase deferred until Cost to Completion funds become available  Test realistic physics load on readout and HLT by preloading fake high pT physics signals in the readout system. High Level Trigger Farm Evian, 19-21 Octobre 2009 ATLAS Trigger... 7

8  Maximum L1 rate 75kHz - 100kHz (high luminosity)  Recording rate at the SFOs ~ 400MB/s  Safe operating point with some margin, stream overlap  Bandwidth Point 1 – Tier 0 < 1GB/s  Tier 0 processing: steady state of 200 events/s and 300MB/s  for an operational efficiency of 60%  => ATLAS event size is a crucial parameter...  Design ~ 1-1.5MB Evian, 19-21 Octobre 2009 ATLAS Trigger... 8 Rate budgets...

9 Evian, 19-21 Octobre 2009 ATLAS Trigger... 9 Commissioning of the Trigger System

10 Data flow at L1 rate up to 100kHz  Stress test in cosmics with physics triggers  Up to 500Hz of 'physics' muon triggers  Up to 100kHz of random triggers  Remove the random triggers with the HLT  Also used high rate of calorimeter noise triggers Evian, 19-21 Octobre 2009 ATLAS Trigger... 10 Commissioning of the Trigger System  Run by default run at L1 rates >30kHz.  Limitation from calibration and commissioning needs.  Also stress test of the trigger shifters and experts.

11 Evian, 19-21 Octobre 2009 ATLAS Trigger... 11  Preload simulated events: multijets, ttbar, black holes... into the Read out System. Stress Test High Level Trigger

12 Evian, 19-21 Octobre 2009 ATLAS Trigger... 12  Preload simulated events: multijets, ttbar, baclk holes... into the Read out System. Stress Test High Level Trigger Check the performance for very 'busy' events Test the limits of the HLT

13 0.8 Mbyte event size Performance on Monte Carlo Data  Dedicated 24 hour period almost every week  Run the 10 31 Hz/cm 2 menu (900 trigger chains)  Sustained L2 input rate of 90kHz  L2 output 18kHz and EF output 200Hz. Event Building rate in 2008 10 31 menu / 14 TeV (out of date) 10 31 menu builds on expected L1 rate ~ 10kHz HLT can cope with ~double the amount L1 trigger and subsystems run daily at >30kHz Evian, 19-21 Octobre 2009 ATLAS Trigger

14 Example of High Level Trigger menu in cosmic runs High Level Trigger chains Evian, 19-21 Octobre 2009 ATLAS Trigger... 14 Running L1 and HLT in Cosmics Trigger readout Precision readout ATLAS Preliminary 2008 cosmic-ray data L2 MuonID with calorimeter

15 Shower shape at L2 trigger Tracking efficiency at L2 trigger Barrel track Sagitta L2 Muon tracking Three MDT fit segments Evian, 19-21 Octobre 2009 ATLAS Trigger 15 Running L1 and HLT in Cosmics

16 Evian, 19-21 Octobre 2009 ATLAS Trigger... 16 Data Streams - Stream overlaps waste bandwidth - Non trivial to understand overlaps, not always easy to fix - Generates additional sources of errors, difficult overview - Large differences in size b/w streams => problem for Tier 0 with small and large files + ATLAS convenienty writes out different streams + Convenient for physics analysis + Monitoring quantities on certain streams or triggers are much easier to interprete + Express streams strengthen monitoring capability

17 L1 Latency / Timing Evian, 19-21 Octobre 2009 ATLAS Trigger 17 tt cosmic CTP Can apply delays to equalize eg. L1Calo with L1mu Can apply delays to equalize eg. L1Calo with L1mu Before timing calibration Delay not independent of detector region, ToF Trigger Accept to the front end

18  Ensure L1 trigger latency uniform within  each muon trigger sector,  easier with collisions, but requires decent lumi  tedious with lots of cosmics...<= but worth it  each calorimeter TTC partition,  pulser runs, charge and laser injection  L1 latency uniform over partitions, MU/CAL  cosmics, collisions tt cosmic Timing and Synchronisation Tile (HCAL) timing / single beam Pretimed using the TileCal laser system ATLAS Preliminary 2008 Pixel module timingSCT timing Evian, 19-21 Octobre 2009 ATLAS Trigger

19 L1 Mu Timing Evian, 19-21 Octobre 2009 ATLAS Trigger 19 Challenging in cosmics due to  Very large system  Large time of flights  Need an efficient time reference  Even in beam would take up O(weeks) TRT L1 Trigger: only for commissioning provides a pure cosmic trigger. TRT is small in size ~ 3ns For events triggered by TRT, monitor the timing of the muon triggers. Time of the TRT trigger – Time of the RPC trigger before calibration after calibration Actual plot still in review...  Typically trigger only on TRT ~10Hz (if barrel only)  Good match with LAr Calorimeter in 32 samples  High rate of ID tracks with small variation in latency => ID alignment

20 L1 Mu Timing Evian, 19-21 Octobre 2009 ATLAS Trigger 20 TRT L1 Trigger: only for commissioning provides a pure cosmic trigger. TRT is small in size ~ 3ns For events triggered by TRT, monitor the timing of the muon triggers. Time of the TRT trigger – Time of the RPC trigger before calibration after calibration Actual plot still in review and looks great  Typically trigger only on TRT ~10Hz (if barrel only)  Good match with LAr Calorimeter in 32 samples  Large monitoring window at the central trigger processor, large dead time.  High rate of ID tracks with small variation in latency => ID alignment Challenging in cosmics due to  Very large system  Large time of flights  Need an efficient time reference  Even in beam would take up O(weeks)

21  Full Lar pulse shape provides T drift  And check of mechanical uniformity  Requires read out of 32 LAr samples  Event size of ~12MB  Requires very long dead time limiting L1 rate  Need to reconfigure dead time, prescales to limit L1 rate  Rate budget<400MB/s  Maximum record rate of 33Hz, Tier 0 limit 300MB/s => 25Hz for periods more than O(12 hours).  Limit the overall ATLAS recording rate to 15% of nominal.  Defined a calo-centric running cosmic running mode, both calorimeters in multiple sample read-out, Tier0 processing accordingly  It is worth it, but scheduled in priority when muon system cannot provide trigger, or low trigger rate is required for some other reason (eg. TRT-RPC timing) Evian, 19-21 Octobre 2009 ATLAS Trigger 21 ECAL Expected vs measured drift Time LAr Calorimeter in 32 sample read out

22  Requires large samples of cosmic muons, not necessariy pointing.  Toroid OFF, solenoid ON, pT measurement in ID, straight tracks in muon system Evian, 19-21 Octobre 2009 ATLAS Trigger... 22 Muon Alignment with Cosmics October 16 th – October 19 th L1 rate RND (>30kHz, phys 500Hz), Record rate > 300Hz Small calorimter data size (physics mode) Small dead time protection Sagitta in the precision plane x x x

23  BPTX  Beam Condition Monitors  Forward detectors  Calibration triggers  Transition radiation tracker (commissioning) Evian, 19-21 Octobre 2009 ATLAS Trigger. 23 Other triggers for cosmics or beam MinBias trigger scintillators LUCID at 17 m Zero Degree Calorimeter at 140 m ALFA at 240 m Beam intensity - BPTX

24 H. Pernegger / CERN BCM : Timing of background vs. interactions  Distinguish collisions from background  through time-of-flight measurement  Is measured for every bunch crossing (25ns)  Requires fast and radiation hard detector+electronics  Crucial to establish z-position of collisions BCM timing BCM Side A - Side C Timing in ATLAS smallest detector! Beam Condition Monitor Evian, 19-21 Octobre 2009 ATLAS Trigger.

25 Operation: Bright and Dark Side  Daily changes among configurations  Calos in mutisample, low rate, high dead time protection  L1Mu timing or ID alignment, medium rate, medium dead time  Muon alignmnent, high rate, low dead time, also as 'high rate test'  Everything can change when you put a new menu, or push the system  Becoming more and more efficient at that, more quiet with beam?  Change of L1 prescales without stopping the run  powerful tool, crucial to deal with variations in rate due to beam conditions  Change of HLT prescales  Dangerous for analyzers  Extremely useful for commissioning  Enable some chains at end of fill to testthem on real data/ no stop  During the commissiong period change for instance from high rate muon centric ATLAS run to low rate calorimeter centric run. Evian, 19-21 Octobre 2009 ATLAS Trigger.

26  Start data taking right after injection with muon, pixel, SCT, (FCAL) in standby, configure with menu for the LHC fill.  Pixel preamps disabled.  Special trigger prescale to limit the trigger rate during injection  When beams become stable (flag from the machine):  Pause triggers  Ramp HV on pixel, sct, muon, (fcal)  Reneable pixel preamps  Change prescale set  Mark lumi block in cool (data quality)  Resume triggers  In case beam becomes unstable  pause triggers Evian, 19-21 Octobre 2009 ATLAS Trigger... 26 Operation of ATLAS simulated RF ramps  In case of detector ROD or FE busy, ideally pause the run, fix the problem, resynchronise, resume triggers.  Step 1: Pause and remove faulty element (working for all but 1 subsystem of ATLAS)  Step 2: Reconfigure faulty element, resynchronise and put back into ATLAS data taking (in place for 2 subsystems so far)

27 Conclusions  After two years of running ATLAS with cosmics, subtantial improvements have been made to increase the data taking efficiency.  ATLAS is ready and waiting for beam, week 48... Evian, 19-21 Octobre 2009 ATLAS Trigger

28 Evian, 19-21 Octobre 2009 ATLAS Trigger... 28 Backup Slides

29  Consider a bucket with a hole in the bottom = Central Trigger Processor  If trigger arrives, it is placed in the bucket. If the bucket is full, triggers are discarded.  Triggers in the bucket are at a constant rate, equivalent to the size of the hole in the bucket. Bunch crossing Alive and waiting for trigger No triggers are issued to ATLAS, simple dead time Trigger received and issued to ATLAS Alive and waiting for trigger Simple Dead Time Complex Dead Time Trigger received issued to ATLAS Evian, 19-21 Octobre 2009 ATLAS Trigger... 29 Dead Time To protect readout electronic from bursts of triggers...

30 L1 Latency ATLAS takes 40 million 'pictures per second' Upon a positive trigger decision, need to select & send out the right 'picture' 99 th LHCC –23 September 2009 Status of ATLAS Timing in TGC, TGC trigger BCM timing Timing in ATLAS smallest detector! Beam Condition Monitor Tile (HCAL) timing / single beamECAL timing / single beam ATLAS Preliminary 2008 single beam data ATLAS Preliminary 2008 single beam data L1Calo timing / beam on collimators Evian, 19-21 Octobre 2009 ATLAS Trigger... 30 Pixel module timingSCT timing


Download ppt "ATLAS Trigger "sur le terrain" Physique ATLAS France Evian 19-21 Octobre 2009 C. Clément (Stockholm University) Evian, 19-21 Octobre 2009 ATLAS Trigger."

Similar presentations


Ads by Google