Download presentation
Presentation is loading. Please wait.
1
Triggering In High Energy Physics Gordon Watts University of Washington, Seattle NSS 2003 Thanks to interactions.org for the pictures! N14-1
2
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 2 Outline Introduction The Collider Environment Trigger Design –Hardware Based Triggers –Software Based Triggers –Farms Conclusions I am bound to miss some features of the triggering systems in various detectors. I have attempted to look at techniques and then point out examples. Thanks to the many people who helped me with this talk (too numerous to mention). Focus mostly on ongoing discussions and presentations (RT’03)
3
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 3 Why Trigger? Bandwidth –Raw bandwidth of many detectors is GB/s Can’t Archive that rate! A HEP experiment can write TeraBytes of data. CPU –Reconstruction programs are becoming significant Farms are 1000’s of computers. –GRID to the rescue? Physics –Is all the data going to be analyzed (L4)? –Analysis Speed Small samples can be repeatedly studied, faster Don’t Loose Physics! Don’t Loose Physics! Disk & Tape Reconstruction Farm Eliminate Background As Early As Possible Eliminate Background As Early As Possible
4
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 4 Why Trigger? II It’s obvious Still, stunning to consider Level 1 Level 2 Level 3 Trigger Information Full Readout Detector ~50-70 Hz ~1 kHz ~10 kHz ~1 kHz ~7 MHz LevelAccept Rate TB/year$ Tape (raw only) Raw7 MHz25,700,000 10,500 m L110 kHz32,700 13 m L21 kHz3,270 1.3 m L350 Hz185 75,000 Assume 50% duty cycle for accelerator 15,768,000 working seconds in a year Event size 250 kB $ are for raw events only $0.40/GB Much worse at LHC! 1999: $400k for L3!
5
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 5 History - HEP DAQ was a stereo photograph! Low level trigger was the accelerator cycle Each expansion was photographed High Level Trigger was human (scanners). Bubble Chambers, Cloud Chambers, etc. (4 ) Early Fixed Target Experiments CAL (EM/HAD), Scintilator, etc. No pipelines, hardware lookup tables, discriminators,... Hardware L1 implemented using CAMAC Large deadtime possible during readout (Cronin and Fitch CP Violation ’64) Modern Day HEP CAL (EM/HAD), Scintillator, etc. Pipelines, Hardware lookup tables, discriminators,... Hardware L1 implemented custom designs <5% deadtime during readout AMY, LEP, CDF/D0, BaBar, Belle, etc.
6
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 6 Trigger Design Large Scale HEP Experiments High rate – 40 MHz Raw Trigger Rate DAQ/Trigger Integration Trigger Flexibility – move decision out of hardware and into firmware or software as early as possible. Farm Node Hardware Trigger Multilevel triggers, custom hardware, network topologies Small HEP Experiments, Medical Applications Large Data, lower frequency, fewer sources Stereotyping Detector CPU Smaller CPU/cluster can handle directly with specialized board
7
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 7 HEP Accelerators AcceleratorTime between collisions (ns) Luminosity (10 30 cm -1 s -1 ) Energy (GeV) CESR (CLEO) 1430009-12 KEKB (Belle) 210,0008 x 3.5 PEP-II (BaBar) 4.23,0009 x 3.1 AcceleratorTime between collisions (ns) Luminosity (10 30 cm -1 s -1 ) Energy (GeV) HERA (H1, Zeus) 9614920 x 30 TeV ( DØ, CDF, BTeV ) 3962002,000 LHC ( Atlas, CMS, LHCb ) 2510,00014,000 e+e- ep, pp, pp Source: PDB’02
8
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 8 HEP Detectors DetectorNumber of Channels Silicon Part of Trigger? Input Trigger Rate Largest (Non) Physics Background CLEO III 400,000Yes (L2) L1: 72 MHz L2/L3: 1 kHz Tape: <100 Hz electron pairs Belle 150,000Not Yet L1: 50 MHz L2: 500 Hz Tape: 100 Hz BhaBha & BaBar 150,000No L1: 25 MHz L3: 2 kHz Tape: 100 Hz BhaBha & H1, ZEUS 500,000Yes (ZEUS) L1: 10 MHz L2: 1 kHz L3: 100 Hz Tape: 2-4 Hz Beam-gas HERA-B 600,000Yes (L2) L1: 10 MHz L2: 50 kHz L3: 500 Hz L4: 50 Hz Tape: 2 Hz Beam-wire scattering Inelastics CDF, DØ (Run II) 1,000,000Yes (L2) L1: 7 MHz L2: 10-50 kHz L3:.3-1 kHz Tape: 50 Hz QCD, pileup, multiple interactions ATLAS CMS 10 7 High Level Trigger L1: 75 kHz HLT: 100 Hz QCD, pileup, multiple interactions
9
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 9 High Rate Trigger/DAQ Data Rates are 100 MB/sec Beam Crossing Rates are MHz Many times at edge of technology when first designed But off-the-shelf by the time they are running! Even GLAST uses them Gamma-ray Large Area Space Telescope 1 of 25 towers Si Tracker L1 Cal L1 Tracking L1 Decision Hardware Cal Hardware L2 CPU (in tower) Full Readout L3 CPU (Farm) To Earth Upon detection of Gamma-ray burst notify by internet within 5 seconds ATLAS, BaBar, Belle, BTeV, CDF, CMS, DØ, LEP etc. Level 1: hardware based –Physics Objects, little associations –Coarse Detector Data –Deadtimeless Level 2 –Hardware to preprocess data Some Muon processors, Silicon Triggers –Software to combine Matches, Jet finders, etc. Level 3: a commodity CPU farm –Complete event information available
10
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 10 Trends In Big Trigger Commodity Electronics CPU and Networking Delay the decision as long as possible More Data Available More sophisticated decision platform (bug fixing!) DAQ typically sits before a Farm Trigger DAQ is driving deeper into the Trigger System BTeV, ATLAS, CMS… Commodity Electronics Enforces Common Interface Custom Hardware Plethora of data formats Allows for economies of scale Hardware Sophistication ASICs, FPGAs HLT Algorithms in Hardware (ENET in a FPGA) HLT getting more sophisticated
11
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 11 The Age Of The Network BTeV Fast Networking Moves Data from a local environment to a Global One Cheap DRAM allows for large buffering, and latency Cheap Commodity Processors allow for powerful algorithms with large rejection
12
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 12 DØ & CDF Typical BIG Experiment ~1 million readout channels Event size is 250 KB 12.5 MB/sec written to tape Trigger – Typical Multi-level Level 1 Level 2 Level 3 Trigger Information Full Readout Detector 50-100 Hz ~1(.3) kHz ~5(20) kHz ~1(.3) kHz ~2.5 MHz L1:Dead-timeless,pipelined, Hardware L2&L3: can cause deadtime, variable length algorithms L2: Hardware and Software L3: CPU Farms Data for > 2.5 year
13
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 13 Triggering Strategy Similar but Different Approaches driven by Physics CDF: More emphasis on B Physics Jet of particles w/displaced track. Hard to pick out of background Too sophisticated to do well at L1 Pumps the extra data into L2 (CDF) DØ cuts harder at L1 L1 Trigger better at object correlations CAL, Tracks (connection), etc….
14
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 14 Tracking at the TeV –Track finding is difficult Must Implement hit searches and loops in hardware. Use FPGAs in the Track Trigger –Fiber Tracker (Scintillating Fiber) – DØ –Open Cell Drift Chamber - CDF –Finds Tracks in bins of p T 1.5, 3, 5, and 10 GeV/c @ DØ –Done by equations: Layer 1 Fiber 5 & Layer 2 Fiber 6 & etc. Inefficiencies are just more equations –Firmware becomes part of the trigger Versioning Fast & Flexible - Can be reprogrammed later. Painful for the trigger simulation! CFT Layers Forward Preshower Track –80 sectors –1-2 chips per sector –16K equations per chip –One eqn per possible track.
15
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 15 Drift Chamber Tracking DØ’s Fiber Tracker gives instant position information –No drift time to account for. CDF divides hits into two classes –Drift time < 33 ns –Drift time 33-100ns This timing information is used as further input to their FPGAs for greater discrimination. –This is not done on every wire. Prompt Hit Non Prompt Hit Babar also uses FPGAs & LUTs to do track finding 2ns beam crossing time! Babar also uses FPGAs & LUTs to do track finding 2ns beam crossing time! The Central Track Trigger System of D0 Experiment The D0 Central Tracking Trigger New Trigger Processor Algorithm for a Tracking Detector in a Solenoidal Magnetic Field N36-57 N36-55 N14-5
16
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 16 Trigger Manager Simple matching between detectors –track-cal –Usually first time Decision Logic is always programmable –Often part of Run configuration –Human Readable Trigger List May manage more than one Trigger Level. Usually contains scalars –Keeps track of trigger live-time for luminosity calculations. Detector ADetector BDetector C Trigger Logic For A Trigger Logic For B Trigger Logic For C BX Synchronization Decision Logic (prescales too) Readout Data to Next Level FEC Noti- fication 7 BX4 BX9 BX Per Trig Scalars
17
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 17 Level 2 Architecture Global CAL Preprocessor Si Preprocessor Common platform based on embedded DEC Alpha CPU and common communication lines for Run I Hybrid of Commodity Processor & Custom Hardware Custom Hardware Control CPU CPU Similar to L1 Global Upgrade CDF: Pulsar Board DØ: L2 Boards CDF Pulsar Project N29-4 An Impact Parameter Trigger for the DØ Experiment N29-5
18
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 18 High Level Trigger Move the data from the ROCs Build the Event Process Event in Physics Software Deliver accepted events Move the data from the ROCs Build the Event Process Event in Physics Software Deliver accepted events ~100 Read Out Crates ~100-250 MB/sec ~100ms - 1 second/event processing time Both experiments use networking technology ATM w/concentrators, SCRAMNet for routing Gb Ethernet with central switch, ENet for routing See the Online/DAQ Sessions
19
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 19 CDF Data Flows when Manager has Destination –Event Builder Machines One of first to use switch in DAQ. Dataflow over ATM –Traffic Shaping Backpressure provided by Manager.
20
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 20 DØ High Level Trigger Network Based DAQ The Linux-based Single Board Computer for Front End Crates in the DZERO DAQ System Dataflow in the DZERO Level 3 Trigger/DAQ System N36-71 N36-70
21
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 21 Fermi TeV Planned Upgrades Peak Luminosity matters for Trigger/DAQ Tevatron Trigger Upgrades are Under review –Decreased Tevatron Luminosity Expectations Both experiments still believe they will be required. L1 CAL Sliding Window (ATLAS algorithm) L1 CAL Track Match Track Stereo info used at L1 L2 CPUs L2 Si Trigger improvements 200 250 300 ~2.8e32 Accelerator draft plan: Peak luminosities Peak Luminosity (x10 30 cm -2 sec -1 ) 0 50 100 150 2003 Start of Fiscal Year ~1.6e32 200420052006200720082009 2010 Today(4.5x10 31 ) All under Review COT Readout Speedup L1 Track Trigger L2 Si L2 CPUs L3 Network DAQ Upgrade 13 racks of ’88 to 3 of present day, and better functionality (planned CAL upgrade in DZERO)
22
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 22 ATLAS & CMS O(10) Larger Almost no custom Hardware after L1 L1: 75kHz HLT: 100 Hz L1: 75kHz HLT: 100 Hz Both Experiments use full Network DAQ after L1 Solve Data Rate Problem Differently Big Switch (CMS) ROI (ATLAS) Event size: 1MB But Approach Is Similar 75GB/sec
23
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 23 CMS Trigger Architecture CMS Must operate at 100 kHz 50 kHz @ startup CMS eISO Card CMS Uses only Muon & Calorimeter Tracking too Complex & Large Muon Tracking Both ASIC and FPGA Pipeline is on-detector (3.2 s)
24
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 24 ATLAS Level 1 Similar Design 2 s Pipe Line CAL Cluster Finder CPU Farm (100-150 dual CPU, ~1ms/event) Offline Tracking @ L2 by CDF like XFT proposed Level 2
25
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 25 Regions of Interest (ROI) Bandwidth/Physics Compromise –Farm usually reads out entire detector –Hardware often look at single detector ROI sits in the middle –Farm CPU requests bits of detector Uses previous trigger info to decide what regions of the detector it is interested in. –Once event passes ROI trigger, complete readout is requested and triggering commences. Flexible, but not without problems. –Must keep very close watch on trigger programming! –Farm decisions happen out-of-event-order Pipelines must be constructed appropriately. ATLAS, HERA-B, BTeV… L2 Farm CPU Basic Cal L1 Info Ele Conf Full Trigger
26
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 26 CMS High Level Trigger NO L2 Trigger 1 second buffering in Hardware (100Kx1meg) 1000 HLT Farm Nodes 700 ROC, 64 Event Builders Build Events in Two Stages 8 Planes Will use Myrinet for first plane GB under evaluation 8x8 1 of 8 Planes ROC EVB
27
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 27 LHC Posters And Papers Full Crate Test and Production of the CMS Regional Calorimeter Trigger System N29-3 N14-2 The First-Level and High-Level Muon Triggers of the CMS Experiment at CERN The ATLAS Liquid Argon Calorimeters Readout System ATLAS Level-1 Calorimeter Trigger: Subsystem Tests of a Jet/Energy-sum Processor Module Test Beam Results from the ATLAS LVL1 Muon Barrel Trigger and RPC Readout Slice Beam Test of the ATLAS End-cap Muon Level1 Trigger System N29-2 N36-68 N29-1 N14-2
28
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 28 BTeV 23 million pixels – in Level 1! 132 ns Bunch Spacing 100 kb/event Trigger full rate. 800 GB/sec!! B s Mixing, Rare Decays Level 1 1.Hit clustering in Pixels – FPGAs 2.Cluster linking in inner and outer layers – FPGAs 3.Track finding in B field (p T >0.25) – Embedded CPU Farm 4.Hard Scatter Vertex Finding – Embedded CPU Farm 5.DCA test for displaced tracks due to B decays – Embedded CPU Farm 8 Planes of L1+L2/L3 (round robin)
29
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 29 BTeV Trigger Design 800 GB/sec optical links Global L1 L1 Farms L2/L3 Farms Buffering for 300k interactions (300ms) 200MB/sec 3.8 kHz 78 kHz
30
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 30 BTeV HLT Algorithm Large # of High End Boxes (dual 4 GHz) Uses ROI – Similar to ATLAS But L2/L3 are interleaved on same box Trigger/DAQ TDR Just Out Tremendous amount of simulation work Pre-prototype of the BTeV Trigger Level 1 Farm Processing Module. N36-66 N36-52 Hash Sorter - Firmware Implementation and an Application for the Fermilab BTeV Level 1 Trigger System N36-65 Real-Time Embedded System Support for the BTeV Level 1 Muon Trigger N36-61 Failure Analysis in a Highly Parallel Processor for L1 Triggering N29-7 Data Flow Analysis of a Highly Parallel Processor for a Level 1 Pixel Trigger
31
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 31 The Human Factor Anonymous messing up DAQ system from home System are complex 1000’s of interacting computers Complex Software with 100’s of thousands of lines of code We get the steady state behavior right. What about the shifter who does a DAQ system reset 3 times in a row in a panic of confusion? The expert playing from home? Self Healing Systems? The Microsoft Problem? (too smart for their own good)
32
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 32 Others Too much out there to be covered in this talk! ZEUS has added a CPU farm to its custom Hardware L1/L2 trigger LHCb uses a 3D torus network topology to move events through its farm quickly (1200 CPUs) FPGAs come with Ethernet Controllers built-in The Programmable NIC as a switch
33
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 33 Other HEP Talks and Posters N29-6 BaBar Level 1 Drift Chamber Trigger Upgrade N14-4 The Trigger System of the COMPASS Experiment N36-59 A First Level Vertex Trigger for the Inner Proportional Chamber of the H1 Detector N36-60 The Trigger System for the New Silicon Vertex Belle Detector SVD 2.0 N36-62 Rapid 3-D Track Reconstruction with the BaBar Trigger Upgrade N36-64 The ZEUS Global Tracking Trigger N36-69 Electronics for Pretrigger on Hadrons with High Transverse Momentum for HERA-B Experiment N36-56 A Pipeline Timing and Amplitude Digitizing Front-End and Trigger System
34
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 34 Air Shower Observatories 1.5km Looking for ultra high energy cosmic rays (beyond GZK). 1600 Auger detectors 1.5 km between each 4 communications concentrators in the array PLCs, ASICS, Power PC Low Data Rate (1200bps) Low Power (Solar) Low Data Rate (1200bps) Low Power (Solar) Central Cell Fires (20 Hz) Look at surrounding rings to decide N14-6 PLD First Level Surface Detector Trigger in the Pierre Auger Observatory N36-54 The Trigger System of the ARGO-YBJ Experiment 10 second pipeline
35
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 35 Medical Applications Decay Detection No Accelerator Clock Positron Emission Tomography (PET) Trigger is a Scintillator Coincidence BGO Scintillator Reduced matching window increases resolution
36
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 36 PET BGO Different strength signals Different rise times Signal Threshold doesn’t have good enough timing resolution!
37
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 37 PET Time Amplitude Fit signal to BGO Model Fit and Error Matrix done in FPGA Allows a significant reduction in window width (9.2ns) N36-58 Development of a High Count Rate Readout System Based on a Fast, Linear Transimpedance Amplifier for X-ray Imaging N36-63 A New Front-End Electronics Design for Silicon Drift Detector
38
Review of Triggering in HEP Gordon Watts gwatts@fnal.gov University of Washington, Seattle NSS 2003 Oct 21, 2003 38 Conclusions In HEP the Network rules the day –Enables experiments to get localized data off detector –And into CPU with Global View –Global Trigger Decisions have more discriminating power. Technology Continues to Make Hardware Look more like Software –FPGA’s increase in complexity –Moves complex algorithms further up the trigger chain –Following commodity hardware – CPUs, Embedded processors, etc. Where next? Most adopting Ethernet or its near relatives –A few important exceptions. A Great Session with Lots of Good Talks and Posters! Enjoy! A Great Session with Lots of Good Talks and Posters! Enjoy!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.