Triggering In High Energy Physics Gordon Watts University of Washington, Seattle NSS 2003 Thanks to interactions.org for the pictures! N14-1.

Slides:



Advertisements
Similar presentations
Freiburg Seminar, Sept Sascha Caron Finding the Higgs or something else ideas to improve the discovery ideas to improve the discovery potential at.
Advertisements

Track Trigger Designs for Phase II Ulrich Heintz (Brown University) for U.H., M. Narain (Brown U) M. Johnson, R. Lipton (Fermilab) E. Hazen, S.X. Wu, (Boston.
The trigger1 The Trigger YETI 7th January 2008 Emily Nurse Outline: Why do we need a Trigger? The trigger system at CDF Rate control at CDF Triggering.

Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
A Fast Level 2 Tracking Algorithm for the ATLAS Detector Mark Sutton University College London 7 th October 2005.
The Silicon Track Trigger (STT) at DØ Beauty 2005 in Assisi, June 2005 Sascha Caron for the DØ collaboration Tag beauty fast …
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
FPCP 2002, 05/16-18/2002 p. 1 Richard E. Hughes, The Ohio State UniversityCDF Run II Status Status of CDF and Prospects Flavor Physics and CP Violation.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
Triggering In High Energy Physics Gordon Watts University of Washington, Seattle Instr99.
CDF Trigger System Veronica Sorin Michigan State University Silicon Workshop UCSB May 11,2006.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
BTeV Trigger Architecture Vertex 2002, Nov. 4-8 Michael Wang, Fermilab (for the BTeV collaboration)
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
Workshop on Quarkonium, November 8-10, 2002 at CERN Heriberto Castilla DØ at Run IIa as the new B-Physics/charmonium player Heriberto Castilla Cinvestav-IPN.
The SLHC and the Challenges of the CMS Upgrade William Ferguson First year seminar March 2 nd
BEACH Conference 2006 Leah Welty Indiana University BEACH /7/06.
The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System D.Acosta, A.Madorsky, B.Scurlock, S.M.Wang University of Florida A.Atamanchuk,
Hall D Trigger and Data Rates Elliott Wolin Hall D Electronics Review Jefferson Lab 23-Jul-2003.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Faster tracking in hadron collider experiments  The problem  The solution  Conclusions Hans Drevermann (CERN) Nikos Konstantinidis ( Santa Cruz)
Manfred Jeitler, HEPHY Vienna Trigger Systems at LHC Experiments Instr-2008, Novosibirsk, 4 March Trigger Systems at LHC Experiments Manfred Jeitler.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Claudia-Elisabeth Wulz Institute for High Energy Physics Vienna Level-1 Trigger Menu Working Group CERN, 9 November 2000 Global Trigger Overview.
HEP 2005 WorkShop, Thessaloniki April, 21 st – 24 th 2005 Efstathios (Stathis) Stefanidis Studies on the High.
The CDF Online Silicon Vertex Tracker I. Fiori INFN & University of Padova 7th International Conference on Advanced Technologies and Particle Physics Villa.
Il Trigger di Alto Livello di CMS N. Amapane – CERN Workshop su Monte Carlo, la Fisica e le simulazioni a LHC Frascati, 25 Ottobre 2006.
Event selection and readout Online networks and architectures Online event filter Technologies and trends Computing and communication at LHC.
EMCal in ALICE Norbert Novitzky 1. Outline How Electro-Magnetic Calorimeters works ? Physics motivation – What can we measure with Emcal ? – Advantages.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Dec.11, 2008 ECL parallel session, Super B1 Results of the run with the new electronics A.Kuzmin, Yu.Usov, V.Shebalin, B.Shwartz 1.New electronics configuration.
G. Volpi - INFN Frascati ANIMMA Search for rare SM or predicted BSM processes push the colliders intensity to new frontiers Rare processes are overwhelmed.
LHCb front-end electronics and its interface to the DAQ.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
The DØ Silicon Track Trigger Wendy Taylor IEEE NSS 2000 Lyon, France October 17, 2000  Introduction  Overview of STT  STT Hardware Design u Motherboard.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Susan Burke DØ/University of Arizona DPF 2006 Measurement of the top pair production cross section at DØ using dilepton and lepton + track events Susan.
DØ Beauty Physics in Run II Rick Jesik Imperial College BEACH 2002 V International Conference on Hyperons, Charm and Beauty Hadrons Vancouver, BC, June.
1 Experimental Particle Physics PHYS6011 Fergus Wilson, RAL 1.Introduction & Accelerators 2.Particle Interactions and Detectors (2) 3.Collider Experiments.
A Fast Hardware Tracker for the ATLAS Trigger System A Fast Hardware Tracker for the ATLAS Trigger System Mark Neubauer 1, Laura Sartori 2 1 University.
Living Long At the LHC G. WATTS (UW/SEATTLE/MARSEILLE) WG3: EXOTIC HIGGS FERMILAB MAY 21, 2015.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
Preliminary Design of Trigger System for BES III Zhen’an LIU Inst of High Energy Physics,Beijing Oct
DAQ Selection Discussion DAQ Subgroup Phone Conference Christopher Crawford
Nikhef Scientific Meeting 2000Onne Peters D0 Muon Spectrometer December 14-15, Amsterdam Onne Peters Nikhef Jamboree 2000.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
M. Selen, 7/24/03 LEPP Lunch: Pg 1 The CLEO-c Trigger System: More Than Just Blinking Lights ! The CLEO-c Trigger System:
Off-Detector Processing for Phase II Track Trigger Ulrich Heintz (Brown University) for U.H., M. Narain (Brown U) M. Johnson, R. Lipton (Fermilab) E. Hazen,
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
More technical description:
Electronics Trigger and DAQ CERN meeting summary.
ALICE – First paper.
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
Kevin Burkett Harvard University June 12, 2001
The Silicon Track Trigger (STT) at DØ
The First-Level Trigger of ATLAS
Example of DAQ Trigger issues for the SoLID experiment
Commissioning of the ALICE-PHOS trigger
Experimental Particle Physics PHYS6011 Putting it all together Lecture 4 6th May 2009 Fergus Wilson, RAL.
LHCb Trigger, Online and related Electronics
Experimental Particle Physics PHYS6011 Putting it all together Lecture 4 28th April 2008 Fergus Wilson. RAL.
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
The CMS Tracking Readout and Front End Driver Testing
SVT detector electronics
The LHCb Front-end Electronics System Status and Future Development
Presentation transcript:

Triggering In High Energy Physics Gordon Watts University of Washington, Seattle NSS 2003 Thanks to interactions.org for the pictures! N14-1

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Outline Introduction The Collider Environment Trigger Design –Hardware Based Triggers –Software Based Triggers –Farms Conclusions I am bound to miss some features of the triggering systems in various detectors. I have attempted to look at techniques and then point out examples. Thanks to the many people who helped me with this talk (too numerous to mention). Focus mostly on ongoing discussions and presentations (RT’03)

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Why Trigger? Bandwidth –Raw bandwidth of many detectors is GB/s Can’t Archive that rate! A HEP experiment can write TeraBytes of data. CPU –Reconstruction programs are becoming significant Farms are 1000’s of computers. –GRID to the rescue? Physics –Is all the data going to be analyzed (L4)? –Analysis Speed Small samples can be repeatedly studied, faster Don’t Loose Physics! Don’t Loose Physics! Disk & Tape Reconstruction Farm Eliminate Background As Early As Possible Eliminate Background As Early As Possible

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Why Trigger? II It’s obvious Still, stunning to consider Level 1 Level 2 Level 3 Trigger Information Full Readout Detector ~50-70 Hz ~1 kHz ~10 kHz ~1 kHz ~7 MHz LevelAccept Rate TB/year$ Tape (raw only) Raw7 MHz25,700,000 10,500 m L110 kHz32, m L21 kHz3, m L350 Hz185 75,000 Assume 50% duty cycle for accelerator 15,768,000 working seconds in a year Event size 250 kB $ are for raw events only $0.40/GB Much worse at LHC! 1999: $400k for L3!

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, History - HEP DAQ was a stereo photograph! Low level trigger was the accelerator cycle Each expansion was photographed High Level Trigger was human (scanners). Bubble Chambers, Cloud Chambers, etc. (4  ) Early Fixed Target Experiments CAL (EM/HAD), Scintilator, etc. No pipelines, hardware lookup tables, discriminators,... Hardware L1 implemented using CAMAC Large deadtime possible during readout (Cronin and Fitch CP Violation ’64) Modern Day HEP CAL (EM/HAD), Scintillator, etc. Pipelines, Hardware lookup tables, discriminators,... Hardware L1 implemented custom designs <5% deadtime during readout AMY, LEP, CDF/D0, BaBar, Belle, etc.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Trigger Design Large Scale HEP Experiments High rate – 40 MHz Raw Trigger Rate DAQ/Trigger Integration Trigger Flexibility – move decision out of hardware and into firmware or software as early as possible. Farm Node Hardware Trigger Multilevel triggers, custom hardware, network topologies Small HEP Experiments, Medical Applications Large Data, lower frequency, fewer sources Stereotyping Detector CPU Smaller CPU/cluster can handle directly with specialized board

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, HEP Accelerators AcceleratorTime between collisions (ns) Luminosity (10 30 cm -1 s -1 ) Energy (GeV) CESR (CLEO) KEKB (Belle) 210,0008 x 3.5 PEP-II (BaBar) 4.23,0009 x 3.1 AcceleratorTime between collisions (ns) Luminosity (10 30 cm -1 s -1 ) Energy (GeV) HERA (H1, Zeus) x 30 TeV ( DØ, CDF, BTeV ) ,000 LHC ( Atlas, CMS, LHCb ) 2510,00014,000 e+e- ep, pp, pp Source: PDB’02

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, HEP Detectors DetectorNumber of Channels Silicon Part of Trigger? Input Trigger Rate Largest (Non) Physics Background CLEO III 400,000Yes (L2) L1: 72 MHz L2/L3: 1 kHz Tape: <100 Hz electron pairs Belle 150,000Not Yet L1: 50 MHz L2: 500 Hz Tape: 100 Hz BhaBha &  BaBar 150,000No L1: 25 MHz L3: 2 kHz Tape: 100 Hz BhaBha &  H1, ZEUS 500,000Yes (ZEUS) L1: 10 MHz L2: 1 kHz L3: 100 Hz Tape: 2-4 Hz Beam-gas HERA-B 600,000Yes (L2) L1: 10 MHz L2: 50 kHz L3: 500 Hz L4: 50 Hz Tape: 2 Hz Beam-wire scattering Inelastics CDF, DØ (Run II) 1,000,000Yes (L2) L1: 7 MHz L2: kHz L3:.3-1 kHz Tape: 50 Hz QCD, pileup, multiple interactions ATLAS CMS 10 7 High Level Trigger L1: 75 kHz HLT: 100 Hz QCD, pileup, multiple interactions

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, High Rate Trigger/DAQ Data Rates are 100 MB/sec Beam Crossing Rates are MHz Many times at edge of technology when first designed But off-the-shelf by the time they are running! Even GLAST uses them Gamma-ray Large Area Space Telescope 1 of 25 towers Si Tracker L1 Cal L1 Tracking L1 Decision Hardware Cal Hardware L2 CPU (in tower) Full Readout L3 CPU (Farm) To Earth Upon detection of Gamma-ray burst notify by internet within 5 seconds ATLAS, BaBar, Belle, BTeV, CDF, CMS, DØ, LEP etc. Level 1: hardware based –Physics Objects, little associations –Coarse Detector Data –Deadtimeless Level 2 –Hardware to preprocess data Some Muon processors, Silicon Triggers –Software to combine Matches, Jet finders, etc. Level 3: a commodity CPU farm –Complete event information available

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Trends In Big Trigger Commodity Electronics CPU and Networking Delay the decision as long as possible More Data Available More sophisticated decision platform (bug fixing!) DAQ typically sits before a Farm Trigger DAQ is driving deeper into the Trigger System BTeV, ATLAS, CMS… Commodity Electronics Enforces Common Interface Custom Hardware Plethora of data formats Allows for economies of scale Hardware Sophistication ASICs, FPGAs HLT Algorithms in Hardware (ENET in a FPGA) HLT getting more sophisticated

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, The Age Of The Network BTeV Fast Networking Moves Data from a local environment to a Global One Cheap DRAM allows for large buffering, and latency Cheap Commodity Processors allow for powerful algorithms with large rejection

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, DØ & CDF Typical BIG Experiment ~1 million readout channels Event size is 250 KB 12.5 MB/sec written to tape Trigger – Typical Multi-level Level 1 Level 2 Level 3 Trigger Information Full Readout Detector Hz ~1(.3) kHz ~5(20) kHz ~1(.3) kHz ~2.5 MHz L1:Dead-timeless,pipelined, Hardware L2&L3: can cause deadtime, variable length algorithms L2: Hardware and Software L3: CPU Farms Data for > 2.5 year

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Triggering Strategy Similar but Different Approaches driven by Physics CDF: More emphasis on B Physics Jet of particles w/displaced track. Hard to pick out of background Too sophisticated to do well at L1 Pumps the extra data into L2 (CDF) DØ cuts harder at L1 L1 Trigger better at object correlations CAL, Tracks (connection), etc….

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Tracking at the TeV –Track finding is difficult Must Implement hit searches and loops in hardware. Use FPGAs in the Track Trigger –Fiber Tracker (Scintillating Fiber) – DØ –Open Cell Drift Chamber - CDF –Finds Tracks in bins of p T 1.5, 3, 5, and 10 DØ –Done by equations: Layer 1 Fiber 5 & Layer 2 Fiber 6 & etc. Inefficiencies are just more equations –Firmware becomes part of the trigger Versioning Fast & Flexible - Can be reprogrammed later. Painful for the trigger simulation! CFT Layers Forward Preshower Track –80 sectors –1-2 chips per sector –16K equations per chip –One eqn per possible track.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Drift Chamber Tracking DØ’s Fiber Tracker gives instant position information –No drift time to account for. CDF divides hits into two classes –Drift time < 33 ns –Drift time ns This timing information is used as further input to their FPGAs for greater discrimination. –This is not done on every wire. Prompt Hit Non Prompt Hit Babar also uses FPGAs & LUTs to do track finding 2ns beam crossing time! Babar also uses FPGAs & LUTs to do track finding 2ns beam crossing time! The Central Track Trigger System of D0 Experiment The D0 Central Tracking Trigger New Trigger Processor Algorithm for a Tracking Detector in a Solenoidal Magnetic Field N36-57 N36-55 N14-5

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Trigger Manager Simple matching between detectors –track-cal –Usually first time Decision Logic is always programmable –Often part of Run configuration –Human Readable Trigger List May manage more than one Trigger Level. Usually contains scalars –Keeps track of trigger live-time for luminosity calculations. Detector ADetector BDetector C Trigger Logic For A Trigger Logic For B Trigger Logic For C BX Synchronization Decision Logic (prescales too) Readout Data to Next Level FEC Noti- fication 7 BX4 BX9 BX Per Trig Scalars

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Level 2 Architecture Global CAL Preprocessor Si Preprocessor Common platform based on embedded DEC Alpha CPU and common communication lines for Run I Hybrid of Commodity Processor & Custom Hardware Custom Hardware Control CPU CPU Similar to L1 Global Upgrade CDF: Pulsar Board DØ: L2  Boards CDF Pulsar Project N29-4 An Impact Parameter Trigger for the DØ Experiment N29-5

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, High Level Trigger Move the data from the ROCs Build the Event Process Event in Physics Software Deliver accepted events Move the data from the ROCs Build the Event Process Event in Physics Software Deliver accepted events ~100 Read Out Crates ~ MB/sec ~100ms - 1 second/event processing time Both experiments use networking technology ATM w/concentrators, SCRAMNet for routing Gb Ethernet with central switch, ENet for routing See the Online/DAQ Sessions

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, CDF Data Flows when Manager has Destination –Event Builder Machines One of first to use switch in DAQ. Dataflow over ATM –Traffic Shaping Backpressure provided by Manager.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, DØ High Level Trigger Network Based DAQ The Linux-based Single Board Computer for Front End Crates in the DZERO DAQ System Dataflow in the DZERO Level 3 Trigger/DAQ System N36-71 N36-70

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Fermi TeV Planned Upgrades Peak Luminosity matters for Trigger/DAQ Tevatron Trigger Upgrades are Under review –Decreased Tevatron Luminosity Expectations Both experiments still believe they will be required. L1 CAL Sliding Window (ATLAS algorithm) L1 CAL Track Match Track Stereo info used at L1 L2 CPUs L2 Si Trigger improvements ~2.8e32 Accelerator draft plan: Peak luminosities Peak Luminosity (x10 30 cm -2 sec -1 ) Start of Fiscal Year ~1.6e Today(4.5x10 31 ) All under Review COT Readout Speedup L1 Track Trigger L2 Si L2 CPUs L3 Network DAQ Upgrade 13 racks of ’88 to 3 of present day, and better functionality (planned CAL upgrade in DZERO)

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, ATLAS & CMS O(10) Larger Almost no custom Hardware after L1 L1: 75kHz HLT: 100 Hz L1: 75kHz HLT: 100 Hz Both Experiments use full Network DAQ after L1 Solve Data Rate Problem Differently Big Switch (CMS) ROI (ATLAS) Event size: 1MB But Approach Is Similar 75GB/sec

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, CMS Trigger Architecture CMS Must operate at 100 kHz 50 startup CMS eISO Card CMS Uses only Muon & Calorimeter Tracking too Complex & Large Muon Tracking Both ASIC and FPGA Pipeline is on-detector (3.2  s)

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, ATLAS Level 1 Similar Design 2  s Pipe Line CAL Cluster Finder CPU Farm ( dual CPU, ~1ms/event) Offline L2 by CDF like XFT proposed Level 2

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Regions of Interest (ROI) Bandwidth/Physics Compromise –Farm usually reads out entire detector –Hardware often look at single detector ROI sits in the middle –Farm CPU requests bits of detector Uses previous trigger info to decide what regions of the detector it is interested in. –Once event passes ROI trigger, complete readout is requested and triggering commences. Flexible, but not without problems. –Must keep very close watch on trigger programming! –Farm decisions happen out-of-event-order Pipelines must be constructed appropriately. ATLAS, HERA-B, BTeV… L2 Farm CPU Basic Cal L1 Info Ele Conf Full Trigger

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, CMS High Level Trigger NO L2 Trigger 1 second buffering in Hardware (100Kx1meg) 1000 HLT Farm Nodes 700 ROC, 64 Event Builders Build Events in Two Stages 8 Planes Will use Myrinet for first plane GB under evaluation 8x8 1 of 8 Planes ROC EVB

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, LHC Posters And Papers Full Crate Test and Production of the CMS Regional Calorimeter Trigger System N29-3 N14-2 The First-Level and High-Level Muon Triggers of the CMS Experiment at CERN The ATLAS Liquid Argon Calorimeters Readout System ATLAS Level-1 Calorimeter Trigger: Subsystem Tests of a Jet/Energy-sum Processor Module Test Beam Results from the ATLAS LVL1 Muon Barrel Trigger and RPC Readout Slice Beam Test of the ATLAS End-cap Muon Level1 Trigger System N29-2 N36-68 N29-1 N14-2

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, BTeV 23 million pixels – in Level 1! 132 ns Bunch Spacing 100 kb/event Trigger full rate. 800 GB/sec!! B s Mixing, Rare Decays Level 1 1.Hit clustering in Pixels – FPGAs 2.Cluster linking in inner and outer layers – FPGAs 3.Track finding in B field (p T >0.25) – Embedded CPU Farm 4.Hard Scatter Vertex Finding – Embedded CPU Farm 5.DCA test for displaced tracks due to B decays – Embedded CPU Farm 8 Planes of L1+L2/L3 (round robin)

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, BTeV Trigger Design 800 GB/sec optical links Global L1 L1 Farms L2/L3 Farms Buffering for 300k interactions (300ms) 200MB/sec 3.8 kHz 78 kHz

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, BTeV HLT Algorithm Large # of High End Boxes (dual 4 GHz) Uses ROI – Similar to ATLAS But L2/L3 are interleaved on same box Trigger/DAQ TDR Just Out Tremendous amount of simulation work Pre-prototype of the BTeV Trigger Level 1 Farm Processing Module. N36-66 N36-52 Hash Sorter - Firmware Implementation and an Application for the Fermilab BTeV Level 1 Trigger System N36-65 Real-Time Embedded System Support for the BTeV Level 1 Muon Trigger N36-61 Failure Analysis in a Highly Parallel Processor for L1 Triggering N29-7 Data Flow Analysis of a Highly Parallel Processor for a Level 1 Pixel Trigger

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, The Human Factor Anonymous messing up DAQ system from home System are complex 1000’s of interacting computers Complex Software with 100’s of thousands of lines of code We get the steady state behavior right. What about the shifter who does a DAQ system reset 3 times in a row in a panic of confusion? The expert playing from home? Self Healing Systems? The Microsoft Problem? (too smart for their own good)

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Others Too much out there to be covered in this talk! ZEUS has added a CPU farm to its custom Hardware L1/L2 trigger LHCb uses a 3D torus network topology to move events through its farm quickly (1200 CPUs) FPGAs come with Ethernet Controllers built-in The Programmable NIC as a switch

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Other HEP Talks and Posters N29-6 BaBar Level 1 Drift Chamber Trigger Upgrade N14-4 The Trigger System of the COMPASS Experiment N36-59 A First Level Vertex Trigger for the Inner Proportional Chamber of the H1 Detector N36-60 The Trigger System for the New Silicon Vertex Belle Detector SVD 2.0 N36-62 Rapid 3-D Track Reconstruction with the BaBar Trigger Upgrade N36-64 The ZEUS Global Tracking Trigger N36-69 Electronics for Pretrigger on Hadrons with High Transverse Momentum for HERA-B Experiment N36-56 A Pipeline Timing and Amplitude Digitizing Front-End and Trigger System

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Air Shower Observatories 1.5km Looking for ultra high energy cosmic rays (beyond GZK) Auger detectors 1.5 km between each 4 communications concentrators in the array PLCs, ASICS, Power PC Low Data Rate (1200bps) Low Power (Solar) Low Data Rate (1200bps) Low Power (Solar) Central Cell Fires (20 Hz) Look at surrounding rings to decide N14-6 PLD First Level Surface Detector Trigger in the Pierre Auger Observatory N36-54 The Trigger System of the ARGO-YBJ Experiment 10 second pipeline

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Medical Applications Decay Detection No Accelerator Clock Positron Emission Tomography (PET) Trigger is a Scintillator Coincidence BGO Scintillator Reduced matching window increases resolution

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, PET BGO Different strength signals Different rise times Signal Threshold doesn’t have good enough timing resolution!

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, PET Time Amplitude Fit signal to BGO Model Fit and Error Matrix done in FPGA Allows a significant reduction in window width (9.2ns) N36-58 Development of a High Count Rate Readout System Based on a Fast, Linear Transimpedance Amplifier for X-ray Imaging N36-63 A New Front-End Electronics Design for Silicon Drift Detector

Review of Triggering in HEP Gordon Watts University of Washington, Seattle NSS 2003 Oct 21, Conclusions In HEP the Network rules the day –Enables experiments to get localized data off detector –And into CPU with Global View –Global Trigger Decisions have more discriminating power. Technology Continues to Make Hardware Look more like Software –FPGA’s increase in complexity –Moves complex algorithms further up the trigger chain –Following commodity hardware – CPUs, Embedded processors, etc. Where next? Most adopting Ethernet or its near relatives –A few important exceptions. A Great Session with Lots of Good Talks and Posters! Enjoy! A Great Session with Lots of Good Talks and Posters! Enjoy!