Triggering In High Energy Physics Gordon Watts University of Washington, Seattle Instr99.

Slides:



Advertisements
Similar presentations
Freiburg Seminar, Sept Sascha Caron Finding the Higgs or something else ideas to improve the discovery ideas to improve the discovery potential at.
Advertisements

Tuan Tran. What is CISC? CISC stands for Complex Instruction Set Computer. CISC are chips that are easy to program and which make efficient use of memory.
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
The Silicon Track Trigger (STT) at DØ Beauty 2005 in Assisi, June 2005 Sascha Caron for the DØ collaboration Tag beauty fast …
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
Status of Eric Kajfasz CPPM/Fermilab GDR-Susy, Lyon - 26 novembre 2001.
FPCP 2002, 05/16-18/2002 p. 1 Richard E. Hughes, The Ohio State UniversityCDF Run II Status Status of CDF and Prospects Flavor Physics and CP Violation.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
Triggering In High Energy Physics Gordon Watts University of Washington, Seattle NSS 2003 Thanks to interactions.org for the pictures! N14-1.
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
Workshop on Quarkonium, November 8-10, 2002 at CERN Heriberto Castilla DØ at Run IIa as the new B-Physics/charmonium player Heriberto Castilla Cinvestav-IPN.
BEACH Conference 2006 Leah Welty Indiana University BEACH /7/06.
Emulator System for OTMB Firmware Development for Post-LS1 and Beyond Aysen Tatarinov Texas A&M University US CMS Endcap Muon Collaboration Meeting October.
Tracking at the ATLAS LVL2 Trigger Athens – HEP2003 Nikos Konstantinidis University College London.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
J. Christiansen, CERN - EP/MIC
David N. Brown Lawrence Berkeley National Lab Representing the BaBar Collaboration The BaBar Mini  BaBar  BaBar’s Data Formats  Design of the Mini 
Technical Part Laura Sartori. - System Overview - Hardware Configuration : description of the main tasks - L2 Decision CPU: algorithm timing analysis.
A Front End and Readout System for PET Overview: –Requirements –Block Diagram –Details William W. Moses Lawrence Berkeley National Laboratory Department.
Chapter 8 CPU and Memory: Design, Implementation, and Enhancement The Architecture of Computer Hardware and Systems Software: An Information Technology.
The CDF Online Silicon Vertex Tracker I. Fiori INFN & University of Padova 7th International Conference on Advanced Technologies and Particle Physics Villa.
EE3A1 Computer Hardware and Digital Design
Event selection and readout Online networks and architectures Online event filter Technologies and trends Computing and communication at LHC.
All Experimenters MeetingDmitri Denisov Week of July 16 to July 22 D0 Summary  Delivered luminosity and operating efficiency u Delivered: 1.6pb -1 u Recorded:
Chapter 4 MARIE: An Introduction to a Simple Computer.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Dec.11, 2008 ECL parallel session, Super B1 Results of the run with the new electronics A.Kuzmin, Yu.Usov, V.Shebalin, B.Shwartz 1.New electronics configuration.
G. Volpi - INFN Frascati ANIMMA Search for rare SM or predicted BSM processes push the colliders intensity to new frontiers Rare processes are overwhelmed.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
Jefferson Laboratory Hall A SuperBigBite Spectrometer Data Acquisition System Alexandre Camsonne APS DNP 2013 October 24 th 2013 Hall A Jefferson Laboratory.
FPGA-Based System Design: Chapter 1 Copyright  2004 Prentice Hall PTR Moore’s Law n Gordon Moore: co-founder of Intel. n Predicted that number of transistors.
ATLAS Trigger Development
SVT Nov 7, 2002Luciano Ristori - Vertex2002_7Nov02.ppt1 SVT The CDF Silicon Vertex Trigger Vertex 2002 Luciano Ristori Istituto Nazionale di Fisica Nucleare.
The DØ Silicon Track Trigger Wendy Taylor IEEE NSS 2000 Lyon, France October 17, 2000  Introduction  Overview of STT  STT Hardware Design u Motherboard.
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
First physics results with the Silicon Vertex Trigger at CDF-II 2 nd Workshop on the CKM Unitarity triangle. April 5 th -9 th IPPP Durham, England, UK.
Hardeep Bansil (University of Birmingham) on behalf of L1Calo collaboration ATLAS UK Meeting, Royal Holloway January 2011 Argonne Birmingham Cambridge.
DØ Beauty Physics in Run II Rick Jesik Imperial College BEACH 2002 V International Conference on Hyperons, Charm and Beauty Hadrons Vancouver, BC, June.
1 Experimental Particle Physics PHYS6011 Fergus Wilson, RAL 1.Introduction & Accelerators 2.Particle Interactions and Detectors (2) 3.Collider Experiments.
A Fast Hardware Tracker for the ATLAS Trigger System A Fast Hardware Tracker for the ATLAS Trigger System Mark Neubauer 1, Laura Sartori 2 1 University.
Living Long At the LHC G. WATTS (UW/SEATTLE/MARSEILLE) WG3: EXOTIC HIGGS FERMILAB MAY 21, 2015.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
Evelyn Thomson Ohio State University Page 1 XFT Status CDF Trigger Workshop, 17 August 2000 l XFT Hardware status l XFT Integration tests at B0, including:
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
Nikhef Scientific Meeting 2000Onne Peters D0 Muon Spectrometer December 14-15, Amsterdam Onne Peters Nikhef Jamboree 2000.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
Off-Detector Processing for Phase II Track Trigger Ulrich Heintz (Brown University) for U.H., M. Narain (Brown U) M. Johnson, R. Lipton (Fermilab) E. Hazen,
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
ATLAS calorimeter and topological trigger upgrades for Phase 1
More technical description:
2018/6/15 The Fast Tracker Real Time Processor and Its Impact on the Muon Isolation, Tau & b-Jet Online Selections at ATLAS Francesco Crescioli1 1University.
ALICE – First paper.
Trigger, DAQ, & Online: Perspectives on Electronics
eXtremely Fast Tracker; An Overview
Kevin Burkett Harvard University June 12, 2001
The Silicon Track Trigger (STT) at DØ
Example of DAQ Trigger issues for the SoLID experiment
Commissioning of the ALICE-PHOS trigger
Experimental Particle Physics PHYS6011 Putting it all together Lecture 4 6th May 2009 Fergus Wilson, RAL.
LHCb Trigger, Online and related Electronics
Experimental Particle Physics PHYS6011 Putting it all together Lecture 4 28th April 2008 Fergus Wilson. RAL.
The CMS Tracking Readout and Front End Driver Testing
The LHCb Front-end Electronics System Status and Future Development
Presentation transcript:

Triggering In High Energy Physics Gordon Watts University of Washington, Seattle Instr99

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Outline Introduction The Collider Environment Trigger Design –Hardware Based Triggers –Software Based Triggers –Farms Conclusions I am bound to miss some features of the triggering systems in various detectors. I have attempted to look at techniques and then point out examples. Thanks to the many people who helped me with this talk (too numerous to mention).

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Why Trigger It’s obvious Still, stunning to consider Level 1 Level 2 Level 3 Trigger Information Full Readout Detector ~50-70 Hz ~1 kHz ~10 kHz ~1 kHz ~7 MHz LevelAccept Rate TB/year$ Tape (raw only) Raw7 MHz25,700,000 52,700 m L110 kHz32, m L21 kHz3, m L350 Hz ,000 Assume 50% duty cycle for accelerator 15,768,000 working seconds in a year Event size 250 kB $ are for raw events only $1 m per year including reco data

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, History: Bubble Chambers Bubble Chambers, Cloud Chambers, etc. (4  ) –DAQ was a stereo photograph! –Low level trigger was the accelerator cycle Each expansion was photographed –High level trigger was human (scanners). –No Trigger Only most common processes were observed. Slow repetition rate. Some of the high repetition experiments (>40 Hz) had some attempt at triggering. Note: Emulsions are still used in detectors today

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, History: Fixed Target Triggers around early (1964’s Cronin & Fitch CP Violation Experiment) Limited –No pipelines –Similar to today’s Level 1 Triggers –Large deadtime possible during readout –Good trigger required no further interactions after first trigger (24ns) PDP-11 VAX 780 Target Beam Spectrometer EM Cal Had Cal dEdx Scint Memory Lookup E T Sum Trigger Module I/O CAMAC E557/E672 Jet Production E557/E672 Jet Production

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, History: Modern Day Modern Day Detectors –~million channels –Collision Frequency in the MHz –Multilevel Triggers High end electronics Capable of discarding common physics (background) in favor of more rare processes (like top quark production; 1 in a few billion). –Large farms to reconstruct data –Terabytes worth of analyzed data! –100’s of collaborators

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Constraints Bunch Spacing Bunch Train Length Sustained Duty Cycle Cost Trigger Complexity What is required to make the decision? Simple Jet Finding Complex Track Finding Inter-detector Correlations Implementation Multi Level Low level feature extraction ASIC, PLA, etc. Farms PC Platforms Transfer Technology ATM, Gigabit, etc. Technology/Capability Trade off. Upgradability Detector Environment Pileup Multiple Interactions Beam-gas Interactions Accelerator Physics Output Accept Rate Archive Ability Reconstruction Capability Trigger Requirements Rejection % Deadtime Input/Output Rate Offline Cost Trigger and DAQ go hand in hand DAQ Moore’s Law for Triggers

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Accelerators AcceleratorTime between collisions (ns) Luminosity (10 30 cm -1 s -1 ) Energy (GeV) CESR (CLEO) KEKB (Belle) 210,0008 x 3.5 PEP-II (BaBar) 4.23,0009 x 3.1 LEP (Aleph, Delphi, Opal, L3) (103??) AcceleratorTime between collisions (ns) Luminosity (10 30 cm -1 s -1 ) Energy (GeV) HERA (H1, Zeus) x 30 TeV (DØ, CDF) 396 (132)2002,000 LHC (Atlas, CMS) 2510,00014,000 e+e- ep, pp, pp Source: PDB’98

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Detectors DetectorNumber of Channels Silicon Part of Trigger? Input Trigger Rate Largest (Non) Physics Background CLEO 400,000No L1: 72 MHz L2: 1 kHz Tape: <100 Hz electron pairs Belle 150,000Not Yet L1: 50 MHz L2: 500 Hz Tape: 100 Hz BhaBha &  BaBar 150,000No L1: 25 MHz L3: 2 kHz Tape: 100 Hz BhaBha &  H1, ZEUS 500,000No L1: 10 MHz L2: 1 kHz L3: 100 Hz Tape: 2-4 Hz Beam-gas HERA-B 600,000Yes (L2) L1: 10 MHz L2: 50 kHz L3: 500 Hz L4: 50 Hz Tape: 2 Hz Beam-wire scattering Inelastics Aleph, Opal, L3, Delphi kNo L1: 45 kHz Tape: 15 Hz beam-gas CDF (Run 2), DØ (Run 2) 1,000,000Yes (L2) L1: 7 MHz L2: kHz L3:.3-1 kHz Tape: 50 Hz QCD, pileup, multiple interactions

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Trigger Design Backpressure Bandwidth to Archive (Tape Robot) –Tape I/O lags behind many other computer components. –A HEP experiment can write TeraBytes of data. –Media can run over a million US$ a year (CDF, DØ). –How much CPU power for post-triggering reconstruction? If it isn’t done online? Cannot save all raw data all the time –Facilitates the study of Trigger Rates, Efficiencies –Enables one to repeat Reconstruction BTeV proposes to write only reconstructed data. Don’t Prescale Physics! Don’t Prescale Physics! Disk & Tape Reconstruction Farm Eliminate Background As Early As Possible Eliminate Background As Early As Possible

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Typical Designs Required rejection is orders of magnitude Cannot do it at beam crossing rate –Algorithms to attain required rejection are too sophisticated. –Accelerator backgrounds can also contribute to the problem e + e - vs pp Multi-Level trigger –Algorithms implemented in Hardware Specialized, often matched to geometry of detector –Algorithms implemented in Software Farm –Or a mix Level 1 (HW) Level 2+ (SW/HW) Detector Course Grained Readout Full Resolution Readout To Storage DØ: 7MHz DØ: 10kHz DØ: 50Hz Real Time

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Multi Level Trigger Level 1 is hardware based –Identifies Jets, Tracks, Muons, Electrons –Operates on reduced or course detector data Level 2 is often a composite –Hardware to preprocess data Some Muon processors, Silicon Triggers –Software to combine Matches, Jet finders, etc. Level 3 is a farm –General Purpose CPUs Almost every one uses this scheme –Vary number and function of levels. Even GLAST uses them Gamma-ray Large Area Space Telescope 1 of 25 towers Si Tracker L1 Cal L1 Tracking L1 Decision HW Cal HW L2 CPU (in tower) Full Readout L3 CPU (Farm) To Earth Upon detection of Gamma-ray burst notify by internet within 5 seconds

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Typical Designs Higher level trigger decisions are migrating to the lower levels –TeV and LHC detectors have very tough Level 1 rejection requirements –Correlations that used to be done at Level 2 or Level 3 in are now done at Level 1. Recent Developments in Electronics –Line between software and hardware is blurring –Complex Algorithms in HW –Possible to have logic designs change after board layout. Hardware Triggers Software Triggers Software Migration is following functional migration Good, since complex algorithms often have bugs! Characteristics Function

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Algorithms in Hardware Cut out simple, high rate backgrounds –beam-gas (HERA) z vertex –QCD (TeV) jet energy, track matching Capabilities are Limited –Feature Extraction from single detectors Hadronic or EM Jets, tracks, muon stubs –Combined at Global Trigger EM Object + Track Characteristics: –High speed & Dead- timeless –Limited ability to modify algorithm But thresholds typically can be modified –Algorithms Frequently Matched to the detector (and readout) geometry Vertical Design Built from Modern Components –Custom (ASICs) –Programmable Logic Arrays (FPGAs, etc.)

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Belle Level 1 Block Diagram Each detector has its own trigger device –Track finder, jet finder, etc. These feed a global decision maker which has limited ability to make correlations –it usually does not get the raw data. Cal Central Open-cell Tracker Muon XFT Muon Preproc XTRP L1 Cal L1 Track L1 Muon L1 Global Accelerator Clock

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Pipelines Trigger cannot cause deadtime during decision –Not only is Beam Crossing (BX) too run full algorithm –Many Simple algorithms take longer than a single BX To make dead timeless must have pipeline to store everything in FE –Switched Capacitor Arrays (SCAs) –DRAM –Shift Registers –Sample & Hold Shaping –Old fashion Delay Lines Most detectors us a combination of techniques –Some new ones, like BTeV use a uniform technique Data for BX # Dec. for BX 1 Dec. for BX 2 Dec. for BX 3 Dec. for BX 4 Dec. for BX 5 Dec. for BX 6 Dec. for BX 7 Pipelined Trigger Trigger Output

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Track Finding Hardware is well suited to simple questions –A Calorimeter Tower over threshold Gang together towers and do simple charge discrimination –Track finding is much more difficult Must Implement hit searches and loops in hardware Track finding –Axial Only Simple in that only the r-  view needs to be looked at. Large Logic Arrays are the way to solve this problem. –Stereo Information Have to determine where layers intersect. Cuts down on background. Shift Registers can be used, or more complex techniques. Complexity increases as a function of the number of intersections there are along a stereo element.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, The Field Programmable Gate Array Board on a chip –Revolutionized the way board design is done. –Logic design is not frozen when the board is laid out. But much faster than running a program in microcode or a microprocessor. –Can add features at a later time by adding new logic equations. –Basically a chip of unwired logical gates (flip-flops, counters, registers, etc.) The FPGA allows for the most logic per chip. Simple Programmable Logic Devices (SPLD) Complex Programmable Logic Devices (CPLD) Field Programmable Gate Arrays (FPGA) Field Programmable InterConnect (FPIC) Simple Programmable Logic Devices (SPLD) Complex Programmable Logic Devices (CPLD) Field Programmable Gate Arrays (FPGA) Field Programmable InterConnect (FPIC) Several Types Transistor Logic Chip Single Task IC Gate Array

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Tracking at the TeV Use FPGAs in the Track Trigger –Fiber Tracker (Scintillating Fiber) – DØ –Open Cell Drift Chamber - CDF –Finds Tracks in bins of p T 1.5, 3, 5, and 10 DØ –Done by equations: Layer 1 Fiber 5 & Layer 2 Fiber 6 & etc. Inefficiencies are just more equations –Firmware becomes part of the trigger Versioning Fast & Flexible - Can be reprogrammed later. CFT Layers Forward Preshower Track –80 sectors –1-2 chips per sector –16K equations per chip –One eqn per possible track.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Drift Chamber Tracking DØ’s Fiber Tracker gives instant position information –No drift time to account for. CDF divides hits into two classes –Drift time < 33 ns –Drift time ns This timing information is used as further input to their FPGAs for greater discrimination. –This is not done on every wire. Prompt Hit Non Prompt Hit Babar also uses FPGAs & LUTs to do track finding 2ns beam crossing time! Babar also uses FPGAs & LUTs to do track finding 2ns beam crossing time!

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, H1 Vertex Finder Beam-Gas is the largest cause of background at HERA –Vertex position is not near interaction point Vertex Trigger –Does tracking by finding rays using a custom CMOS gate array –Histogram the z Uses FPGA to build the histogram High speed static RAM lookup to find result. FPC COP CIP Ray Finder Card Histogram of Z Vertex Histogram Evaluation 256x 1x

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Neural Networks H1 has a NN Trigger –Used to look for missed or odd physics and reject certain backgrounds. Neural Net trigger is based on CNAPS chips –Simple Massively Parallel Processor –Suited to training and evaluation of many-node NN. Up to 64 hidden nodes, 64 inputs. –20  s decision time –9 units, trained on physics (bb, cc, etc.) and the same background. Trained offline –~ 2000 signal events –~ 2000 background Preprocessor Board –Formats data (FPGAs) –Can implement simple algorithms (jet finding) CDF implemented a tau trigger –Analog Chip (Intel ETANN) –Worked well enough to see a W peak

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Trigger Manager Often first time there are inter-detector correlations –simple matching Decision Logic is always programmable –Often part of Run configuration –Human Readable Trigger List May manage more than one Trigger Level. Usually contains scalars –Keeps track of live-time for luminosity calculations. Detector ADetector BDetector C Trigger Logic For A Trigger Logic For B Trigger Logic For C BX Synchronization Decision Logic (prescales too) Readout Data to Next Level FEC Noti- fication 7 BX4 BX9 BX Per Trig Scalars

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Belle Inputs to Global Belle’s inputs into the global are typical –# of tracks are done by bit numbers –Correlation information is done the same way. Some experiments have 256 or more inputs to L1 Global.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Decision Logic Simple AND requirements –A jet AND a track and a MUON stub More Complex OR requirements –A jet in Quad 1 with a track in Quad 1 OR a jet in Quad 2 with a track in Quad 2 etc. FGAs and truth tables well suited to this. –Can be arbitrarily complex trigger requirements. Prescales are adjustable… or

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Algorithms in Software Sophisticated Algorithms –Frequently separating background physics processes from the interesting processes. –Some experiments run their offline reconstruction online (Hera-B, CDF) Perhaps 2D tracking instead of 3D tracking. –Data from more than one detector often used. Two common implementations: –DSPs tend to live at lower levels of the trigger. Stub finders in muon DØ’s muon system. Transputers in ZEUS’ L2, DSPs on HERAB L2 & L3 –CPU Farms tend to be at the upper end Sometimes difficult to classify a system –Blurring of line between software and hardware.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Hybrids – Level 2 at TeV Use a combination of both Hardware and General Purpose Processors CDF & DØ’s Level 2 Trigger –Each box is a VME crate –Contain specialized cards to land and reformat data from detector front-ends –Contain ALPHA processor to run algorithms. L2FW:Combined objects (e, m, j) L1FW: towers, tracks L2STT Global L2 L2CFT L2PS L2Cal L1 CTT L2 Muon L1 CAL L1 Muon L1FPD DØ’s Level 2

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Hybrids – SVT Both CDF & DØ will trigger on displaced tracks at Level 2 –DØ’s will be another specialized set of L2 crates. Input to Level 2 Global –Select Events with more than 2 displaced tracks –Resolution is almost as good as offline. –Refined track parameters Track Fitters –CPU Farm. Hit Finders XFT Associative Memory Hit Buffers Track Fitters SVXCOT Roads CDF’s SVT To L2 Global

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Associative Memory –High speed matching of 4 SVX clusters and a XFT cluster To a valid road (p T > 1.5 GeV/c) –A Reverse Lookup Normally a valid road index would give back the XFT and SVX hit parameters Operates similar to a CAM. –Each AM Chip does 128 roads. –Custom CMOS. Reverse Lookup Table XFT Hit Parameters SVX Layer 1 Cluster Param SVX Layer 2 Cluster Param SVX Layer 3 Cluster Param SVX Layer 4 Cluster Param Valid Road Number Or Invalid Road

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, DØ SVT DØ’s Silicon Track Trigger is similar in design to CDF’s Operates at a reduce speed –10kHz instead of 50 kHz. Uses FPGAs instead of Associated Memory. –Track fitters also use DSPs Cluster Finders L1 CFT Cluster Filter Track Fitters SMTCFT Roads + associated SMT clusters DØ’s STT To L2 Track Preprocessor

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Farm Triggers Industry has revolutionized how we do farm processing –Wide scale adoption by HEP. –Online and Offline. –One CPU, one event Not massively parallel Data must be fed to farm –As many ways as experiments –Flow Management & Event Building Farm Control –Distributed system can have 100’s of nodes that must be kept in sync. DAQ & HW Triggers Trigger Farm To Tape or Further Triggering

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Readout Control & Event Building Events come from many Front End Crates (FECs) –Many Sources directed at the one CPU doing the work. –Directed FECs know what each node needs and start sending it –Requested Farm Node Requests the data it wants (ROI). Event Builder – Final assembly of all FECs –CDF has a separate machine (16) –DØ does it in the Trigger Farm Node –ATLAS, CMS, HERA-B do it bit-by-bit.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, DØ Data Flows as soon as there is a L2 Accept –Destination is unassigned –Data Driven (each packet contains an event number). Backpressure accomplished via recirculation loops –When full, no new data can enter the system. Event built in node

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, CDF Data Flows when Manager has Destination –Event Builder Machines One of first to use switch in DAQ. Dataflow over ATM –Traffic Shaping Backpressure provided by Manager.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Regions of Interest (ROI) Bandwidth/Physics Compromise –Farm usually reads out entire detector –HW often look at single detector ROI sits in the middle –CP Farm CPU requests bits of detector Uses previous trigger info to decide what regions of the detector it is interested in. –Once event passes ROI trigger, complete readout is requested and triggering commences. Flexible, but not without problems. –Pipelines in front ends must be very deep –Farm decisions happen out-of-event-order Pipelines must be constructed appropriately. ATLAS, HERA-B, BTeV… Farm CPU Basic Cal L1 Info Ele Conf Full Trigger

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, HERA-B Looks for B 0 to J/  K 0 S Level 1 is pipelined –Results distributed to L2 processors Level 2 processors request only what data they want –If pass, request full readout.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, High Speed Data Links Data Rate into Farm is High –Require high bandwidth technology Commodity or Custom Technology –Commodity Technologies ATM Gigabit Ethernet Fibre Channel –Custom Technologies Often built on off-the- shelf protocols (and driver chips), without the full blown protocol (like IP) ATM –Wasn’t adopted by the mass market –Still expensive for what you get. –Not obviously well suited to physics Cell Size, etc. –GigaBit looks much more attractive now.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Control The event farms can have 100’s of nodes. –Complex configuration possible Especially during commissioning and calibration runs –Must be automated Farm Steward –A Resource Manager –Communicates with the rest of the trigger CORBA – defacto standard TCP/IP – often with layers, isn’t cross-platform. Physics Algorithms Framework –Framework optimized for fast processing Algorithms become plug-in modules –Designed to run both online and offline.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Monitoring –Continuous dump of L1 scalars and higher level pass/fail rates –Control room monitoring of those rates as a function of time –CPU Node monitoring WEB Access –Many experiments put real-time data on the web trigger rates, pictures of the latest event, DAQ status, etc.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Farm OS Intel Compatible CPUs have a lock –Commodity, cheap, very fast Rumors of 1 GHz AMD Athalon at COMDEX this week with no nitrogen cooling?? –There are a number of OSes that run on this platform BeOS, Linux, Solaris, Windows NT –Linux seems to have won out Similar to HEP’s mainframe tools: UNIX DØ is using Windows NT on its L3 Farm

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Conclusions Sophisticated Triggers moving into HW –Algorithms too complex for hardware can be put on a single chip. –Software algorithms are becoming more complex to retain the same rejection rates. Blurring of Software and Hardware –Programmable Chips (FPGAs) common in all levels of Trigger/DAQ. Two rounds of experiments coming on line –Trigger Farms with 100’s of computers –ATM switch concept test We’ve Followed the Commodity Market –PLAs, Cheap CPUs –Where is it heading now?? Embedded systems, small embedded systems with network connections.

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Tevatron Near Future Latest Schedule has Run 2a starting March 1, –Detectors rolled in and taking data. Commissioning, calibration, understanding new detectors & triggers… Run 2b –Starts after 2-4 fb -1 –Replace parts of Si –Upgrade Tracking Trigger for high luminosity environment? Run 2 CHEP 2001 Will have some real experience to report on!

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Future Triggers are driven by their environment. What Accelerators in the Future –Near Term: Linear Collider (LC) Muon Collider Very Large Hadron Collider (VLHC) –Long Term: ?? Era of large Colliders drawing to a close?

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Future Technology has driven accelerators to this day

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Future Silicon Triggering –Pixels –On board Triggering

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Future Trends HEP has stolen from the market place before –High Speed Networking –PLAs –Commodity CPUs –Not always the best match Where is the market moving now? –Small networked devices –Wireless devices and very-local area networks –embeded systems How will this affect triggering? –Triggers part of detector based readout chips –Fewer central processors at earlier stages of trigger –More complex trigger decisions at earlier times Less bandwith out of detector??

Review of Triggering in HEP Gordon Watts University of Washington, Seattle Instr99 Nov 17, Track Finding In Drift Chambers If drift time is larger than BX –Must wait several BXs until all data arrives. –In meantime data may have arrived from an earlier BX. –CDF’s COT: Drift time is xx BX long. Use shift register –One Shift Per BX –Apply masks to shift register after a number of BXs –?? 16,000 masks?? Where was that?? t Central Drift Chamber Wires Track Shift Time BX 1: BX 2: BX 3: Apply Masks For BX 1