Tracking for Triggering Purposes in ATLAS

Slides:



Advertisements
Similar presentations
High Level Trigger (HLT) for ALICE Bergen Frankfurt Heidelberg Oslo.
Advertisements

Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
High Level Trigger – Applications Open Charm physics Quarkonium spectroscopy Dielectrons Dimuons Jets.
A Fast Level 2 Tracking Algorithm for the ATLAS Detector Mark Sutton University College London 7 th October 2005.
CMS High Level Trigger Selection Giuseppe Bagliesi INFN-Pisa On behalf of the CMS collaboration EPS-HEP 2003 Aachen, Germany.
The Silicon Track Trigger (STT) at DØ Beauty 2005 in Assisi, June 2005 Sascha Caron for the DØ collaboration Tag beauty fast …
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
July 7, 2008SLAC Annual Program ReviewPage 1 High-level Trigger Algorithm Development Ignacio Aracena for the SLAC ATLAS group.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
Online Measurement of LHC Beam Parameters with the ATLAS High Level Trigger David W. Miller on behalf of the ATLAS Collaboration 27 May th Real-Time.
ALICE HLT High Speed Tracking and Vertexing Real-Time 2010 Conference Lisboa, May 25, 2010 Sergey Gorbunov 1,2 1 Frankfurt Institute for Advanced Studies,
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
FTK poster F. Crescioli Alberto Annovi
Framework for track reconstruction and it’s implementation for the CMS tracker A.Khanov,T.Todorov,P.Vanlaer.
Tracking at the ATLAS LVL2 Trigger Athens – HEP2003 Nikos Konstantinidis University College London.
Tracking at LHCb Introduction: Tracking Performance at LHCb Kalman Filter Technique Speed Optimization Status & Plans.
Future farm technologies & architectures John Baines 1.
Non-prompt Track Reconstruction with Calorimeter Assisted Tracking Dmitry Onoprienko, Eckhard von Toerne Kansas State University, Bonn University Linear.
ATLAS ATLAS Week: 25/Feb to 1/Mar 2002 B-Physics Trigger Working Group Status Report
The CDF Online Silicon Vertex Tracker I. Fiori INFN & University of Padova 7th International Conference on Advanced Technologies and Particle Physics Villa.
Tracking at Level 2 for the ATLAS High Level Trigger Mark Sutton University College London 26 th September 2006.
Status of Reconstruction in CBM
Valeria Perez Reale University of Bern On behalf of the ATLAS Physics and Event Selection Architecture Group 1 ATLAS Physics Workshop Athens, May
AMB HW LOW LEVEL SIMULATION VS HW OUTPUT G. Volpi, INFN Pisa.
TDAQ Upgrade Software Plans John Baines, Tomasz Bold Contents: Future Framework Exploitation of future Technologies Work for Phase-II IDR.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Commissioning and Performance of the CMS High Level Trigger Leonard Apanasevich University of Illinois at Chicago for the CMS collaboration.
Standalone FLES Package for Event Reconstruction and Selection in CBM DPG Mainz, 21 March 2012 I. Kisel 1,2, I. Kulakov 1, M. Zyzak 1 (for the CBM.
G. Volpi - INFN Frascati ANIMMA Search for rare SM or predicted BSM processes push the colliders intensity to new frontiers Rare processes are overwhelmed.
Trigger Software Upgrades John Baines & Tomasz Bold 1.
ATLAS Trigger Development
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
1/13 Future computing for particle physics, June 2011, Edinburgh A GPU-based Kalman filter for ATLAS Level 2 Trigger Dmitry Emeliyanov Particle Physics.
Régis Lefèvre (LPC Clermont-Ferrand - France)ATLAS Physics Workshop - Lund - September 2001 In situ jet energy calibration General considerations The different.
A Fast Hardware Tracker for the ATLAS Trigger System A Fast Hardware Tracker for the ATLAS Trigger System Mark Neubauer 1, Laura Sartori 2 1 University.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
FTK high level simulation & the physics case The FTK simulation problem G. Volpi Laboratori Nazionali Frascati, CERN Associate FP07 MC Fellow.
David Lange Lawrence Livermore National Laboratory
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
GUIDO VOLPI – UNIVERSITY DI PISA FTK-IAPP Mid-Term Review 07/10/ Brussels.
WP8 : High Level Trigger John Baines. Tasks & Deliverables WP8 Tasks: 1.Optimize HLT Tracking software for Phase-I 2.Optimize Trigger Selections for Phase-I.
John Baines on behalf of the ATLAS Collaboration Tracking for Triggering Purposes in ATLAS 1 Connecting the Dots Workshop: Vienna Feb 2016.
CMS Trigger Improvements towards Run II
New Track Seeding Techniques at the CMS experiment
Using IP Chi-Square Probability
More technical description:
ATLAS Upgrade Program Sarah Demers, US ATLAS Fellow with BNL
Project definition and organization milestones & work-plan
IOP HEPP Conference Upgrading the CMS Tracker for SLHC Mark Pesaresi Imperial College, London.
2018/6/15 The Fast Tracker Real Time Processor and Its Impact on the Muon Isolation, Tau & b-Jet Online Selections at ATLAS Francesco Crescioli1 1University.
Silicon Pixel Detector for the PHENIX experiment at the BNL RHIC
ALICE – First paper.
Overview of the ATLAS Fast Tracker (FTK) (daughter of the very successful CDF SVT) July 24, 2008 M. Shochet.
Commissioning of the ALICE HLT, TPC and PHOS systems
Trigger, DAQ, & Online: Perspectives on Electronics
ATLAS L1Calo Phase2 Upgrade
Michele Pioppi* on behalf of PRIN _005 group
Kevin Burkett Harvard University June 12, 2001
The LHC collider in Geneva
The Silicon Track Trigger (STT) at DØ
High Level Trigger Studies for the Efstathios (Stathis) Stefanidis
Commissioning of the ALICE-PHOS trigger
LHCb Trigger, Online and related Electronics
Julie Kirk Rutherford Appleton Laboratory
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
Contents First section: pion and proton misidentification probabilities as Loose or Tight Muons. Measurements using Jet-triggered data (from run).
The LHCb Front-end Electronics System Status and Future Development
Presentation transcript:

Tracking for Triggering Purposes in ATLAS John Baines on behalf of the ATLAS Collaboration Connecting the Dots Workshop: Vienna 22-24 Feb 2016

Overview The ATLAS Detector Trigger Tracking Challenges The ATLAS Tracker & Trigger Trigger Tracking Challenges Solutions Current & Future: Custom Fast algorithms Hardware-based Tracking: Fast TracKer (FTK) Use of co-processors/accelerators Tracking for LHC Phase II Summary

The ATLAS Detector y z Si Tracker: 12 Si detector planes 4 Pixel layers (3 in Run-1) 4 double SCT layers ~100M channels ~200k hits/event f stereo 20 mRad SCT Barrel

Tracking for the Trigger (1) Muon: Match ID and Muon track segments Electron: Match track and calorimeter cluster Lepton isolation: look for reconstructed tracks within cone around lepton Hadronic Tau lepton decays: distinguish hadronic Tau decay from jet background Based on distribution of tracks in Core and Isolation Cones

Tracking for the Trigger (2) Jets originating from b-quark (b-jets): Identify primary and secondary vertex B-Physics Triggers: Invariant mass cuts e.g. Identify J/y(m,m) or B(m,m) Pile-up suppression: Identify number of interactions (no. vertices) Identify which particles in jet come from primary vertex m+ m- J/y Dimuon Invariant Mass

LHC & High Luminosity LHC 7.5 x nominal luminosity Addition of Inner pixel layer (IBL) Trigger upgrades Addition of FTK in 2016 Further trigger upgrades concurrent processing on many-core cpu (+ accelerators) Upgraded ATLAS Detector incl. Replacement of Inner Tracker Upgraded Trigger FTK: Fast TracKer: hardware based tracking First W->en 7 TeV collision data.

Trigger Requirements,Challenges & Solutions Solutions for the Trigger: Reconstruction in Regions of Interest Reduced detector volume reconstructed Knowledge of L1 trigger type enables optimised reconstruction Two stage tracking: Fast Tracking: Initial loose trigger selection using reduced resolution tracks Precision Tracking: full precision tracks for final trigger selection Limit hit combinations: Geometrical constraints using RoI, beamspot and possibly primary vertex info Hardware Trigger: FTK (Run 2) Pattern matching using custom ASIC Acceleration (Future) Exploring use of GPGPUs Requirements: High Efficiency; Low fake rate Excellent track parameter resolution Challenges: Event complexity: many superimposed collisions 45 (Run 1) to 69 (Run 3) to 200 (Run 4) High rate: 100 kHz Run 2&3 to 400 kHz (1MHz) Run 4 Short Time: finite HLT farm size => ~300ms/event for ALL Reco. ~factor 50 faster than offline Huge number of hit combinations

Offline Reconstruction & Storage Trigger in LHC Run 2 & 3 Muon Region of Interest Calorimeter Region of Interest 40 MHz Level 1 Hardware-based (FPGA) Calorimeter & Muon (+ specialized triggers) Inner Detector Calorimeter Muon Fast TracKer: Hardware-based tracking Commissioned during 2016 Full-event track information High Level Trigger: Software-based running on CPU farm Precision Calo and Muon info. Precision Tracking mainly in RoI improved resolution Time budget: <t> ~300 ms / event Level 1 Trigger FPGA <2.5 us Readout Driver ReadoutDriver Readout Driver 100 kHz High Level Trigger ~4200 CPU ~32k Cores DataFlow Event Fragments Built Events Fast TracKer < 300 ms Offline Reconstruction & Storage 1k Hz Note: In Run-1 HLT was split into Level-2 and Event Filter

Run-1 Trigger Tracking Two-step Tracking: Fast-Tracking followed by Precision Tracking 1) Fast-Tracking in Run-1 used 2 strategies: Histogramming method: Zfinder: Identify z-positions of verticies Histogram z of pairs/triplets of hits HitFilter: Identify groups of nearby hits Histogram in h-f (uses z of vertex for h) Group Cleaner: Identify hits on track Histogram in f0,1/pT Clone Removal & Track Fitting Remove duplicate tracks sharing hits Determine track parameters Combinatorial method Similar to algorithm used in Run2 (next slide). Also used for GPU prototype (see later) 2) Precision Tracking uses Online configuration of Offline tools f h pT f0 z 1 2 Group y 3 x Histogramming Method: + Time scales linearly with occupancy - potential loss of efficiency if vertex not found Combinatorial Method: + Does not rely on primary vertex for efficiency (but vertex info. can increase speed) + More similar to offline algorithm - More rapid scaling of execution time with occupancy

Run 2 HLT Tracking f r Fast Tracking: Seeding: Impact Parameter Resolution v. h Fast Tracking: Seeding: Form Triplets of Hits using bins in r and f Conformal mapping to estimate triplet parameters Combinations limited using RoI geometry Track Following: extend to all Si layers Fast Kalman Track Fit Precision Tracking: Online configuration of Offline Tools Resolve ambiguous hit assignments Remove duplicate tracks Precision track fit (Global c2 algorithm) 1/pT Resolution v. h

HLT Refit of FTK tracks in RoI Gets SCT & Pixel data direct from detector front-end Performs pattern matching & track fitting at up to full Level 1 rate (100 kHz) Sends tracks to the HLT: perigee parameters and hits-on-track information (cluster centres) Fast TracKer FTK HLT has access to: FTK Tracks in whole event HLT Refit of FTK tracks HLT tracks in RoI FTK Tracks Primary vertex reconstruction Initial track-based selections in whole event or large RoI incl. hadronic Tau-lepton decays B decays including hadrons Lepton isolation; Track-based jets HLT Refit of FTK tracks in RoI Selections requiring higher parameter precision incl. Primary & secondary vertex for b-jet tagging Impact parameter and invariant mass for B-physics HLT tracking in RoI Single leptons: e, m, leptonic t decays Final selections requiring ultimate efficiency & precision HLT FTK

FTK System High throughput and low latency (100 ms) from: Parallelism: 64 independent towers (4 in h x 16 in f) Hardware Implementation: custom ASICs and FPGAs Pattern matching for track finding: 1 billion patterns Linearized track-fitting exploiting conformal mapping Data Reduction: grouping Pixels/Strips into super strips Two stage Tracking: 8 then 12 layers Implementation: 8192 Associative Memory (AM) chips each with 128k Patterns 2000 FPGAs Core Crate: 16 PU; 8 h-f towers 8 Crates Interface to HLT Processing U Data Organiser AM Board Track Fitter Hit Warrior Processing Unit AM Board (64 AM) Second Stage Fit 4 Second Stage Boards (SSB) ... Data Formatter AM chip: Special Content-Addressable Memory chip AM06 implemented in 64nm technology Large chip:168mm2 Performs 1014 parallel comparisons at 16 bps

FTK First Stage Clusters formed from SCT and Pixel input Transformed to coarse resolution Super Strips AM board performs Pattern Matching Uses 3 Pixel + 5 SCT layers (6+5 measurements) 1st Stage Track Fitting Input: addresses of patterns with hits Uses full-resolution hit information Estimates c2 : calculation linear in local hit position Allows 1 missing hit 1st Stage Hit Warrior: Duplicate track removal Compare tracks within same road If tracks share >Nmax hits, keep higher quality track (based on Nhits and c2) Sij & hi are pre-calculated constants xj are hit coordinates j: measured coordinates i:components of c2

FTK Patterns & Constants Determined from large sample of simulated single muons (~1.2 billion tracks) Optimize patterns for: high efficiency, low fake rate, minimum pT (1 GeV), d0 & z0 range Within limits of finite AM size Maximizing efficiency: Allow 1 missing hit at each of stage 1 and stage 2 Additional Wild Card option: treat layer as if all Super Strips had a hit Variable resolution patterns: Effectively double the size of the Super Strip in a layer of a given pattern Employs “Don’t Care” feature when bit-matching Super-Strip address Variable resolution allows optimum combination of: large bins to maintain efficiency while limiting bank size Small bins to minimise fake rate

Processor Unit Auxilary Board FTK Second Stage Use 8-layer track parameters to find 3 SCT stereo hits + IBL hit 12-layer track fit: parameters & c2 All track input to Hit Warrior to remove Duplicates Fit 1 track in 1 ns. (1 track every 5ps for full system) pi: track parameters Cil, qi: constants xl hit coordinate Data Formatter Extrapolator Track Fitter Hit Warrior Processor Unit Auxilary Board

FTK Status & Schedule Status: Schedule: Production of Chips & Boards is underway First AM06 chips received – functioning well Integration with Data Acquisition in progress Dataflow to FTK system exercised at end of 2015 data-taking Triggers being developed & tested with Simulated Data => expected performance measurements Schedule: Full Barrel Commissioned Mid 2016 Full Coverage (for <m> = 40) : for start of 2017 data-taking Full Spec. system (for <m> = 69): 2018 (luminosity driven)

GPU Prototype Nvidia C2050 Nvidia K80 Cores 448 2x2496 Multiprocessors 14 2x13 Cores/SMX 32 192 GPU provide high levels of parallelisation => Re-writing of code required to exploit this parallelisation: Code block running on Multiprocessor. Threads in blocks (organised in 1D or 2D) Two GPU prototypes have been developed: Complete Run-1 Fast Tracking (combinatorial algorithm) Run-2 HLT Tracking: Currently Seed-Maker implemented on GPU Goal is to assess the potential benefit in terms of though-put per unit cost. Integration with the Atlas software uses a client-server technology:

Clustering on GPU Pixel Clustering implemented using a Cellular Automaton Algorithm Each hit is assigned an initial tag Tags are replaced for adjacent clusters Ends when no more adjacent hits Pixel Obtain Factor 26 Speed-up running on GPU compared to 1 CPU core

Combinatorial Track-finding on GPU Doublet formation: 2D thread block combines hits from 2 layers to form doublets Triplet Making: 2D thread block combines doublets with hits in 3rd layer to form triplets Track Extension: 1D thread block extends triplets to other layers Clone Removal: Identify tracks with different seeds sharing extension hits Process N×(N-1)/2 track pairs. Each GPU thread processes a number of tracks. Uses Global Memory. Logical Layers Seed Layers Extension Hits Si : space-point i

GPU Performance & Outlook Code Speed-up: Average factor 12 speedup for Tracking on GPU c.f. 1 CPU core Nvidia C2050 GPU Intel E56020 CPU 2.6 GHz x2.4 x5 System Performance: Question: What increase in throughput comparing CPU system with CPU+GPU Measurements in progress: Updated hardware: K80 GPU, Intel E5-2695V3 CPU Updated software: Run-2 Tracking algorithm Initial measurements with Seed-Maker suggest factor of two increase in system throughput could be obtained by adding GPU. Work in progress to add GPU track following Prototype also includes Calorimeter & Muon reconstruction

Tracking in the Phase II Trigger L0 Muon, Calorimeter 40MHz collision rate 1 MHz L1 Muon, Calorimeter, Regional Fast-Tracking Muon Calorimeter Regional Tracking Event Filter Muon, Calorimeter, Full-event Fast-Tracking Precision Tracks in RoI Full Event Tracking ITK 400 kHz Offline Reconstruction & Storage 10 kHz 100 kHz Schematic of possible ITK layout Silicon Strips Pixels Extended coverage to |h|<4 L=7.5x1034cm-2s-1 , 200 pile-up events ~10 times current highest luminosity Upgraded Detector including: upgraded Trigger replacement Tracker (ITK) Tracking in the Phase II Trigger: Hardware Tracking in RoI at up to 1 MHz Used to refine L0 decision Hardware Full Event Tracking at ~100 kHz Provides primary vertex information for pile-up suppression Precision Tracking at the Event Filter - Mainly in RoI Near offline quality tracks to maximize selection efficiency Note: L1 might be implemented in hardware or in software running at the Event Filter

Summary The Trigger is a challenging environment for tracking High Rates, Short times, requirement for high efficiency and low fake rates Current solutions include: Partial event reconstruction: use of Regions of Interest Limiting combinations: using beam constraint and RoI information Multi-step reconstruction and selection: Fast lower resolution, Slower full resolution Hardware-based Fast-Tracking based on Associative Memory and FPGA Future Evolution: Evolution of software framework to exploit concurrent execution of algorithms Evolution of algorithms to exploit internal parallelisation Possible use of accelerators such as GPGPU – encouraging results obtained from prototype Phase-II is even more challenging, will exploit: Hardware based tracking: Full-event and Regional CPU-based Precision Tracking in RoI (use of accelerators) Holy Grail: Full offline-quality tracking at L0 output rates (400 kHz – 1 MHz) Looking for new ideas to bring us closer to this ideal.

Additional Material

ATLAS Trigger in Run 2 & 3

Offline Reconstruction & Storage Trigger in LHC Run 1 40 MHz Tracking at: Level-2 : ~10 ms Event Filter : ~ 1s Tracking Mainly in Regions of Interest DataFlow Calo Muon Inner Detector ReadoutDriver Readout Driver Level 2 Trigger Offline Reconstruction & Storage Level 1 Trigger FPGA Event Filter Event Fragments Built Events Event Building <2.5 us Region of Interest 70 kHz < 40 ms < 4 s ~600 Hz

GPUS Example of complete L2 ID chain implemented on GPU (Dmitry Emeliyanov) Time (ms) Tau RoI 0.6x0.6 tt events 2x1034 C++ on 2.4 GHz CPU CUDA on Tesla C2050 Speedup CPU/GPU Data. Prep. 27 3 9 Seeding 8.3 1.6 5 Seed ext. 156 7.8 20 Triplet merging 7.4 3.4 2 Clone removal 70 6.2 11 CPU GPU xfer n/a 0.1 Total 268 22 12 Data Prep. L2 Tracking Max. speed-up: x26 Overall speed-up t(GPU/t(CPU): 12 X2.4 X5

y z The ATLAS Detector