The ATLAS Trigger System 1.Global Requirements for the ATLAS trigger system 2.Trigger/DAQ Architecture 3.LVL1 system 4.HLT System 5.Some results from LHC.

Slides:



Advertisements
Similar presentations
Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Advertisements

1 The ATLAS Missing E T trigger Pierre-Hugues Beauchemin University of Oxford On behalf of the ATLAS Collaboration Pierre-Hugues Beauchemin University.
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
Inefficiencies in the feet region 40 GeV muons selection efficiency   Barrel – End Cap transition 10th International Conference on Advanced Technology.
1 Introduction to Geneva ATLAS High Level Trigger Activities Xin Wu Journée de réflexion du DPNC, 11 septembre, 2007 Participants Assitant(e)s: Gauthier.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
The ATLAS High Level Trigger Steering Journée de réflexion – Sept. 14 th 2007 Till Eifert DPNC – ATLAS group.
The ATLAS B physics trigger
Kostas KORDAS INFN – Frascati XI Bruno Touschek spring school, Frascati,19 May 2006 Higgs → 2e+2  O (1/hr) Higgs → 2e+2  O (1/hr) ~25 min bias events.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
Chris Bee ATLAS High Level Trigger Introduction System Scalability Trigger Core Software Development Trigger Selection Algorithms Commissioning & Preparation.
Top Trigger Strategy in ATLASWorkshop on Top Physics, 18 Oct Patrick Ryan, MSU Top Trigger Strategy in ATLAS Workshop on Top Physics Grenoble.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
1 Modelling parameters Jos Vermeulen, 2 June 1999.
What’s in the ATLAS data : Trigger Decision ATLAS Offline Software Tutorial CERN, August 2008 Ricardo Gonçalo - RHUL.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
1 A ROOT Tool for 3D Event Visualization in ATLAS Calorimeters Luciano Andrade José de Seixas Federal University of Rio de Janeiro/COPPE.
The Region of Interest Strategy for the ATLAS Second Level Trigger
L1Calo Intro Cambridge Group, Dec 2008 Norman Gee.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
1 John Baines Commissioning of the ATLAS High Level Trigger.
Valeria Perez Reale University of Bern On behalf of the ATLAS Physics and Event Selection Architecture Group 1 ATLAS Physics Workshop Athens, May
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Commissioning and Performance of the CMS High Level Trigger Leonard Apanasevich University of Illinois at Chicago for the CMS collaboration.
SL1Calo Input Signal-Handling Requirements Joint Calorimeter – L1 Trigger Workshop November 2008 Norman Gee.
The Status of the ATLAS Experiment Dr Alan Watson University of Birmingham on behalf of the ATLAS Collaboration.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays Alessandro Di Mattia Michigan State University On behalf of the Atlas Collaboration.
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
The ATLAS Trigger Configuration System Design and Commissioning A.dos Anjos, P.Bell, D.Berge, J.Haller, S.Head, T.Kohno, S.Li, T.McMahon, M.Nozicka, H.v.d.
Status of the ATLAS first-level Central Trigger and the Muon Barrel Trigger and First Results from Cosmic-Ray Data David Berge (CERN-PH) for the ATLAS.
ATLAS TDAQ RoI Builder and the Level 2 Supervisor system R. E. Blair, J. Dawson, G. Drake, W. Haberichter, J. Schlereth, M. Abolins, Y. Ermoline, B. G.
Monitoring of L1Calo EM Trigger Items: Overview & Midterm Results Hardeep Bansil University of Birmingham Birmingham ATLAS Weekly Meeting 11/11/2010.
Kostas KORDAS INFN – Frascati 10th Topical Seminar on Innovative Particle & Radiation Detectors (IPRD06) Siena, 1-5 Oct The ATLAS Data Acquisition.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
TRIGGERING IN THE ATLAS EXPERIMENT Thomas Schörner-Sadenius UHH Teilchenphysik II 4. November 2005.
Hardeep Bansil (University of Birmingham) on behalf of L1Calo collaboration ATLAS UK Meeting, Royal Holloway January 2011 Argonne Birmingham Cambridge.
Performance of the ATLAS Trigger with Proton Collisions at the LHC John Baines (RAL) for the ATLAS Collaboration 1.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
ANDREA NEGRI, INFN PAVIA – NUCLEAR SCIENCE SYMPOSIUM – ROME 20th October
1 J. Varela, CMS Trigger, RT09, Beijing, May 2009 J. Varela IST/LIP Lisbon CMS Trigger Project Manager 16 th IEEE NPSS Real Time Conference May 10-15,
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays 1.Review of HLT architecture 2.Tools needed for Commissioning 3.Operational.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
5/14/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
Ricardo Gonçalo, RHUL BNL Analysis Jamboree – Aug. 6, 2007
ATLAS L1Calo Phase2 Upgrade
High Level Trigger Studies for the Efstathios (Stathis) Stefanidis
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
Trigger commissioning and early running strategy
The First-Level Trigger of ATLAS
12/3/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
Commissioning of the ALICE-PHOS trigger
1/2/2019 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
LHCb Trigger, Online and related Electronics
The CMS Tracking Readout and Front End Driver Testing
Presentation transcript:

The ATLAS Trigger System 1.Global Requirements for the ATLAS trigger system 2.Trigger/DAQ Architecture 3.LVL1 system 4.HLT System 5.Some results from LHC running 6.Summary RT2009 Conference Beijing (China) 14-May-2009 Outline: Cristobal Padilla (IFAE-Barcelona/CERN) on behalf of the ATLAS Collaboration

1Thursday, May 14th, 2009Cristobal Padilla

2 ATLAS Detector 45 m 24 m 7000 T Thursday, May 14th, 2009Cristobal Padilla ATLAS superimposed to the 5 floors of building 40

Thursday, May 14th, 2009Cristobal Padilla3 Global Requirements of Trigger Systems at ATLAS  At full luminosity, the LHC will produce:  Signal  100 GeV Higgs: 0.1 Hz  600 GeV Higgs: 0.01 Hz  SUSY: < Hz  Interesting b events: tenths of Hz  Background  Inelastic: 10 9  Jets with E T > 100 GeV: kHz  μ from π,K,D,B  b events: 250 kHz  Environment at the LHC:  Number of overlapping events/25 ns: 23  Number of particles in ATLAS/25ns: 1400  Need rejection powers of order up to 10 13

Thursday, May 14th, 2009Cristobal Padilla4 Global Requirements of Trigger Systems at ATLAS  ATLAS has O(10 8 ) channels. Event size is ~1.6 MB  At ~30MHz LHC collision rate, that would imply ~50 TB/s data throughput  Estimated cost of storage would be ~ MCHF  Need to reduce throughput to an affordable level of ~200 Hz  Assume reconstruction takes 1 s/event  Would need to have ~10 7 CPU cores to do a processing cycle in a year  Need to reduce the CPU needs by three orders of magnitude  Need a custom electronics-based trigger to reduce the rate input to the computer farms (which constitute the higher levels of the trigger)  Then a CPU farm to reduce the event rate to the affordable storage rate  The trigger system at ATLAS is required to work in higher regimes of bandwidth and trigger rates than previous similar experiments

Thursday, May 14th, 2009Cristobal Padilla5 HLTHLT DATAFLOWDATAFLOW 40 MHz 75 kHz ~3 kHz ~ 200 Hz 120 GB/s ~ 300 MB/s ~3 GB/s Event Building N/work Dataflow Manager Sub-Farm Input Event Builder EB SFI EBN DFM Lvl2 acc = ~2 kHz Event Filter N/work Sub-Farm Output Event Filter Processors EFN SFO Event Filter EFP ~ sec ~4 GB/s EFacc = ~0.2 kHz TriggerDAQ RoI Builder L2 Supervisor L2 N/work L2 Proc Unit Read-Out Drivers FE Pipelines Read-Out Sub-systems Read-Out Buffers Read-Out Links ROS 120 GB/s ROB LV L1 D E T R/O 2.5 m s Calo MuTrCh Other detectors Lvl1 acc = 75 kHz 40 MHz ROD LVL2 ~ 10 ms ROIB L2P L2SV L2N RoI RoI data = 1-2% RoI requests Trigger/DAQ Architecture

Thursday, May 14th, 2009Cristobal Padilla6  The Level-1 selection is dominated by local signatures  Based on coarse granularity, no access to inner tracking  Important further rejection can be gained with local matching of full detector data Region of Interest: Why?  The geographical addresses of interesting signatures identified by LVL1 (Regions of Interest)  Allow access to local data of each relevant detector  Sequentially  Normally, there is ONE RoI per event accepted by LVL1  = ~1.6  The resulting total amount of RoI data is minimal  a few % of the Level-1 throughput 4 RoI  addresses ==> the ATLAS RoI-based Level-2 trigger … ~ one order of magnitude smaller ReadOut network … … at the cost of a higher control traffic …  There is a simple correspondence region  ROB number(s) (for each detector).  For each RoI the list of ROBs with the corresponding data from each detector is quickly identified (LVL2 processors). Note that this example is atypical, the average number of RoIs/ev is just over ~1.6 Region of Interest: Implementation

First Level Trigger: LVL1  3 sub-systems  L1- Calorimeters  L1- Muons  Central Trigger Processor (CTP)  It does signature Identification  e/γ, τ/h, jets, μ  Multiplicities per p T threshold  Isolation criteria  Missing E T, total E T, jet E T  CTP  Receive and synchronize trigger information  Sends the L1 decision to the L2 and the detectors Thursday, May 14th, 2009Cristobal Padilla7 Preprocessor Cluster Processor e/  /h e/ ,  /h Jet/Energy Processor jetsE T jets, E T Muon Barrel Trigger Muon Endcap Trigger Muon-CTP Interface Central Trigger Processor LAr TileRPCTGC RoIB L2 supervisorDetector readout

LVL1 Calorimeter architecture  Pre-processor (PPrs)  Receive and sample signals from carlorimeters  Uses coarse granularity (Trigger Towers)  Performs noise Filter  Does the bunch Crossing Identification (BCID) of the signals  Determine final E T Value  Jet Energy and Cluster Processors  Physics algorithms that search for and identify  Isolated leptons, taus  Jets  Compute E T, total Energy, missing E T of the clusters  The LVL1 calorimeter system transmit signals to CTP, DAQ and RoI Builder Thursday, May 14th, 2009Cristobal Padilla8

Installation and Commissioning of LVL1 Calorimeter  System is fully installed since end of 2007  Commissioning  Hardware/software has been tested  Calibration procedures with calorimeters exercised  Participation in data-taking Thursday, May 14th, 2009Cristobal Padilla9 1 of 8 pre-processor crates 1 of 4 cluster processor crates Analogue input cables 1 of 3 Read Out Driver crates with the fibres installed

LVL1 Muon  The LVL1 muon system uses dedicated chambers providing very fast signals  Resistive Plate Chambers (RPC) in the barrel region  Thin Gap Chambers (TGC) in the end-cap region  Momentum of muons is defined by the deviation of a track assumed to have infinite momentum looking at hit coincidences inside so-called coincidence windows  It is a fast and high redundancy system with a wide p T threshold range separated in systems for low and high p T thresholds  The main issues are  Safe Identification (within 25 ns), which requires good time resolution  Precise time alignment of the signals that provide the muon coincidences  The system has been installed and, thanks to its highly programmable logic, been widely used during the cosmic ray runs Thursday, May 14th, 2009Cristobal Padilla10

Central Trigger Processor (CTP)  Receives trigger information from calorimeters and muon detectors (CTPIN)  Synchronizes and aligns inputs  Forms Level 1 trigger decision from inputs, trigger menu, and prescale/passthrough set (CTPCore)  Generates deadtime, handles busy signals and calibration requests from sub- detectors  Passes trigger decision and region of interest to Level 2 (CTPOUT)  Reports to the Trigger and Timing Control (TTC) system  Receives timing signals from the LHC (CTPMI)  Monitors trigger information bunch-by- bunch (CTPMON) Thursday, May 14th, 2009Cristobal Padilla11

Thursday, May 14th, 2009Cristobal Padilla12 Data Flow for the High Level Trigger (HLT) LVL2Supervisor Database ROS L2PU Event Building SFI/SFO EFD LVL1 Result LVL2 Decision LVL2 Selection EF Selection Data Request Event Fragments Event Fragments Data Request LVL1 Result LVL2 Decision Full Event Full Event Full Event + EF Result Selected Full Event LVL2 Result

Thursday, May 14th, Final size for max L1 rate ~ 500 PCs for L2 + ~ 1800 PCs for EF (multi-core technology) 850 PCs installed total of 27 XPU racks = 35% of final system (1 rack = 31 PCs) (XPU = can be connected to L2 or EF) x 8 cores CPU: 2 x Intel Harpertown quad-core 2.5 GHz RAM: 2 GB / core, i.e. 16 GB Final system : total of 17 L EF racks of which 28 (of 79) racks as XPU HLT Farms Cristobal Padilla

Thursday, May 14th, 2009Cristobal Padilla14 How the HLT Works Found by LVL1 two isolated em clusters with each pt>20GeV are found by LVL1 possible signature for Z-> e+e- Goal: Validate step-by-step, check intermediate signatures, reject at earliest possible moment Alg Iso lation pt> 30GeV Cluster shape track finding Iso lation pt> 30GeV Cluster shape track finding EM20i + e30i + e30 + e e + ecand + Signature  Level1 seed  STEP 1 STEP 4 STEP 3 STEP 2 t i m e Sequential call of algorithms

Thursday, May 14th, 2009Cristobal Padilla15 Algorithm Data Access Algorithm Region Selector HLT Algorithm Region Selector Trans. Event Store Data Access Byte Stream Converter Data source organized by ROB Transient EventStore Realistic Data Access (in ByteStream* format) is vital to realistic modeling of Trigger Performance region list DetElem IDs ROB ID raw event data DetElems list DetElem IDs DetElems

Trigger Database  Contains the trigger configuration:  active trigger Chains, algorithm parameters, prescale factors, passthrough fractions.  It has a relational database (TriggerDB) with no duplication of objects  four Database keys: LVL1 & HLT menu, L1 prescales, HLT prescales, bunch number;  There is a user interface (TriggerTool);  Read and write menu into XML format  Perform menu consistency checks  Experience of 2-month cosmic commissioning: over 3k chains, 6k components (algorithms, tools, services), 5k parameters. [Counting all versions of all objects]  After run, Trigger Configuration becomes conditions data  During run, can change complete menu at any run stop/start, prescales/passthroughs at any lumi block boundary  Database proxy mechanism in place to avoid direct connection from every application Thursday, May 14th, 2009Cristobal Padilla16

Excitement in the ATLAS Detector Control Room: The first LHC event on 10 th September 2008 Thursday, May 14th, Cristobal Padilla

The very first beam-splash event from the LHC in ATLAS on 10:19, 10 th September 2008 Online display Offline display First Event in ATLAS Thursday, May 14th, 2009Cristobal Padilla18

First Circulating Beams in ATLAS ATLAS trigger during the circulating beams used the “Minimum Bias Trigger Scintillator” (MBTS) and the Beam Pick-up (BPTX) Thursday, May 14th, 2009Cristobal Padilla19 Beam becoming unstable: MBTS, initially quiet, becomes more active after several runs. At the end the beam pick-up does not see the beam anymore while the MBTS still fires

 Experiment timing based on beam-pickup (“BPTX”) reference  First task of LVL1 central trigger team on 10 th September was to commission the beam pickups  Times of arrival of other triggers were adjusted to match  Plots show evolution from September 10 th to September 12 th  Each LVL1 sub-system also needs to be timed internally  L1-Calo, L1-RPC, L1-TGC, MBTS, etc. Note change of scale! Thursday, May 14th, Cristobal Padilla Timing-in the Trigger with Single Beams

Summary  The challenging LHC data-taking environment determines the design of the ATLAS Trigger architecture  Data reduction factors of more than five orders of magnitude are needed  Very fast and reliable algorithms are needed  The ATLAS Trigger system has chosen the following design parameters to efficiently select the events that contain interesting physics  A first level trigger based on custom electronics and fixed latency  Two software-based higher level triggers  Region of interest mechanism  Smaller readout network  Increases complexity in the data control traffic software  Wide use of offline-based software for event selection  The complete system is ready for the LHC operation in Fall 2009  Hardware equipment is installed  Commissioning with cosmic rays has been widely exploited (see other talks)  The system was successfully exercised in the single beam runs in 2008 Thursday, May 14th, 2009Cristobal Padilla21

Cosmic Event Triggered by Jet and Tau Triggers Thursday, May 14th, 2009Cristobal Padilla22

Cosmic Data Thursday, May 14th, 2009Cristobal Padilla23

Thursday, May 14th, 2009Cristobal Padilla24 Run Tile calorimeter L1Calo - Had EM calorimeter Event Display of a Cosmic Muon

Thursday, May 14th, 2009Cristobal Padilla25 HLTHLT DATAFLOWDATAFLOW 40 MHz 75 kHz ~3 kHz ~ 200 Hz 120 GB/s ~ 300 MB/s ~3 GB/s Event Building N/work Dataflow Manager Sub-Farm Input Event Builder EB SFI EBN DFM Lvl2 acc = ~2 kHz Event Filter N/work Sub-Farm Output Event Filter Processors EFN SFO Event Filter EFP ~ sec ~4 GB/s EFacc = ~0.2 kHz TriggerDAQ RoI Builder L2 Supervisor L2 N/work L2 Proc Unit Read-Out Drivers FE Pipelines Read-Out Sub-systems Read-Out Buffers Read-Out Links ROS 120 GB/s ROB LV L1 D E T R/O 2.5 m s Calo MuTrCh Other detectors Lvl1 acc = 75 kHz 40 MHz ROD LVL2 ~ 10 ms ROIB L2P L2SV L2N RoI RoI data = 1-2% RoI requests Trigger/DAQ Architecture

HLT Hardware Installation  Currently 824 nodes on 27 racks  8 cores/node in 2xHarpertown GHz  2 GB memory/core  Can run L2 or EF Thursday, May 14th, 2009Cristobal Padilla26

HLT Structure The basic ingredient of the HLT is the HLT Steering which controls the flow of code execution inside the HLT processing units  Algorithms for feature extraction (FEX) and applying requirements (HYPO)  Configurable by parameters  Results (Features) cached by HLT steering  Sequences: FEX and HYPO algorithms producing TriggerElements (ex: L2_mu10)  Chains: Ordered list of defined numbers of TriggerElements  Steering aborts chains as soon as given step fails (early reject)  Menu: Collection of chains (and pass- through+prescales)  In python or xml, recorded in database Thursday, May 14th, 2009Cristobal Padilla27 Cluster? L2 Calorim. EF Calorim, EM ROI TrigEMCluster EM ROI L2 Calorim. Cluster? L2 tracking. EF Calorim, EF tracking e/γ reconst. e OK?γ OK? Track? Match? TrigInDetTrakcs CaloCluster EFTrack egamma FEX Track? HYPO Feature

Thursday, May 14th, 2009Cristobal Padilla28 LArTiles (semi-projective segmentation ) trigger towers map 0.1x <|  |< x <|  |< x <|  |< x0.1 |  |< 2.5  x  Position Analogue summation of calorimeter cells 3584 x 2 (EM+HAD) trigger towers Trigger Towers (TT)

Thursday, May 14th, 2009Cristobal Padilla29 10b Sampling 40 Mhz, Flash-ADC 10 bits 1 ADC = 250 MeV Pedestal 40 ADC PPr Receivers (Rx) Input signal conditioning to L1 (2,5V  250GeV) Variable gain amplifier (VGA) E  E T Conversion (Hadronic layers only) Local signal monitoring Transmission to processors & DAQ 10b E T calibration Look-Up Table Look-Up Table (LUT) Pedestal subtraction, noise suppression ADCGeV ADC (10b)  GeV (8b) conversion E thres 8b Bunch crossing identification (BCID) Finite impulse response filter (FIR) Peak finder (linear/saturated) Assign E T to the ‘correct’ bunch crossing a0a0 a1a1 a2a2 a3a3 a4a4 + FIR filter Pre-Processors – Energy Reconstruction

Thursday, May 14th, 2009Cristobal Padilla30 Processors input is a matrix of tower energies Algorithms look for physics signatures (sliding window) RoI’s sent to Level-2 trigger Cluster Processor Jet/Energy-sum Processor Criteria for e/  or  /h candidate: EM or Had. cluster > E threshold Total E T in EM Isolation Ring  EM isolation thresh. Total E T in Had. Isolation Ring  Had. isolation thresh. Local E T Maximum compared to neighbor windows. e/  only: Had. core  core isolation threshold Jet candidate Coarser granularity 0.2x0.2 (jet element) Digital summation EM + Had. Sliding, overlapping windows (3 sizes) Missing energy ECAL+HCAL Regions of Interest (RoI)