The SLHC CMS L1 Pixel Trigger & Detector Layout Wu, Jinyuan Fermilab April 2006.

Slides:



Advertisements
Similar presentations
ATLAS-CMS SLHC Workshop, 20/3/07 Costas Foudas, Imperial College London 1 Results from Studies for a Tracking Trigger for CMS at SLHC Overview of this.
Advertisements

Track Trigger Designs for Phase II Ulrich Heintz (Brown University) for U.H., M. Narain (Brown U) M. Johnson, R. Lipton (Fermilab) E. Hazen, S.X. Wu, (Boston.
1 ACES Workshop, March 2011 Mark Raymond, Imperial College. Two-in-one module PT logic already have a prototype readout chip for short strips in hand CBC.
Jan. 2009Jinyuan Wu & Tiehui Liu, Visualization of FTK & Tiny Triplet Finder Jinyuan Wu and Tiehui Liu Fermilab January 2010.
Some Thoughts on L1 Pixel Trigger Wu, Jinyuan Fermilab April 2006.
8th Workshop on Electronics for LHC Experiment, Colmar, France, 10 Sep R.Ichimiya, ATLAS Japan 1 Sector Logic Implementation for the ATLAS Endcap.
Status of the MICE SciFi Simulation Edward McKigney Imperial College London.
MICE Tracker Front End Progress Tracker Data Readout Basics Progress in Increasing Fraction of Muons Tracker Can Record Determination of Recordable Muons.
Terra-Pixel APS for CALICE Progress meeting 10 th Nov 2005 Jamie Crooks, Microelectronics/RAL.
BTeV Trigger Architecture Vertex 2002, Nov. 4-8 Michael Wang, Fermilab (for the BTeV collaboration)
The BTeV Tracking Systems David Christian Fermilab f January 11, 2001.
The SLHC and the Challenges of the CMS Upgrade William Ferguson First year seminar March 2 nd
Without hash sorting, all O(n 2 ) combinations must be checked. Hash Sorter - Firmware Implementation and an Application for the Fermilab BTeV Level 1.
The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System D.Acosta, A.Madorsky, B.Scurlock, S.M.Wang University of Florida A.Atamanchuk,
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Advanced Topics on FPGA Applications Screen B Wu, Jinyuan Fermilab IEEE NSS 2007 Refresher Course Supplemental Materials Oct, 2007.
An Asynchronous Level-1 Tracking Trigger for Future LHC Detector Upgrades A. Madorsky, D. Acosta University of Florida/Physics, POB , Gainesville,
Proposal by the Numbers 3 Trigger layers (plus 2 “short” layers) provide full coverage to eta=??? in 15 degree sectors Hits collected in real time, sent.
Trigger A front end chip for SLHC CMS strip tracker IN2P3 microelectronic Summer School Frejus, June 2011  Concept of Silicon strip Pt-module.
L1 Pixel Trigger: Is 20 us Latency Achievable? David Christian Fermilab April 29, 2015.
1 Digital Active Pixel Array (DAPA) for Vertex and Tracking Silicon Systems PROJECT G.Bashindzhagyan 1, N.Korotkova 1, R.Roeder 2, Chr.Schmidt 3, N.Sinev.
High-resolution, fast and radiation-hard silicon tracking station CBM collaboration meeting March 2005 STS working group.
September 8-14, th Workshop on Electronics for LHC1 Channel Control ASIC for the CMS Hadron Calorimeter Front End Readout Module Ray Yarema, Alan.
HallA/SBS – Front Tracker PARAMETERDESIGN VALUE Microstrip Silicon Detector Number of tiles/plane and size2 Number of planes2 Size of the single
14/6/2000 End Cap Muon Trigger Review at CERN 1 Muon Track Trigger Processing on Slave and Hi-pT Boards C.Fukunaga/Tokyo Metropolitan University TGC electronics.
NA62 Trigger Algorithm Trigger and DAQ meeting, 8th September 2011 Cristiano Santoni Mauro Piccini (INFN – Sezione di Perugia) NA62 collaboration meeting,
A Pattern Recognition Scheme for Large Curvature Circular Tracks and Its FPGA Implementation Example Using Hash Sorter Jinyuan Wu and Z. Shi Fermi National.
Design and development of micro-strip stacked module prototypes for tracking at S-LHC Motivations Tracking detectors at future hadron colliders will operate.
1 Th.Naumann, DESY Zeuthen, HERMES Tracking meeting, Tracking with the Silicon Detectors Th.Naumann H1 DESY Zeuthen A short collection of experiences.
LHCb VErtex LOcator & Displaced Vertex Trigger
Advanced Topics on FPGA Applications Screen A Wu, Jinyuan Fermilab IEEE NSS 2007 Refresher Course Supplemental Materials Oct, 2007.
W. Smith, U. Wisconsin, CMS SLHC EMU Upgrade Workshop, Jan. 8, 2008Introduction to CMS SLHC Triggers - 1 CMS Trig & DAQ for LHC Overall Trigger & DAQ Architecture:
Jefferson Laboratory Hall A SuperBigBite Spectrometer Data Acquisition System Alexandre Camsonne APS DNP 2013 October 24 th 2013 Hall A Jefferson Laboratory.
Fast Tracking of Strip and MAPS Detectors Joachim Gläß Computer Engineering, University of Mannheim Target application is trigger  1. do it fast  2.
Shiuan-Hal,Shiu To-do-list and schedule 1. Things need to do before beam comes 2 Generate new trigger matrix for test beam base on the final geometry.(1)
Test of Single Crystal Diamond Pixel Detector at Fermilab MTEST Simon Kwan Fermilab April 28, 2010.
Anatoli Romaniouk, 3 March 2009 ACES Track trigger in ATLAS? Track trigger in ATLAS? Anatoli Romaniouk Moscow Physics and Engineering Institute.
Marcello Mannelli October 2011 Overview of CMS Tracking Trigger.
Software Tools for Layout Optimization (Fermilab) Software Tools for Layout Optimization Harry Cheung (Fermilab) For the Tracker Upgrade Simulations Working.
CMS Upgrade Workshop - Fermilab Trigger Studies Using Stacked Pixel Layers Mark Pesaresi
Tiny Triplet Finder Jinyuan Wu, Z. Shi Dec
Leonardo Rossi INFN Genova - UTOPIA #12 0 What we have learned from UTOPIA so far? Defined a set of gauge histos to compare layouts Exercised on barrel-part.
Tracking at the Fermilab Test Beam Facility Matthew Jones, Lorenzo Uplegger April 29 th Infieri Workshop.
Evelyn Thomson Ohio State University Page 1 XFT Status CDF Trigger Workshop, 17 August 2000 l XFT Hardware status l XFT Integration tests at B0, including:
W. Smith, U. Wisconsin, ATLAS-CMS SLHC Workshop March 21, 2007 CMS SLHC Trigger - 1 CMS SLHC Trigger Wesley H. Smith U. Wisconsin - Madison ATLAS-CMS SLHC.
FPGA based signal processing for the LHCb Vertex detector and Silicon Tracker Guido Haefeli EPFL, Lausanne Vertex 2005 November 7-11, 2005 Chuzenji Lake,
The BTeV Pixel Detector and Trigger System Simon Kwan Fermilab P.O. Box 500, Batavia, IL 60510, USA BEACH2002, June 29, 2002 Vancouver, Canada.
Pixel detector/Readout for SuperB T.Kawasaki Niigata-U.
FP-CCD GLD VERTEX GROUP Presenting by Tadashi Nagamine Tohoku University ILC VTX Ringberg Castle, May 2006.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
Off-Detector Processing for Phase II Track Trigger Ulrich Heintz (Brown University) for U.H., M. Narain (Brown U) M. Johnson, R. Lipton (Fermilab) E. Hazen,
VICTR Vertically Integrated CMS TRacker Concept Demonstration ASIC
Scope R&D Participants/Infrastructure
LHC1 & COOP September 1995 Report
IOP HEPP Conference Upgrading the CMS Tracker for SLHC Mark Pesaresi Imperial College, London.
Tracker Upgrade Simulation Task List Harry Cheung (Fermilab), Alessia Tricomi (Catania)
Readout System of the CMS Pixel Detector
David Christian Fermilab October 18, 2016
Some basic ideas (not a solution)
eXtremely Fast Tracker; An Overview
Michele Pioppi* on behalf of PRIN _005 group
The LHC collider in Geneva
ATLAS-CMS SLHC Workshop This talk is available on:
SVT detector electronics
Changes in Level 1 CSC Trigger in ORCA by Jason Mumford and Slava Valuev University of California Los Angeles June 11,
Motives and design of future CMS tracker
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
The CMS Tracking Readout and Front End Driver Testing
SVT detector electronics
The LHCb Front-end Electronics System Status and Future Development
Presentation transcript:

The SLHC CMS L1 Pixel Trigger & Detector Layout Wu, Jinyuan Fermilab April 2006

Preference on Detector Layout Pixel planes are expensive in terms of material, cost, data volume, power, cooling etc. (C3: Cost, Cable, Cooling) If N layers of pixel detector planes are affordable, normally spaced configurations like (b) is more preferable for data analysis stage. Pattern recognition for (b) is more difficult . From BTeV works, the pattern recognition for (b) is not as hard as we thought several years ago. (a) (b)

Brief History of Tracking Long time ago, tracking was done by: –Finding 2-point candidates (doublets) and then –Finding the third point. Before BTeV, it was known: –Triplet can be found in one step. During BTeV, we learnt how to do triplet finding in FPGA fast and cheaply. (e. g. Tiny Triplet Finder)

Circular Tracks from Collision Point on Cylindrical Detectors For a given hit on layer 3, the coincident between a layer 2 and a layer 1 hit satisfying coincident map signifies a valid circular track. A track segment has 2 free parameters, i.e., a triplet. The coincident map is invariant of rotation.  1 -  3 )+64  2 -  3 )+64

Tiny Triplet Finder Reuse Coincident Logic via Shifting Hit Patterns C1 C2 C3 One set of coincident logic is implemented. For an arbitrary hit on C3, rotate, i.e., shift the hit patterns for C1 and C2 to search for coincidence.

Tiny Triplet Finder for Circular Tracks *R1/R3 *R2/R3 Triplet Map Output To Decoder Bit Array Shifter Bit Array Shifter Bit-wise Coincident Logic 1.Fill the C1 and C2 bit arrays. (n1 clock cycles) 2.Loop over C3 hits, shift bit arrays and check for coincidence. (n3 clock cycles) Also works with more than 3 layers

Question: How can data from different layers merge together? Total data rates from pixel 10cm are: 3.125, 5 or 12 Gb/s/cm 2. To send full data over large distance is difficult. (The good side of stacked layer ideas is the possibility of doing coincident locally.) Difficult, yes, but there are several possibilities.

Possibility 1 Pre-trigger

From LHC to SLHC The total L1 latency for SLHC has been increased to 6.4 us. Total L1 rate is kept the same (100kHz). Consider a pre-trigger of 3.2 us. Use pre-trigger to dump data from pixel. Data rate: 1/80 or 1/40. BX=40MHz L=10 34 BX=80MHz L=10 35 Latency 3.2  s Current LHC CMS L1: <100 kHz SLHC CMS pre-trigger? <1MHz Latency 6.4  s SLHC CMS L1: <100 kHz

Sending Data to Triplet Finder: The Pre-trigger ECAL (or any other) generates coarse pre-trigger and sends to global L1. The pre-trigger is distributed to all (or 1/2, 1/4 of all) readout chips at 3.2 us. The distribution lines are original L1 trigger signal lines. The ROC output data and the tracker trigger generates trigger primitives. The L1 system makes final global T1. Pre-triggered data stored in Tracker Trigger during the second 3.2 us are sent to HLT/DAQ. ROC has shorter pipeline in this operation mode. Worst case: two round trips. Better if one round trip can be eliminated. ECAL ROC Triplet Finder L1 ECAL Pre-triggerECAL finer trigger CableL1 PTCable L1 triggerCable L1 triggerCable ROC out Triplet Trigger 3.2us HLT DAQ

Some Numbers Assume: ECAL generates up to 1MHz pre-trigger with 3.2us latency. Use the hit rate 4hits/(1.28cm) 2 R=8cm. Total data rate: 4hits x 16 bits/hit x 1MHz = 64 Mb/s. Assume each (1.28cm 2 ) ROC output Cu 160 Mb/s. ECAL ROC L1 ECAL Pre-triggerECAL finer trigger CableL1 PTCable L1 triggerCable L1 triggerCable ROC out 3.2us HLT DAQ R (cm)[hits]/(1.28cm) 2 /BX [Foundas] Data Rate (Mb/s/ROC) (assume 16 bits/hit) # of 160 Mb/s Cu pairs/ROC Output Capacity (Mb/s/ROC) cm]641 or 2160 or cm] [est.] Triplet Finder Triplet Trigger

Possibility 1+ Pre-trigger + Stacked layers for high PT tracks

High PT Doublet Finding, If Needed The system supports both ECAL pre-trigger mode and high PT doublet finding mode. The ROC at 300mm and 295mm communicate to each other. High PT doublets are found in ROC. The doublets point the searching windows on 200 and 100mm layers and hits in the window are enabled to be readout. One set of stack layers, rather than 3. ECAL ROC 300 ROC 295 Triplet Finder & Readout L1 HLT DAQ R=300mm R=295mm R=200mm R=100mm R=50mm ROC 200 ROC 100 ROC 50 Readout Only

Stack Layers: 1mm or 5mm Pixel pitch:  u in ,  v in z. Layer separation: (r 2 -r 1 ). Measurement error: –  =  u / (r 2 -r 1 ) –  =  v / (r 2 -r 1 ) Power Consumption: – P = P 0 A /(  u  u). Therefore: –P   = P 0 A / (r 2 -r 1 ) 2. When the layer separation increases from 1mm to 5mm, P   reduces by factor of 25. 1mm5mm Pixel Pitch 20  m(  ) 200  m(z) 50  m(  ) 200  m(z)   Power P 0 =10  W 2.5KW/m 2 1KW/m 2 Sharing Mech. Support & Cooling ?Yes

Straw Man Stack Layers (r-z view) The two stack layers share same mechanical support and cooling layer. ROC in two layers overlap to each other in z direction. Hits from 1/4 of chip at both end are sent to opposite chips for coincident. Questions: overlapping in phi direction? Sensor Readout Chip Mechanical Support, Cooling, interconnection Readout Chip Sensor Seeding Hits Coincident Range

Straw-Man Readout Chip -- Backend Column Logic & Zero Suppression Pipeline 6.4us3.2us1.0us High PT Segment Correlation CS10HDACS10AHDB From/to Stack Layer ROC DOUT CS64 CS32 T1orPT From L1

Pipeline and CS32/64 Column Logic & Zero Suppression Pipeline 6.4us3.2us1.0us DOUT CS64 CS32 T1orPTFrom L1 The hit data are stored in the pipeline. After 3.2 us, when the pre-trigger comes (signal T1orPT), the ROC sends data out for triplet trigger. After 6.4 us, when the L1 comes, the ROC sends data of the BX out.

High PT Correlation The OR-AND coincident logic accepts high PT doubles. Set the Bit Enable Register to change PT cut and correct offset on pixel alignment. The OR gate is replaced with a priority encoder in real implementation. Bit Enable Register Plane A Plane B

1 Copy, Not 256 Copies in real implementation Some design may use N copies of coincident logic. (N=256 here.) The design here uses 1 copy. Note that Plane A is local in the ROC and Plane B is another ROC. The data from Plane B are column coordinate of hits. The priority encoder output represents track angle. Plane A Plane B Logarithmic Shifter Priority Encoder

About This Work It is extremely interesting since it is still in detector layout stage. There are not so many chances one can work at this stage in ones life time. Simulation, simulation, simulation. Time is tight. (TDR around ’07, ’08)

The End Thanks

Analysis Track reconstruction: –I–Impact parameter. –T–Transverse momentum. Fake track rejection Compare configuration (a) and (c) when silicon strip tracker data are also included. (a) (b) (c)