A demonstrator for a level-1 trigger system based on μTCA technology and 5Gb/s optical links. Greg Iles Rob Frazier, Dave Newbold (Bristol University)

Slides:



Advertisements
Similar presentations
GCT Source Card Status John Jones Imperial College London
Advertisements

GCT Software ESR - 10th May 2006 Jim Brooke. Jim Brooke, 10 th May 2006 HAL/CAEN Overview GCT Driver GCT GUI Trigger Supervisor Config DB Test scripts.
9 Mar 2011CMS Calorimeter Trigger Upgrade1 Sridhara Dasu (a), Rob Frazier (b), Tom Gorski (a), Geoff Hall (c), Greg IIes (c), Pam Klabbers (a), Dave Newbold.
Phase-0 topological processor Uli Schäfer Johannes Gutenberg-Universität Mainz Uli Schäfer 1.
Level-1 Topology Processor for Phase 0/1 - Hardware Studies and Plans - Uli Schäfer Johannes Gutenberg-Universität Mainz Uli Schäfer 1.
Alice EMCAL Meeting, July 2nd EMCAL global trigger status: STU design progress Olivier BOURRION LPSC, Grenoble.
Moving NN Triggers to Level-1 at LHC Rates Triggering Problem in HEP Adopted neural solutions Specifications for Level 1 Triggering Hardware Implementation.
28 August 2002Paul Dauncey1 Readout electronics for the CALICE ECAL and tile HCAL Paul Dauncey Imperial College, University of London, UK For the CALICE-UK.
S. Silverstein For ATLAS TDAQ Level-1 Trigger updates for Phase 1.
06-Dec-2004 HCAL TriDAS 1 TriDAS Status HF Luminosity HF Trigger Slicing.
GEM Design Plans Jason Gilmore TAMU Workshop 1 Oct
C. Foudas, Imperial CMS Upgrades Workshop FNAL – Trigger Introduction Status of the CMS Trigger Upgrade Activities: –uTCA and Optical Link upgrades.
Global Trigger Upgrades for SLHC Vienna, Global Trigger Group A.Taurok, C.-E. Wulz SLHC Workshop, FNAL, 19 Nov
Juanjo Noguera Xilinx Research Labs Dublin, Ireland Ahmed Al-Wattar Irwin O. Irwin O. Kennedy Alcatel-Lucent Dublin, Ireland.
CMS Calorimeter Trigger Upgrades
Global Trigger H. Bergauer, K. Kastner, S. Kostner, A. Nentchev, B. Neuherz, N. Neumeister, M. Padrta, P. Porth, H. Rohringer, H. Sakulin, J. Strauss,
Status and planning of the CMX Philippe Laurens for the MSU group Level-1 Calorimeter Trigger General Meeting, CERN May 24, 2012.
Prototype of the Global Trigger Processor GlueX Collaboration 22 May 2012 Scott Kaneta Fast Electronics Group.
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Technical Part Laura Sartori. - System Overview - Hardware Configuration : description of the main tasks - L2 Decision CPU: algorithm timing analysis.
S. Dasu, University of Wisconsin September Regional Calorimeter Trigger Outline: Status of Prototypes Plans for Production Testing Plans for Integration,
Status of Global Trigger Global Muon Trigger Sept 2001 Vienna CMS-group presented by A.Taurok.
DAQ Link integration update A Navarro Tobar*, JM Cela Ruiz 10/2/2015.
W. Smith, DOE/NSF Review, August, 2006 CMS Trigger - 1 SORT ASICs (w/heat sinks) EISO Bar Code Input DC-DC Converters Clock delay adjust Clock Input Oscillator.
Status and planning of the CMX Wojtek Fedorko for the MSU group TDAQ Week, CERN April , 2012.
ATLAS Trigger / current L1Calo Uli Schäfer 1 Jet/Energy module calo µ CTP L1.
Samuel Silverstein Stockholm University CMM++ firmware development Backplane formats (update) CMM++ firmware.
A Time-Multiplexed Track-Trigger architecture for CMS G Hall, M Pesaresi, A Rose Imperial College London D Newbold University of Bristol Thanks also to.
US CMS DOE/NSF Review, May Cal. Trig. 4 Gbaud Copper Link Cards & Serial Link Test Card - U. Wisconsin Compact Mezzanine Cards for each Receiver.
Tom Gorski, U. Wisconsin, July 20, 2009 SHLC RCT - 1 CMS Calorimeter Trigger SLHC Regional Calorimeter Trigger System Design and Prototypes Tom Gorski.
KLM Trigger Status Barrel KLM RPC Front-End Brandon Kunkler, Gerard Visser Belle II Trigger and Data Acquistion Workshop January 17, 2012.
Trigger Meeting: Greg Iles5 March The APV Emulator (APVE) Task 1. –The APV25 has a 10 event buffer in de-convolution mode. –Readout of an event =
New L2cal hardware and CPU timing Laura Sartori. - System overview - Hardware Configuration: a set of Pulsar boards receives, preprocess and merges the.
W. Smith, U. Wisconsin, US CMS DOE/NSF Review, September 2004 Trigger Report - 1 CSC Trigger Muon Port Card & Sector Processor in production Mezzanine.
US CMS DOE/NSF Review, 20 November Trigger - W. Smith1 WBS Trigger Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager DOE/NSF Status.
SLHC Meeting CERN, 21 May 2008 Presented by C.-E. Wulz Global Triggers,  TCA Technology M. Stettler, M. Hansen (CERN) C. Foudas, G. Iles (Imperial College)
1 Level 1 Pre Processor and Interface L1PPI Guido Haefeli L1 Review 14. June 2002.
IPbus A method to communicate with cards over Ethernet 1.
Trigger Hardware Development Modular Trigger Processing Architecture Matt Stettler, Magnus Hansen CERN Costas Foudas, Greg Iles, John Jones Imperial College.
Trigger Workshop: Greg Iles Feb Optical Global Trigger Interface Card Dual CMC card with Virtex 5 LX110T 16 bidirectional.
W. Smith, U. Wisconsin, US CMS DOE/NSF Review, May, 2004 Trigger Report - 1 CSC on-detector peripheral crate SBS VME Controller Muon Port Card: Output.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
Possible SLHC Modifications to Clock Distribution Chris Tully, Dan Marlow Princeton Madison, Wisconsin February 13, 2003 Current SLB Clock Distribution.
CMS Trigger Upgrades Phase-I (1) Muon Trigger Upgrades: – CSCTF (CSC TrackFinder) – MPC (Muon Port Card) Calorimeter Trigger Upgrades: – HCAL TPG – Calorimeter.
ROM. ROM functionalities. ROM boards has to provide data format conversion. – Event fragments, from the FE electronics, enter the ROM as serial data stream;
A demonstration of a Time Multiplexed Trigger for CMS Rob Frazier, Simon Fayer, Rob Frazier, Geoff Hall, Christopher Hunt, Greg Iles, Dave Newbold, Andrew.
MicroTCA in CMS Not Official! Just my opinions... Greg Iles 6 July 2010.
S. Dasu, University of Wisconsin February Calorimeter Trigger for Super LHC Electrons, Photons,  -jets, Jets, Missing E T Current Algorithms.
Samuel Silverstein, SYSF ATLAS calorimeter trigger upgrade work Overview Upgrade to PreProcessor MCM Topological trigger.
Status and Plans for Xilinx Development
Producing FPGA Firmware- 1 U. Wisconsin, February 19, 2009 Calorimeter Algorithm Firmware Calorimeter Trigger Upgrade Firmware Michael Schulte, Katherine.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
Off-Detector Processing for Phase II Track Trigger Ulrich Heintz (Brown University) for U.H., M. Narain (Brown U) M. Johnson, R. Lipton (Fermilab) E. Hazen,
TWEPP, Prague, Costas Foudas, Imperial College London 1 The CMS Global Calorimeter Trigger Test Results and Commissioning Overview: Brief Description.
T. Gorski, et al., U. Wisconsin, April 28, 2010 SHLC RCT MicroTCA Development - 1 SLHC Cal Trigger Upgrade SLHC Regional Calorimeter Trigger MicroTCA Development.
Rainer Stamen, Norman Gee
A Compact Trigger Architecture for SLHC
L1Calo Phase-1 architechure
SLHC Calorimeter Trigger
TELL1 A common data acquisition board for LHCb
The Matrix Card and its Applications
uTCA A Common Hardware Platform for CMS Trigger Upgrades
Trigger Upgrade Planning
ATLAS L1Calo Phase2 Upgrade
MicroTCA A look at different options...
Regional Cal. Trigger Milestone: Production & Testing Complete
Example of DAQ Trigger issues for the SoLID experiment
John Harvey CERN EP/LBC July 24, 2001
LHCb Trigger, Online and related Electronics
TELL1 A common data acquisition board for LHCb
Presentation transcript:

A demonstrator for a level-1 trigger system based on μTCA technology and 5Gb/s optical links. Greg Iles Rob Frazier, Dave Newbold (Bristol University) Costas Foudas*, Geoff Hall, Jad Marrouche, Andrew Rose (Imperial College) 20 September 2010 * Recently moved to University Ioannina

22 Sept 2010Greg Iles, Imperial College2 CMS Calorimeter Trigger Hadron Calorimeter Electromagnetic Calorimeter Trigger Regional Calo Trigger Global Calo Trigger 0.3 Tb/s 5-6 Tb/s Potential benefits from upgrade: - The trigger is effectively an image processor. Better resolution? - Extra physics quantities? - Better reliability? Main task: Find jets and electrons from energy depositions inside the calorimeters. Sort in order of importance and send 4 most significant to global trigger.

22 Sept 2010Greg Iles, Imperial College3 Requirements for an upgrade Must process 6 Tb/s Not a problem, just make it parallel, but.... – Need to build physics objects, which don’t observe detector granularity! Data sharing Data duplication – Need to sort physics objects Avoid multi stage sort to minimise latency Restricts number of “physics builders” due to fan in constraints – Only have approx 1us Each serialisation is 100ns - 200ns

22 Sept 2010Greg Iles, Imperial College4 Incoming Trigger Data Neg EtaPos Eta Phi x18 Data per tower, per bx ECAL: 8bits energy + FGV (FineGrainVeto) HCAL: 8bits energy + MIP (MinIonisingParticle) Phi Region Number / Eta Index Constant phi strip Electron or Tau Jet Non Tau Jet RegionTower BarrelEndcaps

22 Sept 2010Greg Iles, Imperial College5 Geometry & link capacity Total number of links = Gb/s Total BW = 5.2Tb/s Number of Regions: 22 (eta) x 18 (phi) 1 Region = 4x4 Towers 1 Link at 2.4Gb/s (8B/10B) – Equivalent to 4 towers – 6 bytes per bx – 12bits per tower – Currently use 9bits per tower BBBBEEEFFFFHF B Region Tower

22 Sept 2010Greg Iles, Imperial College6 Build Level-1 Calo Trigger Demonstartor Why? – Real life always far more complicated than gant chart e.g. Local clock routing from MGT to fabric in Virtex II Pro took 6 months to fully understand in last calorimter project. – Understand system issues Some things only obvious in hindsight – Enable firmware and software to get underway Always lags in any project...

22 Sept 2010Greg Iles, Imperial College7 Original Plan: Fine Processing – CMS TriggerUpgradeWorkshop: 22 July 2009 ** Split problem into two stages Fine processing (electrons) Coarse processing (jets) Clustering process **

22 Sept 2010Greg Iles, Imperial College8 Plan B: Coarse Processing – For jets reduce resolution by factor of 2 Perfectly adequate Nicely matches HF resolution + Clustering process

22 Sept 2010Greg Iles, Imperial College9 Conventional Process entire detector on each bx Deal with electrons/taus first. Build jets from earlier energy clustering or take tower data and coarse grain to 2x2 towers. Pros: – Simple design Cons: – Less flexible, but does it matter... Time Multiplexed * Akin to High Level Trigger processor farm – Process entire detector over ≈9bx – Switch between 10 systems operating in round robin method. Latency impact of 9bx regained by fewer serialisation steps Pros: – Jet+Elec processing in single fpga – Max processed/boundary area – Scalable: 10 identical systems – Redundant: Trigger keeps operating if partition lost, albit at 90% – Flexible: Based on single card Cons: – Never been done before * First proposed at TWEPP last year by John Jones

22 Sept 2010Greg Iles, Imperial College10

22 Sept 2010Greg Iles, Imperial College11 PP CAL- 36 SLB 2.4Gb/s x32 MP- Each PP (PreProcessor) card spans 1 tower in eta and 72 towers in phi. It receives ECAL & HCAL data. MP+ x32 PP CAL+ x32 x4x32x Gb/s 1 3.2Gb/s Time Multiplex example Node 2 Node 1 MP- MP+ MP Nodes 3 to 10 Each PP card has 10 output links. One for each MP processing node. Cards on eta boundary have 20 outputs (data duplicated) cards with 2.4Gb/s SLB cards with 4.8Gb/s SLB PP = PreProcessor MP = MainProcessor

22 Sept 2010Greg Iles, Imperial College12 Objective of demonstartor system R&D project into both Conventional and Time Multiplexed – Develop common technology – 90% of hardware, firmware and software common to both Focussing on Time Multiplexed Trigger at present – Gain better understanding of different approach – Belief that it offers significant benefits to conventional trigger Flexible: All trigger data available at tower resolution Redundant: Loss of a trigger partition has limited impact on CMS – i.e. Trigger rate reduced by 10%. No regional area loss. Scalable: Can prototype with single crate – i.e. Just 10% of the final hardware – Uses single uTCA card throughout system

22 Sept 2010Greg Iles, Imperial College13 Based around MINI-T5 Double Width AMC Card Virtex 5 TX150T or TX240T Optics – IN= 160 Gb/s (32x 5Gb/s) – OUT= 100Gb/s (20x 5b/s) Parallel LVDS – BiDir 64Gb/s – 2x40Pairs at 800Mb/s AMC – 2x Ethernet, 1x SATA – 4x FatPipe, 1x Ext FatPipe Can be used for Standard or Time Multiplex Architecture

22 Sept 2010Greg Iles, Imperial College14 Latency Current latency at Pt5 (bx) + 2SLB Tx (estimate) + 2SLB to RCT cable + 40RCT+GCT + 3GCT to GT = 47Total Modern serial links have many advantages, but latency isn’t one of them Latency measuremnets based on Virtex 5 GTX and GTP transceivers – No tricks e.g. No elastic buffer bypass Why? Tricks sometimes backfire! – Measured 5.3 bx at 5Gb/s Includes sync firmware Be conservative - assume 6.0 bx

22 Sept 2010Greg Iles, Imperial College15 Latency: Time Multiplex SLB to PP Serdes plus 10m cable delay PP processing plus time multiplexing PP to MP serdes plus 20m cable delay MP processing: 4bx for electrons measured. 4bx for jets & sort estimate. MP to GT serdes plus 5m cable delay GT procesing plus latency contingency All numbers in bx Total = 47 No Lateny penalty

22 Sept 2010Greg Iles, Imperial College16 Firmware GTX Transceiver RAM Algorithm DAQ Capture Mux Sync Ethernet Buffer SyncMasterIn, EnableIn SyncSlaveOut Level1Accept DAQ Capture Counter To GT TPGs Mux Electron cluster algorithm implmented. Now adding sorting and jet clustering

22 Sept 2010Greg Iles, Imperial College17 Firmware Infrastructure - Finished Optical transeivers: – Up to 32 GTX transceivers at 5Gb/s – Pattern RAM injection & capture – CRC check – Low latency automatic synchronisation aligns links on a user defined signal e.g. beginning of a packet or BC0 DAQ: – Event pipeline with automatic capture and buffering i.e no more adjusting the latency! Ethernet: – UDP interface - Thanks to Jeremy Mans Lab development system

22 Sept 2010Greg Iles, Imperial College18 Firmware Algorithms – Work in progress Based on Wisconsin algorithms – Monika Grothe, Michalis Bachtis & Sridhara Dasu – Fully implemented & verified 2×2 cluster finder Electron/Photon ID module Cluster overlap filtering Cluster weighting – Work started, but postponed to allow SW development Cluster isolation Jet finding Uses Modelsim Forgein Language Interface (FLI) to verify algorithms VHDL Algorithm C++ Algorithm TPG Input Check All work done by Andrew Rose

22 Sept 2010Greg Iles, Imperial College19 Firmware verification: Test setup Test setup slightly different to proposed system, but concept the same. Uses 8bit rather than 12bits energies (i.e. current CMS) – Would require 28 links at 5Gb/s to load data in 24bx. – Final system uses 72 links across 2 boards running at 10Gb/s to load 12bit data in 9bx. Different length fibre to make sure sync unit operating correctly. Partial electron algo in xc5vtx150t Slice Logic Utilization: Number of Slice Registers: 21,143 out of 92,800 22% Number of Slice LUTs: 27,571 out of 92,800 29% Specific Feature Utilization: Number of BlockRAM/FIFO: 220 out of % Number of GTX_DUALs: 13 out of 20 65% x12 RAM Algo Full electron algo in xc5vtx150t Slice Registers 12460/92800 = 13% Slice LUTs 17167/92800 = 18%

22 Sept 2010Greg Iles, Imperial College20 Software: Architecture Hardware controller PC separates the Control LAN and the User code from the Hardware LAN and the devices Unlike current Trigger software architecture, all network traffic hidden from end user Made possible by common interface layer within the firmware and mirrored within the software Simply python interface also available. Single Multicore host Hardware LAN Fabric Control LAN Fabric Kernel Async. IO services Transport Adapter Multiplexer layer Network Interface User code Andrew Rose Rob Frazier

22 Sept 2010Greg Iles, Imperial College21 Software: Work in progress Rob Frazier (Bristol) writing the Controller Hub Software within a high performance web server framework Andy Rose (Imperial) writing the low level client side software Project hosted on HepForge Low Level Client Side software finished – Hierachial register map Allows VHDL blocks to be reused. i.e. make simple VHDL block and then instantiate multiple times.

22 Sept 2010Greg Iles, Imperial College22 Current Status MINI-T5 Rev1 back – USB2 capability – Avago optics – It even has a front panel! Next – Sort and Jet clustering – Build system Main Processor Pre Processor Current focus has been on the MainProcessor firmware/software. Next focus on PreProcessor and buildining a system of multiple cards. Rev1

Questions ?