CMS Global Calorimeter Trigger Hardware Design 21/9/06.

Slides:



Advertisements
Similar presentations
GCT Leaf Card. Overview Based on existing double PMC board Based on existing double PMC board Satellite processor prototypeSatellite processor prototype.
Advertisements

J. Jones (Imperial College London), Alt. GCT Mini-Meeting GCT Source Card.
GCT Source Card Status John Jones Imperial College London
J. Jones (Imperial College London), Alt. GCT Mini-Meeting Source Card Design Status and Plans.
On the development of the final optical multiplexer board prototype for the TileCal experiment V. González Dep. of Electronic Engineering University of.
TileCal Optical Multiplexer Board 9U VME Prototype Cristobal Cuenca Almenar IFIC (Universitat de Valencia-CSIC)
Digital RF Stabilization System Based on MicroTCA Technology - Libera LLRF Robert Černe May 2010, RT10, Lisboa
Uli Schäfer JEM Status and Test Results Hardware status JEM0 Hardware status JEM1 RAL test results.
L. Greiner1HFT Hardware 09/23/2010 STAR Vertex-6 based PXL RDO board Design concept and items for discussion.
ESODAC Study for a new ESO Detector Array Controller.
Uli Schäfer JEM Status and plans Hardware status JEM0 Hardware status JEM1 Plans.
CMS Week Sept 2002 HCAL Data Concentrator Status Report for RUWG and Calibration WG Eric Hazen, Jim Rohlf, Shouxiang Wu Boston University.
Level-1 Topology Processor for Phase 0/1 - Hardware Studies and Plans - Uli Schäfer Johannes Gutenberg-Universität Mainz Uli Schäfer 1.
28 August 2002Paul Dauncey1 Readout electronics for the CALICE ECAL and tile HCAL Paul Dauncey Imperial College, University of London, UK For the CALICE-UK.
Status of the Optical Multiplexer Board 9U Prototype This poster presents the architecture and the status of the Optical Multiplexer Board (OMB) 9U for.
Update on SP02 design by Victor Golovtsov & Lev Uvarov Trigger Meeting at Rice August 2002.
Uli Schäfer 1 JEM1: Status and plans power Jet Sum R S T U VME CC RM ACE CAN Flash TTC JEM1.0 status JEM1.1 Plans.
Uli Schäfer JEM Status and plans Firmware Hardware status JEM1 Plans.
Global Trigger H. Bergauer, K. Kastner, S. Kostner, A. Nentchev, B. Neuherz, N. Neumeister, M. Padrta, P. Porth, H. Rohringer, H. Sakulin, J. Strauss,
Status and planning of the CMX Philippe Laurens for the MSU group Level-1 Calorimeter Trigger General Meeting, CERN May 24, 2012.
TID and TS J. William Gu Data Acquisition 1.Trigger distribution scheme 2.TID development 3.TID in test setup 4.TS development.
Trigger Supervisor (TS) J. William Gu Data Acquisition Group 1.TS position in the system 2.First prototype TS 3.TS functions 4.TS test status.
Leo Greiner IPHC testing Sensor and infrastructure testing at LBL. Capabilities and Plan.
LECC 2004 – Boston – September 13 th L.Guiducci – INFN Bologna 1 The Muon Sorter in the CMS Drift Tubes Regional Trigger G.M. Dallavalle, Luigi Guiducci,
Prototype of the Global Trigger Processor GlueX Collaboration 22 May 2012 Scott Kaneta Fast Electronics Group.
DLS Digital Controller Tony Dobbing Head of Power Supplies Group.
Printed by Topical Workshop on Electronics for Particle Physics TWEPP-08, Naxos, Greece / September 2008 MEZZANINE CARDS FOR.
Lessons Learned The Hard Way: FPGA  PCB Integration Challenges Dave Brady & Bruce Riggins.
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Leo Greiner IPHC meeting HFT PIXEL DAQ Prototype Testing.
1 DAQ for FVTX detector Implementation Mark Prokop Los Alamos National Laboratory.
Evaluation of the Optical Link Card for the Phase II Upgrade of TileCal Detector F. Carrió 1, V. Castillo 2, A. Ferrer 2, V. González 1, E. Higón 2, C.
Leo Greiner IPHC DAQ Readout for the PIXEL detector for the Heavy Flavor Tracker upgrade at STAR.
Status of Global Trigger Global Muon Trigger Sept 2001 Vienna CMS-group presented by A.Taurok.
11 October 2002Matthew Warren - Trigger Board CDR1 Trigger Board CDR Matthew Warren University College London 11 October 2002.
CPT Week, April 2001Darin Acosta1 Status of the Next Generation CSC Track-Finder D.Acosta University of Florida.
Gueorgui ANTCHEVPrague 3-7 September The TOTEM Front End Driver, its Components and Applications in the TOTEM Experiment G. Antchev a, b, P. Aspell.
Global Trigger H. Bergauer, Ch. Deldicque, J. Erö, K. Kastner, S. Kostner, A. Nentchev, B. Neuherz, N. Neumeister, M. Padrta, P. Porth, H. Rohringer, H.
Hardware proposal for the L2  trigger system detailed description of the architecture mechanical considerations components consideration electro-magnetic.
CMX status and plans Yuri Ermoline for the MSU group Level-1 Calorimeter Trigger Joint Meeting CERN, October 2012,
Annual Review Cern -June 13th, 2006 F. Loddo I.N.F.N. Bari RPC Electronics: Technical Trigger Flavio Loddo I.N.F.N. Bari On behalf of the RPC-Trigger group.
Status and planning of the CMX Wojtek Fedorko for the MSU group TDAQ Week, CERN April , 2012.
Leo Greiner PIXEL Hardware meeting HFT PIXEL detector LVDS Data Path Testing.
9/25/08J. Lajoie - Muon Trigger Upgrade Review1 Muon Trigger Upgrade LL1 Electronics Iowa State University John Lajoie Cesar Luiz daSilva Todd Kempel Andy.
Serial Data Link on Advanced TCA Back Plane M. Nomachi and S. Ajimura Osaka University, Japan CAMAC – FASTBUS – VME / Compact PCI What ’ s next?
ATLAS Trigger / current L1Calo Uli Schäfer 1 Jet/Energy module calo µ CTP L1.
March 9, 2005 HBD CDR Review 1 HBD Electronics Preamp/cable driver on the detector. –Specification –Schematics –Test result Rest of the electronics chain.
1 07/10/07 Forward Vertex Detector Technical Design – Electronics DAQ Readout electronics split into two parts – Near the detector (ROC) – Compresses and.
Samuel Silverstein Stockholm University CMM++ firmware development Backplane formats (update) CMM++ firmware.
KLM Trigger Status Barrel KLM RPC Front-End Brandon Kunkler, Gerard Visser Belle II Trigger and Data Acquistion Workshop January 17, 2012.
W. Smith, U. Wisconsin, US CMS DOE/NSF Review, September 2004 Trigger Report - 1 CSC Trigger Muon Port Card & Sector Processor in production Mezzanine.
US CMS DOE/NSF Review, 20 November Trigger - W. Smith1 WBS Trigger Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager DOE/NSF Status.
Trigger Hardware Development Modular Trigger Processing Architecture Matt Stettler, Magnus Hansen CERN Costas Foudas, Greg Iles, John Jones Imperial College.
Trigger Workshop: Greg Iles Feb Optical Global Trigger Interface Card Dual CMC card with Virtex 5 LX110T 16 bidirectional.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
1 Timing of the calorimeter monitoring signals 1.Introduction 2.LED trigger signal timing * propagation delay of the broadcast calibration command * calibration.
E. Hazen - DTC1 DAQ / Trigger Card for HCAL SLHC Readout E. Hazen - Boston University.
LECC2003: The 96 Chann FED Tester: Greg Iles30 September The 96 channel FED Tester Outline: (1) Background (2) Requirements of the FED Tester (3)
Trigger for MEG2: status and plans Donato Nicolo` Pisa (on behalf of the Trigger Group) Lepton Flavor Physics with Most Intense DC Muon Beam Fukuoka, 22.
CMX: Update on status and planning Yuri Ermoline, Wojciech Dan Edmunds, Philippe Laurens, Chip Michigan State University 7-Mar-2012.
29/05/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
CMS GCT ESR: Firmware Based on CMS GCT Algorithm Review & Firmware Status Report by Magnus Hansen Magnus Hansen, Greg Iles, Matt Stettler 10.
AMC13 T1 Rev 2 Preliminary Design Review E. Hazen Boston University
ATLAS calorimeter and topological trigger upgrades for Phase 1
E. Hazen - Back-End Report
Current DCC Design LED Board
CoBo - Different Boundaries & Different Options of
MicroTCA Common Platform For CMS Working Group
ECAL OD Electronic Workshop 7-8/04/2005
Presentation transcript:

CMS Global Calorimeter Trigger Hardware Design 21/9/06

CMS Level-1 Trigger Position of the Global Calorimeter Trigger in The CMS Level-1 Trigger system

RCT Data Output π η Φ 2π 18 RCT Crates cover CMS Calorimeter barrel Each Crate covers 0.7 Φ x 5 η region Outputs Electron and Jet Information to GCT

GCT Requirements Sorts Electrons –4 highest Energy Finds and sorts Jets –Top 4 by energy, physical size, tau … Sorting criteria may be changed Sort by energy for initial implementation –Algorithm implements 3x3 sliding window Requires contiguous data space – spans RCT crate boundaries Data sharing scheme needs to be implemented This algorithm drives processing requirements Processes data at ~250 Gbps –Latency requirement of 24 bunch crossings 600 nS Interfaces to 18 RCT crates –108, 68 pin parallel cables –Differential ECL, not DC balanced –Need to maintain ground reference with entire RCT Entire row of racks on different floor

Jet finder π η Φ Parallel algorithm implements 3x3 sliding window over each RCT crate’s data space –Overlaps neighboring crates –RCT data must be shared between Instances Instances must be in close proximity –Same Device if possible –High bandwidth connection if not Note overlap at η0 –Data duplicated –Sent to both jet finders Poster covers this subject in detail – Jet finder 1 Jet finder 2 Jet finder 3 “Revised CMS Global Calorimeter Trigger Functionality & Performance”

Design Tradeoffs Design on a compressed schedule –Reduce risk to the extent possible Base on existing modules Conservative data rates Minimize number of FPGAs –Reduce firmware risk Never split algorithms Algorithm size and complexity drives FPGA selection Easier simulation Better synthesis efficiency Superior timing (and lower overall power consumption) Judicious use of serializer/deserializers –Most efficient method of concentrating data Latency penalty of at least 6 clocks –Only use on system boundaries RCT/GCT GCT/GT –Significant negative impact on complexity of design Several cards used mainly as “signal plumbing”

Design Overview Three main elements –Data transport and physical concentration –Trigger processing –Data plumbing and sorting Data Transport –Compress RCT data and provide electrical isolation –Functionally part of the RCT What we really wanted it’s output to be Trigger Processing –Implement Jet finders –Modular processing element Data Plumbing –Large, physically complex boards Required to deal with multiple wide parallel busses –Excellent example of why SERDES technology was developed Unfortunately required due to latency constraints

Design Overview Modular design with 4 card types –Source card (based on Imperial College IDAQ VME module) Serializes RCT data and transmits on fiber –Electrically isolates GCT from RCT –Reduces interface cabling, allows physical data concentration Resides in RCT racks –Also provides RCT readout –Leaf card (based on Los Alamos digital channelizer double PMC) Jet processing and fiber receiver –Logic capacity driven by Jet finder algorithm –Also used for Electron sort –Wheel card (new CERN design) Only used for Jet processing Carries multiple Leaf cards –Facilitates Leaf data sharing Sorts resulting Jets –Concentrator card (new CERN Design) Electron and final Jet sort –Carrier Leaf cards for Electron sort Interfaces with Wheel cards for final jet sort Slow control, TTC, and DAQ interface

GCT Block Diagram Leaf Card Wheel CardConcentrator card 63 source cards (not shown) 8 Leaf cards 2 Wheel cards 1 Concentrator

Electrical Interfaces LVDS signaling –Used between Leaf cards, and from Wheel to Concentrator Cable based board/board connections 40MHz DDR required, but can support faster –Direct FPGA drive in most cases Direct connections provide maximum flexibility and speed Wheel/Concentrator Jet data passes through single ended/diff converters –Pin limits due to large number of signals –Samtec QTS/H differential connectors High density and speed, Rated for multi GHz operation Commercial cable assemblies DDR used for all single ended I/O –FPGA intercommunication at 40MHz Short runs allow faster operation –Communication with Leaf cards at 40MHz PMC connectors limit speed here Leaf cards utilize 2.5V LVCMOS with DCI (nominally 50 ohms at this point) Single ended outputs from FPGAs utilize DCI drivers –Allow controlled impedance drive Nominally 50 ohms at this point

Clock Distribution TTCrxQPLL 40MHz 80MHz 160MHz Fan out 4x4 Cross point switch External Input FPGA Control PMC sites Wheel cards Local FPGAs Concentrator QPLL 40MHz 80MHz Fan out 4x4 Cross point switch 40MHz from Concentrator FPGA Control PMC sites Local FPGAs Wheel 80MHz from Concentrator OSC Fan out 4x4 Cross point switch Local 80MHz reference FPGA Control FPGA SERDES ref Leaf 40, 80MHz from Wheel FPGA Logic clocks 80MHz External input Fully differential distribution tree Controlled by cross point switches Allows stand alone operation Use DLLs in FPGAs to tune if necessary

JTAG System JTAGJTAG JTAG SWITCH (CPLD) Front panel CPLD JTAG Board header SWITCHESSWITCHES Differential buffers Samtec differential Connector/cable From concentrator JTAG Mode 6, 2.5V JTAG chains 2, 3.3V JTAG chains JTAG CPLD is a combinatorial flow-through design A single chain, or multiple chains (daisy chained) are selected via front manual mode switches JTAG source is either header or remote device, selected by mode switch Leaf JTAG is single chain (4 devices) JTAGJTAG JTAG SWITCH (CPLD) Board header CPLD JTAG Board header SWITCHESSWITCHES Differential buffers Samtec differential Connector/cable To wheel JTAG Mode 6, 2.5V JTAG chains 2, 3.3V JTAG chains VME ConcentratorWheel

Source Card Based in IDAQ module designed at Imperial College –Simplified due to large number required for complete system Converts ECL RCT input to SFP optical –Two VHDCI SCSI inputs, 32 bits at 80MHz –Four SFP fiber outputs Spartan 3 4 SERDES/SFP modules –8b/10b encoding DC balanced 1.6Gbps –Comma generation (sync) TTCrx –Time synchronization USB slow control –Provides RCT readback

Block diagram layer PCB 2.USB slow control 3.TTCrx and QPLL 4.VHDCI SCSI inputs 5.Serial SPF outputs Agilent HFBR-5720AL TLK2501 SERDES Rated at 2.5 Gbps 6.Linear power for SERDES/SFP 7.Switching power for FPGA 8.Spartan3 FPGA 3S1000

Leaf Card Based on satellite channelizer design at Los Alamos Lab –Modified to include V2Pro and MFP connectors Main processing engine of GCT –Accepts Jet data from 3 RCT crates –Electron data from 9 RCT crates –32 fiber optic links SNAP-12 MFP optics Rated at 2.5 Gbps Jet algorithm drives capacity –3M gates/jet finder –10 fibers/crate –2 V2Pro70 FPGAs

Block Diagram V2P Clock Distribution Local oscillator PMC/coax inputs 2.PMC connectors 8 fully populated 3.60 pair differential links 4.Switching power supplies 1.5V core and 2.5V I/O Phase and freq controlled 5.Linear SERDES supplies 6.12 channel optical receivers Agilent AFBR-742B 7.14 layer PCB

Implementation High density Optical Inputs –Cannot fit enough SFP single channel modules –“Snap 12” parallel receiver 12 channels at 2.5Gbps –Industry standard short distance link Xilinx Embedded SERDES links (Rocket I/O) –Virtex2 Pro devices selected V2P70 with 16 links each Support improved differential I/O Easily obtainable No external (off FPGA chip) memory –Nice to have, but not required for GCT processing Double PMC format –Power supply and basic layout retained from existing design –Electrically compatible, but too high mechanically Not truly PMC compliant

Routing Parameters Length matching –Differential lines matched to ¼” as a bus Yields ½” on board/board connections –Individual pairs matched to a few mils Matched in groups of 8 pairs Not required for 40MHz DDR –Allows significant speed increase –Single ended lines matched to ½” as a bus Matched in groups of 8-12 lines Not required for 40MHz DDR –Allows higher speed operation –Differential SERDES lines matched to 1-2 mils Impedance controlled and individually matched Board structure isolates SERDES with ground planes –50 Ohm stripline

Test routing

Power supplies 15A, 1.5V switcher for each V2Pro VccInt –Devices can be run at thermal limit Fan headers on board if needed Estimated load less than 1/2 this figure –40MHz, 100% utilization yields 6A Single 15A, 2.5V switcher for I/O –Estimated load is ½ of this capacity Switchers powered from 5V –Not used for other logic –Phase and frequency controlled Can optimize noise or efficiency Switch out of phase to control surge currents Separate linear supplies for SERDES –Each FPGA has local linear SERDES supply Optical receivers powered directly from 3.3V PMC power –Manufacturer claims this is acceptable

Wheel Card Carries 3 leaf cards (double PMC) –Compresses (sorts) Jet data –Calculates Et and Jet count –Single ended electrical interface (DDR 40MHz) Interfaces to concentrator board –High speed cable interface –LVDS electrical interface DDR 40Mhz required, but could support higher 9U VME form factor –Power only, no VME interface ECAL backplane

Block Diagram Energy FPGA Jet FPGA Leaf (x3) J 12 J 14 J 13 J 11 J 22 J 21 J 24 J 23 Finished Jet dataEnergy Sum Data J1 J2 J3 Diff Buffers Diff Buffers Slow control and readback

Implementation Accepts parallel data from leafs on 3 DPMC sites –278 signals on each site (total 834) 186 signals/site to Jet FPGA, 92/site to Energy FPGA Single ended Outputs parallel data –Electrical interface 240 differential pairs provided Processing –Two Xilinx Virtex4 FPGAs –XC4VLX100FF1513 I/O (as opposed to logic) intensive design Advanced Virtex4 I/O features reduce risk –Better double data rate support –Improved Differential support –One for Jet sorting, one for Et and Jet count Jet FPGA pin limited –Requires single ended output to meet signal count –External differential buffers drive data to concentrator

Power supplies 10A, 1.2V switcher for each V4 VccInt –Devices can be run near thermal limit Estimated load less than 1/2 this figure at 40MHz Two 10A, 2.5V switchers for I/O –Leaf cards may be jumpered to provide own VIO Wheel will only need to drive own FPGAs 10A, 3.3V switcher for each DPMC site –Board design allows for substitution of 5A Linear Would be preferable since optical receivers are powered directly Questionable margin requires switcher site be provided All Switchers powered from 5V –Not used for other logic Separate linear supply for QPLL and some clock distribution –2.5V

Concentrator Carries 2 Leaf cards (double PMC) –Sorts Electron data –Single ended 40 MHz DDR interface Interfaces to two Wheel cards –Sorts Jet data –Differential 40 MHz DDR cable interface Provides VME Interface –Slow control and readback TTC interface –Timing and synchronization DAQ Interface –Slink Carries GT interface PMC 9U VME form factor –ECAL backplane Complex design –Significant Data plumbing Congested routing –Multiple communication Interfaces

Block Diagram Leaf PMC x2 Electron FPGA (4VLX100) VME/DAQ FPGA (2V3000) Jet FPGA (4VLX100) GT Output PMC J1 J2 J3 P1 P2 P3 ECAL Slink Carrier TTC VME bus Electron data Jet data Jet count/ total EtSlow control/ readbackGT output DAQ data Differential connectors To Wheel (x2) VME connectors To backplane

Implementation Processing –Two Xilinx Virtex4 FPGAs –XC4VLX100-FF1513 Must concentrate large amount of data –Choose package with most I/O Integrated differential termination makes layout simpler High speed I/O provide reserve capability Communication –Xilinx Virtex2 FPGA –XC2V3000-BF957 Robust in 3.3V enviroment –VME 64x interface –Slink –TTCrx Electron FPGA – Isolated Electrons – Non-Isolated Electrons – Energy Sums – Jet Counts Jet FPGA – Forward Jets – Central Jets – Tau Jets Power Supplies –Identical to Wheel Number and Type

Status Schedule Requirements –Concentrator, 2 Leafs, and 7 source cards by 1/07 –Full system by 7/07 Source card –First articles in hand –Extensively tested, awaiting integration with leaf Leaf card –First articles in hand –Testing underway SERDES tests to begin the first week of October Concentrator card –In production –First articles due in mid October Wheel card –In final Layout –Production order to be placed by mid October