22nd March, 2005CPM FDR, Architecture and Challenges1 CPM architecture and challenges oCP system requirements oArchitecture oModularity oData Formats oData.

Slides:



Advertisements
Similar presentations
System Design GroupInstrumentationViraj PereraRAL2-March-01 Prototype ROD Prototype ROD (6U VME) – Requirements 4 channel (CPMs) prototype Perform zero.
Advertisements

Uli Schäfer JEM Status and plans Hardware status JEM0 Hardware status JEM1 Plans.
8 th Workshop on Electronics for LHC experiments - Colmar- September 9 th -13 th 2002Gilles MAHOUT Prototype Cluster Processor Module for the ATLAS Level-1.
5th April, 2005JEM FDR1 JEM FDR: Design and Implementation JEP system requirements Architecture Modularity Data Formats Data Flow Challenges : Latency.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
Jet algorithm/FPGA by Attila Hidvégi. Content Jet algorithm Jet-FPGA – Changes – Results – Analysing the inputs Tests at RAL Summary and Outlook.
Phase-0 Topological Processor Uli Schäfer Johannes Gutenberg-Universität Mainz Uli Schäfer 1.
L1Calo – towards phase II Mainz upgraders : B.Bauss, V.Büscher, R.Degele, A.Ebling, W.Ji, C.Meyer, S.Moritz, U.Schäfer, C.Schröder, E.Simioni, S.Tapprogge.
ATLAS L1 Calorimeter Trigger Upgrade - Uli Schäfer, MZ -
Phase-0 topological processor Uli Schäfer Johannes Gutenberg-Universität Mainz Uli Schäfer 1.
Level-1 Topology Processor for Phase 0/1 - Hardware Studies and Plans - Uli Schäfer Johannes Gutenberg-Universität Mainz Uli Schäfer 1.
Uli Schäfer 1 S-L1Calo upstream links architecture -- interfaces -- technology.
Cluster Processor Module : Status, test progress and plan Joint Meeting, Mainz, March 2003.
5th April, 2005JEM FDR1 Energy Sum Algorithm In all stages saturate outputs if input is saturated or arithmetic overflow occurs Operate on 40Mb/s data.
28 August 2002Paul Dauncey1 Readout electronics for the CALICE ECAL and tile HCAL Paul Dauncey Imperial College, University of London, UK For the CALICE-UK.
S. Silverstein For ATLAS TDAQ Level-1 Trigger updates for Phase 1.
Samuel Silverstein Stockholm University L1Calo upgrade hardware planning + Overview of current concept + Recent work, results.
Uli Schäfer 1 JEM: Status and plans Pre-Production modules Status Plans.
Uli Schäfer 1 (Not just) Backplane transmission options Uli, Sam, Yuri.
Uli Schäfer 1 JEM1: Status and plans JEM1.1 Status Plans.
5th April, 2005JEM FDR1 JEM FDR: Design and Implementation JEP system requirements Architecture Modularity Data Formats Data Flow Challenges : Latency.
Uli Schäfer 1 CP/JEP backplane test module What’s the maximum data rate into the S-CMM for phase-1 upgrade ?
S. Silverstein For ATLAS TDAQ Level-1 Trigger upgrade for Phase 1.
Uli Schäfer 1 JEM1: Status and plans power Jet Sum R S T U VME CC RM ACE CAN Flash TTC JEM1.0 status JEM1.1 Plans.
Samuel Silverstein Stockholm University L1Calo upgrade discussion Overview Issues  Latency  Rates  Schedule Proposed upgrade strategy R&D.
FEX Uli Schäfer, Mainz 1 L1Calo For more eFEX details see indico.cern.ch/getFile.py/access?contribId=73&sessionId=51&resId=0&materialId=slides&confId=
Status and planning of the CMX Philippe Laurens for the MSU group Level-1 Calorimeter Trigger General Meeting, CERN May 24, 2012.
Trigger Supervisor (TS) J. William Gu Data Acquisition Group 1.TS position in the system 2.First prototype TS 3.TS functions 4.TS test status.
CMX (Common Merger eXtension module) Y. Ermoline for CMX collaboration Preliminary Design Review, Stockholm, 29 June 2011.
Atlas L1Calo CMX Card CMX is upgrade of CMM with higher capacity 1)Inputs from JEM or CPM modules – 40 → 160Mbps (400 signals) 2)Crate CMX to System CMX.
SBS meeting VETROC application for Cerenkov triggering Alexandre Camsonne March 18 th 2015.
PHENIX upgrade DAQ Status/ HBD FEM experience (so far) The thoughts on the PHENIX DAQ upgrade –Slow download HBD test experience so far –GTM –FEM readout.
SBS meeting VETROC application for Cerenkov triggering Alexandre Camsonne March 18 th 2015.
L1Calo Intro Cambridge Group, Dec 2008 Norman Gee.
STGC Trigger Demonstrator sTGC Trigger Demonstrator ATLAS Israel Annual Meeting 30 December 2012 Lorne Levinson, Julia Narevicius, Alex Roich, Meir Shoa,
HBD FEM the block diagram preamp – FEM cable Status Stuffs need to be decided….
Leo Greiner IPHC DAQ Readout for the PIXEL detector for the Heavy Flavor Tracker upgrade at STAR.
CMX status and plans Yuri Ermoline for the MSU group Level-1 Calorimeter Trigger Joint Meeting CERN, October 2012,
Alan Watson ATLAS Overview week, Prague, 17/09/2003 Rx   Calorimeters (LAr, Tile) 0.2x x Mb/s analogue ~75m 0.1x0.1 RoI Builder L1 CTP.
8/9/2000T.Matsumoto RICH Front End RICH FEE Overview PMT to FEE signal connection Trigger Tile Summation of Current RICH LVL-1 Trigger Module1,2 What is.
Leo Greiner PIXEL Hardware meeting HFT PIXEL detector LVDS Data Path Testing.
S. Rave, U. Schäfer For L1Calo Mainz
CMX Hardware Overview Chip Brock, Dan Edmunds, Philippe Yuri Wojciech Michigan State University 12-May-2014.
Ideas about Tests and Sequencing C.N.P.Gee Rutherford Appleton Laboratory 3rd March 2001.
ATLAS Trigger / current L1Calo Uli Schäfer 1 Jet/Energy module calo µ CTP L1.
Algorithms and TP Y. Ermoline et al. Level-1 Calorimeter Trigger Joint Meeting, Heidelberg, January 11-13, 2010.
SL1Calo Input Signal-Handling Requirements Joint Calorimeter – L1 Trigger Workshop November 2008 Norman Gee.
JFEX Uli Schäfer 1. Constraints & Numerology Assumption: one crate, several modules. Each module covers full phi, limited eta range Data sharing with.
CMX Hardware Overview Chip Brock, Dan Edmunds, Philippe Yuri Wojciech Michigan State University 19-May-2014.
JFEX Uli Schäfer 1 Mainz. L1Calo Phase-1 System Uli Schäfer 2 CPM JEM CMX Hub L1Topo ROD JMM PPR From Digital Processing System CPM JEM CMX Hub L1Topo.
Samuel Silverstein Stockholm University CMM++ firmware development Backplane formats (update) CMM++ firmware.
Transfering Trigger Data to USA15 V. Polychonakos, BNL.
JFEX Uli Schäfer 1 Mainz. L1Calo Phase-1 System Uli Schäfer 2 CPM JEM CMX Hub L1Topo ROD JMM PPR From Digital Processing System CPM JEM CMX Hub L1Topo.
Samuel Silverstein, SYSF ATLAS calorimeter trigger upgrade work Overview Upgrade to PreProcessor MCM Topological trigger.
CP Athena Monitoring Status as of 20/05/08 Revised directory structure (again!). Phi scale configurable in degrees, radians or channels. Existing plots.
CMX: Update on status and planning Yuri Ermoline, Wojciech Dan Edmunds, Philippe Laurens, Chip Michigan State University 7-Mar-2012.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
Samuel Silverstein, Stockholm University For the ATLAS TDAQ collaboration The Digital Algorithm Processors for the ATLAS Level-1 Calorimeter Trigger.
Configuration and local monitoring
ATLAS calorimeter and topological trigger upgrades for Phase 1
L1Calo Requirements on the DPS
2018/6/15 The Fast Tracker Real Time Processor and Its Impact on the Muon Isolation, Tau & b-Jet Online Selections at ATLAS Francesco Crescioli1 1University.
L1Calo Phase-1 architechure
L1Calo upgrade discussion
ATLAS L1Calo Phase2 Upgrade
Possibilities for CPM firmware upgrade
ATLAS: Level-1 Calorimeter Trigger
PID meeting Mechanical implementation Electronics architecture
U. Marconi, D. Breton, S. Luitz
Presentation transcript:

22nd March, 2005CPM FDR, Architecture and Challenges1 CPM architecture and challenges oCP system requirements oArchitecture oModularity oData Formats oData Flow oChallenges oHigh-speed data paths oLatency

22nd March, 2005CPM FDR, Architecture and Challenges2 CP system requirements oProcess –2.5 < η < 2.5 region o50x64 trigger towers per layer oTwo layers o8 bit data (0-255 GeV) oRelatively complex algorithm oOutput data to CTP o16 x 3 bit hit counts oEach hit condition is a combination of four thresholds oOutput data to RODs oIntermediate results oRoI data for RoIB oCluster Algorithms o4 x 4 x 2 cell environment oSliding window

22nd March, 2005CPM FDR, Architecture and Challenges3 System design considerations oSeveral major challenges to overcome oLarge processing capacity oData i/o, largely at input oLatency requirements oProcessing must be split over several modules working in parallel oBut overlapping nature of algorithms implies fan-out needed oModularity is compromise between competing requirements oHigh connectivity back-plane required for data sharing oData must be ‘compressed’ as much as possible oUse data reduction whenever possible oData serialisation at various speeds used to reduce i/o pin counts

22nd March, 2005CPM FDR, Architecture and Challenges4 System modularity oFull system o50 x 64 x 2 trigger towers oFour crates, each processing one quadrant in phi o50 x 16 x 2 core towers oEta range split over 14 CPMs o4 x 16 x 2 core towers oModule contains 8 CP FPGAs o4 x 2 x 2 core towers

22nd March, 2005CPM FDR, Architecture and Challenges5 Board Level Fan-out, input signals and back-plane oCPM has 64 core algorithm cells o16 x 4 reference towers oObtained from direct PPM connections (2 PPMs per CPM) oAlgorithm requires extra surrounding cells for ‘environment’ oOne extra below, two above o19 x 4 x 2 towers in all oFanout in phi achieved via multiple copies of PPM output data oFanout in eta achieved via back- plane

22nd March, 2005CPM FDR, Architecture and Challenges6 Internal Fan-out and the Cluster Processing FPGA Environment oCP FPGA processes 2x4 reference cells oAlgorithm requires 4x4x2 cells around reference oConvoluting these gives 5x7x2 FPGA environment oData received from 18 different serialiser FPGAs o6 on-board o12 through back-plane on-boardfrom leftfrom right from above ‘core’ cells from below

22nd March, 2005CPM FDR, Architecture and Challenges7 CPM Data formats – tower data o8 bit tower data oPPM peak finding algorithm guarantees any non-zero data is surrounded by zeroes oAllows data encoding/compression oTwo 8 bit towers converted to one 9 bit ‘BC-muxed’ data word oAdd odd-parity bit for error detection o160 input towers encoded in 80 x 10 bit data streams oSame format utilized for: oinput to CPM obetween serializer FPGA and CP FPGA two towers x 8 bits ‘bcmuxed’ 10 bit data 8 bit data bcmux bit parity bit

22nd March, 2005CPM FDR, Architecture and Challenges8 CPM data formats – hits and readout oCPM hits results: o16 x 3 bit saturating sums o8 sent to left CMM, 8 sent to right o8 x 3 = 24 results bits pluse 1 odd-parity bit added oDAQ readout oPer L1A, 84 x 20 bits data oBulk of data is BC-demuxed input data o10 bit per tower, eight bit data, 1 bit parity error, 1 bit link error o160 direct inputs x 10 bit data = 80 x 20 bits o48 bits hit data, 12 bits Bcnum, 20 bits odd-parity check bit oRoI readout oPer L1A, 22 x 16 bits data oBulk of data is individual CP FPGA hit and region location o16 bits + 2 bits location + 1 bit saturation + 1 bit parity error o8 FPGAs each have 2 RoI locations = 8 x 2 x 20 bits oRest is 12 bits Bcnum, and odd-parity check bit

22nd March, 2005CPM FDR, Architecture and Challenges9 CPM data flow: signal speeds oMultiple protocols and data speeds used throughout board oCare needed to synchronize data at each stage oThis has proved to be the biggest challenge on the CPM LVDS deserialiser Serialiser FPGA CP FPGA Hit Merger 400 Mbit/s serial data (480 Mbit/s with protocol) 40 MHz parallel data 160 MHz serial data 40 MHz parallel data Readout Controllers 640 Mbit/s serial data (800 Mbit/s with protocol)

22nd March, 2005CPM FDR, Architecture and Challenges10 CPM challenges: high-speed data paths o400 (480) Mbit/s input data oNeeded to reduce input connectivity o80 differential inputs plus grounds = 200 pins/CPM oPrevious studies of the LVDS chipset established viability oWorks very reliably with test modules (DSS/LSM) oStill some questions over pre-compensation and PPM inputs o160 MHz CP FPGA input data oNeeded to reduce back-plane connectivity o160 fan-in and 160 fan-out pins per CPM oNeeded to reduce CP FPGA input pin count o108 input streams needed per chip oThis has been the subject of the most study in prototype testing o640 (800) Mbit/s Glink output data oGlink chipset successfully used in demonstrators oNeeded some work to understand interaction with RODs

22nd March, 2005CPM FDR, Architecture and Challenges11 CPM challenges: latency oCP system latency budget: ~14 ticks oThis is a very difficult target oNote, CPM is only first stage of CP system oCMM needs about 5 ticks oCPM latency - irreducible oInput cables: > 2 ticks oLVDS deserialisers: ~ 2 ticks oMux/Demux to 160 MHz: ~ 1 tick oBC-demuxing algorithm: 1 tick oRemaining budget = 3 !

22nd March, 2005CPM FDR, Architecture and Challenges12 Conclusions oThe CPM is a very complex module oDifficulties include: oHigh connectivity oMultiple time-domains oTight constraints on latency oLarge overall system size oExtensive testing has shown that the current prototype CPM meets these demands