Download presentation
Presentation is loading. Please wait.
Published byReginald Kelly McDowell Modified over 8 years ago
1
Samuel Silverstein, Stockholm University For the ATLAS TDAQ collaboration The Digital Algorithm Processors for the ATLAS Level-1 Calorimeter Trigger
2
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing2 Acknowledgements The work presented here is a major collaborative effort among the member institutes in L1Calo: UK: Rutherford Appleton Laboratory, University of Birmingham, Queen Mary University of London Germany: University of Heidelberg, Johannes Gutenberg University of Mainz Sweden: Stockholm University Thanks to many collaborators whose work is included in this talk. NB: Much more information and detail in conference proceeding and TNS submission
3
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing3 ATLAS Trigger/DAQ RoIB High-level Trigger ROD Other detectors Calorimeters Muons CALO trigger MUONS trigger CTP Level-1 L1A. 40 MHz 75 kHz <2.5 s ROB Detector readout DAQ Event Builder ROS Level-2 <40 ms L2S V EFNL2A. 2 kHz RoI requests RoI Data Regions of Interest (RoI) Event Filter ~1 sL2N EF Acc. 200 Hz Full events Hardware 300 Mb/s 120 Gb/s 3 Gb/s Software
4
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing4 DAQ RODs Input/output data To DAQ Pre- Processor (PPr) Analog tower sums (0.1 x 0.1) RoI RODs To L2 Feature types/ positions Fiber F/O TTCvi CTP Slow Control CANbus DCS Jet / E T (JEP) 0.2 x 0.2 E, H E/ /had clusters (CP) 0.1 x 0.1 E, H To CTP L1 Calorimeter trigger
5
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing5 Sliding window algorithms 0.4 x 0.4 0.6 x 0.6 0.8 x 0.8 Jet algorithm:EM cluster algorithm: Local maximum "Environment"
6
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing6 Digital algorithm processors Two independent subsystems Cluster Processor (CP) Jet/Energy-sum Processor (JEP) These have evolved together Common architecture, technology choices In many cases, common hardware
7
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing7 Outline Common architecture Input data links Readout scheme Slow control Hardware Production/commissioning experience Outlook (upgrade)
8
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing8 Input data links Technology: 480 MBaud serial links National Instruments Bus-LVDS 10 bits data @ 40 MHz + 2 sync bits 11 m shielded parallel-pair cables for all links Jet/Energy-sum processor (2 crates) 0.2 0.2 "jet elements" (9 bits + parity) ~ 1400 links per crate Cluster processor (4 crates) 0.1 0.1 "trigger towers" from PreProcessor (8 bits + parity) ~2240 em and hadronic trigger towers per crate! We exploit a feature of PreProcessor to halve the number of links needed in CP (to ~1120)
9
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing9 PreProcessor digital filter Peak finder FIR filter No consecutive non-zero values! Used to identify bunch crossing of pulse
10
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing10 BC Multiplexing (BCMUX) Two neighboring trigger towers paired on one link On non-zero tower, two consecutive words energy (8b) parity BCMUX disambiguation JEP cannot use BCMUX 4-tower sums full BC not necessarily followed by an empty one
11
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing11 Readout Optical fiber with HP/Agilent G-Links Up to 20 parallel bits of user data at 40 MHz Readout encoded in parallel bitstreams Data Available (DAV) bit flags new readout frame arriving at ROD Example: 84-bit CPM DAQ readout frame: G-Link bit Time
12
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing12 Slow control ATLAS DCS uses CANbus to monitor crate temperatures, voltages, etc. Module-level monitoring in L1Calo Internal CANbus on crate backplane Fujitsu microcontroller with 10-bit ADC on each board FPGA temperatures voltages, currents, etc One microcontroller provides bridge between internal CANbus and DCS
13
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing13 CPUCPU CAMCAM CMMCMM TCMTCM CMMCMM 14 CPMs One quadrant per crate CPUCPU CMMCMM TCMTCM CMMCMM 16 JEMs Two quadrants per crate Jet and CP hardware
14
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing14 Cluster Processor Module (CPM) Input Stage: 20 FPGAs (XCV100E) Processing Stage: 8 FPGAs (XCV1000E) Merging Stage: 2 FPGAs (XCV100E) 18 layer PCB, minimum feature size 0.003" InputProcessingMerging Backplane connector
15
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing15 CPM real time data path Processing Stage Merging Stage Hits Input Stage LVDS Input from Pre-Processor 8 Hit counts to CMM’1’ (electron or tau) 8 Hit counts to CMM’0’ (electron) Module Fan-in/out
16
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing16 Readout Input Stage Processing Stage Merging Stage Hits ROI DAQ Readout TTCrx TTC L1A ROI Tower data BC Clocks
17
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing17 Control and Monitoring Input Stage Processing Stage Merging Stage VME CAN bus to DCS uC ° C V I VME– - to CPU
18
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing18 Jet/Energy Module (JEM) Input Stage: 16 6-channel LVDS deserialisers (SCAN921260) 20 FPGAs (XC2V1500) Processing Stage: 2 FPGAs (XC2V3000) 14-layer main board with most functionality on daughter modules
19
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing19 (conceptual drawing) Jet/Energy Module (JEM)
20
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing20 Results merging CPMs / JEMs Local crate data via backplane To CTP Final results system-level merging crate-level merging ………… System FPGA Crate FPGA System FPGA Two modules in each crate EM, hadron clusters Energy sums, Jets One crate performs system level merging
21
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing21 Common Merger Module One multi-purpose module does cluster, jet or energy-sum merging Xilinx System ACE used tp automatically load correct firmware based on geographic address
22
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing22 2 environment sharing: 18 layers, 8 signal layers: 4 results merging: 2 TTC fanout: Reduced VME bus Processor backplane 1148 pins per slot
23
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing23 Backplane hardware Vertical stiffening bars provide Rigidity against inject/eject forces (~450N) Mount points for cable retention, LV bus bar hardware Rear transition modules for CMM- to-CMM cabling
24
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing24 Other backplane features Reduced VME bus (VME--) Constrained by limited pin count A24D16 slave cycles 3.3V levels Extended geographic addressing Position in crate Which crate in system (set by rotary switch)
25
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing25 Fully cabled rack (JEP) CMM-CMM merging cables LVDS input link cables
26
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing26 Other modules Common ROD for DAQ and ROI readout FPGA based (XC2VP20) Many different firmwares to process different readout formats System ACE + geographic addressing used Timing and Control Module (TCM) Distribute TTC signal Bridge between DCS and internal CANbus VME display Clock Alignment Module (CAM) Phase measurement between crate and module clocks important for timing-in of CP
27
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing27 Summary - things learned Common architecture / hardware: Saved much engineering/development/testing effort Fewer types of module types/spares Simplified software development Even conservative solutions come with costs Many compelling reasons to use 400 Mbit/s LVDS, but resulting cable plant near limits of what was feasible Parallel backplane with conservative design, but high pin counts caused own problems (insertion forces, vulnerable male backplane pins) Count on production delays! Manufacturer related errors on many major system components
28
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing28 Outlook: LHC upgrade Two-phase LHC luminosity upgrade expected Phase 1 (2014): 2-3 10 34 cm -2 s -1 Phase 2 (2018): 10 35 cm -2 s -1 Phase 1 has short time scale no fundamental change to system, but But would like to expand L1Calo capabilities Topological triggers? Identify overlapping features in CP and JEP? Q: Can we put feature positions in Level-1 real time data path for Phase 1?
29
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing29 Data merger transmission Transmission tests of longest merger lines 2.5V CMOS levels Parallel destination termination (~JEM) Series source termination (~CPM) 160 Mbit/s 320 Mbit/s
30
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing30 Global Merger Jet / E T (JEP) 0.2 x 0.2 E/ /had clusters (CP) 0.1 x 0.1 Pre- Processor (PPr) Analog tower sums (0.1 x 0.1) L1Calo Phase-1 upgrade High-speed fiber links Jet ROIs Cluster ROIs To CTP Muon ROIs?
31
Digital Algorithm Processors for ATLAS L1Calo - RT2009 Beijing31 Thank you for your attention!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.