Download presentation
Presentation is loading. Please wait.
Published byRalf Neal Modified over 8 years ago
1
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 1 Trigger Electronics at SLHC Overview of this talk: The current trigger system in retrospect. Ideas for a new trigger system at the SLHC. SLHC Trigger R&D Plans. The Trigger WP3 of the UK SLHC proposal. M. Hansen (CERN) C. Foudas, G. Iles, A. Rose, M. Stettler G. Sidiropoulos (Imperial) D. Baden (Maryland), J. Jones (Princeton) S. Dasu, W. Smith (Wisconsin)
2
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 2 Current LHC Level-1 Triggers The need of introducing silicon tracking information in the L1 Trigger decision at the SLHC forces us to reconsider the architecture of L1 trigger systems: Typical L1 triggers made of highly specialized custom build hardware platforms which process the trigger information from a specific detectors are prohibitively expensive, complex and hard to commission and maintain over the duration of an experiment. Even those systems which use FPGAs are not flexible enough to meet the physics requirements at the SLHC era since the data paths are usually fixed by the PCB design. Hence, they cannot respond easily to the unexpected. We now find that they often lack speed and processing contingency. They lack any form of standardisation. Recent industry advances in FPGA and optical link technologies allow the design of L1 trigger systems whose algorithm sophistication approaches that of High Level Triggers which are based on commercial devices (PCs+Cross-Point Switches) programmed in C++.
3
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 3 The CMS-L1 Trigger: Looking forward The current CMS L1T is made of ~ 100 different electronic cards: –Each card needs to be maintained for ~ decade. –Each card needs its expert and this will continue to be so during the entire LHC run. –Each card needs specific software and firmware whose development will tail out as time goes by during the LHC run only if we don’t make drastic changes to the trigger during the run. L1T trigger systems today at CMS and elsewhere lack a standard platform because at the time of their design there was not one suitable for all systems. The obvious question is: Must this continue to be so at SLHC ? We believe that we have we reached the limit of complexity at LHC. We propose to develop a system based on a single generic processing card (see talks by M. Stettler, J. Jones) and the uTCA backplane. Large FPGAs and cross-point switches provide for enhanced computing capability and considerable modularity. Multi-GBs optical links allow for fast interconnect. The current GCT system already goes some way this direction…
4
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 4 Another way: The GCT Example The project started in January 06 with goals to design and construct a system which would use the RCT data to provide the final lists of electron (sorting) and jet triggers (finding) to GT. A large part of this system (electrons) is already commissioned at USC-55. Large factor for the relatively quick evolution of the project is that we borrowed from the industry technologies such as MGTs, Optical link standards and of course FPGAs. We also borrowed Matt from LANL. Worth mentioning: –GCT has one main Processing card, the Leaf card, largely designed by Los Alamos National Laboratory with the rest of groups providing interfacing hardware, firmware and software. –The rest of the GCT cards are there mainly to provide interfacing to the different systems around it which would not be needed if we had a standard hardware platform.
5
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 5 The GCT Leaf Card The GCT project was approved in January 06. The Electron Trigger was commissioned in September 07. The Jet finder will be installed in January 08. 16 production Leafs have been manufactured and tested. Virtex-II Pro P70 16 SERDES/FPGA The GCT Leaf Card optimized for Jet finding(600 nsec latency) Daisy chain connectors for exchanging data With other Leaf Cards Input: 32 x 1.6 GBs Optical Links
6
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 6 The GCT System 63 Source cards 8 Leaf cards 2 Wheel cards 1 Concentrator 31 Source Cards 32 Source Cards 3 Jets Leafs Wheel Concentrator e/ Leafs
7
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 7 A Generic Trigger System For the past year we have been investigating the possibility of developing a generic trigger system, the post-GCT generation of triggers. The system is based on the uTCA Telecom standard and utilizes the Virtex-5 FPGAs with 16 MGTs per chip at 3 GB/s (but could have up to 20 MGTs). The innovative idea here is that the processor card instead of a second FPGA, as in the Leaf Card case, it has a 72x72 cross-point switch which can be configured either dynamically or statically. The card has optical sockets but unlike the Leaf the are not all inputs but are distributed as 12 input, 12 output and 8 I/O. The Cards are plugged in the uTCA crate with custom active backplane which has also 2 72x72 cross point switches. We are already building such a small system to handle the Q/M bits of the CMS L1-Trigger.
8
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 8 The Proposed System The Main Processing Card (MPC): – Receives and transmits data via front panel optical links. – On board 72x72 Cross-Point Switch allows for dynamical routing of the data either to a V5 FPGA or directly to the uTCA backplane. – The MPC can exchange data with other MPCs either via the backplane or via the front panel optical links. The Custom uTCA backplane: – Instrumented with 2 more Cross-Point Switches for extra algorithm flexibility. – Allows dynamical or static rooting of the data to different MPCs. Concept for Main Processing Card uTCA Crate and Backplane
9
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 9 Capabilities and Aims This system combines: – Powerful FPGAs which have considerable on-chip computing and I/O capabilities (large rams, PC-cores, clock management, Multi-GBs serializers and deserializers..) – State of the art Cross-Point Switches. – Optical links. which provide for an extremely modular and flexible trigger device. It aims to be used as a standard platform for all L1 algorithms at SLHC. – It will have the capability to process data from all CMS detectors participating in the first level trigger decision (muon and silicon trackers, calorimeters,..). It will have the potential of replacing the ~10 2 different devices which constitute the current CMS L1-trigger. The hardware will be designed at LANL and collaborating institutes will provide the trigger algorithm firmware in a similar was as they provide filter code for HLT algorithms.
10
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 10 Advantages of this System Modularity: –One can take advantage of the backplane and the cross-point switch to connect several processor cards together according to the physics/detector needs. It is reasonable to hope that one can replace the L1-Trigger system with a specific configuration of the processor cards. Proven industry standards: –uTCA, Optical Links and Cross-point switches. Standard Platform: –The cards can be produced once by a competent Lab and be distributed to the institutions interested to develop specific trigger algorithms. Cost: –It saves a considerable amount of R&D funds. L1T Sociology changes: –Smaller groups with no in-house state of the art technology capabilities can still contribute like they do in HLT.
11
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 11Comments: All these should not be understood as the end of the need for engineers in triggering. Obviously we need considerable firmware effort to configure the hardware for the specific trigger needs. Programming devices with MGTs is not something for the average physicist not even for the most hardware oriented among us. You still need a professional FPGA engineer in order to contribute in this. Physicists and University Engineers can contribute to algorithm development as well as building the detector interfacing electronics What one saves is the manpower costs going to schematic capture, layout and prototype testing for 100 different devices which are of course considerable. Further more there are considerable savings in long term maintenance and spares. At the end the L1T group looks more and more like the HLT group where one central places provides the hardware platform and the university groups are mainly involved in delivering firmware, software and hardware interfaces. HLT collaborators have been working in this mode for years now.
12
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 12 Institutions Involved The institutions that have expressed interest in this are: UK: Bristol, Imperial EU: CERN US: Maryland, Los Alamos, Princeton, Wisconsin Initial discussions on specific responsibilities have indicated an interest for : Calorimeter TPG: Maryland Tracker TPG: Princeton Cal Object Algorithms: Imperial, Wisconsin Tracker Object Algorithms: Bristol (simulations) Tracker-Calorimeter Matching: Imperial, + ?
13
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 13 Status of the QM System Processor card has been designed (M. Stettler) and parts have been bought. The Card is under layout at Los Alamos. The Backplane has been designed (J. Jones+M. Stettler) but we will wait to test the processor card first. A uTCA crate and a commercial backplane have been bought and are already at CERN. I foresee the fist prototypes to arrive at CERN in January 08. The architecture of the card is already given so people interested to develop VHDL algorithms could start now with aim to have some hardware to test in summer 08.
14
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 14 The CMS UK SLHC Proposal We have recently submitted a proposal to develop generic triggering devices for Tracking and Calorimeter triggers at the SLHC. This is part of a larger proposal which includes: –Tracker Readout (WP2: G. Hall, M. Raymond) –Trigger Simulations (WP1: D. Newbold) –Trigger Devices (WP3: C. Foudas) The Proposed Trigger Programme is to develop the next generation of the CMS Q/M system. The proposal in at the peer-review stage. The final meeting with the review panel is in January 08. Funding should be available in the second half of 08.
15
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 15 The Trigger WP3 Programme Year 1 (08-09): –Y1-A: MPC design complete. –Y1-B: uTCA Backplane Schematic capture complete. –Y1-C: Preliminary MPC Firmware ready. –Y1-D: MPC emulator framework complete. Year 2 (09-10): –Y2-A: MPC and uTCA backplane prototypes. –Y2-B: MPC, uTCA integration complete. –Y2-C: Initial performance tests complete. Year 3 (10-11): –Y3-A: Major algorithm performance tests and comparisons with the emulator. –Y3-B: Publish the results. A 3 year programme is proposed to design and manufacture a system which consists of a uTCA crate with 2-4 MPCs and a uTCA backplane. This system will be used to perform feasibility studies for L1 trigger algorithms.
16
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 16 WP3: Resources - Team UK Based Resources: –C. Foudas, Imperial: PM –G. Iles, Imperial: Firmware –M. Noy, Imperial: Firmware –A. Tapper, RA, Pr. Student: Emulator, Algorithms LANL (US) Based Resources: –M Stettler: Chief Engineer –LANL Schematic Capture: 250 k£ –LANL Layout: 150 K£ Materials: –Hardware manufacturing: 66 K£ US Collaborating Institutes: –LANL –Univ. of Maryland –Princeton University –University of Wisconsin
17
CMS SLHC CAL-TRIGER Workshop, Madison, 29/11/07Costas Foudas, Imperial College London 17 What need to be done to get started: With everybody busy debugging and commissioning CMS there is a sever lack of Firmware and Simulation effort. Goal 1: Adapt some of the HLT algorithms which use the tracking (Muon, Tau, Electron) to run on FPGA with reasonable assumptions and simulate them. The aim would be to provide the trigger point of view to groups which have started designing and simulating the new tracker. Get a first idea of how the efficiency and the rate changes for the various detector configurations (D. Newbold). Goal 2: We need to optimize (minimize) the input rate. That is we need to answer the question of what is the minimum amount of data required to be transferred from the detector to provide a decent tracking trigger. For this one needs both firmware and CMSSW simulations. (D. Newbold). Goal 3: We need to port existing toy algorithms on V5 platforms and simulate them. The goal here is to get an idea of what kind of latency a calorimeter type of trigger will have and what kind of latency the various part of the Stub-finder will have; Gain experience with V5s. Goal 4: Towards the end of 08 we will start distributing the first processing cards to groups which have signed up. We need to determine if this system can be a trigger standard fairly early so time is allowed for iterations which can add new features.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.