Status of ETD/Online D. Breton, U.Marconi, S.Luitz

Slides:



Advertisements
Similar presentations
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Advertisements

LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
28 August 2002Paul Dauncey1 Readout electronics for the CALICE ECAL and tile HCAL Paul Dauncey Imperial College, University of London, UK For the CALICE-UK.
4 Dec 2001First ideas for readout/DAQ1 Paul Dauncey Imperial College Contributions from all of UK: result of brainstorming meeting in Birmingham on 13.
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
IPHC - DRS Gilles CLAUS 04/04/20061/20 EUDET JRA1 Meeting, April 2006 MAPS Test & DAQ Strasbourg OUTLINE Summary of MimoStar 2 Workshop CCMOS DAQ Status.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
ROM. ROM functionalities. ROM boards has to provide data format conversion. – Event fragments, from the FE electronics, enter the ROM as serial data stream;
SuperB DAQ U. Marconi Padova 23/01/09. Bunch crossing: 450 MHz L1 Output rate: 150 kHz L1 Triggering Detectors: EC, DC The Level 1 trigger has the task.
The Evaluation Tool for the LHCb Event Builder Network Upgrade Guoming Liu, Niko Neufeld CERN, Switzerland 18 th Real-Time Conference June 13, 2012.
C.Beigbeder, D.Breton, M.El Berni, J.Maalmi, V.Tocut – LAL/In2p3/CNRS L.Leterrier, S. Drouet - LPC/In2p3/CNRS P. Vallerand - GANIL/CNRS/CEA SuperB -Collaboration.
Detector Goals and General Syst. System Parallel Joint Parallel DGWG Mechanical Integ. TDR Organization System Summaries WORKSHOP Structure-Detector +
ETD/Online Summary D. Breton, U. Marconi, S. Luitz Frascati Workshop 04/2011.
Status of ETD D. Breton, U.Marconi, S.Luitz WS summary plenary session October 1 st 2010 D. Breton - SuperB Frascati Workshop – September 2010.
Some thoughs about trigger/DAQ … Dominique Breton (C.Beigbeder, G.Dubois-Felsmann, S.Luitz) SuperB meeting – La Biodola – June 2008.
Straw readout status Run 2016 Cover FW SRB FW.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
Trigger, DAQ, & Online Planning and R&D needs
BaBar Transition: Computing/Monitoring
MPD Data Acquisition System: Architecture and Solutions
M. Bellato INFN Padova and U. Marconi INFN Bologna
Organization and Goals for the Workshop
DCH FEE STATUS Level 1 Triggered Data Flow FEE Implementation &
ETD meeting Architecture and costing On behalf of PID group
D. Breton, S. Simion February 2012
ETD meeting First estimation of the number of links
SuperB and its computing requirements
Electronics Trigger and DAQ CERN meeting summary.
ETD summary D. Breton, S.Luitz, U.Marconi
ETD/Online Report D. Breton, U. Marconi, S. Luitz
CERN meeting report, and more … D. Breton
Trigger, DAQ and Online Closeout
Modelisation of SuperB Front-End Electronics
ETD/Online Report D. Breton, U. Marconi, S. Luitz
ETD/Online Summary D. Breton, U. Marconi, S. Luitz
for the Offline and Computing groups
DCH FEE 28 chs DCH prototype FEE &
TELL1 A common data acquisition board for LHCb
Electronics, Trigger and DAQ for SuperB
Controlling a large CPU farm using industrial tools
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
Discussion after electronics parallel session
Trigger, DAQ, & Online: Perspectives on Electronics
CMS EMU TRIGGER ELECTRONICS
DCH Electronics Upgrade: Overview and Status
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Modelisation of control of SuperB Common Front-End Electronics
VELO readout On detector electronics Off detector electronics to DAQ
Dominique Breton, Jihane Maalmi
Example of DAQ Trigger issues for the SoLID experiment
SVT detector electronics
John Harvey CERN EP/LBC July 24, 2001
LHCb Trigger, Online and related Electronics
Design Principles of the CMS Level-1 Trigger Control and Hardware Monitoring System Ildefons Magrans de Abril Institute for High Energy Physics, Vienna.
PID meeting Mechanical implementation Electronics architecture
ETD parallel session March 18th 2010
SVT detector electronics
Electronics for the PID
Electronics, Trigger and DAQ for SuperB: summary of the workshop.
The LHCb Front-end Electronics System Status and Future Development
TELL1 A common data acquisition board for LHCb
U. Marconi, D. Breton, S. Luitz
Orsay Talks Christophe : General questions and future developments.
Fixed Latency Serial Links with FPGA-embedded SerDes for SuperB
Links and more … D. Breton
Presentation transcript:

Status of ETD/Online D. Breton, U.Marconi, S.Luitz WS summary plenary session March 19th 2010 D. Breton - SuperB Annecy Workshop - March 2010

Issues from ETD sessions … Tuesday afternoon, during the detector plenary session, the problem of the trigger latency was raised We agree that we now have to fix a maximum value for it in order to allow subdetectors to design their front-end electronics Based on experience and on what was implemented on LHC experiments, we decided to fix it to 4µs. This was announced in the ETD parallel session This is already written in the WP This is a maximum value. It could be eventually lowered, but it allows us to envisage a powerful level 1 trigger. Wednesday, we had the ETD parallel session dedicated to front-end electronics it looks like things are converging in the definition of the internal architecture of subsystem electronics There doesn’t seem to be major modifications during the workshop budget seems to be nailed down (minor changes occurred this week) At the ETD parallel session of Thursday, the problem of a radiation map was raised We need a more precise idea of the radiation levels for the TDR writing D. Breton - SuperB Annecy Workshop - March 2010

D. Breton - SuperB Annecy Workshop - March 2010 Electronics costing EDIA Labor M&S TOTAL WBS Item (MM) (K$) (Keuro) 1.7 Electronics 994 18877 13103 342 6498 4511 10601 7359 35976 24973 1.7.1 SVT 11,0 Cost of SVT electronics estimated by Mauro 21,0 468 1.7.2 DCH 74 Cost of DCH electronics estimated by Giulietto 76 1390 1.7.3 PID Barrel (32k channels) 136 Cost of PID Barrel electronics estimated by Dominique 18 510 1.7.4 EMC 110,0 164,0 2271,5 1.7.5 IFR 37,5 Cost of IFR electronics estimated by Angelo 51,0 1239,0 1.7.6 Infrastructure 4 1.7.C 12 247 1.7.7 Systems Engineering AJR estimates 1.7.8 Hardware Trigger 97 Cost of Hardware Trigger electronics still based on BABAR's 532 1.7.9 ETD (without Trigger) 512 Cost of ETD electronics 990,0 D. Breton - SuperB Annecy Workshop - March 2010

ETD parallel session: 16-channel TDC (D.Breton) We are currently developing a TDC for the PID barrel 16 channels Steps of ~200ps from a 168MHz clock (3 x 56MHz), resolution of 70ps Very simple architecture: hit data is simply pushed out on a parallel 16-bit bus it can be tailored to the targeted design in the companion FPGA almost nothing to program inside radiation tolerant design based on AMS CMOS 0.35µm technology this chip is available for any body interested inside SuperB community D. Breton - SuperB Annecy Workshop - March 2010

ETD parallel session: links (A.Aloisio) Test bench for the radiation qualification of the different links is ready This is a crucial point for the validation of both commercial chipsets (control & readout) D. Breton - SuperB Annecy Workshop - March 2010

ETD parallel session: coding (S.Cavaliere) A preliminary study of the Front-End Control coding for SuperB serial links has been launched: This is an important feature to study for the clock and control distribution D. Breton - SuperB Annecy Workshop - March 2010

ETD parallel session: FCTS (D.Charlet) Work went ahead on the FCTS design We concentrated on the different possible uses of ATCA crates Custom backplane, custom links on ATCA backplane, or ATCA itself ? Should we use ATCA motherboards for FCTM or custom boards ? Custom backplane and crate ATCA D. Breton - SuperB Annecy Workshop - March 2010

Kit used for master test bench Proposed OFF-detector slave mezzanine ETD parallel session: ECS (D.Charlet) FPGA-based Ethernet interface for ECS field bus masters is now running This is currently being developped for LHCb (Ethernet interface is an IP from Altera ) Proposal for a new off-detector mezzanine Kit used for master test bench Master test bench Proposed OFF-detector slave mezzanine Current version of ON-detector slave mezzanine D. Breton - SuperB Annecy Workshop - March 2010

ETD parallel session: ROM (D.Galli) R&D is starting on implementing the UDP protocol on a FPGA to be used as ROM's output stage toward the PC farm This is a outstanding R&D. A R&D project (PRIN) has been funded by Italian Education and Research Ministry (MIUR): TeraDAQ: D. Breton - SuperB Annecy Workshop - March 2010

D. Breton - SuperB Annecy Workshop - March 2010 Points of discussion In BABAR, dead-time (2.7µs) was introduced in the FCTS system after a level 1 trigger decision in order to simplify the front-end. this was not a problem because of the luminosity and low trigger rate (3kHz) In SuperB, at 150kHz, this would be an important source of dead-time Our baseline philosophy would rather be to leave the door fully open and put no restriction on the trigger minimum distance between triggers would be only due to the trigger processors capacity to distinguish between consecutive events ~ 100ns we have to properly manage all sorts of pile-up but inter-trigger minimum spacing also fixes the maximum frame size Safety factors on dataflow: We would like to understand the safety factors used for the readout link calculations for each subsystem Level 1 trigger derandomizer: We have to perform simulations to optimize their depth ECS bandwidth: Subdetectors should think of the bandwidth they need to set up within seconds the FEE at startup or reload it (could be done often because of radiation policy) D. Breton - SuperB Annecy Workshop - March 2010

D. Breton - SuperB Annecy Workshop - March 2010 Conclusion on ETD Subsystems now have a stable electronics design as described in the WP Corresponding budget looks nailed down However, new questions were raised concerning effects of pile-up in the EMC Work is going ahead in all the fields of the overall system electronics There are teams working on each of the elements, except the level 1 trigger (still the job of the conveners …) We now enter the last phase of the TDR => we have to define our roadmap. => is there any new R&D needed ? => we need more precise information about radiation levels WP is a great basis for TDR writing We have to define the length and content of the different sections And maybe introduce some new ones We have to know when real chip and board design have to start if ever some of these points were crucial for the final schedule (time necessary for series production and testing might be long in some cases) => backward scheduling D. Breton - SuperB Annecy Workshop - March 2010

Online Progress Currently, main focus is (as it should be) on longer-lead-time items in ETD Most components of Online/Online Computing are on “computing timescales” Many software components (hopefully) shared with Offline/general computing Except for interactions in Online/ETD ROMs, Event builder Looking at using components / building blocks from LHC exp’s Probably a good idea to wait how they work with real beams Writing the white paper really helped to focus thoughts on some of the Online issues

Online System Components

Rates and Sizes Baseline assumptions: 150kHz L1-accept rate, 75kByte Event size HLT (BaBar L3-equivalent) accepts 25nb 25kHz logging rate at 1x1036 ca. 12 Gbyte/s input rate Assume x2 “safety” (can’t run at 100%)  24GByte/s ca. 2 Gbyte/s output/logging rate Extrapolated from BaBar Currently best estimate Event size may increase (e.g. SVT Layer 0) Need to design size capability (and/or safety factor) After-FEX event size estimate needed soon from Sub-Detectors L1 accept rate may increase (design for lumi upgrades) Not an issue for Online now (if designed to be scalable)

HLT Farm & Logging Standard off-the shelf rack-mount servers Network event builder receivers Receive event fragments from ROMs, build complete events HLT (L3) 10ms/event (baseline assumption, almost 10x BaBar)  1500 cores needed (~150 servers) Data logging & buffering Local disk (few TB/node)? Storage servers over back-end network? Probably 2 day’s worth of local storage (2TByte/node)? Depends on SLD/SLA for data archive facility. No file aggregation into “runs”  bookkeeping Back-end network to archive facility

Ferrara SuperB Computing Workshop Determine R&D needed for the Computing TDR Relevant for Online: Code sharing with Offline Coding standards, quality assurance, tools Error handling, “fault tolerance” Structuring releases & software packages Performance (worst-case / average) Framework(s) Offline vs. Online, re-use of existing Online FWs Supporting multiple platforms – costs & benefits

Online Next Steps Define roadmap and timetable for Online ROM R&D (what’s the best way to build a ROM?) Specialized hardware vs. commodity computer + interface boards FEX / Data reduction in ROMs Work with sub-detectors to Identify processing requirements for FEX/data reduction Determine output data size (needed for network design and initial farm scaling) – some flexibility there but would be good to settle event size for all downstream system design and sizing Map processing requirements on processing units CPU (preferred), FPGA, GPU??? Online Software & Infrastructure Look at what others are doing. Pros and cons of certain approaches. Investigate potential use of existing tools (such as CMS xDAQ) More research on Online/Offline code sharing reqs Code, build infrastructure, frameworks databases, etc.