Gunther Haller SiD Meeting January 30, 08 1 Electronics Systems Discussion presented by Gunther Haller Research Engineering.

Slides:



Advertisements
Similar presentations
6 Mar 2002Readout electronics1 Back to the drawing board Paul Dauncey Imperial College Outline: Real system New VFE chip A simple system Some questions.
Advertisements

26-Sep-11 1 New xTCA Developments at SLAC CERN xTCA for Physics Interest Group Sept 26, 2011 Ray Larsen SLAC National Accelerator Laboratory New xTCA Developments.
Offering the freedom to design solutions Sundance PXIe Solution.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
Gunther Haller Parallel: X-Ray Controls & DAQ LCLS FAC Meeting 30 October Experimental Area Controls and Data-Acquisition.
An ATCA and FPGA-Based Data Processing Unit for PANDA Experiment H.XU, Z.-A. LIU,Q.WANG, D.JIN, Inst. High Energy Physics, Beijing, W. Kühn, J. Lang, S.
Gunther Haller Plenary: X-Ray Controls & Data LCLS FAC Meeting 29 October Experimental Area Controls and Data-Acquisition.
Detector Array Controller Based on First Light First Light PICNIC Array Mux PICNIC Array Mux Image of ESO Messenger Front Page M.Meyer June 05 NGC High.
PCIe based readout U. Marconi, INFN Bologna CERN, May 2013.
Company and Product Overview Company Overview Mission Provide core routing technologies and solutions for next generation carrier networks Founded 1996.
The Train Builder Data Acquisition System for the European-XFEL John Coughlan, Chris Day, Senerath Galagedera and Rob Halsall STFC Rutherford Appleton.
5 Feb 2002Alternative Ideas for the CALICE Backend System 1 Alternative Ideas for the CALICE Back-End System Matthew Warren and Gordon Crone University.
K. Honscheid RT-2003 The BTeV Data Acquisition System RT-2003 May 22, 2002 Klaus Honscheid, OSU  The BTeV Challenge  The Project  Readout and Controls.
Proposal of new electronics integrated on the flanges for LAr TPC S. Cento, G. Meng CERN June 2014.
XFEL Large Pixel Detector DAQ. Project Team Technical Team: STFC Rutherford DAQ Glasgow University Surrey University Science Team: UCL Daresbury Bath.
Calorimeter upgrade meeting - Wednesday, 11 December 2013 LHCb Calorimeter Upgrade : CROC board architecture overview ECAL-HCAL font-end crate  Short.
Gunther Haller SiD LOI Meeting March 2, LOI Content: Electronics and DAQ Gunther Haller Research Engineering Group.
Particle Physics & Astrophysics Representing: Mark Freytag Gunther Haller Ryan Herbst Michael Huffer Chris O’Grady Amedeo Perazzo Leonid Sapozhnikov Eric.
TRIGGER-LESS AND RECONFIGURABLE DATA ACQUISITION SYSTEM FOR POSITRON EMISSION TOMOGRAPHY Grzegorz Korcyl 2013.
Future DAQ Directions David Bailey for the CALICE DAQ Group.
SLAC Particle Physics & Astrophysics The Cluster Interconnect Module (CIM) – Networking RCEs RCE Training Workshop Matt Weaver,
Pulsar II Hardware Overview Jamieson Olsen, Fermilab 14 April 2014
Understanding Data Acquisition System for N- XYTER.
XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser Dariusz Makowski, Technical University of Łódź LLRF review, DESY, 3-4 December 2007 Advanced.
Department of Particle Physics & Astrophysics Representing: Mark Freytag Gunther Haller Ryan Herbst Chris O’Grady Amedeo Perazzo Leonid Sapozhnikov Eric.
Agata Week – LNL 14 November 2007 Global Readout System for the AGATA experiment M. Bellato a a INFN Sez. di Padova, Padova, Italy.
Presentation for the Exception PCB February 25th 2009 John Coughlan Ready in 2013 European X-Ray Free Electron Laser XFEL DESY Laboratory, Hamburg Next.
Department of Particle Physics & Astrophysics ATLAS TDAQ upgrade proposal TDAQ week at CERN Michael Huffer, November 19, 2008.
Background Physicist in Particle Physics. Data Acquisition and Triggering systems. Specialising in Embedded and Real-Time Software. Since 2000 Project.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Next Generation Operating Systems Zeljko Susnjar, Cisco CTG June 2015.
CKM Data Source System CKM Electronics Group July 2003.
Serial Data Link on Advanced TCA Back Plane M. Nomachi and S. Ajimura Osaka University, Japan CAMAC – FASTBUS – VME / Compact PCI What ’ s next?
ATLAS Trigger / current L1Calo Uli Schäfer 1 Jet/Energy module calo µ CTP L1.
Latest ideas in DAQ development for LHC B. Gorini - CERN 1.
VLVnT09A. Belias1 The on-shore DAQ system for a deep-sea neutrino telescope A.Belias NOA-NESTOR.
12GeV Trigger Workshop Christopher Newport University 8 July 2009 R. Chris Cuevas Welcome! Workshop goals: 1.Review  Trigger requirements  Present hardware.
LHCb front-end electronics and its interface to the DAQ.
01/04/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
Gunther Haller SiD Meeting 26 October Electronics Systems Issues for SiD Dataflow & Power Gunther Haller Research Engineering.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
KLM Trigger Status Barrel KLM RPC Front-End Brandon Kunkler, Gerard Visser Belle II Trigger and Data Acquistion Workshop January 17, 2012.
Links from experiments to DAQ systems Jorgen Christiansen PH-ESE 1.
XLV INTERNATIONAL WINTER MEETING ON NUCLEAR PHYSICS Tiago Pérez II Physikalisches Institut For the PANDA collaboration FPGA Compute node for the PANDA.
DAQ interface + implications for the electronics Niko Neufeld LHCb Electronics Upgrade June 10 th, 2010.
16 February 2011Ian Brawn1 The High Speed Demonstrator and Slice Demonstrator Programme The Proposed High-Speed Demonstrator –Overview –Design Methodology.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
CODA Graham Heyes Computer Center Director Data Acquisition Support group leader.
CLAS12 Central Detector Meeting, Saclay, 3 Dec MVT Read-Out Architecture & MVT / SVT Integration Issues Irakli MANDJAVIDZE.
Department of Particle & Particle Astrophysics Modular Data Acquisition Introduction and applicability to LCLS DAQ Michael Huffer,
ROM. ROM functionalities. ROM boards has to provide data format conversion. – Event fragments, from the FE electronics, enter the ROM as serial data stream;
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
DAQ Selection Discussion DAQ Subgroup Phone Conference Christopher Crawford
Array Trigger for CTA-US MSTs Feb CTA-US Meeting, SLAC Frank Krennrich John Anderson Karen Byrum Gary Drake Frank Krennrich Amanda Weinstein.
Grzegorz Kasprowicz1 Level 1 trigger sorter implemented in hardware.
Gunther Haller LUSI DOE Review July 23, 2007 Controls (WBS 1.6) 1 Control and Data Systems (WBS 1.6) Gunther Haller Scope Control.
The SLAC Instrumentation and Control Platform
Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI Krzysztof Korcyl.
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
TELL1 A common data acquisition board for LHCb
The Train Builder Data Acquisition System for the European-XFEL
Trigger, DAQ, & Online: Perspectives on Electronics
VELO readout On detector electronics Off detector electronics to DAQ
Example of DAQ Trigger issues for the SoLID experiment
LUSI Controls and Data Systems W.B.S. 1.6
SVT detector electronics
TELL1 A common data acquisition board for LHCb
LUSI Controls and Data Systems W.B.S. 1.6
Presentation transcript:

Gunther Haller SiD Meeting January 30, 08 1 Electronics Systems Discussion presented by Gunther Haller Research Engineering Group Particle Physics and Astrophysics Division SLAC-Stanford University Ryan Herbst, Dieter Freytag, Martin Breidenbach January 30, 2008

Gunther Haller SiD Meeting January 30, 08 2 Overview Electronics Architecture Cold-Machine Readout Warm-Machine, focusing here on KPIX readout DAQ Hardware

Gunther Haller SiD Meeting January 30, 08 3 Electronics Architecture Digital On-Top-of or next- to Detector ATCA DAQ Boards Online Farm/ Storage Total data rate from each front-end relatively small, thus can combine data from several front-ends to reduce number of connections to the outside of the detector Front-End ASICs/electronics transmit event data to concentrator 1 boards Digital interface (optical or electrical, e.g. LVDS) Concentrator 1 boards close to front-end, combining data-streams from several front-end ASICs Zero-suppression either at front-end or on concentrator 1 boards No additional processing needed at this stage Event data from concentrator 1 boards are combined in concentrator 2 boards Multiplexing of concentrator 1 board event data onto fewer fibers Event data is transmitted to top or side of detector ATCA crate (see later) to process and switch data packets Online farm for filtering (if necessary) Concen trator Board Level 1 Front-End Electronics (e.g. ASIC) Inside detector, close to sensor Still inside detector, fan-in of several front-end data sources Concen trator Board Level 2 Depending on sub-system, still inside detector, fan-in of several concentrator 1 boards Fiber Inside Detector

Gunther Haller SiD Meeting January 30, EM Barrel Example (Cold) E.g. Barrel EMCal, 54,000 KPiX, mean # of hits/train: 4*E07 FE Module ~12 KPiX’s 1 Concent rator 1 1,000 channels * ~9 * Mb/s = ~ 6 msec for each KPiX, 12 KPiX read serially: total of ~ 70 msec Zero- Suppress & Sort Concent rator Buffer/Sort Gb/s Fiber Sub-System Event-Build (sort) 6-slot crate: 5 RCE modules 1 ICE In-Detector On-Top-Of or Side-Of -Detector 4-slot crate: 3 RCE modules 1 CIM ATCA processors Detector- Wide Event- Build (sorted by train # and bunch #) Top-of or next-to Detector Readout to outside-Detector crates via 3 Gbit/s fibers Single 6-slot crate to receive 36 fibers: 5 RCE modules + 1 Cluster Interconnect Module (CIM) Total out of EM Barrel partition: 1.6 Gbytes/s Available bandwidth: > 80 Gbit/s (and is scalable) Sorting, data reduction Can be switched into ATCA processors for data-filtering/reduction or online farm A few 10-G Ethernet fibers off detector Filter, Online Analysis & Storage Off-Detector (Stationary) Fiber 40 khits/train/ fiber -> ~3 Mbytes/s 3 Mhits/s (~45 Mbytes/s) 3 Gb/s Fiber

Gunther Haller SiD Meeting January 30, 08 5 Warm-Machine Assumptions 100 ns to 1000 ns train length 100 Hz to 1KHz train spacing Hits: depends on sub-system. look here at calorimeter, assume for now 1 hit per train get 50 nsec granularity when hit occurred in train

Gunther Haller SiD Meeting January 30, 08 6 EM Barrel Example 1000 channels for each KPIX Still don’t need zero-suppression on KPIX Settle + digitize: 400 usec Readout: 1024 channels x 2 words x 14 bits x 50 nsec = 573 nsec Requires some modification to optimize data in the readout structure

Gunther Haller SiD Meeting January 30, EM Barrel Example E.g. Barrel EMCal, 54,000 KPiX FE Module ~12 KPiX’s 1 Concent rator 1 Zero- Suppress & Sort Concent rator Buffer /Sort Gb/s Fiber Sub-System Event-Build (sort) 1 Crate of ATCA RCE’s In-Detector On-Top-Of-Detector Fraction of a crate: 1 IO module, 1 processor Detector- Wide Event- Build 1 In-Detector KPiX Readout via two levels of concentrator boards (FPGA-based, reconfigurable) L1 concentrator: zero-suppress. Sort total 20 hits/train/Kpix -> 20 hits * (2 words of 14 bits + 26 bit total system address) * 1 KHz * 96 KPIX;s = 12 Mbytes/sec To L2 concentrator: Out of L2 concentrator: 12 Mbytes/sec * 16 = 192 Mbytes/sec Readout to On-Top-Of-Detector crates via 3 Gbit/s fibers Only need 1 fiber, assume 2 On-Top-Of-Detector Event-Builder for sub-system Off-Detector (Stationary) Filter, Online-Analysis, Storage Filter, Online Analysis & Storage Off-Detector (Stationary) Fiber 12 lines

Gunther Haller SiD Meeting January 30, 08 8 Concentrator-1 8 x 12-KPiX FE Modules FPGA Zero-Suppress & Sort, Buffering Control & Timing Signals Fiber Conversion 3 Gbit/sec full- duplex fiber Memory Power Conversion Power Concentrator 1 To/from Concentrator 2

Gunther Haller SiD Meeting January 30, 08 9 DAQ Architecture 1 ATCA crate 3 Gbits/sec PGP fiber links CIM In-Detector On-Top-of-Or-next-To- Detector 10G Ethernet 1 ATCA crate for each sub-system for partitioning reasons Two custom modules RCE: Reconfigurable Cluster Element CIM: Cluster Interconnect Module 8 EM Barrel EM EC RCE’s Switch Online Farm/ Storage n

Gunther Haller SiD Meeting January 30, DAQ Sub-System Based on ATCA (Advanced Telecommunications Computing Architecture) Next generation of “carrier grade” communication equipment Driven by telecom industry Incorporates latest trends in high speed interconnect, next generation processors and improved Reliability, Availability, and Serviceability (RAS) Essentially instead of parallel bus backplanes, uses high-speed serial communication and advanced switch technology within and between modules, plus redundant power, etc

Gunther Haller SiD Meeting January 30, ATCA Crate ATCA used for e.g. SLAC LUSI (LCLS Ultra-fast Science Instruments) detector readout for Linac Coherent Light Source hard X-ray laser project Based on 10-Gigabit Ethernet backplane serial communication fabric 2 custom boards Reconfigurable Cluster Element (RCE) Module Interface to detector Up to 8 x 2.5 Gbit/sec links to detector modules Cluster Interconnect Module (CIM) Managed 24-port 10-G Ethernet switching One ATCA crate can hold up to 14 RCE’s & 2 CIM’s Essentially 480 Gbit/sec switch capacity SiD needs only ~ 320 Gbit/sec including factor of 4 margin Plus would use more than one crate (partitioning) ATCA Crate RCE CIM

Gunther Haller SiD Meeting January 30, Reconfigurable Cluster Element (RCE) Boards Addresses performance issues with off- shelf hardware Processing/switching limited by CPU-memory sub-system and not # of MIPS of CPU Scalability Cost Networking architecture Reconfigurable Cluster Element module with 2 each of following Virtex-4 FPGA 2 PowerPC processors IP cores 512 Mbyte RLDRAM 8 Gbytes/sec cpu-data memory interface 10-G Ethernet event data interface 1-G Ethernet control interface RTEMS operating system EPICS up to 512 Gbyte of FLASH memory Reconfigurable Cluster Element Module Rear Transition Module

Gunther Haller SiD Meeting January 30, Cluster Interconnect Module Network card 2 x 24-port 10-G Ethernet Fulcrum switch ASICs Managed via Virtex-4 FPGA Network card interconnects up to 14 in-crate RCE boards Network card interconnects multiple crates or farm machines

Gunther Haller SiD Meeting January 30, Summary KPIX system can also be used for warm machine with some modest mods within KPIX Event data rate for SiD can be handled by current technology, e.g. ATCA system being built for LCLS SiD data rate dominated by noise & background hits Can use standard ATCA crate technology with e.g. existing SLAC custom cluster elements and switch/network modules No filtering required in DAQ. Could move event data to online farm/off-line for further filtering/analysis Still: investigate filtering in ATCA processors