Department of Particle Physics & Astrophysics Representing: Rainer Bartoldus Ric Claus Gunther Haller Ryan Herbst Martin Kocian Chris O’Grady Jim Panetta.

Slides:



Advertisements
Similar presentations
Goals and status of the ATLAS VME Replacement Working Group Markus Joos, CERN Based on slides prepared by Guy Perrot, LAPP 19/09/2012.
Advertisements

Digital RF Stabilization System Based on MicroTCA Technology - Libera LLRF Robert Černe May 2010, RT10, Lisboa
26-Sep-11 1 New xTCA Developments at SLAC CERN xTCA for Physics Interest Group Sept 26, 2011 Ray Larsen SLAC National Accelerator Laboratory New xTCA Developments.
Page 1 Dorado 400 Series Server Club Page 2 First member of the Dorado family based on the Next Generation architecture Employs Intel 64 Xeon Dual.
RCE Platform Technology (RPT)
CMS Week Sept 2002 HCAL Data Concentrator Status Report for RUWG and Calibration WG Eric Hazen, Jim Rohlf, Shouxiang Wu Boston University.
Slide 1 ITC 2005 Gunnar Carlsson 1, David Bäckström 2, Erik Larsson 2 2) Linköpings Universitet Department of Computer Science Sweden 1) Ericsson Radio.
An ATCA and FPGA-Based Data Processing Unit for PANDA Experiment H.XU, Z.-A. LIU,Q.WANG, D.JIN, Inst. High Energy Physics, Beijing, W. Kühn, J. Lang, S.
Shelf Management & IPMI SRS related activities
Messages from our friends a few slides about xTCA projects of colleagues that did not manage to come to Perugia M. Joos, CERN17th meeting of the xTCA IG.
Development of an ATCA IPMI Controller Mezzanine Board to be used in the ATCA developments for the ATLAS Liquid Argon upgrade Nicolas Dumont Dayot, LAPP.
Asis AdvancedTCA Class What is a Backplane? A backplane is an electronic circuit board Sometimes called PCB (Printed Circuit Board) containing circuitry.
The Train Builder Data Acquisition System for the European-XFEL John Coughlan, Chris Day, Senerath Galagedera and Rob Halsall STFC Rutherford Appleton.
Chapter 8 Input/Output. Busses l Group of electrical conductors suitable for carrying computer signals from one location to another l Each conductor in.
Gunther Haller SiD LOI Meeting March 2, LOI Content: Electronics and DAQ Gunther Haller Research Engineering Group.
Particle Physics & Astrophysics Representing: Mark Freytag Gunther Haller Ryan Herbst Michael Huffer Chris O’Grady Amedeo Perazzo Leonid Sapozhnikov Eric.
Update of IBL ROD proposal based on RCE/CIM concept and ATCA platform Rainer Bartoldus, Andy Haas, Mike Huffer, Martin Kocian, Su Dong, Emanuel Strauss,
Update on DAQ Upgrade R&D with RCE/CIM and ATCA platform Rainer Bartoldus, Martin Kocian, Andy Haas, Mike Huffer, Su Dong, Emanuel Strauss, Matthias Wittgen.
InfiniSwitch Company Confidential. 2 InfiniSwitch Agenda InfiniBand Overview Company Overview Product Strategy Q&A.
SLAC Particle Physics & Astrophysics The Cluster Interconnect Module (CIM) – Networking RCEs RCE Training Workshop Matt Weaver,
Pulsar II Hardware Overview Jamieson Olsen, Fermilab 14 April 2014
ATCA based LLRF system design review DESY Control servers for ATCA based LLRF system Piotr Pucyk - DESY, Warsaw University of Technology Jaroslaw.
Department of Particle Physics & Astrophysics Representing: Mark Freytag Gunther Haller Ryan Herbst Chris O’Grady Amedeo Perazzo Leonid Sapozhnikov Eric.
Design and Performance of a PCI Interface with four 2 Gbit/s Serial Optical Links Stefan Haas, Markus Joos CERN Wieslaw Iwanski Henryk Niewodnicznski Institute.
Department of Particle Physics & Astrophysics Introduction to the RPT & its concepts. Roadmap to today’s lectures and lessons. CERN follow-on to the Oxford.
Micro-Research Finland Oy Components for Integrating Device Controllers for Fast Orbit Feedback Jukka Pietarinen EPICS Collaboration Meeting Knoxville.
SLAC Particle Physics & Astrophysics Future Development and Direction RCE Training Workshop Michael Huffer, 15 June, 2009.
June 3, 2005H Themann SUSB Physics & Astronomy 1 Phenix Silicon Pixel FEM S. Abeytunge C. Pancake E. Shafto H. Themann.
Agata Week – LNL 14 November 2007 Global Readout System for the AGATA experiment M. Bellato a a INFN Sez. di Padova, Padova, Italy.
Presentation for the Exception PCB February 25th 2009 John Coughlan Ready in 2013 European X-Ray Free Electron Laser XFEL DESY Laboratory, Hamburg Next.
Gueorgui ANTCHEVPrague 3-7 September The TOTEM Front End Driver, its Components and Applications in the TOTEM Experiment G. Antchev a, b, P. Aspell.
CMX status and plans Yuri Ermoline for the MSU group Level-1 Calorimeter Trigger Joint Meeting CERN, October 2012,
Department of Particle Physics & Astrophysics ATLAS TDAQ upgrade proposal TDAQ week at CERN Michael Huffer, November 19, 2008.
XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser Communication in ATCA-LLRF System LLRF Review, DESY, December 3rd, 2007 Communication in.
XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser XFEL-LLRF-ATCA Meeting, 3-4 December 2007 Communication in ATCA-LLRF System Presenter:
XTCA projects (HW and SW) related to ATLAS LAr xTCA interest group - CERN 07/03/2011 Nicolas Letendre – Laurent Fournier - LAPP.
Upgrade to the Read-Out Driver for ATLAS Silicon Detectors Atlas Wisconsin/LBNL Group John Joseph March 21 st 2007 ATLAS Pixel B-Layer Upgrade Workshop.
Lecture 12: Reconfigurable Systems II October 20, 2004 ECE 697F Reconfigurable Computing Lecture 12 Reconfigurable Systems II: Exploring Programmable Systems.
Latest ideas in DAQ development for LHC B. Gorini - CERN 1.
Deliverables, Cost, Manpower, Schedule & Maintenance Su Dong CSC Readout Replacement CDR Oct/8/
Gunther Haller SiD Meeting January 30, 08 1 Electronics Systems Discussion presented by Gunther Haller Research Engineering.
Industrial Controls Engineering Department First CERN PXI Users Group meeting 19 th October 2011 – Hubert REYMOND – EN/ICE 1.
XLV INTERNATIONAL WINTER MEETING ON NUCLEAR PHYSICS Tiago Pérez II Physikalisches Institut For the PANDA collaboration FPGA Compute node for the PANDA.
Department of Particle Physics & Astrophysics CSC ROD Replacement Conceptual Design Review New ROD Complex (NRC) Overview & Physical Design Michael Huffer,
AFP Trigger DAQ and DCS Krzysztof Korcyl Institute of Nuclear Physics - Cracow on behalf of TDAQ and DCS subsystems.
Scott Mandry, EUDET JRA1 Meeting, DESY 30 th January ISIS1 Testbeam EUDET JRA1 Meeting, DESY 30 th January 2008 Scott Mandry LCFI Collaboration.
ECFA Workshop, Warsaw, June G. Eckerlin Data Acquisition for the ILD G. Eckerlin ILD Meeting ILC ECFA Workshop, Warsaw, June 11 th 2008 DAQ Concept.
Department of Particle & Particle Astrophysics Modular Data Acquisition Introduction and applicability to LCLS DAQ Michael Huffer,
RCE Project Update and Plans Matt Weaver, July 14, 2015 SLAC Technology Innovation Directorate Advanced Data Systems Department.
IPMI developments at LAPP dec 15, 2011IPMI developments at LAPP, L.Fournier1 Alain BAZAN, Fatih BELLACHIA,Sébastien CAP, Nicolas DUMONT DAYOT, Laurent.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
New ATCA compute node Design for PXD Zhen-An Liu TrigLab, IHEP Beijing Feb , 6th International Workshop on DEPFET Detectors and Applications.
Consideration of the LAr LDPS for the MM Trigger Processor Kenneth Johns University of Arizona Block diagrams and some slides are edited from those of.
KM3NeT Offshore Readout System On Chip A highly integrated system using FPGA COTS S. Anvar, H. Le Provost, F. Louis, B.Vallage – CEA Saclay IRFU – Amsterdam/NIKHEF,
High Bandwidth DAQ Systems for the Intensity Frontier Mark Convery January 10, 2013.
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
Online Software November 10, 2009 Infrastructure Overview Luciano Orsini, Roland Moser Invited Talk at SuperB ETD-Online Status Review.
Johannes Lang: IPMI Controller Johannes Lang, Ming Liu, Zhen’An Liu, Qiang Wang, Hao Xu, Wolfgang Kuehn JLU Giessen & IHEP.
29/05/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
E. Hazen1 MicroTCA for HCAL and CMS Review / Status E. Hazen - Boston University for the CMS Collaboration.
E. Hazen -- ACES ACES 2011 MicroTCA in CMS E. Hazen, Boston University for the CMS collaboration.
E. Hazen -- HCAL Upgrade Workshop1 MicroTCA Common Platform For CMS Working Group E. Hazen - Boston University for the CMS Collaboration.
DAQ and TTC Integration For MicroTCA in CMS
Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI Krzysztof Korcyl.
Future Hardware Development for discussion with JLU Giessen
The Train Builder Data Acquisition System for the European-XFEL
The Software Framework available at the ATLAS ROD Crate
ENG3050 Embedded Reconfigurable Computing Systems
MicroTCA Common Platform For CMS Working Group
Presentation transcript:

Department of Particle Physics & Astrophysics Representing: Rainer Bartoldus Ric Claus Gunther Haller Ryan Herbst Martin Kocian Chris O’Grady Jim Panetta Amedeo Perazzo Steve Tether Gregg Thayer Su Dong Matt Weaver Matthias Wittgen Status update on Detector R & D program for large-scale DAQ Systems CERN/ACES Workshop Michael Huffer, SLAC National Accelerator Laboratory March, 9-11, 2011 V1

Department of Particle Physics & Astrophysics 2 R & D program’s goals and phases DAQ/trigger “technology” for next generation HEP experiments… –One component of ongoing SLAC Detector R & D project “Phase 0” –“Survey the requirements and capture their commonality” –Intended to leverage recent industry innovation –Project’s cultural bias, “one size does not fit all”. Leads to… –The concept of ubiquitous building blocks The (Reconfigurable) Cluster Element (RCE) The Cluster Interconnect (CI) Industry standard packaging (ATCA) “Phase 1” –Technology evaluation & demonstration hardware The RCE & CI boards “Phase 2” –Useful, sustainable architecture (how to future proof) Generic ATCA Front-Board (the COB) & the RCE “GEN-II” “Phase 3” –Meet all RCE related performance goals Move to ASP (ARM based silicon) This is where the program is currently executing

Department of Particle Physics & Astrophysics 3 GEN-I RCE board + RTM GEN-I RCE board + RTM

Department of Particle Physics & Astrophysics 4 Outline Brief recap of both ongoing and past ATLAS related activities (Re) Introduce the RCE/CE + CI/Cluster Status of “Phase-2” activities (the COB) ATLAS upgrade related activities –IBL Read/Out (R/O) –Full pixel R/O system replacement –CSC ROD replacement –Forward Muon Readout (small wheel replacement) –Case studies on alternative architectures Integration of ROD + ROS functionality Level 1.5 triggers ROIB Summary Will focus only on the IBL as one, representative, example of these activities

Department of Particle Physics & Astrophysics 5 Past & current ATLAS related activities Presentations: –Mike Huffer at ACES (Mar/09 & sessions of last ATUW): –Rainer Bartoldus at ROD workshop (June/09): RCE training workshop at CERN (June/09): – Test-stands –Shared RCE test stand at CERN (August/09) –Stony Brook and LBNL (October/2010) –Pixel (February/2010) Expressions of Interest –IBL (September/2009) –R/O systems for Full Pixel Replacement (December/2010) –R/O systems for the small wheel upgrade (December/2010) –DOE Collider Detector R & D LOI (Feburary/2011) Pixel calibration system –3D sensor test CERN test-beam (June 2010) –Calibration of FE-I4 (Decemeber/2010)

Department of Particle Physics & Astrophysics 6 RCE Development Lab at CERN

Department of Particle Physics & Astrophysics 7 Application of RCE to ATLAS Pixel Existing Pixel Module or IBL FE-I4 module 3 Gb/s /CIM 10-GE Ethernet HSIO Typical test stand Cosmic Other applications: Successful integration with CERN test beam. Preparation of IBL stave-0 (16 x FE-I4) readout/calibration tests for Jun/Jul 2011.

Department of Particle Physics & Astrophysics 8 Application of RCE to ATLAS Pixel Example of FE-I4 module tests using cosmic telescope data collected with RCE Hit time TOT Hit – track residual (  m) Crude alignment

Department of Particle Physics & Astrophysics Application of RCE to ATLAS Pixel 9 Standard Atlas pixel calibration console interface as used for Point-1. Running ATLAS TDAQ inter- process communication software on RCE under RTEMS Full suite of pixel calibration DSP code adopted to run on RCEs with < 1 FTE*year effort. RCE concept with integrated software support has demonstrated flexibility for compatibility and fast progress. Threshold calibration for an existing FE-I3 module with known defects

Department of Particle Physics & Astrophysics 10 The RCE Based on SOC technology… –Xilinx Virtex-5 & ASP Has three principal components: –Programmable Fabric Paired with configuration flash Both soft & hard silicon (resources) –Programmable Cluster-Element (CE) Paired with configuration flash RAM (DDR2/DDR3), up to 4 Gbytes –Plugins “Glues” fabric resources to CE Itself is built from fabric resources Comes bundled with two prebuilt: –Network (1-40 Gb/sec Ethernet) – Interface to CE flash CE has eight (8) sockets –2 prewired to bundled plugins –6 are application defined Two most “interesting”, hardened resources: –DSP tiles (> 200 TeraMACS/Sec) –SerDes (+ support) (up to 12.5 Gbits/s)

Department of Particle Physics & Astrophysics 11 The CE (hardware) –Processor (multiple cores) Up to 5000 DMIPS (somewhere between Core Duo and i7) Code + data stored on configuration media –Cross-bar Provides “glue” between processor, memory & plugins > 8 Gbytes/sec of switching bandwidth –P (Peripheral)-Bus In concert with BSI and bootstrap provides “plug and play” support for plugins –C (Command)-Bus Allows processor interaction with plugin concurrent with its memory transfers Extends the processor’s instruction set Provision for user to plugin its own application specific logic –Boot-Strap-Interface (BSI) I2C slave interface Allows external control & parameterization of boot process –House-Keeping-Interface (HKI) I2C master interface Allows external, configuration, control & monitoring of “slow” devices

Department of Particle Physics & Astrophysics 12 The CE (software) (6 layer stack) –Primary Bootstrap Operating System (O/S) agnostic Driven by BSI –Operating System (O/S) + BSP Bundled with Open Source R/T kernel (RTEMS) POSIX compliant interfaces –Core CE initialization Plugin support Utilities –Plugins Driven by the set of physical plugins present –Well-Known-Services (WKS) Standard BSD Network (IP) stack Telenet, GDB stub, NFS, etc… Set is customizable –Standard GNU cross-development environment Includes remote (GDB) debugger All software is object-oriented –Both C & C++ support

Department of Particle Physics & Astrophysics 13 The CI & the related concept of Cluster The CI consists of… –96 channel 10G-Ethernet switch: Partitioned into 24 ports of 4 channels each Each port can be configured as: –1-4 10G (KR) –1 40G (KR4) –1 4 x 3.125G (XAUI) –1 SGMII (10, 100, 1000 Base-T) Cut through, industry lowest ( NS hop latency) Full Layer 2 and Layer-3 switching –One (1) RCE Manages switch (supports all management protocol) Has fixed connection to one (1) port of the switch The Cluster consists of… –One (1) CI –One or more (up to 96) RCE’s

Department of Particle Physics & Astrophysics 14 “Phase 2”, Cluster-On-Board (COB) “Mezzaninize” RCE & CI concepts –Decouples RCE & CI from ATCA (or any packaging standard) New ATCA Front-Board which supports: –The mezzanine concept Decouples Front-Board from RCE & CI –The cluster concept –IPMI (shelf configuration & monitoring) –Full mesh backplanes Applications require only a single type of board Interoperability with any type of backplane –The ATCA for physics standard (PICMG 3.8) –Complete interoperability with any type of ATCA board –10 Gbits/s signaling (both backplanes & Ethernet switching) –Generic, synchronous Timing & Trigger signal distribution

Department of Particle Physics & Astrophysics 15 COB block diagram  Partitioned into five zones (each with an associated mezzanine card)  Four (4) Data Processing Modules (DPM)  Process data from RTM  Intended to contain as many as eight (8) RCEs  One (1) Data Transport Module (DTM)  Distributes 10- GE Ethernet Fabric and timing signals to DPMs  Contains CI  Unlike DPM mezzanine extends to Front-Panel  IPM controller to monitor and configure the five zones + RTM

Department of Particle Physics & Astrophysics 16 COB (Board layout and Front-Panel) P1 switch IPMC P2 Zone 3 PICMG 3.8 DPM 0 DPM 1 DPM 2 DPM 3 DTM TTC interface “Busy” interface DC-DC Conversion/filter

Department of Particle Physics & Astrophysics 17 Ethernet topology in a 14-slot shelf…

Department of Particle Physics & Astrophysics 18 IBL R/O reusing current RODs 4 crates 8 fiber inputs/ROD (56 RODs) Calibration information must flow through VME bus & RCC No explicit provision for FTK

Department of Particle Physics & Astrophysics 19 IBL R/O proposal based the COB… Interfaces remain identical (TTC & ROS) 1 crate (shelf) 12 COBs 48 fiber inputs/COB (10 COBs) Calibration data has independent, 160 gigabits/sec (Ethernet) path Adding one more COB + SLINK/RTM easily accommodates FTK access

Department of Particle Physics & Astrophysics SLINK RTM SFP+ (1 of 8) Avago 12 channel receiver & transmitter (2 of 8)

Department of Particle Physics & Astrophysics 21 Summary Phase I of the program is complete –Working prototypes (the CI and CIM boards) deployed in real experiments Phase II is well underway with the COB board & RCE “GEN-II” –All ATLAS upgrade activities based on this board. Its mezzanine boards… Will have 8 RCEs and support 48 channels of generic serial I/O Support the current Trigger Timing & Control (TTC) system & “busy” GEN-I has demonstrated that the RCE is a viable software platform –Successful port of pixel calibration software –Operation as cosmic telescope using real FEE hardware (the FE-I4) Proposed new R/O system for the IBL is based on the COB –Will integrate (plug into) into the present TDAQ structure –Much smaller footprint (4 crates -> 1 shelf) –Has two purpose built RTMs (SLINK & R/O) –Capable of transferring the entire calibration data volume at L1 rates –Designed to easily accommodate the FTK