Communication Models for Run Control with ATCA-based Systems

Slides:



Advertisements
Similar presentations
1 iTOP Electronics Effort LYNN WOOD PACIFIC NORTHWEST NATIONAL LABORATORY JULY 17, 2013.
Advertisements

An ATCA and FPGA-Based Data Processing Unit for PANDA Experiment H.XU, Z.-A. LIU,Q.WANG, D.JIN, Inst. High Energy Physics, Beijing, W. Kühn, J. Lang, S.
1 Network Packet Generator Characterization presentation Supervisor: Mony Orbach Presenting: Eugeney Ryzhyk, Igor Brevdo.
5 March DCS Final Design Review: RPC detector The DCS system of the Atlas RPC detector V.Bocci, G.Chiodi, E. Petrolo, R.Vari, S.Veneziano INFN Roma.
Normal text - click to edit RCU – DCS system in ALICE RCU design, prototyping and test results (TPC & PHOS) Johan Alme.
ETRAX CRIS architecture and Xilinx FPGA Peter Zumbruch Experiment control systems group GSI (KS/EE)
Linux development on embedded PowerPC 405 Jarosław Szewiński.
ATCA based LLRF system design review DESY Control servers for ATCA based LLRF system Piotr Pucyk - DESY, Warsaw University of Technology Jaroslaw.
Summary of CSC Track-Finder Trigger Control Software Darin Acosta University of Florida.
Status of Global Trigger Global Muon Trigger Sept 2001 Vienna CMS-group presented by A.Taurok.
A PCI Card for Readout in High Energy Physics Experiments Michele Floris 1,2, Gianluca Usai 1,2, Davide Marras 2, André David IEEE Nuclear Science.
Muon Electronics Upgrade Present architecture Remarks Present scenario Alternative scenario 1 The Muon Group.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
XTCA projects (HW and SW) related to ATLAS LAr xTCA interest group - CERN 07/03/2011 Nicolas Letendre – Laurent Fournier - LAPP.
Latest ideas in DAQ development for LHC B. Gorini - CERN 1.
IPHC - DRS Gilles CLAUS 04/04/20061/20 EUDET JRA1 Meeting, April 2006 MAPS Test & DAQ Strasbourg OUTLINE Summary of MimoStar 2 Workshop CCMOS DAQ Status.
Clara Gaspar, December 2012 Experiment Control System & Electronics Upgrade.
A Super-TFC for a Super-LHCb (II) 1. S-TFC on xTCA – Mapping TFC on Marseille hardware 2. ECS+TFC relay in FE Interface 3. Protocol and commands for FE/BE.
IPbus A method to communicate with cards over Ethernet 1.
Clara Gaspar on behalf of the ECS team: CERN, Marseille, etc. October 2015 Experiment Control System & Electronics Upgrade.
AFP Trigger DAQ and DCS Krzysztof Korcyl Institute of Nuclear Physics - Cracow on behalf of TDAQ and DCS subsystems.
Firmware Overview and Status Erno DAVID Wigner Research Center for Physics (HU) 26 January, 2016.
ROM. ROM functionalities. ROM boards has to provide data format conversion. – Event fragments, from the FE electronics, enter the ROM as serial data stream;
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
ATCA based LLRF system design review DESY Control servers for ATCA based LLRF system Piotr Pucyk - DESY, Warsaw University of Technology Jaroslaw.
Backend Control Systems in uTCA HCAL Requirements January 21, 2010 Jeremiah Mans.
XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser Wojciech Jalmuzna, Technical University of Lodz, Department of Microelectronics and Computer.
L1Calo Upgrade Phase 2 ● Phase 2 Functional Blocks? – Thoughts on L1 “refinement” of L0 ● Simulation framework ● Phase 1 Online SW Murrough Landon 2 February.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
of the Upgraded LHCb Readout System
A. Aloisio, R. Giordano Univ. of Naples ‘Federico II’
Online clock software status
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
M. Bellato INFN Padova and U. Marconi INFN Bologna
Use of FPGA for dataflow Filippo Costa ALICE O2 CERN
Software for Testing the New Injector BLM Electronics
The Jülich Digital Readout System for PANDA Developments
CALICE DAQ Developments
CERN-IPMC Solution for AdvancedTCA Blades
Institute of Nuclear Physics Polish Academy of Sciences
Demonstrator Slice Possibilities and Timetable
Off-detector electronics: what could be a generic module?
SLC5 VME ROD based test system for Pixel DAQ: Outlook for Spring 2014
Test Stand Status and Plans
Online Control Program: a summary of recent discussions
VME Pixel ROD Setup in UW Pixel Lab B050
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
Erno DAVID, Tivadar KISS Wigner Research Center for Physics (HU)
Firmware Structure Alireza Kokabi Mohsen Khakzad Friday 9 October 2015
CoBo - Different Boundaries & Different Options of
LATOME LAPP Nicolas Dumont Dayot on behalf of the LAPP team
ALICE Trigger Upgrade CTP and LTU PRR
The Software Framework available at the ATLAS ROD Crate
“Golden” Local Run: Trigger rate = 28Hz
Silicon Lab Bonn Physikalisches Institut Universität Bonn
Atlas RCE Infrastructure Support and Experiences
xTCA interest group meeting
Read-out of High Speed S-LINK Data Via a Buffered PCI Card
Evolution of S-LINK to PCI interfaces
Front-end digital Status
xTCA in CMS Magnus Hansen Thanks to many CMS collaborators
#01 Client/Server Computing
ATLAS Detector Controls
ATLAS: Level-1 Calorimeter Trigger
University of California Los Angeles
Control requirements from sub-detectors Results of the ECS to front-end interface questionnaire André Augustinus 18 May 2000.
The Performance and Scalability of the back-end DAQ sub-system
System View Inc..
#01 Client/Server Computing
TTC setup at MSU 6U VME-64 TTC Crate: TTC clock signal is
Presentation transcript:

Communication Models for Run Control with ATCA-based Systems Introduction Architecture models and survey Outlook I apologize for the bias towards my work in the ATLAS Level-1 Central Trigger (L1CT) on the Phase-1 Upgrade of the Muon-to-Central-Trigger-Processor Interface (MUCTPI) xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Definition: Run Control Run Control = Trigger/DAQ control communication: Send control commands, e.g. start, stop, pause, run calibration etc. Load configuration data, e.g. lookup-table files, algorithm parameters, etc. Collect monitoring data, e.g. counters, selected event data, etc. It is NOT: no slow control: voltages, currents, temperatures, etc. (→ IPMI) no event data, except for monitoring of selected event data, (→ Readout Links) xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Legacy model for VME SBC MIOCTnn/MICTP/MIROD ATLAS VME FPGA FPGA1 … VMEIF FPGA1 FPGA2 VME Internal Bus ATLAS Run Control MIOCTnn/MICTP/MIROD … Hardware modules are based on VME Single-board computer (SBC) communicates on one side with the Run Control via IP/Ethernet On the other side the SBC communicates via VME with the hardware modules: read/write cycles (single or block) Hardware modules have a dedicated FPGA with VME I/F firmware, and internal bus to other FPGAs with individual strobe lines Almost all ATLAS sub-detectors use this model ATLAS/TDAQ provides support for Purchasing of SBCs Common VME driver and library xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Model 1 for ATCA: IPbus ATCA PC uHAL FPGA IPbus FPGA1 FPGA2 UDP/IP/Ethernet ATLAS Run Control MUCTPI Internal Links ATCA Provided by CMS: based on firmware and software An FPGA receives UDP packets and performs read/write transactions with other (processing) FPGAs ATLAS/TDAQ provides the s/w library (uHAL) as part of its releases In the ATLAS L1CT, we have tested it – it does work: ⇒ We consider it a fall-back solution for the L1CT/MUCTPI But: UDP is not a reliable protocol, packets can get lost, and for multiple clients a ControlHub software is needed, written in Erlang … xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Model 2 for ATCA: RemoteBus (L1CT) PC SoC FPGA1 FPGA2 TCP/IP/Ethernet ATLAS Run Control MUCTPI ATCA Internal Links The MUCTPI uses a “System on a Chip” (SoC), i.e. FPGA with embedded processor running embedded Linux (Xilinx Zynq) Use a client-server and request-response approach: client = TDAQ controller on PC sends requests server = process on Zynq, receives requests and sends responses Use TCP: reliable protocol, i.e. no data loss Use synchronous approach: as before with VME, but allow multiple clients and multi-threaded server Provide several modes of working: Single and block read/write functions (as before with VME) Remote functions for more complex hardware access (like Remote Procedure Call, RPC), e.g. I2C, SPI, JTAG, etc. Queuing of several requests: bundle several requests before sending them together ⇒ mitigate latency overhead Extend functionality by using C++ inheritance for adding more complex functions Use Yocto/OpenEmbedded framework for building Linux operating system and RemoteBus software ⇒ We use it currently to test a prototype of the new L1CT/MUCTPI This is an example of using remote procedure call developed by L1CT, other implementations in ATLAS exist xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Model 3 for ATCA: TDAQ/Embedded Linux SoC (or CoM) FPGA1 FPGA2 ATLAS Run Control MUCTPI ATCA Internal Links The TDAQ controller runs directly on the SoC → How difficult to port ATLAS TDAQ s/w? How much effort to maintain? How much effort to fulfil CERN/IT’s security requirements? In the L1CT, we have started to evaluate the porting ATLAS TDAQ to embedded Linux using the Yocto/OpenEmbedded framework → technical student project, started 03/17, so far going quite well …  Alternatively, a CoM (“Computer on Module” ⇒ “PC on ATCA blade”) could be used xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Survey: Run Control with ATCA in ATLAS ATLAS Project Hardware (SoC, FPGA) Software/Firmware gFEX v2: Xilinx Zynq 7045 Linux (Yocto/OpenEmbedded) + IPbus (software emulation) v3: Xilinx Zynq Ultrascale+ MPSoC ZU19EG Linux (Yocto/OpenEmbedded) + TDAQ – planned TOPO Xilinx Kintex7 325 IPbus (firmware) jFEX Xilinx Zynq 7030 IPbus (firmware on Zynq/PL) Linux? + software? eFEX Xilinx Virtex7 550, 690 GBT (via Hub module) Control traffic with deterministic latency – possibility MUCTPI Xilinx Zynq IPbus (firmware on Zynq/PL) – fallback Linux (Yocto/OpenEmbedded) + RemoteBus (L1CT) Linux (Yocto/OpenEmbedded) + TDAQ – project xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Survey: Run Control with ATCA in ATLAS (cont’d) ATLAS Project Hardware Software/Firmware CSC ROD Xilinx Zynq Linux (RTEMS and ArchLinux) + RPC + JSON (Four daughter boards: RTEMS, one: ArchLinux) Pixel-Chip teststand gen1: Xilinx Virtex 4 (PPC405) Linux (RTEMS) + TDAQ4 (private port to PPC) gen3: Xilinx Zynq Linux (ArchLinux) + TDAQ5 (private port) – discontinued Linux (ArchLinux) + Remote Call Framework (RCF) AFP prototype (same as Pixel test stand gen3) Linux (ArchLinux) + TDAQ5 (private port) - discontinued NSW Trigger Processor GBT-SCA (FE ASICs) E-Links (FPGAs) None TileCal PreProcessor (ROD) Prototype: Xilinx Virtex7 + Kintex7 IPbus FTK Data Formatter (DF) FTK Level-2 Interface Card (FLIC) Xilinx Virtex7 Lar LATOME Altera Arria 10 FPGA xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Survey: Run Control with ATCA in ATLAS My observations: Many ATLAS projects are using IPbus and people are happy to use it Many ATLAS projects are using or plan to use SoC, all of which are based on Xilinx Zynq (ARMv7 processors, 32-bit) Note: the next generation (Xilinx Zynq Ultrascale+) is based on ARMv8 processors (64-bit) A few different implementations of RPC-like applications on embedded Linux exist and people are happy to use them xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Outlook Several RPC-like solutions: Could the RPC-like applications be unified? Could TDAQ provide an RPC stub for the TDAQ run controllers? I don’t know the answer, but a better way to unite our efforts could be the following: Porting TDAQ to an embedded Linux has definite advantages: No need for an intermediate layer like IPbus (software & firmware) or RPC-like applications (software) Looks like legacy model of SBC and VME: TDAQ controllers can be written in a similar way Common low-level functionality for inter-FPGA communication, I2C, SPI, JTAG, etc. could be provided in a way similar to ATLAS ROD Crate DAQ (common drivers and libraries) Embedded Linux provides a full operating system which can run many user applications and allows direct interactive access (ssh) → In the ATLAS L1CT, we are currently investigating the possibility to port TDAQ to Zynq: technical student project, started 03/17 → If possible, could a port of the software be maintained by ATLAS TDAQ? → What support could possibly be provided by CERN/IT? Investigate the possibility to have CERN CentOS (CC) for ARMv8 processors? What do other experiments do? xTCA Interest Group - 15-JUN-2017 R. Spiwoks

Let’s build intelligent run control directly into each ATCA blade! xTCA Interest Group - 15-JUN-2017 R. Spiwoks