2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.

Slides:



Advertisements
Similar presentations
G ö khan Ü nel / CHEP Interlaken ATLAS 1 Performance of the ATLAS DAQ DataFlow system Introduction/Generalities –Presentation of the ATLAS DAQ components.
Advertisements

Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
A Gigabit Ethernet Link Source Card Robert E. Blair, John W. Dawson, Gary Drake, David J. Francis*, William N. Haberichter, James L. Schlereth Argonne.
Copyright© 2000 OPNET Technologies, Inc. R.W. Dobinson, S. Haas, K. Korcyl, M.J. LeVine, J. Lokier, B. Martin, C. Meirosu, F. Saka, K. Vella Testing and.
1 Introduction to Geneva ATLAS High Level Trigger Activities Xin Wu Journée de réflexion du DPNC, 11 septembre, 2007 Participants Assitant(e)s: Gauthier.
October 20 th, 2000Lyon - DAQ2000HP Beck ATLAS Trigger & Data Acquisition Requirements and Concepts Hanspeter Beck LHEP - Bern for the ATLAS T/DAQ Group.
The ATLAS High Level Trigger Steering Journée de réflexion – Sept. 14 th 2007 Till Eifert DPNC – ATLAS group.
Kostas KORDAS INFN – Frascati XI Bruno Touschek spring school, Frascati,19 May 2006 Higgs → 2e+2  O (1/hr) Higgs → 2e+2  O (1/hr) ~25 min bias events.
Slide: 1 Richard Hughes-Jones T2UK, October 06 R. Hughes-Jones Manchester 1 Update on Remote Real-Time Computing Farms For ATLAS Trigger DAQ. Richard Hughes-Jones.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
Chris Bee ATLAS High Level Trigger Introduction System Scalability Trigger Core Software Development Trigger Selection Algorithms Commissioning & Preparation.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
March 2003 CHEP Online Monitoring Software Framework in the ATLAS Experiment Serguei Kolos CERN/PNPI On behalf of the ATLAS Trigger/DAQ Online Software.
Virtual Organization Approach for Running HEP Applications in Grid Environment Łukasz Skitał 1, Łukasz Dutka 1, Renata Słota 2, Krzysztof Korcyl 3, Maciej.
K. Honscheid RT-2003 The BTeV Data Acquisition System RT-2003 May 22, 2002 Klaus Honscheid, OSU  The BTeV Challenge  The Project  Readout and Controls.
Worldwide event filter processing for calibration Calorimeter Calibration Workshop Sander Klous September 2006.
Straw electronics Straw Readout Board (SRB). Full SRB - IO Handling 16 covers – Input 16*2 links 400(320eff) Mbits/s Control – TTC – LEMO – VME Output.
Hall D Trigger and Data Rates Elliott Wolin Hall D Electronics Review Jefferson Lab 23-Jul-2003.
DAQ System at the 2002 ATLAS Muon Test Beam G. Avolio – Univ. della Calabria E. Pasqualucci - INFN Roma.
A TCP/IP transport layer for the DAQ of the CMS Experiment Miklos Kozlovszky for the CMS TriDAS collaboration CERN European Organization for Nuclear Research.
Boosting Event Building Performance Using Infiniband FDR for CMS Upgrade Andrew Forrest – CERN (PH/CMD) Technology and Instrumentation in Particle Physics.
What’s in the ATLAS data : Trigger Decision ATLAS Offline Software Tutorial CERN, August 2008 Ricardo Gonçalo - RHUL.
LECC2003 AmsterdamMatthias Müller A RobIn Prototype for a PCI-Bus based Atlas Readout-System B. Gorini, M. Joos, J. Petersen (CERN, Geneva) A. Kugel, R.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
The ATLAS Trigger and Data Acquisition: a brief overview of concept, design and realization John Erik Sloper ATLAS TDAQ group CERN - Physics Dept. April.
Design and Performance of a PCI Interface with four 2 Gbit/s Serial Optical Links Stefan Haas, Markus Joos CERN Wieslaw Iwanski Henryk Niewodnicznski Institute.
Remote Online Farms Sander Klous
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
The Data Flow System of the ATLAS DAQ/EF "-1" Prototype Project G. Ambrosini 3,9, E. Arik 2, H.P. Beck 1, S. Cetin 2, T. Conka 2, A. Fernandes 3, D. Francis.
1 Network Performance Optimisation and Load Balancing Wulf Thannhaeuser.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
Prospects for the use of remote real time computing over long distances in the ATLAS Trigger/DAQ system R. W. Dobinson (CERN), J. Hansen (NBI), K. Korcyl.
Online-Offsite Connectivity Experiments Catalin Meirosu *, Richard Hughes-Jones ** * CERN and Politehnica University of Bucuresti ** University of Manchester.
Geneva – Kraków network measurements for the ATLAS Real-Time Remote Computing Farm Studies R. Hughes-Jones (Univ. of Manchester), K. Korcyl (IFJ-PAN),
May 23 rd, 2003Andreas Kugel, Mannheim University1 Mannheim University – FPGA Group Real Time Conference 2003, Montréal ATLAS RobIn ATLAS Trigger/DAQ Read-Out-Buffer.
January 12th 2009MICINN-IN2P3 meeting1 ATLAS and CMS in France Construction Integration and Commissioning Test beam LHC computing Physics Plans for sLHC.
LHCb front-end electronics and its interface to the DAQ.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
Experience with multi-threaded C++ applications in the ATLAS DataFlow Szymon Gadomski University of Bern, Switzerland and INP Cracow, Poland on behalf.
ATLAS TDAQ RoI Builder and the Level 2 Supervisor system R. E. Blair, J. Dawson, G. Drake, W. Haberichter, J. Schlereth, M. Abolins, Y. Ermoline, B. G.
Networking update and plans (see also chapter 10 of TP) Bob Dobinson, CERN, June 2000.
Kostas KORDAS INFN – Frascati 10th Topical Seminar on Innovative Particle & Radiation Detectors (IPRD06) Siena, 1-5 Oct The ATLAS Data Acquisition.
GridPP Meeting Jan 2003 R. Hughes-Jones Manchester ATLAS Trigger/DAQ Real-time use of the Grid Network Richard Hughes-Jones The University of Manchester.
New DAQ at H8 Speranza Falciano INFN Rome H8 Workshop 2-3 April 2001.
1 Farm Issues L1&HLT Implementation Review Niko Neufeld, CERN-EP Tuesday, April 29 th.
LKr readout and trigger R. Fantechi 3/2/2010. The CARE structure.
ATLAS RoI Builder + CDF ● Brief reminder of ATLAS Level 2 ● Opportunities for CDF (what fits - what doesn't) ● Timescales (more of what fits and what doesn't)
LECC2004 BostonMatthias Müller The final design of the ATLAS Trigger/DAQ Readout-Buffer Input (ROBIN) Device B. Gorini, M. Joos, J. Petersen, S. Stancu,
ANDREA NEGRI, INFN PAVIA – NUCLEAR SCIENCE SYMPOSIUM – ROME 20th October
1 Nicoletta GarelliCPPM, 03/25/2011 Overview of the ATLAS Data-Acquisition System o perating with proton-proton collisions Nicoletta Garelli (CERN) CPPM,
The Evaluation Tool for the LHCb Event Builder Network Upgrade Guoming Liu, Niko Neufeld CERN, Switzerland 18 th Real-Time Conference June 13, 2012.
Jos VermeulenTopical lectures, Computer Instrumentation, Introduction, June Computer Instrumentation Introduction Jos Vermeulen, UvA / NIKHEF Topical.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
LHCb and InfiniBand on FPGA
5/14/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
U.S. ATLAS TDAQ FY06 M&O Planning
Ricardo Gonçalo, RHUL BNL Analysis Jamboree – Aug. 6, 2007
Electronics Trigger and DAQ CERN meeting summary.
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
Operating the ATLAS Data-Flow System with the First LHC Collisions
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
Example of DAQ Trigger issues for the SoLID experiment
12/3/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
1/2/2019 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
LHCb Online Meeting November 15th, 2000
Presentation transcript:

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger and Data Acquisition System Giovanna Lehmann On Behalf of the ATLAS Trigger/DAQ DataFlow Subsystem

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Outline  ATLAS  Interaction rates and event sizes  The Trigger/DAQ architecture  The DataFlow  ROS Design & Performance  LVL2 dataflow Design & Performance  Event Builder Design & Performance  Conclusions & Outlook

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD CERN Accelerators’ Complex Colliding particles: protons Center of mass Energy: 14 TeV Bunch crossing rate: 40 MHz Interaction rate: 10 9 Hz Event size: 1-2 Mbytes

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD DATAFLOWDATAFLOW EB 120 GB/s HLTHLT L1L1 DET DET ROD LVL2 TriggerDAQ ARCHITECTURE 2.5  s ~ 10 ms Calo MuTrCh Other detectors SFI SFO EFN FE Pipelines ROIB L2P L2SV L2N Event Filter EFP RoI RoI data = 2% RoI requests Lvl2 acc= ~2 kHz ~ sec Lvl1 acc = 75 kHz 40 MHz ~3 GB/s EFacc= ~0.2 kHz EB req/clears EBN ROS ROB IOManager data DFM RRM 40 MHz 75 kHz ~2 kHz ~ 200 Hz 120 GB/s ~ 300 MB/s ~3+3 GB/s O(100) L2P O(1000) EFP 1628 ROLs O(100) ROS O(100) SFI

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The ROS  Receive & buffer event fragments from the 1628 detector ROLs  Up to 160 MB/s per ROL  Send selected event fragments on request  ROI requests : high rate, low data volume Rate: LVL1 rate (75 kHz), volume: ~2% of ROLs  EB requests : low rate, high data volume Rate: ~3% of LVL1 rate (~2 kHz), volume: complete event data  Provide fragment sampling for data monitoring

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD ROS High Level Components Data requests Event Fragments Monitoring data Control/Configuration RODs ROS subsystem L2 & EB Online SW I/O Manager (SW process) RobIn (custom module) Local Controller (SW process) 500x 150x

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Test Setup: ROS performance  ROS implemented on a 2 GHz PC, with 4 PCI busses (64 bit/66 MHz)  3 RobIn emulators on PCI  On-board “local” bus limited to 266MB/s  Each simulates 4 input channels  12 ROLs per ROS  I/O to/from L2 & EB emulator  Connected to the ROS through a GE switch  Sends ROI/EB requests and clears to the ROS  Receives data fragments back  Uses TCP as communication protocol (maximum possible overhead for message passing)

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD ROS Performance ATLAS baseline conditions (from paper model: contains safety factor 4 with respect to physics simulation )

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The LVL2 Dataflow  Receive RoI information from LVL1  8 LVL1 rate (75 kHz)  Form a LVL1 result record  Build 1 record out of LVL1 rate  Retrieve RoI data from ROSs  2 % of full Event (~30 kB)  Forward the LVL2 decision to the EB LVL2 accept rate (rejects are grouped)  Forward the LVL2 decision record to the EB LVL2 accept rate (~2 kHz)

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD RoI req./data Decisions Control/Configuration EB LVL2 subsystem LVL1 Online SW RoIBuilder (custom module) L2SV (SW process) L2PU (SW process) pROS (SW process) L2 record ROS RoI information 1x10x x 1x LVL2 High Level Components DC Controller (SW process)

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Performance of RoI Builder, L2SV and pROS  Performance of RoI Builder  Custom built 12U VME prototype has achieved required performance  Performance of each LVL2 supervisor  Measured to be ~30 kHz on a 2.4 GHz dual CPU PC  Is insensitive to the number of L2PUs  Performance of pROS  Not a demanding application  Requirement to receive <10 kB at LVL2 accept rate (~3 kHz) and forward them to the EB is largely satisfied.

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Test Setup: Performance of L2PU  ROS emulators used to send data over Gbit Ethernet.  RoI data collection takes always a small fraction of the time requested by the LVL2 event processing (~10 ms).  From a dataflow point of view << 100 L2PUs could sustain already the LVL1 rate.

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The Event Builder  Receive LVL2 decisions LVL2 accept rate (~2 kHz; rejects are grouped)  Request data from ROS and pROS  Build complete events  Depending on ROS implementation merge fragments into one.  ~70 MB/s at every SFI  Distribute clears to ROS and pROS rate < LvL2 accept rate  Forward complete events to EF  ~70 MB/s at every SFI  Provide fragment sampling for data monitoring

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD EB req./data LVL2 Decisions Control/Configuration L2 EB subsystem pROS Online SW DFM (SW process) SFI (SW process) Clears ROS 1x 50x EB High Level Components EF Monitoring DC Controller (SW process) 10x Complete Event

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Test Setup: EB Performance DFM 16x ROS em... Switch SFI... 8x ROS em SFI applications were run on 2.4 GHz dual CPU PCs Many ROSs to many SFIs ROS Emulators:  ALTEON programmable GE NICs  Raw ethernet communication protocol  Simulating n sources

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD EB Performance Limit of 16 ROS emulators for single frame messages EB rate with 8 SFIs ~ 350Hz (17% of ATLAS EB rate) 8 ROLs/ROS Flow Control 1 ROL/ROS No Flow Control

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Conclusions & Outlook  All elements of the DataFlow system have shown that they can satisfy the ATLAS requirements already with the present implementations and with today’s technology.  From now on emphasis will be put on the performance of the integrated DataFlow system.  Testbeds are being setup to measure its behaviour and the first results are encouraging.

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Spares

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD ATLAS baseline conditions (from paper model) Results of Test 1 (no I/O to LVL2 & EB)

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Test Setup 2: Scaling of LVL2 Network From a dataflow point of view a few L2PUs sustain already a large fraction of the LVL1 rate.

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Test Setup 1: Performance of the DFM Tester: L2SV + n SFIs DFM LVL2 decision (group) SFI EOE DFM_Decision Clears (group 300)  Tester Application emulates L2SV and many SFIs  DFM handling full I/O as for real ATLAS  DFM exposed to full input message rate from tester  DFM sending to non existing destinations (Connectionless protocol used)

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD DFM Performance ATLAS event building rate raw ethernet frames udp Test on a 2.2 GHz dual CPU PC: Rate= function of CPU clock ->

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD Test Setup 1: SFI Performance DFM 1 Gbit/s Ethernet 16x ROS em... EF Switch ROS em SFI application was run on a 2.4 GHz dual CPU PC Many ROSs to 1 SFI ROS Emulators:  ALTEON programmable GE NICs  Raw ethernet communication protocol  Simulating n sources SFI

2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD SFI Performance 95 MB/s – IO limited #ROLs/ROS EB only With output to EF CPU limited (2.4 GHz CPU)  Reaching I/O limit at 95 MB/s otherwise CPU limited  35% performance gain with at least 8 ROLs/ROS  Will approach I/O limit for 1 ROL/ROS with faster CPU