KIP Ivan Kisel, Uni-Heidelberg, RT20031 22 May 2003 A Scalable 1 MHz Trigger Farm Prototype with Event-Coherent DMA Input V. Lindenstruth, D. Atanasov,

Slides:



Advertisements
Similar presentations
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
Advertisements

7 Nov 2002Niels Tuning - Vertex A vertex trigger for LHCb The trigger for LHCb ….. and the use of the Si vertex detector at the first and second.
A Gigabit Ethernet Link Source Card Robert E. Blair, John W. Dawson, Gary Drake, David J. Francis*, William N. Haberichter, James L. Schlereth Argonne.
Copyright© 2000 OPNET Technologies, Inc. R.W. Dobinson, S. Haas, K. Korcyl, M.J. LeVine, J. Lokier, B. Martin, C. Meirosu, F. Saka, K. Vella Testing and.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK New Test Results for the ALICE High Level Trigger.
CHEP03 - UCSD - March 24th-28th 2003 T. M. Steinbeck, V. Lindenstruth, H. Tilsner, for the Alice Collaboration Timm Morten Steinbeck, Computer Science.
Router Architectures An overview of router architectures.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. The Power of Data Driven Triggering DAQ Topology.
K. Honscheid RT-2003 The BTeV Data Acquisition System RT-2003 May 22, 2002 Klaus Honscheid, OSU  The BTeV Challenge  The Project  Readout and Controls.
KIP TRACKING IN MAGNETIC FIELD BASED ON THE CELLULAR AUTOMATON METHOD TRACKING IN MAGNETIC FIELD BASED ON THE CELLULAR AUTOMATON METHOD Ivan Kisel KIP,
Data Acquisition Backbone Core DABC J. Adamczewski, H.G. Essel, N. Kurz, S. Linev GSI, Darmstadt The new Facility for Antiproton and Ion Research at GSI.
Architecture and Dataflow Overview LHCb Data-Flow Review September 2001 Beat Jost Cern / EP.
A TCP/IP transport layer for the DAQ of the CMS Experiment Miklos Kozlovszky for the CMS TriDAS collaboration CERN European Organization for Nuclear Research.
The High-Level Trigger of the ALICE Experiment Heinz Tilsner Kirchhoff-Institut für Physik Universität Heidelberg International Europhysics Conference.
Boosting Event Building Performance Using Infiniband FDR for CMS Upgrade Andrew Forrest – CERN (PH/CMD) Technology and Instrumentation in Particle Physics.
C.Combaret, L.Mirabito Lab & beamtest DAQ with XDAQ tools.
LECC2003 AmsterdamMatthias Müller A RobIn Prototype for a PCI-Bus based Atlas Readout-System B. Gorini, M. Joos, J. Petersen (CERN, Geneva) A. Kugel, R.
Use of GPUs in ALICE (and elsewhere) Thorsten Kollegger TDOC-PG | CERN |
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Fast reconstruction of tracks in the inner tracker of the CBM experiment Ivan Kisel (for the CBM Collaboration) Kirchhoff Institute of Physics University.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
1 Network Performance Optimisation and Load Balancing Wulf Thannhaeuser.
Design Criteria and Proposal for a CBM Trigger/DAQ Hardware Prototype Joachim Gläß Computer Engineering, University of Mannheim Contents –Requirements.
KIP Ivan Kisel JINR-GSI meeting Nov 2003 High-Rate Level-1 Trigger Design Proposal for the CBM Experiment Ivan Kisel for Kirchhoff Institute of.
Standalone FLES Package for Event Reconstruction and Selection in CBM DPG Mainz, 21 March 2012 I. Kisel 1,2, I. Kulakov 1, M. Zyzak 1 (for the CBM.
Latest ideas in DAQ development for LHC B. Gorini - CERN 1.
1 Open charm simulations ( D +, D 0,  + c ) Sts geometry: 2MAPS +6strip (Strasbourg geo) or 2M2H4S (D+ and D - at 25AGeV); TOOLS: signal (D +  K - 
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
Methods for fast reconstruction of events Ivan Kisel Kirchhoff-Institut für Physik, Uni-Heidelberg FutureDAQ Workshop, München March 25-26, 2004 KIP.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
Cellular Automaton Method for Track Finding (HERA-B, LHCb, CBM) Ivan Kisel Kirchhoff-Institut für Physik, Uni-Heidelberg Second FutureDAQ Workshop, GSI.
1 PCI fragment buffers Input links TAGnet link protocol for generating event-coherent DMA bursts in trigger farms Hans Muller, Filipe Vinci dos Santos,
The CMS Event Builder Demonstrator based on MyrinetFrans Meijers. CHEP 2000, Padova Italy, Feb The CMS Event Builder Demonstrator based on Myrinet.
Reconstruction Chain used for the D Meson Analysis Ivan Kisel Kirchhoff Institute of Physics University of Heidelberg, Germany CBM Collaboration Meeting.
Future experiment specific needs for LHCb OpenFabrics/Infiniband Workshop at CERN Monday June 26 Sai Suman Cherukuwada Sai Suman Cherukuwada and Niko Neufeld.
Kalman Filter based Track Fit running on Cell S. Gorbunov 1,2, U. Kebschull 2, I. Kisel 2,3, V. Lindenstruth 2 and W.F.J. Müller 1 1 Gesellschaft für Schwerionenforschung.
1 Farm Issues L1&HLT Implementation Review Niko Neufeld, CERN-EP Tuesday, April 29 th.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
LVDS switch: failure tolerance in 2D torus topology Gloria Torralba Kirchhoff-Institut für Physik - University of Heidelberg Dtp. Electronic Engineering.
DAQ Overview + selected Topics Beat Jost Cern EP.
Network On Chip Cache Coherency Final presentation – Part A Students: Zemer Tzach Kalifon Ethan Kalifon Ethan Instructor: Walter Isaschar Instructor: Walter.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
July 22, 2002Brainstorming Meeting, F.Teubert L1/L2 Trigger Algorithms L1-DAQ Trigger Farms, July 22, 2002 F.Teubert on behalf of the Trigger Software.
LECC2004 BostonMatthias Müller The final design of the ATLAS Trigger/DAQ Readout-Buffer Input (ROBIN) Device B. Gorini, M. Joos, J. Petersen, S. Stancu,
ROD Activities at Dresden Andreas Glatte, Andreas Meyer, Andy Kielburg-Jeka, Arno Straessner LAr Electronics Upgrade Meeting – LAr Week September 2009.
The Evaluation Tool for the LHCb Event Builder Network Upgrade Guoming Liu, Niko Neufeld CERN, Switzerland 18 th Real-Time Conference June 13, 2012.
HTCC coffee march /03/2017 Sébastien VALAT – CERN.
Electronics Trigger and DAQ CERN meeting summary.
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
TELL1 A common data acquisition board for LHCb
Controlling a large CPU farm using industrial tools
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
The LHCb Event Building Strategy
LHCb Trigger and Data Acquisition System Requirements and Concepts
John Harvey CERN EP/LBC July 24, 2001
Event Building With Smart NICs
LHCb Trigger, Online and related Electronics
The LHCb High Level Trigger Software Framework
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
LHCb Trigger LHCb Trigger Outlook:
TELL1 A common data acquisition board for LHCb
Presentation transcript:

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 A Scalable 1 MHz Trigger Farm Prototype with Event-Coherent DMA Input V. Lindenstruth, D. Atanasov, I. Kisel, A. Walsch ( KIP, Uni-Heidelberg, Germany ) H. Muller, D. Altmann, A. Guirao, F. Vinci dos Santos ( CERN, Geneva, Switzerland ) LHCb Level-1 Trigger Trigger Concept Trigger Prototype Trigger Simulation Trigger Algorithm

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 e h   Level-1 Trigger for LHCb 1.Find VELO 2D tracks and reconstruct 3D primary vertex 2.Reconstruct high-impact parameter tracks in 3D 3.Extrapolate to TT through small magnetic field  PT 4.Match tracks to L0 muon objects  PT and PID 5.Select B–events using impact parameter and PT information 6.Use T1—3 data to improve further selection (5—10% of events) VELO TT (T1—3)

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Trigger Concept Reduce rate from 1 MHz to 40 kHz  Send data to RU (8 kB/evt  8 GB/s): VELO + TT + L0DU T1—3 (~5—10% on CPU demand)  Traffic shaping --> use Scheduler !  NIC with Remote Direct Memory Access !  Prototype : 2D torus with 32 dual nodes at 1.24 MHz  Trigger farm : 3D torus with up to 1200 CPUs TagNet Data PC farm (2D torus) Scheduler - RU - CN X->Y routing x y

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Scheduler - Basic Block Diagram The supervisor of the system. Handle a coherent data transfer between RUs and CNs. Feed the TagNet with tags for synchronous data transfer in the RUs. Scheduler Core Tag Output Stage TagNet Feed TagNet Feedback User Control Event entries Free CN IDs entries List of free computing nodes Control Register Status Register Tag Input Stage

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Tag InTag Out Data In Subevent Buffer DMA PCI Bus NIC Interface NIC Out NIC In C/M ? Command Execution Message Execution MUX Tag Buffer Readout Unit (RU) Command TagMessage Tag

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Trigger Farm Prototype in Heidelberg >1 MHz 64 CPUs 2D torus 6 Gbit/s NIC 1 year 480 MB/s p-p 450 MB/s x-y

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Automatic setup of the compute farm Configure and control processes on every CN GUI of Prototype

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Scheduler TagNet 3D Core1D Cover Data TagNet – schedule and send small data packets Core network – distribute data to the target compute nodes Cover network – increase number of compute nodes X->Y->Z routing path - RU - CN 3D Torus Topology 4x4x(1+2+1) x y z

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Ptolemy II Simulation of the Trigger 3D Torus (6x6x8) 275 CNs

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Simulation of the Trigger --- Results 128 B/RU 2.1 MHz measured ! 1200 CPU +5% T1--3 Compact Scalable Fast Fast Response VELO Z VELO events VELOT1-3 New VELO Scheduler

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Tracking Efficiency and PV Resolution Track subsets Reference B long Reference prim. long Reference B Reference primary Reference set All set Extra set Clone Ghost D % 3D Z core  46  m X/Y core  17  m

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Trigger Performance  time (ms)  Events 17 ms 15  s Mean: 15  s Max: ~130  s CPU 4.8 ms 1) Tracking efficiency 97—99% 2) PV resolution 46  m 3) Timing 4.8 ms Expect a factor 7—8 in CPU power in 2007 (PASTA report) => we are already within 1 ms ! Cellular Automaton algorithm FPGA co-processor at 50 MHz 8 processing units running in parallel => 15  s ! FPGA co-processor  Events  time (  s)

KIP Ivan Kisel, Uni-Heidelberg, RT May 2003 Summary: Demonstrated Architecture with 3D torus and TagNet Prototype of 64 CPUs has shown stable work at > 1 MHz The Simulation is based on the prototype measurements The Algorithm has high performance on tracks and vertices The Cost is 1300 kCHF (500 CPU) / 2300 kCHF (1000 CPU) The Team --- Heidelberg, CERN and Dubna