Online View and Planning LHCb Trigger Panel Beat Jost Cern / EP.

Slides:



Advertisements
Similar presentations
20 July 2006 H. Chanal, R. Cornat, E. Delage, O. Deschamps, J. Laubser, M. Magne, P. Perret LPC Clermont Level 0 Decision Unit PRR.
Advertisements

LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
The LHCb Event-Builder Markus Frank, Jean-Christophe Garnier, Clara Gaspar, Richard Jacobson, Beat Jost, Guoming Liu, Niko Neufeld, CERN/PH 17 th Real-Time.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
LHCb readout infrastructure NA62 TDAQ WG Meeting April 1 st, 2009 Niko Neufeld, PH/LBC.
The LHCb Online System Design, Implementation, Performance, Plans Presentation at the 2 nd TIPP Conference Chicago, 9 June 2011 Beat Jost Cern.
Architecture and Dataflow Overview LHCb Data-Flow Review September 2001 Beat Jost Cern / EP.
TFC Partitioning Support and Status Beat Jost Cern EP.
1 5 December 2012 Laurent Roy Infrastructure / Electronics Upgrade -Cabling (long distance) -Rack and Crate (space, electrical distribution, cooling) -Detector.
Status and plans for online installation LHCb Installation Review April, 12 th 2005 Niko Neufeld for the LHCb Online team.
CERN Real Time conference, Montreal May 18 – 23, 2003 Richard Jacobsson 1 Driving the LHCb Front-End Readout TFC Team: Arek Chlopik, IPJ, Poland Zbigniew.
LHCb Trigger and Data Acquisition System Beat Jost Cern / EP Presentation given at the 11th IEEE NPSS Real Time Conference June 14-18, 1999 Santa Fe, NM.
Network Architecture for the LHCb DAQ Upgrade Guoming Liu CERN, Switzerland Upgrade DAQ Miniworkshop May 27, 2013.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
R. Fantechi. TDAQ commissioning Status report on Infrastructure at the experiment PC farm Run control Network …
1 Network Performance Optimisation and Load Balancing Wulf Thannhaeuser.
Infrastructure for the LHCb RTTC Artur Barczyk CERN/PH RTTC meeting,
Status of NA62 straw electronics Webs Covers Services Readout.
1 DAQ System Realization DAQ Data Flow Review Sep th, 2001 Niko Neufeld CERN, EP.
LHCb front-end electronics and its interface to the DAQ.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
Online System Status LHCb Week Beat Jost / Cern 9 June 2015.
Niko Neufeld, CERN/PH. Online data filtering and processing (quasi-) realtime data reduction for high-rate detectors High bandwidth networking for data.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
Future experiment specific needs for LHCb OpenFabrics/Infiniband Workshop at CERN Monday June 26 Sai Suman Cherukuwada Sai Suman Cherukuwada and Niko Neufeld.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
DAQ interface + implications for the electronics Niko Neufeld LHCb Electronics Upgrade June 10 th, 2010.
1 Farm Issues L1&HLT Implementation Review Niko Neufeld, CERN-EP Tuesday, April 29 th.
Niko Neufeld HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
CERN Richard Jacobsson, CERN LEADE meeting, March 29, Physics trigger RS TFC SwitchThrottle OR/Switch VELO FEST FEOT FE Clock Orbit Clock Orbit.
Infrastructures and Installation of the Compact Muon Solenoid Data AcQuisition at CERN on behalf of the CMS DAQ group TWEPP 2007 Prague,
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
Alignment in real-time in current detector and upgrade 6th LHCb Computing Workshop 18 November 2015 Beat Jost / Cern.
Status of the NA62 network R. Fantechi 23/5/2012.
Ken Wyllie, CERN Tracker ASIC, 5th July Overview of LHCb Upgrade Electronics Thanks for the invitation to Krakow!
DAQ Systems and Technologies for Flavor Physics FPCP Conference Lake Placid 1 June 2009 Beat Jost/ Cern-PH.
PCIe40 — a Tell40 implementation on PCIexpress Beat Jost DAQ Mini Workshop 27 May 2013.
1 Event Building L1&HLT Implementation Review Niko Neufeld, CERN-EP Tuesday, April 29 th.
DAQ Overview + selected Topics Beat Jost Cern EP.
Monitoring for the ALICE O 2 Project 11 February 2016.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
Introduction to DAQ Architecture Niko Neufeld CERN / IPHE Lausanne.
ROM. ROM functionalities. ROM boards has to provide data format conversion. – Event fragments, from the FE electronics, enter the ROM as serial data stream;
1 DCS Meeting, CERN (vydio), Jun 25th 2013, A. Cotta Ramusino for INFN and Dip. Fisica FE Preliminary DCS technical specifications (v1.0) for the Gigatracker.
Grzegorz Kasprowicz1 Level 1 trigger sorter implemented in hardware.
Ingredients 24 x 1Gbit port switch with 2 x 10 Gbit uplinks  KCHF
Electronics Trigger and DAQ CERN meeting summary.
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
TELL1 A common data acquisition board for LHCb
Controlling a large CPU farm using industrial tools
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
Bernd Panzer-Steindel, CERN/IT
DAQ Systems and Technologies for Flavor Physics
The LHCb Event Building Strategy
VELO readout On detector electronics Off detector electronics to DAQ
LHCb Trigger and Data Acquisition System Requirements and Concepts
John Harvey CERN EP/LBC July 24, 2001
Event Building With Smart NICs
Philippe Vannerem CERN / EP ICALEPCS - Oct03
LHCb Electronics Brainstorm
LHCb Trigger, Online and related Electronics
The LHCb High Level Trigger Software Framework
Network Processors for a 1 MHz Trigger-DAQ System
Throttling: Infrastructure, Dead Time, Monitoring
The LHCb Front-end Electronics System Status and Future Development
Use Of GAUDI framework in Online Environment
TELL1 A common data acquisition board for LHCb
Presentation transcript:

Online View and Planning LHCb Trigger Panel Beat Jost Cern / EP

Beat Jost, CernTrigger Panel 16 March Architecture (Reminder)

Beat Jost, CernTrigger Panel 16 March Features qSupport of Level-1 and HLT on common infrastructure qTwo separate multiplexing layers, reflecting different data rates of L1 and LHT qCommon CPU farm for both traffics qScalable, e.g. L1 upgrade to include trackers qSeparate storage network to de-couple data flows qDriven by Trigger needs

Beat Jost, CernTrigger Panel 16 March Physical Implementation – Rack Layout in D1 qRoom for 50 racks qAll racks identical qUpto U boxes (≤46 boxes/rack) qUpto 150 subfarms (≤2-3 subfarms/rack)

Beat Jost, CernTrigger Panel 16 March Physical Implementation – Farm Rack qUpto 46 1U servers in one 59U rack q2 (3) subfarms per rack q2(3) data switches and one controls switch q1 patch-panel for external connection (total of 9 links) qUpgradeable to 3 subfarms per rack q2 Gb/s input bandwidth upgradeable 4 Gb/s

Beat Jost, CernTrigger Panel 16 March Physical Implementation – Rack Layout in D2 q5 SFC racks q1 rack for Readout Network qMany Racks for patch panels

Beat Jost, CernTrigger Panel 16 March Physical Implementation – Farm controls qBuilding Block of 9U height consisting of ã4 (upgradeable to 5) SFCs ã2 Controls PCs ã1 Patch panel ã2 spare spaces for upgrading number of subfarms ãOne storage switch aggregating the output data from the SFCs ãPatch panel for storage uplink 1 (2) GbEthernet links per crate

Beat Jost, CernTrigger Panel 16 March Plans qMid 2004 ãinstall UTP links between D3 and D2 and between D2 and D1 (Patch Panel  Patch Panel) ~ 2500 links. ãFiber links between surface and underground area qEnd 2004/Beginning 2005 ãInstall basic computing infrastructure åServers in computer room åBasic links top-bottom (initially GbEthernet, later prob. 10Gb Ethernet) ãAcquisition of first basic network equipment and rudimentary CPU farm. (basically HLT system) qDuring 2005 ãCommissioning of DAQ (actually online) system qBeginning 2006 ãPreparation of acquisition of final readout network and CPU farm qMid 2006 installation of final infrastructure (farm + switch)

Beat Jost, CernTrigger Panel 16 March Open Questions qExact composition of L1 trigger ãSo far assumed VeLo, TT, L0DU ãOthers? qIs ever data from ReadOut Supervisor needed in L1? ãInformation available at L0 L0 bunch current 8 bits Bunch ID (RS) 12 bits GPS40 bits Detector status24 bits L0 Event ID24 bits Trigger type3 bits L0 Force bit1 bit Bunch ID (L0DU)12 bits BX type2 bits L0 synch error1 bit L0 synch error forced1 bit qNeed to know where (which racks) SD electronics will reside (needed for Network connections) qNeed to know event sizes (per FE board, preferably)