LHCb Trigger, Online and related Electronics

Slides:



Advertisements
Similar presentations
High Level Trigger (HLT) for ALICE Bergen Frankfurt Heidelberg Oslo.
Advertisements

LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
Laboratoire de l’Accélérateur Linéaire, Orsay, France and CERN Olivier Callot on behalf of the LHCb collaboration Implementation and Performance of the.
The LHCb Online System Design, Implementation, Performance, Plans Presentation at the 2 nd TIPP Conference Chicago, 9 June 2011 Beat Jost Cern.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Architecture and Dataflow Overview LHCb Data-Flow Review September 2001 Beat Jost Cern / EP.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
ILC Trigger & DAQ Issues - 1 ILC DAQ issues ILC DAQ issues By P. Le Dû
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
1 VeLo L1 Read Out Guido Haefeli VeLo Comprehensive Review 27/28 January 2003.
Latest ideas in DAQ development for LHC B. Gorini - CERN 1.
LHCb front-end electronics and its interface to the DAQ.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
Guido Haefeli CHIPP Workshop on Detector R&D Geneva, June 2008 R&D at LPHE/EPFL: SiPM and DAQ electronics.
Data Acquisition, Trigger and Control
Future experiment specific needs for LHCb OpenFabrics/Infiniband Workshop at CERN Monday June 26 Sai Suman Cherukuwada Sai Suman Cherukuwada and Niko Neufeld.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
DAQ interface + implications for the electronics Niko Neufeld LHCb Electronics Upgrade June 10 th, 2010.
1 Farm Issues L1&HLT Implementation Review Niko Neufeld, CERN-EP Tuesday, April 29 th.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
DAQ Overview + selected Topics Beat Jost Cern EP.
Artur BarczykRT2003, High Rate Event Building with Gigabit Ethernet Introduction Transport protocols Methods to enhance link utilisation Test.
LHCb trigger: algorithms and performance Hugo Ruiz Institut de Ciències del Cosmos March 12 th -17 th 2009, Tsukuba, Japan On behalf of the LHCb Collaboration.
Introduction to DAQ Architecture Niko Neufeld CERN / IPHE Lausanne.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
Trigger Strategy and Performance of the LHCb Detector Mitesh Patel (CERN) (on behalf of the LHCb Collaboration) Thursday 7 th July 2005.
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
The Evaluation Tool for the LHCb Event Builder Network Upgrade Guoming Liu, Niko Neufeld CERN, Switzerland 18 th Real-Time Conference June 13, 2012.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
LHCb and InfiniBand on FPGA
High Rate Event Building with Gigabit Ethernet
The LHCb Trigger System
The LHCb Upgrade LHCb now Upgrade motivation New trigger and DAQ
IOP HEPP Conference Upgrading the CMS Tracker for SLHC Mark Pesaresi Imperial College, London.
Electronics Trigger and DAQ CERN meeting summary.
The CMS High-Level Trigger
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
TELL1 A common data acquisition board for LHCb
Controlling a large CPU farm using industrial tools
ALICE – First paper.
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
DAQ Systems and Technologies for Flavor Physics
The LHCb Trigger Niko Neufeld CERN, PH
The LHCb Event Building Strategy
VELO readout On detector electronics Off detector electronics to DAQ
LHCb calorimeter main features
The First-Level Trigger of ATLAS
Example of DAQ Trigger issues for the SoLID experiment
The Level-0 Calorimeter Trigger and the software triggers
LHCb Trigger and Data Acquisition System Requirements and Concepts
John Harvey CERN EP/LBC July 24, 2001
Event Building With Smart NICs
on behalf of the LHCb collaboration
Low Level HLT Reconstruction Software for the CMS SST
The LHCb Trigger Niko Neufeld CERN, PH.
Network Processors for a 1 MHz Trigger-DAQ System
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
The LHCb L0 Calorimeter Trigger
on behalf of the LHCb collaboration
Trigger Strategy and Performance of the LHCb Detector
PID meeting Mechanical implementation Electronics architecture
LHCb Trigger LHCb Trigger Outlook:
SVT detector electronics
The LHCb Front-end Electronics System Status and Future Development
Multi Chip Module (MCM) The ALICE Silicon Pixel Detector (SPD)
TELL1 A common data acquisition board for LHCb
Presentation transcript:

LHCb Trigger, Online and related Electronics Artur Barczyk CERN PH/LBC on behalf of the LHCb collaboration

LHCb Trigger-DAQ system overview Beam crossing rate: 40MHz Visible interaction rate: 10MHz Two stage trigger system Level-0: Hardware Accept rate: 1MHz High Level Trigger (HLT): Software Accept rate: 2kHz Level-1 Electronics: interface to Readout Network Readout network Gigabit Ethernet Full readout at 1MHz HLT trigger farm ~1800 nodes FE FE FE L0 trigger Readout network Timing and Fast Control CPU CPU CPU CPU CPU permanent storage X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC The Front-End system Level-0 Sub-detector specific implementation On-detector  radiation hard Event data buffered during Level-0 trigger processing Analog FE ADC L0 trigger processor L0 decision unit Readout Supervisor L0 buffer L0 derandomizer L1 input buffer Output buffer Zero suppression MUX Link interface DAQ Level 0 Electronics Level 1 L0 Throttle Fan-out TTCrx Level-1 Behind radiation wall Performs zero suppression Interfaces to the Readout Network Readout specific protocol X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC Level-0 Trigger Custom electronics Fixed latency: 4µs Includes link delays, processing time: 2 µs Pile-up system Determines number of interactions per crossing Calorimeter trigger High ET clusters SPD multiplicity Muon Trigger High pT muons L0 Decision Unit Evaluates trigger information Presents L0 decision to Readout Supervisor L0 Decision Unit To Readout Supervisor Pile-up System Calorimeter Trigger Muon Trigger X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC Level-0 Pile-up System Identifies bunch crossings with multiple interactions Use hits in two silicon planes upstream of IP Histogram track origin on beam axis (ZV) Hits belonging to highest peak are masked Search for second peak Information sent to L0 Decision Unit: Nr of tracks in second peak Hit multiplicity Performance: 2 interactions = ~60% at 95% purity Latency: ~ 1 µs X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Level-0 Calorimeter Trigger Find high ET candidates Regions of 2x2 cells PID from ECAL, HCAL energy Pre-shower information SPD information ET threshold: ~3 GeV Information to L0-DU: Highest ET candidate Total calorimeter energy SPD multiplicity Performace: hadronic channels = 30-50% Latency: ~ 1 µs Validation cards FE SPD-PreShower ECAL HCAL SPD mult e± g p0 hadr ETtot Selection crates X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC Level-0 Muon Trigger Search for straight lines in M2-M5 Find matching hits in M1 Momentum resolution: ~20% for b-decays Information sent to L0DU: 2 highest pT candidates (per quadrant) Performance: BJ/(µµ)X = ~ 88 % Latency: ~ 1 µs X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC Level-0 Decision Unit Logical OR of high ET candidates Cuts on global variables: Tracks in second vertex: 3 Pile-up multiplicity: 112 hits SPD multiplicity: 280 hits Total ET: 5 GeV Applied thresholds: Channel Threshold (GeV) Incl. Rate (kHz) Hadron 3.6 705 Electron 2.8 103 Photon 2.6 126 0 local 4.5 110 0 global 4.0 145 Muon 1.1 Di-muon pT 1.3 X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC Level-1 Electronics Common Level-1 board Receives Level-0 accepted events Sub-detector specific links VELO: copper Other SD: optical For analog signals, data is digitized Level-0 throttle signal on input buffer occupancy Performs zero suppression Event formatting for DAQ Quad-GbE NIC, plug-in card ~475 MB/s output bandwidth per board X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC Readout Network Gigabit Ethernet from Level-1 to farm nodes ~300 L1 front-end modules Not all use all 4 interfaces ~750 input links Event Filter Farm ~1800 nodes (estimated from 2005 Real-Time Trigger Challenge results) Organised in sub-farms of up to 44 nodes each Total system throughput: 50 GB/s Designed for 80% average output link utilisation L1 FE L1 FE L1 FE L1 FE 1 - 4 GbE 12 x 1 GbE CPU CPU CPU 50 sub-farms Routed network Single core router (Force10 E1200, 1260 GbE ports) Routing switches in each sub-farm Static routes X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Event Building Traffic Each Level-1 module contains one fragment of a given event Readout supervisor broadcasts the address of destination node to all Level-1 boards Push protocol Readout Network guarantees delivery of all event fragments to a single node Event-building traffic pattern Data is embedded in IP packets No transport layer protocol Multi-Event Packets Several event fragments are packed into a single IP packet Reduction in Frame rate, interrupt rate, CPU load Network protocol overhead  better bandwidth utilisation 100 10 1000 1.0 50 Packet rate [MHz] Bandwidth for User data [%] Data Payload [B] X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Readout Network Scalability Currently estimated event size: ~35kB  50 GB/s network throughput (including safety margins) LHC pilot run in 2007  real data size Need a scalable design Also for possible upgrade scenarios Achieved through Modularity in FE design Multiple interfaces from each Level-1 board Modularity at sub-farm level L1 FE L1 FE L1 FE L1 FE 1 - 4 GbE 6 x 1 GbE CPU CPU CPU 50 sub-farms X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC Event Filter Farm Farm composed of 1U rack-mountable PCs Vendor independent Horizontal cooling Heat exchanger at back of the rack Dual-CPU nodes One event-building process per node One trigger process per CPU Same code running „on-line“ and „off-line“ Change only running parameters Farm nodes running independent from each other Partitioning Can be dynamically included in the system X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC High Level Trigger Use final quality information from detector Combine tracking information with µ’s, hadrons, electrons Starting point is the L0 decision 4 „alleys“ defined Depending on L0 decision Each alley is independent Each alley provides a summary to the selection algorithm Decision Type of trigger Quantities used Reconstructed objects Exclusive selection Reconstructed B decays Inclusive selection Used for systematic studies Inclusive B or D* X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC HLT Alleys Each alley consists of 3 major steps Level-0 trigger confirmation Fast rejection using reconstructed VELO tracks, matching L0 objects Primary Vertex VELO-TT matched tracks Alley-dependent trigger algorithm Long tracking (all tracking detectors) Example: Hadron alley Level-0 confirmation Pre-trigger (fast rejection) Reconstruction And Decision X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC

Artur Barczyk, CERN PH/LBC Summary LHCb trigger consists of 2 Levels Level-0, hardware, custom electronics 1 MHz accept rate Hight Level Trigger, software, CPU farm 2 kHz accept rate Readout Network based on copper Gigabit Ethernet 50 GB/s throughput Scalable Trigger farm Process 1 MHz of events ~1800 processing nodes Installation of the Trigger and DAQ systems has started Commissioning from Q3 2006 Will be ready for the LHC pilot run in 2007 X Pisa Meeting on Advanced Detectors Artur Barczyk, CERN PH/LBC