Download presentation
Presentation is loading. Please wait.
1
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 1 New Test Results for the ALICE High Level Trigger
2
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 2 ALICE High Level Trigger Overview ● ALICE: LHC heavy-ion experiment ● Up to 15,000 particles per event ● HLT input data rate: Up to 25 GB/s ● HLT output to storage: 1.2 GB/s ● Online reconstruction in several steps ● ADC values → 3D coordinates → track segments → tracks ● No latency requirements ● Large commodity PC cluster ● Hierarchy levels (HL) to match physical detector layout 1, 2, 123, 255, 100, 30, 5, 1, 4, 3, 2, 3, 4, 5, 3, 4, 60, 130, 30, 5,..........
3
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 3 ALICE High Level Trigger Overview Software framework to transport data through cluster ● Requirements: Flexible, efficient, fault-tolerant ● Components that communicate via defined interface ● Interface does not copy large event data – Only descriptors – Lowers CPU load for data transport ● Components to define flow of data in cluster ● Component templates for application specific functionality ● Components can be plugged together to create systems w. arbitrary functionality ● Data-driven architecture ● Components receive data from other components ● Process received data ● Forward output data to next component Input Processing Output Input Processing Output
4
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 4 Interface Test Results Benchmarks of Publisher/Subscriber Component Interface ● Only interface benchmarked ● No shared memory access ● Pentium 4 family influenced by L1 cache size (only 8kB) ● Maximum rate: >110 kHz (Dual Opteron) ● CPU overhead f. full event round-trip: <18 μs
5
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 5 HLT in TPC Sector Beamtest Goals of HLT participation in TPC testbeam – Test of DAQ-HLT interface – Gain real world operation experience PCI Card DAQ PC 1 PCI Card DAQ PC 2 Fast Ethernet DAQ PC 3 3x 250 GB disk Optical Link Optical Link Optical Link Detector Prototype Optical Link Mass Storage 1 MB/s HLT PC 1 PCI Card (RORC) HLT PC 3 HLT PC 2 PCI Card (RORC) PCI Card (HLT-Out) Gigabit Ethernet 10 MB/s
6
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 6 TPC Sector Beamtest HLT Software ● EventRateSubscriber measures event receiving rate ● StorageWriter writes data to local disc Optical Link HLT PC 1 RORC Publisher Subscriber BridgeHead HLT PC 3 Publisher BridgeHead Publisher BridgeHead Event Merger Storage Writer HLT-Out Subscriber EventRate Subscriber HDD HLT PC 2 RORC Publisher Subscriber BridgeHead Storage Writer HDD Optical Link
7
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 7 HLT in Readout of TRD Beamtest ● No optical link interface for TRD prototype ● Use HLT output card for DAQ connection ● Adapt HLT input component to TRD prototype interface readout PCI card (ACEX card) ● TRDFormatter checks data, adds header f. optical link ● Optional local output via StorageWriter Detector Prototype HLT PC PCI Card LVDS Link ACEX Publisher HLT-Out Monitoring PCI Card DAQ PC HDD TRD Formatter Optical Link Storage Writer
8
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 8 HLT TaskManager Control System ● System to control large number of components in final HLT ● Fault-tolerant ● No single-points-of-failure ● Hierarchical, master-slave ● Flexible configuration ● XML and Python based ● Interface libraries to communicate with controlled programs Core Configurati on Engine XML Configurat ion File Interfac e Library Controlled Programs Program State & Action Engine Embedded Python Interpreter TaskManager Program Interface Engine
9
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 9 HLT Control System Master TaskMana ger Slave TaskMana ger 1 Slave TaskMana ger 2 RORC Publish er Storage Writer Subscribe r BridgeHea d Publisher Bridge Head Publisher Bridge Head HLT-Out Subscrib er Event- Rate Subscrib er – Status & Command flow is black HLT PC 2 HLT PC 3 Event Merger Slave TaskMana ger 0 RORC Publish er Storage Writer HLT PC 1 Subscribe r BridgeHea d – Data flow is yellow System used to control HLT in TPC Beamtest
10
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 10 Cape Town Heidelberg Bergen Dubna Tromsø Global Processing Test ● Proof-of-concept global online grid system ● High Level Trigger system distributed globally ● Not just distributed on one cluster ● Distributed on five clusters ● Bergen & Tromsø,Norway ; Dubna, Russia ; Heidelberg, Germany ; Cape Town, South Africa ● Test run for more than 15h ● Framework designed for operation in cluster also works globally
11
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 11 Global Processing Test Setup & procesing steps used Bergen,Norwa y Simulated TPC Clusterfinding 3 PCs Heidelberg, Germany Simulated TPC Tracking 2 PCs Tromsø,Norw ay Simulated TPC Clusterfinding 2 PCs Dubna, Russia Simulated TPC Clusterfinding 1 PC Capetown, South Africa Simulated DiMuon Tracking + Global Event Merging 7 PCs
12
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK 41.2 12 Conclusion ● ALICE High Level Trigger Data Transport Software has reached mature state ● Performance on current CPUs enough for use in HLT ● Stable ● Can also operate in globally distributed online grid-like system ● Used successfully in two beamtest – TPC beamtest for interface tests – TRD beamtest used for production data readout ● Current/future development concentrates on higher layers, e.g. control system
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.