Workshop ALICE Upgrade Overview Thorsten Kollegger for the ALICE Collaboration ALICE | Workshop | 13.03.2013.

Slides:



Advertisements
Similar presentations
High Level Trigger (HLT) for ALICE Bergen Frankfurt Heidelberg Oslo.
Advertisements

T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
HLT - data compression vs event rejection. Assumptions Need for an online rudimentary event reconstruction for monitoring Detector readout rate (i.e.
High Level Trigger – Applications Open Charm physics Quarkonium spectroscopy Dielectrons Dimuons Jets.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
High Level Trigger of Muon Spectrometer Indranil Das Saha Institute of Nuclear Physics.
ALICE HLT High Speed Tracking and Vertexing Real-Time 2010 Conference Lisboa, May 25, 2010 Sergey Gorbunov 1,2 1 Frankfurt Institute for Advanced Studies,
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Tom Dietel University of Cape Town for the ALICE Collaboration Computing for ALICE at the LHC.
The High-Level Trigger of the ALICE Experiment Heinz Tilsner Kirchhoff-Institut für Physik Universität Heidelberg International Europhysics Conference.
1 Status report on DAQ E. Dénes – Wigner RCF/NFI ALICE Bp meeting – March
CDF data production models 1 Data production models for the CDF experiment S. Hou for the CDF data production team.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Use of GPUs in ALICE (and elsewhere) Thorsten Kollegger TDOC-PG | CERN |
The ALICE DAQ: Current Status and Future Challenges P. VANDE VYVRE CERN-EP/AID.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Next Generation Operating Systems Zeljko Susnjar, Cisco CTG June 2015.
Predrag Buncic, October 3, 2013 ECFA Workshop Aix-Les-Bains - 1 Computing at the HL-LHC Predrag Buncic on behalf of the Trigger/DAQ/Offline/Computing Preparatory.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Niko Neufeld, CERN/PH. Online data filtering and processing (quasi-) realtime data reduction for high-rate detectors High bandwidth networking for data.
Pierre VANDE VYVRE for the O 2 project 15-Oct-2013 – CHEP – Amsterdam, Netherlands.
Predrag Buncic, October 3, 2013 ECFA Workshop Aix-Les-Bains - 1 Computing at the HL-LHC Predrag Buncic on behalf of the Trigger/DAQ/Offline/Computing Preparatory.
CBM Computing Model First Thoughts CBM Collaboration Meeting, Trogir, 9 October 2009 Volker Friese.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
The Use of Trigger and DAQ in High Energy Physics Experiments Lecture 3: The (near) future O. Villalobos Baillie School of Physics and Astronomy The University.
HLT Kalman Filter Implementation of a Kalman Filter in the ALICE High Level Trigger. Thomas Vik, UiO.
The Past... DDL in ALICE DAQ The DDL project ( )  Collaboration of CERN, Wigner RCP, and Cerntech Ltd.  The major Hungarian engineering contribution.
W. Riegler, ECFA HL-LHC Experiment Workshop, Oct. 21st, 2014 The ALICE Upgrade.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Filippo Costa ALICE DAQ ALICE DAQ future detector readout October 29, 2012 CERN.
ALICE Online Upgrade P. VANDE VYVRE – CERN/PH ALICE meeting in Budapest – March 2012.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Upgrade Letter of Intent High Level Trigger Thorsten Kollegger ALICE | Offline Week |
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
ALICE Software Evolution Predrag Buncic. GDB | September 11, 2013 | Predrag Buncic 2.
Monitoring for the ALICE O 2 Project 11 February 2016.
ALICE O 2 project B. von Haller on behalf of the O 2 project CERN.
ALICE O 2 | 2015 | Pierre Vande Vyvre O 2 Project Pierre VANDE VYVRE.
1 Reconstruction tasks R.Shahoyan, 25/06/ Including TRD into track fit (JIRA PWGPP-1))  JIRA PWGPP-2: Code is in the release, need to switch setting.
16 September 2014 Ian Bird; SPC1. General ALICE and LHCb detector upgrades during LS2  Plans for changing computing strategies more advanced CMS and.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Domenico Elia1 ALICE computing: status and perspectives Domenico Elia, INFN Bari Workshop CCR INFN / LNS Catania, Workshop Commissione Calcolo.
P. Vande Vyvre – CERN/PH for the ALICE collaboration CHEP – October 2010.
Lærer-møde April 19, 2007 Dias 1 I.G. Bearden, Niels Bohr Institute ALICE: Upgrades & Perspectives.
Predrag Buncic CERN Plans for Run2 and the ALICE upgrade in Run3 ALICE Tier-1/Tier-2 Workshop February 2015.
DIS 2011, Newport News, April 2011Joakim Nystrand, University of Bergen 1 Small-x and forward measurements in ALICE Joakim Nystrand University of.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
HTCC coffee march /03/2017 Sébastien VALAT – CERN.
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
evoluzione modello per Run3 LHC
TOF Upgrade: (few) updates
Workshop Computing Models status and perspectives
LHC experiments Requirements and Concepts ALICE
Outline ALICE upgrade during LS2
ALICE – First paper.
Commissioning of the ALICE HLT, TPC and PHOS systems
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
ALICE Computing Upgrade Predrag Buncic
New strategies of the LHC experiments to meet
Example of DAQ Trigger issues for the SoLID experiment
Nuclear Physics Data Management Needs Bruce G. Gibbard
Quarkonium production in p-p and A-A collisions: ALICE status report
Computing at the HL-LHC
Presentation transcript:

Workshop ALICE Upgrade Overview Thorsten Kollegger for the ALICE Collaboration ALICE | Workshop |

A Large Ion Collider Experiment ALICE | Workshop| | Thorsten Kollegger2

ALICE Upgrade Overview Planned for 2018 (LHC 2nd Long Shutdown) (“Upgrade of the ALICE Experiment”, LoI, CERN-LHCC ) Physics goals High precision measurements of rare probes at low p T, which cannot be selected with a trigger, require a large sample of events recorded on tape Target Pb-Pb recorded luminosity ≥ 10 nb -1  8 x events pp Tev) recorded luminosity ≥ 6 pb -1  1.4 x events …and significant improvement of vertexing and tracking capabilities ALICE | Workshop| | Thorsten Kollegger3

ALICE Upgrade Overview The upgrade plans entails building New, high-resolution, low-material ITS Upgrade of TPC with replacement of MWPCs with GEMs and new pipelined readout electronics Upgrade of readout electronics of: TRD, TOF, PHOS and Muon Spectrometer Upgrade of the forward trigger detectors and ZDC Upgrade of the online systems Upgrade of the offline reconstruction and analysis framework and code ALICE | Workshop| | Thorsten Kollegger4

Requirements Sample full 50kHz Pb-Pb interaction rate (current limit at ~500Hz, factor 100 increase) Typical event size of PbPb 22 MByte  ~1.1 TByte/s detector readout  ~500 PByte/HI period (1 month) However: storage bandwidth limited to ~20 GByte/s many physics probes have low S/B: classical trigger/event filter approach not efficient … and all this data has to be reconstructed ALICE | Workshop| | Thorsten Kollegger5

Strategy Data reduction by online reconstruction Store only reconstruction results, discard raw data Demonstrated with TPC data since Pb-Pb 2011 Optimized data structures for lossless compression Algorithms designed to allow for “offline” reconstruction passes with improved calibrations  Implies much tighter coupling between online and offline computing systems ALICE | Workshop| | Thorsten Kollegger6

Data Bandwidth LHC luminosity variation during fill and efficiency taken into account for average output to computing center. ALICE | Workshop| | Thorsten Kollegger7 Detector Input to Online System (GByte/s) Peak Output to Local Data Storage (GByte/s) Avg. Output to Computing Center (GByte/s) TPC TRD ITS Others Total

TPC Data Reduction First steps up to clustering on the FPGA of the detector link receiver Further steps require full event reconstruction, pattern recognition requires only coarse online calibration ALICE | Workshop| | Thorsten Kollegger8 Data Format Data Reduction Factor Event Size (MByte) Raw Data1700 FEEZero Suppression3520 HLT Clustering & Compression5-7~3 Remove clusters not associated to relevant tracks 21.5 Data format optimization2-3<1

TPC Data Reduction First compression steps used in production starting with the 2011 Pb+Pb run Online found TPC clusters are basis for offline reconstruction Currently R&D towards using online found TPC tracks to complement offline seed finding and online calibration ALICE | Workshop| | Thorsten Kollegger9 HLT Pb+Pb 2011

Current Online Systems ALICE | Workshop| | Thorsten Kollegger10 Different technologies/techniques used in DAQ/HLT e.g. Ethernet Infiniband

Combined DAQ/HLT System ALICE | Workshop| | Thorsten Kollegger11 2x10 or 40 Gb/s FLP 10 or 40 Gb/s FLP EPN FLP ITS TRD Muon FTP L0 L1 FLP EMCal EPN FLP TPC Data Storage FLP TOF FLP Farm Network Farm Network PHOS Trigger Detectors ~ 2500 DDL3s 10 Gb/s ~ 250 FLPs ~ 1250 EPNs L0 Data Storage EPN Storage Network Storage Network RORC3

Detector Readout Combination of continuous and triggered readout Continuous readout for TPC and ITS At 50 kHz, ~5 events in TPC during drift time of 92 µs Continuous readout minimizes needed bandwidth Implies change from event granularity in the online systems to time-windows with multiple events Implies event building only after partial reconstruction Fast Trigger Processor (FTP) complementing CTP Provides clock/L0/L1 to triggered detectors and TPC/ITS for data tagging and test purposes ALICE | Workshop| | Thorsten Kollegger12

DDL/RORC Development Data Link DDL1 (now): 2Gbit/s DDL2 (LS1): 6 Gbit/s DDL3 (LS2): 10 Gbit/s Receiver Card (FPGA) RORC1 (now) - 2 DDL1, PCI-X&PCIe Gen1x4 RORC2 (LS1 HLT) -12 DDL2, PCIe Gen2x8 RORC3 (LS2) DDL3, PCIe Gen3 ALICE | Workshop| | Thorsten Kollegger

Network Requirements Total number of nodes: ~1500 FLP Node Output: up to 12 Gbit/s EPN Node Input: up to 7.2 Gbit/s EPN Output: up to 0.5 Gbit/s Two technologies available 10/100Gbit Ethernet (currently used in DAQ) QDR/FDR Infiniband (40/52Gbit, used in HLT) Both would allow to construct a network satisfying the requirements even today ALICE | Workshop| | Thorsten Kollegger14

Network Throughput tests Different topologies under study to - minimize cost - optimize failure tolerance - cabling Spine&Leaf Fat Tree ALICE | Workshop| | Thorsten Kollegger15 m m … Director Switch p p … 1 m m …

Processing Power Estimate for online systems based on current HLT processing power ~2500 cores distributed over 200 nodes 108 FPGAs on H-RORCs for cluster finding 1 FPGA equivalent to ~80 CPU cores 64 GPGPUs for tracking (NVIDIA GTX480 + GTX580) Scaling to 50 kHz rate to estimate requirements ~ cores additional processing power by FPGAs + GPGPUs  nodes in 2018 with multicores ALICE | Workshop| | Thorsten Kollegger16

Processing Power Estimate of processing power based on scaling by Moore’s law However: no increase in single core clock speed, instead multi/many-core Reconstruction software needs to adapt to full use resources ALICE | Workshop| | Thorsten Kollegger17 Picture from Herb Sutte: The Free Lunch Is Over A Fundamental Turn Toward Concurrency in Software Dr. Dobb's Journal, 30(3), March 2005 (updated)

Parallel Reconstruction Tracking most time-consuming step in ALICE Developed multi-threaded tracking algorithm for the HLT Also adopted to GPUs (NVIDIA Fermi, CUDA) ALICE | Workshop| | Thorsten Kollegger18

AliEn CAF On Demand CAF On Demand AliEn O 2 Online- Offline Facility Private CERN Cloud Vision Public Cloud(s) AliEn HLT DAQ T1/T2/T3 Reconstruction Calibration Re-reconstruction Online Raw Data Store Analysis Reconstruction Custodial Data Store Analysis Simulation Data Cache Simulation Computing Model ALICE | Workshop| | Thorsten Kollegger19

Summary ALICE physics program requires handling of 50kHz minimum-bias Pb-Pb collisions (1TByte/s) from the online and offline systems Strategy to handle the load is an ambitious data volume reduction by a first pass online reconstruction & discarding of raw data on a combined DAQ/HLT/offline farm (O2) R&D towards the Online&Computing TDR at end of 2014 An interesting future ahead… ALICE | Workshop| | Thorsten Kollegger20

ALICE | Workshop| | Thorsten Kollegger21

Backup - Event Size Expected data sizes for minimum bias Pb-Pb collisions at full LHC energy ALICE | Workshop| | Thorsten Kollegger22 Detector Event Size (MByte) After Zero Suppression After Data Compression TPC TRD ITS Others Total

ALICE | Workshop| | Thorsten Kollegger 23 Estimated signal statistics and trigger rate for minimum-bias Pb-Pb collisions Hadronic Interaction rate of 50 kHz B’ is the background in the broad invariant mass range (±12σ) Triggering on D 0, D s and Λ c (p T >2 Gev/c)  ~ 36 kHz Why triggering does not work? The open charm case