Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event.

Slides:



Advertisements
Similar presentations
HLT Collaboration; High Level Trigger HLT PRR Computer Architecture Volker Lindenstruth Kirchhoff Institute for Physics Chair of Computer.
Advertisements

High Level Trigger (HLT) for ALICE Bergen Frankfurt Heidelberg Oslo.
Combined tracking based on MIP. Proposal Marian Ivanov.
HLT - data compression vs event rejection. Assumptions Need for an online rudimentary event reconstruction for monitoring Detector readout rate (i.e.
High Level Trigger – Applications Open Charm physics Quarkonium spectroscopy Dielectrons Dimuons Jets.
Level-3 trigger for ALICE Bergen Frankfurt Heidelberg Oslo.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
8th Workshop on Electronics for LHC Experiment, Colmar, France, 10 Sep R.Ichimiya, ATLAS Japan 1 Sector Logic Implementation for the ATLAS Endcap.
Torsten Alt - KIP Heidelberg IRTG 28/02/ ALICE High Level Trigger The ALICE High-Level-Trigger.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
> IRTG – Heidelberg 2007 < Jochen Thäder – University of Heidelberg 1/18 ALICE HLT in the TPC Commissioning IRTG Seminar Heidelberg – January 2008 Jochen.
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
HLT Collaboration (28-Jun-15) 1 High Level Trigger L0 L1 L2 HLT Dieter Roehrich UiB Trigger Accept/reject events Select Select regions of interest within.
Sept TPC readoutupgade meeting, Budapest1 DAQ for new TPC readout Ervin Dénes, Zoltán Fodor KFKI, Research Institute for Particle and Nuclear Physics.
The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System D.Acosta, A.Madorsky, B.Scurlock, S.M.Wang University of Florida A.Atamanchuk,
Christian Steinle, University of Mannheim, Institute of Computer Engineering1 L1 Tracking – Status CBMROOT And Realisation Christian Steinle, Andreas Kugel,
KIP TRACKING IN MAGNETIC FIELD BASED ON THE CELLULAR AUTOMATON METHOD TRACKING IN MAGNETIC FIELD BASED ON THE CELLULAR AUTOMATON METHOD Ivan Kisel KIP,
The High-Level Trigger of the ALICE Experiment Heinz Tilsner Kirchhoff-Institut für Physik Universität Heidelberg International Europhysics Conference.
HLT architecture.
Helmholtz International Center for CBM – Online Reconstruction and Event Selection Open Charm Event Selection – Driving Force for FEE and DAQ Open charm:
The ALICE DAQ: Current Status and Future Challenges P. VANDE VYVRE CERN-EP/AID.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
TPC online reconstruction Cluster Finder & Conformal Mapping Tracker Kalliopi Kanaki University of Bergen.
RCU Status 1.RCU design 2.RCU prototypes 3.RCU-SIU-RORC integration 4.RCU system for TPC test 2002 HiB, UiB, UiO.
S.Vereschagin, Yu.Zanevsky, F.Levchanovskiy S.Chernenko, G.Cheremukhina, S.Zaporozhets, A.Averyanov R&D FOR TPC MPD/NICA READOUT ELECTRONICS Varna, 2013.
Fast reconstruction of tracks in the inner tracker of the CBM experiment Ivan Kisel (for the CBM Collaboration) Kirchhoff Institute of Physics University.
1 High Level Processing & Offline event selecton event selecton event processing event processing offine Dieter Roehrich UiB Data volume and event rates.
CPT Week, April 2001Darin Acosta1 Status of the Next Generation CSC Track-Finder D.Acosta University of Florida.
A PCI Card for Readout in High Energy Physics Experiments Michele Floris 1,2, Gianluca Usai 1,2, Davide Marras 2, André David IEEE Nuclear Science.
Bernardo Mota (CERN PH/ED) 17/05/04ALICE TPC Meeting Progress on the RCU Prototyping Bernardo Mota CERN PH/ED Overview Architecture Trigger and Clock Distribution.
Status of Reconstruction in CBM
KIP Ivan Kisel JINR-GSI meeting Nov 2003 High-Rate Level-1 Trigger Design Proposal for the CBM Experiment Ivan Kisel for Kirchhoff Institute of.
Tracking, PID and primary vertex reconstruction in the ITS Elisabetta Crescio-INFN Torino.
LHCb front-end electronics and its interface to the DAQ.
STAR Level-3 C. Struck CHEP 98 1 Level-3 Trigger for the Experiment at RHIC J. Berger 1, M. Demello 5, M.J. LeVine 2, V. Lindenstruth 3, A. Ljubicic, Jr.
Normal text - click to edit HLT tracking in TPC Off-line week Gaute Øvrebekk.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
Fast Tracking of Strip and MAPS Detectors Joachim Gläß Computer Engineering, University of Mannheim Target application is trigger  1. do it fast  2.
CBM Simulation Walter F.J. Müller, GSI CBM Simulation Week, May 10-14, 2004 Tasks and Concepts.
HLT Kalman Filter Implementation of a Kalman Filter in the ALICE High Level Trigger. Thomas Vik, UiO.
The Past... DDL in ALICE DAQ The DDL project ( )  Collaboration of CERN, Wigner RCP, and Cerntech Ltd.  The major Hungarian engineering contribution.
FPGA Co-processor for the ALICE High Level Trigger Gaute Grastveit University of Bergen Norway H.Helstrup 1, J.Lien 1, V.Lindenstruth 2, C.Loizides 5,
Filippo Costa ALICE DAQ ALICE DAQ future detector readout October 29, 2012 CERN.
1 Level 1 Pre Processor and Interface L1PPI Guido Haefeli L1 Review 14. June 2002.
Upgrade Letter of Intent High Level Trigger Thorsten Kollegger ALICE | Offline Week |
Development of the parallel TPC tracking Marian Ivanov CERN.
The SLHC CMS L1 Pixel Trigger & Detector Layout Wu, Jinyuan Fermilab April 2006.
CWG13: Ideas and discussion about the online part of the prototype P. Hristov, 11/04/2014.
1 19 th January 2009 M. Mager - L. Musa Charge Readout Chip Development & System Level Considerations.
DAQ Overview + selected Topics Beat Jost Cern EP.
1 Reconstruction tasks R.Shahoyan, 25/06/ Including TRD into track fit (JIRA PWGPP-1))  JIRA PWGPP-2: Code is in the release, need to switch setting.
Workshop ALICE Upgrade Overview Thorsten Kollegger for the ALICE Collaboration ALICE | Workshop |
H-RORC HLT-Meeting CERN 02/06/05 Torsten Alt KIP Heidelberg.
17/02/06H-RORCKIP HeidelbergTorsten Alt The new H-RORC H-RORC.
The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments 13 – 17 September 2004, BOSTON, USA Carmen González Gutierrez.
FTK high level simulation & the physics case The FTK simulation problem G. Volpi Laboratori Nazionali Frascati, CERN Associate FP07 MC Fellow.
The ALICE Data-Acquisition Read-out Receiver Card C. Soós et al. (for the ALICE collaboration) LECC September 2004, Boston.
June 2009, Wu Jinyuan, Fermilab MicroBooNe Design Review 1 Some Data Reduction Schemes for MicroBooNe Wu, Jinyuan Fermilab June, 2009.
Data Reduction Schemes for MicroBoone Wu, Jinyuan Fermilab.
Use of FPGA for dataflow Filippo Costa ALICE O2 CERN
Status of Hough Transform TPC Tracker
TELL1 A common data acquisition board for LHCb
ALICE – First paper.
CoBo - Different Boundaries & Different Options of
Commissioning of the ALICE HLT, TPC and PHOS systems
Track Reconstruction Algorithms for the ALICE High-Level Trigger
NA61 - Single Computer DAQ !
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
Multi Chip Module (MCM) The ALICE Silicon Pixel Detector (SPD)
TELL1 A common data acquisition board for LHCb
Presentation transcript:

Data rate reduction in ALICE

Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event building Level-3 system permanent storage system bandwidth 60 Gbyte/sec 15 Gbyte/sec < 1.2 Gbyte/sec < 2 Gbyte/sec

Data rate reduction Volume reduction –regions-of-interest and partial readout –data compression entropy coder vector quantization TPC-data modeling Rate reduction –(sub)-event reconstruction and event rejection before event building

Regions-of-interest and partial readout (1) Selection of TPC sector and  -slice based on TRD track candidate Momentum filter for D 0 decay tracks based on TPC tracking

Regions-of-interest and partial readout (2) Momentum filter for D 0 decay tracks based on TPC tracking: p T > 0.8 GeV/c vs. all p T

Data compression: Entropy coder Variable Length Coding short codes for long codes for frequent values infrequent values Results: NA49: compressed event size = 72% ALICE: = 65% ( Arne Wiebalck, diploma thesis, Heidelberg) Probability distribution of 8-bit TPC data

Data compression: TPC - RCU TPC front-end electronics system architecture and readout controller unit. Pipelined Huffman Encoding Unit, implemented in a Xilinx Virtex 50 chip * * T. Jahnke, S. Schoessel and K. Sulimma, EDA group, Department of Computer Science, University of Frankfurt

Data compression: Vector quantization Sequence of ADC-values on a pad = vector: Vector quantization = transformation of vectors into codebook entries Quantization error: Results: NA49: compressed event size = 29 % ALICE: = 48%-64% (Arne Wiebalck, diploma thesis, Heidelberg) code book compare

Data compression: TPC-data modeling Fast local pattern recognition: Result: NA49: compressed event size = 7 % analytical cluster model quantization of deviations from track and cluster model local track parameters comparison to raw data simple local track model (e.g. helix)track parameters Track and cluster modeling:

Event rejection TRD Trigger ~2 kHz Global Trigger Zerosuppressed TPCdata Sectorparallel Other Trigger Detectors, L0pretrig. L1 L2accept (216 Links, 83 MB/evt) L0 Readout TPC Readout other detectors L1 Trackingof e + e - candidates insideTPC Select regionsof interest Verifye + e - hypothesis TRD e + e - tracks Reject event Tracksegments andspace points e + e - tracks plusROIs On-linedata reduction (tracking,reconstruction, partialreadout, compression) seeds enable L0 L1 L2 HLTHLT DAQ Time, causality MB/evt4-40 MB/evt Detector raw data readout for debugging Binary loss less data compression(RLE, Huffman, LZW, etc.) 45MB/evt Event sizes and number of links TPC only

Fast pattern recognition Essential part of HLT system –crude complete event reconstruction  monitoring, event rejection –redundant local tracklet finder for cluster evaluation and data modeling  efficient data compression –selection of ( , ,p T )-slices  ROI –momentum filter  ROI –high precision tracking for selected track candidates  event rejection

Requirements on the TPC-RORC design concerning HLT tasks Transparent mode –transfering raw data to DAQ Processing mode –Huffman decoding –unpacking –10-to-8 bit conversion –pattern recognition cluster finder Hough transformation tracker

TPC PCI-RORC HLT TPC PCI-RORC –backwards compatibility –fully programmable  FPGA coprocessor Simple PCI-RORC PCI bridgeGlue logic DIU interface DIU card PCI bus FPGA Coprocessor SRAM

raw data, 10bit dynamic range, zero suppressed Huffman encoding (and vector quantization) fast cluster finder: simple unfolding, flagging of overlapping clusters RCU RORC cluster list raw data fast vertex finder fast track finder initialization (e.g. Hough transform) Hough histograms Peakfinder receiver node Preprocessing per sector global node vertex position detector front-end electronics Huffman decoding, unpacking, 10-to-8 bit conversion

FPGA coprocessor: cluster finder Fast cluster finder –up to 32 padrows per RORC –up to 141 pads/row and up to 512 timebins/pad –internal RAM: 2x512x8bit –timing (in clock cycles, e.g. 5 nsec) 1 : #(cluster-timebins per pad) / 2 + #clusters  outer padrow: 150 nsec/pad, 21  sec/row 1. Timing estimates by K. Sulimma, EDA group, Department of Computer Science, University of Frankfurt – centroid calculation: pipelined array multiplier

FPGA coprocessor: Hough transformation Fast track finder: Hough transformations 2 –(row,pad,time)-to-(2/R, ,  ) transformation –(n-pixel)-to-(circle-parameter) transformation –feature extraction: local peak finding in parameter space 2. E.g. see Pattern Recognition Algorithms on FPGAs and CPUs for the ATLAS LVL2 Trigger, C. Hinkelbein et at., IEEE Trans. Nucl. Sci. 47 (2000) 362.

raw data, 8bit dynamic range, decoded and unpacked slicing of padrow-pad-time space into sheets of pseudo-rapidity, subdiving each sheet into overlapping patches track segments fast track finder B: 1. Hough transformation receiver node Processing per sector vertex position, cluster list sub-volumes in r, ,  cluster deconvolution and fitting updated vertex position updated cluster list, track segment list fast track finder B: 2. Hough maxima finder 3. tracklett verification RORC fast track finder A: track follower

Hough transform (1) Data flow

Hough transform (2)  -slices

Hough transform (3) Transformation and maxima search

FPGA coprocessor: Implementation of Hough transform

FPGA coprocessor prototype FPGA candidates –Altera Excalibur (256 kbyte SRAM) –Xilinx Virtex II (3.9 Mbit dual port SRAM Mbit distributed SRAM, 420 MHz) –external high-speed SRAM PCI bridgeGlue logic DIU interface DIU card PCI bus FPGA Coprocessor SRAM FEP RAM SIU interface SIU card RCU