Download presentation
Presentation is loading. Please wait.
Published byJune Perkins Modified over 8 years ago
1
1 High Level Processing & Offline event selecton event selecton event processing event processing offine Dieter Roehrich UiB Data volume and event rates Processing concepts Storage concepts
2
2 Data volume Event size into the High Level Processing System (HLPS) Central Au+Au collision @ 25 AGeV: 335 kByte Minimum bias collisions: 84 kByte Triggered collision: 168 kByte Relative sizes of data objects RAW data (processed by the online event selection system) = 100% Event Summary Data – ESD global re-fitting and re-analysis of PID possible Reconstructed event + compressed raw data (e.g. local track model + hit residuals) = 20% Reconstructed event + compressed processed data (e.g. local track model + error matrix) = 10% Physics Analysis Object Data – AOD Vertices, momenta, PID = 2% Event tags for offline event selection - TAG = << 1%
3
3 Event rates J/ Signal rate @ 10 MHz interaction rate = 0.3 Hz Irreducible background rate = 50 Hz Open charm Signal rate @ 10 MHz interaction rate = 0.3 Hz Background rate into HLPS = 10 kHz Low-mass di-lepton pairs Signal rate @ 10 MHz interaction rate = 0.5 Hz No event selection scheme applicable - minimum bias event rate = 25 kHz
4
4 Data rates Data rates into HLPS Open charm 10 kHz * 168 kbyte = 1.7 Gbyte/sec Low-mass di-lepton pairs 25 kHz * 84 kbyte = 2.1 Gbyte/sec Data volume per year – no HLPS action 10 Pbyte/year ALICE = 10 Pbyte/year: 25% raw, 25% reconstructed, 50% simulated
5
5 Processing concept HLPS’ tasks Event reconstruction with offline quality Sharpen Open Charm selection criteria – reduce event rate further Create compressed ESDs Create AODs No offline re-processing Same amount of CPU-time needed for unpacking and dissemination of data as for reconstruction RAW->ESD: never ESD->ESD’: only exceptionally
6
6 Data Compression Scenarios Loss-less data compression –Run-Length Encoding (standard technique) –Entropy coder (Huffman) –Lempel Ziff Lossy data compression –Compress 10-bit ADC into 8-bit ADC using logarithmic transfer function (standard technique) –Vector quantization –Data modeling Perform all of the above wherever possible
7
7 Data compression: entropy coder Variable Length Coding (e.g. Huffman coding) short codes for long codes for frequent values infrequent values Result: compressed event size = 72% Probability distribution of 8-bit NA49 TPC data
8
8 Data compression: vector quantization Vector –Sequence of ADC-values on a pad –Calorimeter tower –... code book compare Vector quantization = transformation of vectors into codebook entries Result (NA49 TPC data): compressed event size = 29 % Quantization error
9
9 Data Compression – data modeling (1) Data model adapted to TPC tracking Store (small) deviations from a model: (A. Vestbø et. al., to be publ. In Nucl. Instr. Meth. ) Cluster model depends on track parameters Standard loss(less) algorithms; entropy encoders, vector quantization... - achieve compression factor ~ 2 (J. Berger et. al., Nucl. Instr. Meth. A489 (2002) 406) Tracking efficiency before and after comp. Relative pt-resolution before and after comp. dN ch /d =1000 Tracking efficiency Relative pt resolution [%]
10
10 Towards larger multiplicities cluster fitting and deconvolution: fitting of n two-dimensional response functions (e.g. Gauss-distributions) analyzing the remnant and keeping ”good” clusters arithmetic coding of pad and time information Data Compression – data modeling (2) Compressed tracks/clusters Leftovers Scale: 100 MeV/c
11
11 Data Compression – data modeling (3) Achieved compression ratios and corresponding efficiencies Compression factor: 10
12
12 Storage concept Main challenge of processing heavy-ion data: logistics No archival of raw data Storage of ESDs –Advanced compressing techniques: 10-20% –Only one pass Multiple versions of AODs
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.