ALICE Computing Model The ALICE raw data flow P. VANDE VYVRE – CERN/PH Computing Model WS – 09 Dec 2004 - CERN.

Slides:



Advertisements
Similar presentations
HLT Collaboration; High Level Trigger HLT PRR Computer Architecture Volker Lindenstruth Kirchhoff Institute for Physics Chair of Computer.
Advertisements

High Level Trigger (HLT) for ALICE Bergen Frankfurt Heidelberg Oslo.
Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Normal text - click to edit Trigger interface module Kjetil Ullaland, Sten Solli, Johan Alme TPC Electronics meeting. CERN Jan 2005.
1P. Vande Vyvre - CERN/PH ALICE DAQ Progress Report Comprehensive Review IV P. Vande Vyvre – CERN/PH.
ALICE Trigger System Features Overall layout Central Trigger Processor Local Trigger Unit Software Current status On behalf of ALICE collaboration:D. Evans,
1P. Vande Vyvre - CERN/PH ALICE DAQ Technical Design Report DAQ TDR Task Force Tome ANTICICFranco CARENA Wisla CARENA Ozgur COBANOGLU Ervin DENESRoberto.
> IRTG – Heidelberg 2007 < Jochen Thäder – University of Heidelberg 1/18 ALICE HLT in the TPC Commissioning IRTG Seminar Heidelberg – January 2008 Jochen.
Specification and Simulation of ALICE DAQ System Giovanna Di Marzo Serugendo.
1 CTP DCS luminosity monitoring A. Jusko, R. Lietava, J. Urban.
Jochen Thäder – Kirchhoff Institute of Physics - University of Heidelberg 1 HLT for TPC commissioning - Setup - - Status - - Experience -
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg 1 HLT and the Alignment & Calibration DB.
Large Scale and Performance Tests of the ATLAS Online Software CERN ATLAS TDAQ Online Software System D.Burckhart-Chromek, I.Alexandrov, A.Amorim, E.Badescu,
ALICE Data Challenge V P. VANDE VYVRE – CERN/PH LCG PEB - CERN March 2004.
MSS, ALICE week, 21/9/041 A part of ALICE-DAQ for the Forward Detectors University of Athens Physics Department Annie BELOGIANNI, Paraskevi GANOTI, Filimon.
11 CTP Training A.Jusko, M. Krivda and R.Lietava..
1 Status report on DAQ E. Dénes – Wigner RCF/NFI ALICE Bp meeting – March
ALICE DAQ Plans for 2006 Procurement, Installation, Commissioning P. VANDE VYVRE – CERN/PH for LHC DAQ Club - CERN - May 2006.
1 Alice DAQ Configuration DB
V. Altini, T. Anticic, F. Carena, W. Carena, S. Chapeland, V. Chibante Barroso, F. Costa, E. Dénes, R. Divià, U. Fuchs, I. Makhlyueva, F. Roukoutakis,
The ALICE DAQ: Current Status and Future Challenges P. VANDE VYVRE CERN-EP/AID.
DAQ & ECS for TPC commissioning A few statements about what has been done and what is still in front of us F.Carena.
Framework for Raw Data Thomas Kuhr Offline Week 29/06/2004.
The ALICE Data-Acquisition Software Framework DATE V5 F. Carena, W. Carena, S. Chapeland, R. Divià, I. Makhlyueva, J-C. Marin, K. Schossmaier, C. Soós,
Roberto Divià, CERN/ALICE 1 CHEP 2009, Prague, March 2009 The ALICE Online Data Storage System Roberto Divià (CERN), Ulrich Fuchs (CERN), Irina Makhlyueva.
J. Varela, LIP-Lisbon/CERN LEADE WG Meeting CERN, 29 March 2004 Requirements of CMS on the BST J. Varela LIP Lisbon / CERN LHC Experiment Accelerator Data.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
F.Carena, CERN/ALICE The ALICE Experiment Control System F. Carena / CERN-PH.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
NA49-future Meeting, January 26, 20071Ervin Dénes, KFKI - RMKI DATE the DAQ s/w for ALICE (Birmingham, Budapest, CERN, Istanbul, Mexico, Split, Zagreb.
LHCb front-end electronics and its interface to the DAQ.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
The Past... DDL in ALICE DAQ The DDL project ( )  Collaboration of CERN, Wigner RCP, and Cerntech Ltd.  The major Hungarian engineering contribution.
ATLAS TDAQ RoI Builder and the Level 2 Supervisor system R. E. Blair, J. Dawson, G. Drake, W. Haberichter, J. Schlereth, M. Abolins, Y. Ermoline, B. G.
R.Divià, CERN/ALICE Challenging the challenge Handling data in the Gigabit/s range.
Filippo Costa ALICE DAQ ALICE DAQ future detector readout October 29, 2012 CERN.
ALICE Online Upgrade P. VANDE VYVRE – CERN/PH ALICE meeting in Budapest – March 2012.
ARCHITECTURE. PRR November x 32 PADs Up to 26 or 3 x 17 MANU BOARD. PATCH BUS Translator Board. FEE DETECTOR Up to 100 PATCH BUS per detector. MANU.
1 CTP offline software status (Offline week,8/4/08) R.Lietava for CTP group.
Status of the Shuttle Framework Alberto Colla Jan Fiete Grosse-Oetringhaus ALICE Offline Week October 2006.
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
18/05/2000Richard Jacobsson1 - Readout Supervisor - Outline Readout Supervisor role and design philosophy Trigger distribution Throttling and buffer control.
Status of Integration of Busy Box and D-RORC Csaba Soós ALICE week 3 July 2007.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
R.Divià, CERN/ALICE 1 ALICE off-line week, CERN, 9 September 2002 DAQ-HLT software interface.
ALICE O 2 | 2015 | Pierre Vande Vyvre O 2 Project Pierre VANDE VYVRE.
P. Vande Vyvre – CERN/PH for the ALICE collaboration CHEP – October 2010.
Sergio Vergara Limon, Guy Fest, September Electronics for High Energy Physics Experiments.
Barthélémy von Haller CERN PH/AID For the ALICE Collaboration The ALICE data quality monitoring system.
The ALICE Data-Acquisition Read-out Receiver Card C. Soós et al. (for the ALICE collaboration) LECC September 2004, Boston.
ALICE Computing Data Challenge VI
LHC experiments Requirements and Concepts ALICE
TPC Commissioning: DAQ, ECS aspects
TELL1 A common data acquisition board for LHCb
Controlling a large CPU farm using industrial tools
IEEE NPSS Real Time Conference 2009
SPD – ALICE DAQ/ECS/Trigger Integration Status Report
CRU Weekly Meeting Discussion on Trigger
Toward a costing model What next? Technology decision n Schedule
PCI BASED READ-OUT RECEIVER CARD IN THE ALICE DAQ SYSTEM
ITS combined test seen from DAQ and ECS F.Carena, J-C.Marin
LHCb Trigger, Online and related Electronics
CTP offline meeting 16/03/2009 A.Jusko and R.Lietava
ALICE Data Challenges Fons Rademakers Click to add notes.
Use Of GAUDI framework in Online Environment
Multi Chip Module (MCM) The ALICE Silicon Pixel Detector (SPD)
TELL1 A common data acquisition board for LHCb
Implementation of DHLT Monitoring Tool for ALICE
Presentation transcript:

ALICE Computing Model The ALICE raw data flow P. VANDE VYVRE – CERN/PH Computing Model WS – 09 Dec CERN

Computing Model - 09 December 20042P. VANDE VYVRE CERN-PH Outline TRG, DAQ, HLT architecture Dataflow inside DAQ from detectors to storage Raw data format inside DAQ Trigger readout HLT decisions handling, HLT readout DATE ROOT recorder Conclusion

Computing Model - 09 December 20043P. VANDE VYVRE CERN-PH GDC TRG, DAQ, HLT architecture CTP LTU TTC FERO LTU TTC FERO LDC BUSY Rare/All Event Fragment Sub-event Event File Storage Network TDS PDS L0, L1a, L2 262 DDLs EDM LDC Load Bal. LDC HLT Farm FEP DDL H-RORC 10 DDLs 10 D-RORC 10 HLT LDC 123 DDLs TDS DSS Event Building Network 329 D-RORC 175 Detector LDC 50 GDC 25 TDS 5 DSS

Computing Model - 09 December 20044P. VANDE VYVRE CERN-PH Data Format Physics data: Raw data to DAQ and HLT = f (interaction, Triggers L0 L1 L2) Raw data to storage = f (raw data, mode, HLT decision and processing) Event Structure Set of subevents (selection by Trigger and HLT) Each subevent made of 1 or several event fragments Event fragments include LHC clock at the time of the interaction LHC clock = orbit number + bunch crossing Trigger parameters Both LHC clock and trigger parameters distributed by Central Trigger Processor to all detectors electronics readout

Computing Model - 09 December 20045P. VANDE VYVRE CERN-PH GDC Distribution of HLT decisions inside DAQ CTP LTU TTC FERO LTU TTC FERO LDC Storage Network TDS PDS L2 trigger pattern EDM LDC HLT Farm FEP TDS DSS Event Building Network L2 trigger pattern L2 trigger pattern Original LDC pattern Refined LDC pattern HLT output pattern Refined LDC pattern HLT output pattern Original LDC pattern Refined LDC pattern HLT output pattern

Computing Model - 09 December 20046P. VANDE VYVRE CERN-PH General structure Base Header Subevent Base Header Header extension Event Fragment Equipment payload (DDL Header and Data) Equipment Header Event Fragment Event Sub-event Event fragment DDL/RORCLDCGDC

Computing Model - 09 December 20047P. VANDE VYVRE CERN-PH DDL Common Data Header  Event identification: bunch crossing and orbit number  Consistency check: Mini-Event ID  Trigger parameters:  Physics TRG:L1 Trigger flags, Trigger classes, ROI  Software TRG: Participating Sub-Detectors

Computing Model - 09 December 20048P. VANDE VYVRE CERN-PH Event identification Global Event identification [L2a Message] 12 bits for bunch crossing number (3564 bunches per orbit) 24 bits for orbit number (2 24 * 88  s = 1476 s. = 24 min.) Further event identification added by sw in the DAQ Local Event identification (for verification) [Local TTCrx] Mini-event ID: 12 bits of local TTCrx BC counter Essential for data consistency check If 1! TTCrx for each DDL  No problem Automatic check in DAQ sw If > 1 TTCrx for each DDL  1 TTCrx content in the header All TTCrx in the data

Computing Model - 09 December 20049P. VANDE VYVRE CERN-PH Trigger parameters Trigger status (8 bits) [L1 or L2a Message] L2SwC Trigger-type flag (Physics=0, Software=1) ESREnable Segmented readout (if used by detector) [L1] ClTCalibration Trigger flag (Only if L2SwC=1) RoC Readout Control bits (4 bits) (Only if L2SwC=1) [L1] Participating sub-detectors (24 bits) [L2a Message] If L2SwC = 0 (Physics) L2Cluster [6..1]Cluster [6..1] If L2SwC = 1 (Software)L2Detector [24..1]Detector [24..1] Trigger classes (50 bits) [L2a Message] (If L2SwC = 0 Physics) Each trigger class: Logic condition Trigger cluster (set of detectors) (e.g.: all for central and MB, Muon+Pixel for Muon TRG) Frequent/Rare: for feedback from DAQ Downscaling ROI (Region Of Interest) 36 bits

Computing Model - 09 December P. VANDE VYVRE CERN-PH CTP Readout, Interaction Record & Scalers Central Trigger Processor (CTP) CTP readout (sent for every L2A over DDL) Data block of a few words from TRG CTP to TRG LDC Mainly a sub event header (event ID., etc) Contribution from CTP to physics events Interaction record (sent on a regular basis over DDL) For every orbit 1 record: Orbit number (all should be present) 1 word for every bunch crossing in which interaction detected Bunch crossing and interaction type (Peripheral/Semi-Central) Scalers Sent by CTP to DAQ on a regular basis DAQ sw will format and handle these data flows CTP readout part of physics events Interaction records and scalers in separate streams

Computing Model - 09 December P. VANDE VYVRE CERN-PH CTP Readout (Physics Trigger) n n Event ID 1 (Bunch cross.) [11-0] DNC (0) DNC (0) BlockID = 0 Event ID 2 (OrbitID) [23-12] DNC (0) DNC (0) BlockID = 0 Event ID 3 (OrbitID) [11-0] DNC (0) DNC (0) BlockID = 0 DNC (0) DNC (0) BlockID = 0 L2Class [48-37]DNC (0) DNC (0) BlockID = 0 L2Class [36-25]DNC (0) DNC (0) BlockID = 0 L2Class [24-13]DNC (0) DNC (0) BlockID = 0 L2Class [12-0]DNC (0) DNC (0) BlockID = 0 DNC (0) ESRL2SwC = 0 L2Cluster [6-1] L2 Class [50-49] DNC: Do Not Care DNC (0)

Computing Model - 09 December P. VANDE VYVRE CERN-PH CTP Readout (Software Trigger) n n Event ID 1 (Bunch cross.) [11-0] DNC (0) DNC (0) BlockID = 0 Event ID 2 (OrbitID) [23-12] DNC (0) DNC (0) BlockID = 0 Event ID 3 (OrbitID) [11-0] DNC (0) DNC (0) BlockID = 0 DNC (0) DNC (0) BlockID = 0 L2Detector [24-13]DNC (0) DNC (0) BlockID = 0 L2Detector [12-1]DNC (0) DNC (0) BlockID = 0 DNC (0) DNC (0) DNC (0) BlockID = 0 DNC (0) DNC (0) DNC (0) BlockID = 0 DNC (0) DNC (0) L2SwC = 1 DNC (0) DNC (0) DNC: Do Not Care ClT

Computing Model - 09 December P. VANDE VYVRE CERN-PH CTP Readout (Interaction Record) … (OrbitID) [23-12]DNC (0) ERR BlockID = 1 (OrbitID) [11-0]DNC (0) ERR BlockID = 1 (Bunch cross.) [11-0]DNC (0) InT BlockID = 1 DNC (0) BlockID = 1 (Bunch cross.) [11-0]DNC (0) InT BlockID = 1 (Bunch cross.) [11-0]DNC (0) InT BlockID = 1 DNC (0) BlockID = 1 Incomplete record (hFFF) DNC (0) 0 BlockID = 1 DNC: Do Not Care (Bunch cross.) [11-0] InT

Computing Model - 09 December P. VANDE VYVRE CERN-PH GDC Data handling and formatting in LDCs and GDCs CTP LTU TTC FERO LTU TTC FERO LDC BUSY Rare/All Event Fragment Sub-event Event File Storage Network TDS PDS L0, L1a, L2 262 DDLs EDM LDC Load Bal. LDC HLT Farm FEP DDL H-RORC 10 DDLs 10 D-RORC 10 HLT LDC 123 DDLs TDS DSS Event Building Network 329 D-RORC 175 Detector LDC 50 GDC 25 TDS 5 DSS

Computing Model - 09 December P. VANDE VYVRE CERN-PH HLT decision handling in LDC Det. LDC DDL DIUDDL SIU DATE banks readout recorder D-RORC Raw data Events fragments Event Building Network HLT Detector NIC decision agent Selected Sub-events HLT decisions

Computing Model - 09 December P. VANDE VYVRE CERN-PH HLT Data Readout All data sent by HLT over DDL will be event-built and recorded Decisions Events accepted: event-built according to decision Events rejected: keep a trace of decision ESD produced by HLT Events accepted: ESD and any other data event-built with raw data Events rejected: If needed all ESDs could be kept as a separate ESD stream Typical application of HLT decision

Computing Model - 09 December P. VANDE VYVRE CERN-PH Event building and data recording in GDCs GDC NIC DATE data banks event builder ROOT recorder Raw data HLT data Sub-events (raw data, HLT) HLT decisions Storage Network Event Building Network Complete accepted events Event builder: In: subevents Out: I/O vector Set of pointer/size pairs ROOT recorder: ROOT data format Possibly parallel streams CASTOR file system Interfaced to the GRID GRID Catalog

Computing Model - 09 December P. VANDE VYVRE CERN-PH DATE ROOT recorder New program in development (part of DATE V5) Structure of a DATE application (interfaces to AFFAIR, Infologger etc) Data formatting with the data handling library of AliMDC (profit from years of development and tests) Data format: TFile object with collection of raw data and ESD Files created in CASTOR File structure: Complete events: all consistency checks done Interface to GRID (AliEn/GLite/YAG) Declare CASTOR file Create metadata: trigger parameters and run conditions Tested in current Computing DC

Computing Model - 09 December P. VANDE VYVRE CERN-PH Conclusion Data format based on requirements and architecture All elements of physics data (event fragments, subevents and events) tagged with Trigger info CTP readout 1 stream event-built with physics data 2 asynchronous streams HLT readout Decisions ESD DATE recorder Formatting by AliMDC library Interfaced to the GRID