Normal text - click to edit Status of implementation of Detector Algorithms in the HLT framework Calibration Session – OFFLINE week (16-03-2007) M. Richter,

Slides:



Advertisements
Similar presentations
High Level Trigger (HLT) for ALICE Bergen Frankfurt Heidelberg Oslo.
Advertisements

Kondo GNANVO Florida Institute of Technology, Melbourne FL.
5/2/  Online  Offline 5/2/20072  Online  Raw data : within the DAQ monitoring framework  Reconstructed data : with the HLT monitoring framework.
HLT - data compression vs event rejection. Assumptions Need for an online rudimentary event reconstruction for monitoring Detector readout rate (i.e.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
High Level Trigger – Applications Open Charm physics Quarkonium spectroscopy Dielectrons Dimuons Jets.
HLT & Calibration.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/27 A Control Software for the ALICE High Level Trigger Timm.
> IRTG – Heidelberg 2007 < Jochen Thäder – University of Heidelberg 1/18 ALICE HLT in the TPC Commissioning IRTG Seminar Heidelberg – January 2008 Jochen.
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK New Test Results for the ALICE High Level Trigger.
1 HLT – ECS, DCS and DAQ interfaces Sebastian Bablok UiB.
High Level Trigger of Muon Spectrometer Indranil Das Saha Institute of Nuclear Physics.
1 HLT – a source of calibration data One of the main tasks of HLT (especially in the first years) –Monitoring of the detector performance –Analysing calibration.
Jianchun (JC) Wang, 08/21/99 RICH Electronics and DAQ Chip Carrier Short Cable Transition Board Long Cable Data Board Crate J.C.Wang Syracuse University.
Normal text - click to edit HLT – Interfaces (ECS, DCS, Offline) (Alice week – HLT workshop ) S. Bablok (IFT, University of Bergen)
Jochen Thäder – Kirchhoff Institute of Physics - University of Heidelberg 1 HLT for TPC commissioning - Setup - - Status - - Experience -
SSD Status P. Christakoglou (NIKHEF-UU) for the SSD collaboration Thanks to: Marco vL, Enrico, Mino, Marek and Massimo.
The High-Level Trigger of the ALICE Experiment Heinz Tilsner Kirchhoff-Institut für Physik Universität Heidelberg International Europhysics Conference.
Experience with analysis of TPC data Marian Ivanov.
PHOS offline status report Yuri Kharlov ALICE offline week 7 April 2008.
Real data reconstruction A. De Caro (University and INFN of Salerno) CERN Building 29, December 9th, 2009ALICE TOF General meeting.
Andreas Morsch, CERN EP/AIP CHEP 2003 Simulation in ALICE Andreas Morsch For the ALICE Offline Project 2003 Conference for Computing in High Energy and.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
5/2/  Online  Offline 5/2/20072  Online  Raw data : within the DAQ monitoring framework  Reconstructed data : with the HLT monitoring framework.
Planning and status of the Full Dress Rehearsal Latchezar Betev ALICE Offline week, Oct.12, 2007.
IDE DCS development overview Ewa Stanecka, ID Week, CERN
1 SDD: DA and preprocessor Francesco Prino INFN Sezione di Torino ALICE offline week – April 11th 2008.
TPC QA + experience with the AMORE framework Marian Ivanov, Peter Christiansen + GSI group.
1 Checks on SDD Data Piergiorgio Cerello, Francesco Prino, Melinda Siciliano.
1 Outline: Update on muon/dimuon AOD production (R. Arnaldi/E. Scomparin) Other ongoing activities: MUON correction framework (X. Lopez) MUON productions.
Features needed in the “final” AliRoot release P.Hristov 26/10/2006.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
STAR Analysis Meeting, BNL – oct 2002 Alexandre A. P. Suaide Wayne State University Slide 1 EMC update Status of EMC analysis –Calibration –Transverse.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
1 Offline Week, October 28 th 2009 PWG3-Muon: Analysis Status From ESD to AOD:  inclusion of MC branch in the AOD  standard AOD creation for PDC09 files.
Summary of User Requirements for Calibration and Alignment Database Magali Gruwé CERN PH/AIP ALICE Offline Week Alignment and Calibration Workshop February.
ALICE Offline Week October 4 th 2006 Silvia Arcelli & Chiara Zampolli TOF Online Calibration - Strategy - TOF Detector Algorithm - TOF Preprocessor.
Summary of Workshop on Calibration and Alignment Database Magali Gruwé CERN PH/AIP ALICE Computing Day February 28 th 2005.
Quality assurance for TPC. Quality assurance ● Process: ● Detect the problems ● Define, what is the problem ● What do we expect? ● Defined in the TDR.
CWG7 (reconstruction) R.Shahoyan, 12/06/ Case of single row Rolling Shutter  N rows of sensor read out sequentially, single row is read in time.
Normal text - click to edit CHEP Mumbai ALICE High Level Trigger Interfaces … 1/22 ALICE High Level Trigger Interfaces.
PHOS offline status report Yuri Kharlov ALICE offline week 7 July 2008.
Summary of TPC/TRD/DCS/ECS/DAQ meeting on FERO configuration CERN,January 31 st 2006 Peter Chochula.
Calibration algorithm and detector monitoring - TPC Marian Ivanov.
 offline code: changes/updates, open items, readiness  1 st data taking plans and readiness.
M.Frank, CERN/LHCb Persistency Workshop, Dec, 2004 Distributed Databases in LHCb  Main databases in LHCb Online / Offline and their clients  The cross.
D. Elia (INFN Bari)Offline week / CERN Status of the SPD Offline Domenico Elia (INFN Bari) Overview:  Response simulation (timing info, dead/noisy.
AliRoot survey: Calibration P.Hristov 11/06/2013.
D. Elia (INFN Bari)ALICE Offline week / CERN Update on the SPD Offline Domenico Elia in collaboration with H. Tydesjo, A. Mastroserio Overview:
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
10/8/ HMPID offline status D. Di Bari, A. Mastroserio, L.Molnar, G. Volpe HMPID Group Alice Offline Week.
V4-19-Release P. Hristov 11/10/ Not ready (27/09/10) #73618 Problems in the minimum bias PbPb MC production at 2.76 TeV #72642 EMCAL: Modifications.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
CALIBRATION: PREPARATION FOR RUN2 ALICE Offline Week, 25 June 2014 C. Zampolli.
Sebastian Robert Bablok
Christoph Blume Offline Week, July, 2008
Marian Ivanov TPC status report.
LHC experiments Requirements and Concepts ALICE
ALICE High Level Trigger Interfaces and Data Organisation CHEP 2006 – Mumbai ( – ) Sebastian Robert Bablok, Matthias Richter, Dieter Roehrich,
Calibrating ALICE.
Controlling a large CPU farm using industrial tools
ALICE – First paper.
v4-18-Release: really the last revision!
Commissioning of the ALICE HLT, TPC and PHOS systems
HLT & Calibration.
Implementation of DHLT Monitoring Tool for ALICE
Offline framework for conditions data
Presentation transcript:

Normal text - click to edit Status of implementation of Detector Algorithms in the HLT framework Calibration Session – OFFLINE week ( ) M. Richter, D. Rörich, S. Bablok (IFT, University of Bergen) P.T. Hille (University of Oslo) M. Ploskon (IKF, University of Frankfurt) S. Popescu, V. Lindenstruth (KIP, University of Heidelberg) Indranil Das ( Saha Institute of Nuclear Physics )

Normal text - click to edit TOC HLT functionality HLT interfaces HLT  DCS HLT  OFFLINE HLT interface to AliEve Synchronisation via ECS Status of Detector Algorithms –generell remarks –TPC –TRD –PHOS –DiMuon

Normal text - click to edit HLT functionality Trigger –Accept/reject events verify dielectron candidates sharpen dimuon transverse momentum cut identify jets... Select –Select regions of interest within an event remove pile-up in p-p filter out low momentum tracks... Compress –Reduce the amount of data required to encode the event as far as possible without loosing physics information

Normal text - click to edit HLT interfaces ECS: –Controls the HLT via well defined states (SMI) –Provides general experiment settings (type of collision, run number, …) DCS: –Provides HLT with current Detector parameters (voltages, temperature, …)  Pendolino –Provides DCS processed data from HLT (TPC drift velocity, …)  FED-portal (Front-End-Device portal) OFFLINE: –Interface to fetch data from the OCDB  TAXI, (OFFLINE  HLT) –Provides OFFLINE with calculated calibration data  Shuttle-portal, (HLT  OFFLINE) HOMER: –HLT interface to AliEve for online event monitoring

Normal text - click to edit OFFLINE DCS Task Manager Archive DB PVSS FES MySQL FED portal Shuttle Pendolino ECS- proxy Data flow in HLT spec Datasink (Subscriber) Pendolino-portal intern extern DA HLT OCDB (Conditions) AliEn Taxi HCDB (local Cache) Taxi-HCDB ECS AliRoot CDB access Detector responsibility Framework components Interface DA_HCDB Ali Eve Homer PubSub Detector data HLT cluster data DIM- Subscriber

Normal text - click to edit OFF INITIALIZING > DEINITIALIZING > INITIALIZED CONFIGURED READY RUNNING CONFIGURING > ENGAGING > DISENGAGING > INITIALIZE implicit transtion SHUTDOWN CONFIGURE + params RESET ENGANGE DISENGAGE START STOP implicit transtion COMPLETING implicit transtion Distribution of current version of HCDB to DA nodes (DA_HCDB) Filling of FileExchange Server (FES) and MySQL DB Offline Shuttle can fetch data DAs request their DA_HCDB Pendolino fetches data from DCS archive DB and stores data to DA_HCDB Synchronisation via ECS ECS interface

Normal text - click to edit HLT  DCS interface FED portal: –Dim Channels (Services published on the HLT side) –implementing partially the FEDApi –Subscriber component of the HLT framework –PVSS panels on DCS side integrate data in DCS system –storing of DCS related data in the DCS Archive DB (HLT-cluster monitoring [HLT]; TPC drift velocity, …[detector specific]) Pendolino: –contacts the DCS Amanda server (DCS Archive DB) –fetches current running conditions (temperature, voltages, …) –feeds content into DA_HCDB –requests in regular time intervals: three Pendolinos, each with a different frequency (fast, medium, slow)

Normal text - click to edit DCS portal-dcs (dcs-vobox) Detector responsibility Framework components Interface Pendolino (incl. detector preproc) portal-dcs (Dim Subscriber) Interface (PubSub –FED API [ DIM ]) FED API Archive DB PVSS DA_HCDB Pendolino file catalogue SysMes Interface to SysMes triggers sync DA AliRoot CDB access classes HLT  DCS interface

Normal text - click to edit HLT  DCS interface Pendolino details: –Three different frequencies: fast Pendolino: 10 sec - 1 min normal Pendolino: 1 min - 5 min slow Pendolino: over 5 min –Response time: ~13000 values per second –Remark: The requested values can be up to 2 min old. (This is the time, that can pass until the data is shifted from the DCS PVSS to the DCS Archive DB)

Normal text - click to edit HLT  OFFLINE interface Taxi portal –Requests OCDB and caches content locally (HCDB) –Provides calibration objects to Detector Algorithms (DA) inside the HLT copied locally to DA nodes before each run (DA_HCDB) –DA_HCDB accessible via AliRoot CDB access classes Shuttle portal –Collects calibration data from Detector Algorithms –Provides the data to OFFLINE, fetched after each run by Offline Shuttle –Data exchanged via FileExchangeServer (FES) –Meta data stored in MySQL DB

Normal text - click to edit OFFLINE HLT  OFFLINE interface Taxi portal: TaskManager DA 1. Taxi0 HCDB0 portal-taxi0 (vobox-taxi0) DA_HCDB ECS- proxy OCDB current run number triggers update ECS AliRoot CDB access classes DA AliRoot CDB access classes SysMes DA_HCDB

Normal text - click to edit OFFLINE HLT  OFFLINE interface Shuttle portal: OCDB DA FESMySQL portal-shuttle0 (Subscriber) Shuttle DA TaskManager ECS- proxy ECS notifies: “collecting finished”

Normal text - click to edit AliEVE HLT event display (example TPC) existing infrastructure (M. Tadel) adopted to HLT with minimal effort connect to HLT from anywhere within GPN ONE monitoring infrastructure for all detectors using HOMER data transport abstraction

Normal text - click to edit Status Detector Algorithms (general remark) HLT provides service and infrastructure to run Detector Algorithms, e.g. reconstruction and calibration algorithms offline code can be run on the HLT, if it fulfills the requirements, given by the constraints due to: limited accessibility of (global) AliRoot data structures processing of each event is distributed over many nodes none of the nodes have the full event data of all stages available Detector algorithm interfaces via a processing component to the HLT data chain Processing component implements the HLT component interface General principle: HLT Data Input Data Output Only the input, DCS- and calibration data is available for processing

Normal text - click to edit Status Detector Algorithms (general remark) HLT chain HLT Processors/ Detector Algorithms offline source interface Completely identical HLT processing can run both in the online and offline framework dedicated data structures shipped between components, can be ROOT Tobjects DA's must work entirely on incoming data dedicated publisher components for special data are possible HLT produces ESD files, filled with the data it can reconstruct/provide offline sink interface RORC publishers HLT out Online Offline data from DAQAliHLTReconstructor data to DAQ AliHLTReconstructor

Normal text - click to edit Status Detector Algorithms (general remark) 1)no access to (global) AliRoot data structures (a) DA's have no AliRunLoader instance (b) DA's run as separated processes, no data exchange via global variables (c) DA's can only work on incoming data and OCDB data 2)proper data transport hierarchy deployed by DA, i.e. access to whatever data through global methods/objects from lower hierarchies is penalty code 3)structures/objects for data exchange have to be optimized 4)TObjects for data transport must declare pointer-type members as transient members ( //! ), initialization properly handled by the constructor 5)in principle all offline code using the AliReconstructor plugin scheme can run on HLT, if a proper data transport hierarchy is deployed

Normal text - click to edit Status Detector Algorithms (TPC status) Status: full TPC reconstruction running in HLT output in ESD format TPC calibration tasks defined by the TPC group TPC group decided to extensively use HLT's computing capabilities for calibration task several prototype DA's developed Commissioning of calibration algorithms starts soon

Normal text - click to edit Status Detector Algorithms (TPC task list) HLT On-line monitoring for TPC –Calibration : 1-d histograms for pedestal runs and noise calibration 1-d histograms for pad by pad calibration (time offset, gain and width of the time response function) for the pulser run and during the normal data taking 1-d histograms for the gain calibration during the Krypton run, cosmic, laser and data taking TPC drift velocity Data Quality Monitoring (DQM) –Online monitoring: 3d reconstructed track view optionally together with the 3d detector geometry inside Drift velocity monitoring Pad by pad signal Charge per reconstructed track monitoring

Normal text - click to edit Status Detector Algorithms (TRD status) Clusterization algorithm –Ready and working –Uses directly Offline clusterizer Stand alone tracker –Almost ready (ready within next 1-2 weeks) HLT Component implemented Still few fixes within the AliRoot TRD offline code to be done – HLT will run 100% Offline code here too PID component –Pending – offline code under finalization stage – again, no change of the Offline algorithms within HLT Triggering scenarios under consideration

Normal text - click to edit Status Detector Algorithms (TRD status) Calibration –Native AliRoot OCDB calibration data access (provided via HLT TAXI) –Production of reference data for calibration algorithms Ready and working Uses directly offline code Monitoring –Prototype ready –Integration into AliEve will follow TRD Clusters reconstructed on HLT

Normal text - click to edit Status Detector Algorithms (TRD status) Calibration: –Histogram production ready&working (mcm tracklet based) –Each HLT component has an OCDB access (just like in Offline) via local (HLT node) storage – TRD chain is using OCDB data (1:1 Offline AliRoot code) –TRD preprocessor handles calibration of calibration parameter from the input histograms collected on the HLT TRD local reconstruction on HLT almost complete (local tracking still on the way...) Calibration histograms are produced First HLT monitoring code emerging soon (also AliEve support)

Normal text - click to edit Status Detector Algorithms (TRD task list) Short term to do: –PID –Track merging with TPC (and ITS eventually) Long term to do: –Physics trigger scenarios

Normal text - click to edit Status Detector Algorithms (PHOS status) Current status –running full PHOS HLT chain (5 modules) with raw data simulated in aliroot –successful test on simulated HLT ”cluster” consisting of 3 laptops –fast and accurate online evaluation of cell energies –calibration data: Continious accumulation of per channel energy distribution: Calibration data written to root files at end of run. Histograms has been evaluated visually and looks reasonable. –raw data can be written to files untouched by the HLT (HLT mode A) –calibration data can be accumulated over several runs. –event display: Display of events & calibration data for 5 modules using HOMER –collection of data from several nodes to be vizualized in a single event display. –PHOS HLT Analysis chain has run successfully distributed over 21 nodes at the HLT cluster at P2

Normal text - click to edit Status Detector Algorithms (PHOS task list) Currenly ongoing work: –Implementation of DQM –Integration of end of run calibration procedures, DA –Implementation of fast Phi0 invariant mas algorithm –Testing and benchmarking of the processing cain on the HLT cluster. –Preparations for PDC07 Near future plans: –Integration ECS, DCS, shuttle etc.. –Testing of the HLT processing chain on beamtest data –Making of ESDs to be send to DAQ with HLT-out –Running of the PHOS HLT processing chain on data files and root files –Minor improvment on the online display –Finalization and documentation of internal PHOS HLT data format.

Normal text - click to edit Status Detector Algorithms (PHOS task list) Currenly ongoing work: –Implementation of DQM –Integration of end of run calibration procedures, DA –Implementation of fast Phi0 invariant mas algorithm –Testing and benchmarking of the processing cain on the HLT cluster. –Preparations for PDC07 Near future plans: –Integration ECS, DCS, shuttle etc.. –Testing of the HLT processing chain on beamtest data –Making of ESDs to be send to DAQ with HLT-out –Running of the PHOS HLT processing chain on data files and root files –Minor improvment on the online display –Finalization and documentation of internal PHOS HLT data format.

Normal text - click to edit Status Detector Algorithms (DiMuon status and task list) Present Status: –Standalone hit reconstruction is ready and implemented in HLT environment of CERN PC farm –First results of resolution test with the rawdata generated using AliRoot of dHLT chain at CERN –Processing time for multiple event is large compared to standalone mode –Full dHLT Chain is working and up in UCT cluster Future to do list: –Improvement of the processing timing –Integration of the tracker algorithm in CERN HLT. –Implementation of the full chain along with debugging and benchmarking. –Preparing the output in ESD format. –Efficiency checking of the dHLT chain using beamtest data

Normal text - click to edit Information on the web and talks of HLT session on the last Alice week

Normal text - click to edit Backup slides

Normal text - click to edit Status Detector Algorithms (TRD DA overview)

Normal text - click to edit Status Detector Algorithms (TRD status)

Normal text - click to edit Resolution of dHLT hitreconstruction Note : Resolution in the Y direction is far better than the X direction is due to the detector geometry, the minimum padsize in beding plane is ~0.5 cm, whereas in non-bending direction is ~0.71 cm.

Normal text - click to edit HLT  ECS interface State transition commands from ECS –INITIALIZE, CONFIGURE(+PARAMS), ENAGE,START,… –Mapping to TaskManager states CONFIGURE parameters: –HLT_MODE: the mode, in which the HLT shall run (A, B or C) –BEAM_TYPE: (pp (proton-proton) or AA (heavy ion)) –RUN_NUMBER: the run number for the current run –DATA_FORMAT_VERSION: the expected output data format version –HLT_TRIGGER_CODE: ID defining the current HLT Trigger classes –CTP_TRIGGER_CLASS: the trigger classes in the Central Trigger Processor –HLT_IN_DDL_LIST: list of DDL cables on which the HLT can expect event data in the coming run. The structure will look like the following: :, :,... –HLT_OUT_DDL_LIST: list of DDLs, on which the HLT can send data to DAQ