1 HLT – ECS, DCS and DAQ interfaces Sebastian Bablok UiB.

Slides:



Advertisements
Similar presentations
ALICE DCS, Heidelberg 8 Sept G. De Cataldo, CERN CH and INFN Bari;A. Franco INFN Bari 1 Updating on the HV control systems in ALICE The DELPHI HV,
Advertisements

The Detector Control System – FERO related issues
DCS workshop 13-14/6/2005G. De Cataldo, CERN-CH and INFN bari1 Common FSM’s updates An exercise to design the standard FSMs for the DCS, the HV and the.
The Control System for the ATLAS Pixel Detector
CWG10 Control, Configuration and Monitoring Status and plans for Control, Configuration and Monitoring 16 December 2014 ALICE O 2 Asian Workshop
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
HLT & Calibration.
18/05/2005Kamil Sedlak1 ATLAS Data Acquisition and Controls DAQ … data characterised by event and run number. DCS … data characterised by (valid for) some.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/27 A Control Software for the ALICE High Level Trigger Timm.
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK New Test Results for the ALICE High Level Trigger.
HLT Collaboration (23-Jun-15) 1 High Level Trigger HLT PRR Planning Volker Lindenstruth Kirchhoff Institute for Physics Chair of Computer Science University.
CHEP03 - UCSD - March 24th-28th 2003 T. M. Steinbeck, V. Lindenstruth, H. Tilsner, for the Alice Collaboration Timm Morten Steinbeck, Computer Science.
Normal text - click to edit Status of implementation of Detector Algorithms in the HLT framework Calibration Session – OFFLINE week ( ) M. Richter,
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
Normal text - click to edit HLT – Interfaces (ECS, DCS, Offline) (Alice week – HLT workshop ) S. Bablok (IFT, University of Bergen)
March 2003 CHEP Online Monitoring Software Framework in the ATLAS Experiment Serguei Kolos CERN/PNPI On behalf of the ATLAS Trigger/DAQ Online Software.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
G. Maron, Agata Week, Orsay, January Agata DAQ Layout Gaetano Maron INFN – Laboratori Nazionali di Legnaro.
DAQ & ECS for TPC commissioning A few statements about what has been done and what is still in front of us F.Carena.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
TPC Prototype III TPC module anode HV edge LV module analog digital Cool sensor Gas sensor Shift Expert DCS Ch Skirt.
CMS Databases P. Paolucci. CMS DB structure HLT-CMSSW applicationReconstruction-CMSSW application FronTIER read/write objects.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
4/5/2007Data handling and transfer in the LHCb experiment1 Data handling and transfer in the LHCb experiment RT NPSS Real Time 2007 FNAL - 4 th May 2007.
ALICE, ATLAS, CMS & LHCb joint workshop on
P. Chochula ALICE Week Colmar, June 21, 2004 Status of FED developments.
André Augustinus 21 June 2004 DCS Workshop Detector DCS overview Status and Progress.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
G. Dissertori ETHZ CMS Electronics ECAL DCS : Plans for 2003 G. Dissertori ETHZ
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
Databases in CMS Conditions DB workshop 8 th /9 th December 2003 Frank Glege.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
F.Carena, CERN/ALICE The ALICE Experiment Control System F. Carena / CERN-PH.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
Peter Chochula ALICE Offline Week, October 04,2005 External access to the ALICE DCS archives.
L0 DAQ S.Brisbane. ECS DAQ Basics The ECS is the top level under which sits the DCS and DAQ DCS must be in READY state before trying to use the DAQ system.
Controls EN-ICE FSM for dummies (…w/ all my respects) 15 th Jan 09.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
Claudio Grandi INFN-Bologna CHEP 2000Abstract B 029 Object Oriented simulation of the Level 1 Trigger system of a CMS muon chamber Claudio Grandi INFN-Bologna.
Status & development of the software for CALICE-DAQ Tao Wu On behalf of UK Collaboration.
Configuration database status report Eric van Herwijnen September 29 th 2004 work done by: Lana Abadie Felix Schmidt-Eisenlohr.
1 CTP offline software status (Offline week,8/4/08) R.Lietava for CTP group.
Alignment in real-time in current detector and upgrade 6th LHCb Computing Workshop 18 November 2015 Beat Jost / Cern.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
Normal text - click to edit CHEP Mumbai ALICE High Level Trigger Interfaces … 1/22 ALICE High Level Trigger Interfaces.
R.Divià, CERN/ALICE 1 ALICE off-line week, CERN, 9 September 2002 DAQ-HLT software interface.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
T0 DCS Status DCS Workshop March 2006 T.Karavicheva on behalf of T0 team.
Summary of TPC/TRD/DCS/ECS/DAQ meeting on FERO configuration CERN,January 31 st 2006 Peter Chochula.
André Augustinus 18 March 2002 ALICE Detector Controls Requirements.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
Scalable Readout System Data Acquisition using LabVIEW Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer]
Srećko Morović Institute Ruđer Bošković
TPC Commissioning: DAQ, ECS aspects
ALICE High Level Trigger Interfaces and Data Organisation CHEP 2006 – Mumbai ( – ) Sebastian Robert Bablok, Matthias Richter, Dieter Roehrich,
Controlling a large CPU farm using industrial tools
Commissioning of the ALICE HLT, TPC and PHOS systems
HLT & Calibration.
The LHCb Run Control System
The LHCb High Level Trigger Software Framework
Tools for the Automation of large distributed control systems
Implementation of DHLT Monitoring Tool for ALICE
Offline framework for conditions data
Presentation transcript:

1 HLT – ECS, DCS and DAQ interfaces Sebastian Bablok UiB

Control of HLT by the Experiment Control System (ECS) for initialisation and runtime operations Interface: represented as well defined states and transitions managed by Finite State Machines (FSM) ECS controls the HLT via HLT proxy implemented in SMI++ (external communication via DIM) internal communication is using interface library of TaskManager control system HLT - ECS Interface

HLT proxy states and transitions: subdivided into two type of states “stable” states (OFF, INITIALIZED, CONFIGURED, READY, RUNNING, ERROR): transitions only via well defined commands intermediate states (RAMPING_UP_&_INITIALIZING, CONFIGURING, ENGAGING, DISENGAGING, RAMPING_DOWN): longer procedure of transition – implicit transition, when procedure is finished Implementation: SMI++ (common among all ALICE systems) single instance not partitioned HLT proxy is only active during (de-) initialisation- and configuration phase, in RUNNING state the cluster processes event data autonomously internal communication: interface library of TaskManager (TM) control system HLT proxy contacts to multiple master TMs of the cluster (majority vote – fault tolerance, avoiding single-point-of-failure) master TMs are networked with all slave TMs slave TMs control the PubSub system (RORC publisher and others)

HLT proxy - States ERROR OFF RAMPING_UP_&_INITIALIZING > RAMPING_DOWN > INITIALIZED CONFIGURED READY RUNNING CONFIGURING > ENGAGING > DISENGAGING > initialize implicit transtion shutdown reset shutdown configure reset engage disengage start stop implicit transtion HLT proxy states

6 Status HLT proxy implemented Communication with ECS tested Communication with TaskManager implemented but not yet tested

7 HLT- DCS interface HLT is a consumer and a producer of data to be provided/archived by DCS

HLT & Databases The HLT will have contact to different databases, which are for different purposes as well as under different administration. For now, it is foreseen to have contact to three different DBs: –Configuration DB (DCS) –PVSS archive (DCS) and Condition DB –Calibration DB (Detector Specific)

9 HLT integration into DCS Online Calibration Modules (one per detector)

10 HLT- DCS data source model

11 HLT- DCS data flow

12 HLT - DCS interface

13 Status Conceptual design done Implementation will start this week in collaboration with the DCS group

14 HLT- DAQ interfaces Hardware interfaces –HRORC with DIU/SIU –See hardware talks Software interfaces –Output data format

HLT- DAQ output data format HLT – output to DAQ DAQ handles HLT output like data from any other detector (except for the trigger decision) Output data format: trigger decision: list of DDLs to read (detector DDLs and HLT DDLs) Event Summary Data (ESD) in case of an accept candidate ID (ID from used trigger class) preprocessed event data [optional]: (partially) reconstructed event compressed raw data preprocessed detector specific data (e.g. Photon Spectrometer pulse shape analysis) Start-Of-Data (SOD) and End-Of-Data (EOD)

16 Status Implemented