ITS combined test seen from DAQ and ECS F.Carena, J-C.Marin

Slides:



Advertisements
Similar presentations
J. Varela, CERN & LIP-Lisbon Tracker Meeting, 3rd May Partitions in Trigger Control J. Varela CERN & LIP-Lisbon Trigger Technical Coordinator.
Advertisements

Status of the CTP O.Villalobos Baillie University of Birmingham April 23rd 2009.
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
CWG10 Control, Configuration and Monitoring Status and plans for Control, Configuration and Monitoring 16 December 2014 ALICE O 2 Asian Workshop
ALICE Trigger System Features Overall layout Central Trigger Processor Local Trigger Unit Software Current status On behalf of ALICE collaboration:D. Evans,
1P. Vande Vyvre - CERN/PH ALICE DAQ Technical Design Report DAQ TDR Task Force Tome ANTICICFranco CARENA Wisla CARENA Ozgur COBANOGLU Ervin DENESRoberto.
Update on DAQ Klaus Schossmaier CERN PH-AID ALICE TPC Meeting 21 April 2006.
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK New Test Results for the ALICE High Level Trigger.
Specification and Simulation of ALICE DAQ System Giovanna Di Marzo Serugendo.
1 CTP DCS luminosity monitoring A. Jusko, R. Lietava, J. Urban.
CALICE – 12/07/07 – Rémi CORNAT (LPC) 1 ASU and standalone test setup for ECAL MAIA BEE project Overview DAQ dedicated Sensor test In situ debug and maintenance.
MSS, ALICE week, 21/9/041 A part of ALICE-DAQ for the Forward Detectors University of Athens Physics Department Annie BELOGIANNI, Paraskevi GANOTI, Filimon.
DDL hardware, DATE training1 Detector Data Link (DDL) DDL hardware Csaba SOOS.
Artdaq Introduction artdaq is a toolkit for creating the event building and filtering portions of a DAQ. A set of ready-to-use components along with hooks.
11 CTP Training A.Jusko, M. Krivda and R.Lietava..
1 Alice DAQ Configuration DB
G. Maron, Agata Week, Orsay, January Agata DAQ Layout Gaetano Maron INFN – Laboratori Nazionali di Legnaro.
V. Altini, T. Anticic, F. Carena, W. Carena, S. Chapeland, V. Chibante Barroso, F. Costa, E. Dénes, R. Divià, U. Fuchs, I. Makhlyueva, F. Roukoutakis,
The ALICE DAQ: Current Status and Future Challenges P. VANDE VYVRE CERN-EP/AID.
DAQ & ECS for TPC commissioning A few statements about what has been done and what is still in front of us F.Carena.
The ALICE Data-Acquisition Software Framework DATE V5 F. Carena, W. Carena, S. Chapeland, R. Divià, I. Makhlyueva, J-C. Marin, K. Schossmaier, C. Soós,
ALICE Computing Model The ALICE raw data flow P. VANDE VYVRE – CERN/PH Computing Model WS – 09 Dec CERN.
1 MICE Tracker Readout Update, Preparation for Cosmic Ray Tests Cosmic Ray Tests at RAL AFE-IIt Firmware Development VLSB Firmware Development Summary.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
C. Combaret DIF_GDIF_MDIF_D ASU 6x 24 HR2 ASU USB Hub RPi USB2 DCC SDCC RPi USB 1 hub+Rpi for 4 cassettes 1 DCC for 8 cassettes (1 spare) Trigger.
C. Combaret 11/10/ 2008 Status of the DHCAL m2 software C. Combaret IPNL.
Rome 4 Sep 04. Status of the Readout Electronics for the HMPID ALICE Jose C. DA SILVA ALICE.
AFFAIR – fabric monitoring ROOT 2005 Tome Antičić Ruđer Bošković Institute, Zagreb,Croatia ALICE,CERN Tome Antičić Ruđer Bošković.
LIGO-G D Global Diagnostics and Detector Characterization 9 th Marcel Grossmann Meeting Daniel Sigg, LIGO Hanford Observatory.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
F.Carena, CERN/ALICE The ALICE Experiment Control System F. Carena / CERN-PH.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
LHCb front-end electronics and its interface to the DAQ.
Source Controller software Ianos Schmidt The University of Iowa.
PCI B ASED R EAD-OUT R ECEIVER C ARD IN THE ALICE DAQ S YSTEM W.Carena 1, P.Csato 2, E.Denes 2, R.Divia 1, K.Schossmaier 1, C. Soos 1, J.Sulyan 2, A.Vascotto.
Barcelona 1 Development of new technologies for accelerators and detectors for the Future Colliders in Particle Physics URL.
CWG4 – The data model The group proposes a time frame - based data model to: – Formalize the access to data types produced by both detector FEE and data.
Real time performance estimates of the LHC BPM and BLM system SL/BI.
26/11/02CROP meeting-Nicolas Dumont Dayot 1 CROP (Crate Read Out Processor)  Specifications.  Topology.  Error detection-correction.  Treatment (ECAL/HCAL.
The Past... DDL in ALICE DAQ The DDL project ( )  Collaboration of CERN, Wigner RCP, and Cerntech Ltd.  The major Hungarian engineering contribution.
ARCHITECTURE. PRR November x 32 PADs Up to 26 or 3 x 17 MANU BOARD. PATCH BUS Translator Board. FEE DETECTOR Up to 100 PATCH BUS per detector. MANU.
Status of the Shuttle Framework Alberto Colla Jan Fiete Grosse-Oetringhaus ALICE Offline Week October 2006.
Slide 1 2/22/2016 Policy-Based Management With SNMP SNMPCONF Working Group - Interim Meeting May 2000 Jon Saperia.
18/05/2000Richard Jacobsson1 - Readout Supervisor - Outline Readout Supervisor role and design philosophy Trigger distribution Throttling and buffer control.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
General Tracker Meeting: Greg Iles4 December Status of the APV Emulator (APVE) First what whyhow –Reminder of what the APVE is, why we need it and.
R.Divià, CERN/ALICE 1 ALICE off-line week, CERN, 9 September 2002 DAQ-HLT software interface.
The ALICE Silicon Pixel Detector Control system and Online Calibration tools Ivan Amos Calì (a,b) On behalf of the SPD Project in.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
EtherCAT based RF Interlock System for SwissFEL LLRF 2015 Abstract As part of the overall development effort for SwissFEL's RF and LLRF systems, the RF.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
P. Vande Vyvre – CERN/PH for the ALICE collaboration CHEP – October 2010.
Scalable Readout System Data Acquisition using LabVIEW Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer]
The ALICE Data-Acquisition Read-out Receiver Card C. Soós et al. (for the ALICE collaboration) LECC September 2004, Boston.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
LHC experiments Requirements and Concepts ALICE
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
TPC Commissioning: DAQ, ECS aspects
Controlling a large CPU farm using industrial tools
IEEE NPSS Real Time Conference 2009
SPD – ALICE DAQ/ECS/Trigger Integration Status Report
CRU Weekly Meeting Discussion on Trigger
PCI BASED READ-OUT RECEIVER CARD IN THE ALICE DAQ SYSTEM
Example of DAQ Trigger issues for the SoLID experiment
M. Krivda for the ALICE trigger project, University of Birmingham, UK
Design Principles of the CMS Level-1 Trigger Control and Hardware Monitoring System Ildefons Magrans de Abril Institute for High Energy Physics, Vienna.
ALICE Data Challenges Fons Rademakers Click to add notes.
Implementation of DHLT Monitoring Tool for ALICE
State-Transition Diagrams
Presentation transcript:

ITS combined test seen from DAQ and ECS F.Carena, J-C.Marin

HW components (seen by DAQ) 3 detectors producing data 3 DDL links (Detector Data Link) 3 RORCs (ReadOut Receiver Card) (different versions: pRORC, D-RORC) 3 LDCs (Local Data Concentrator) 1 GDC (Global Data Collector) 1 Control Station Fast Ethernet network Local disks as storage area

HW components (seen by DAQ) SDD SPD SSD D L D L D L L D C L D C L D C Fast Ethernet Local disks G D C CONTROL

DAQ software DATE 5.0 Can be controlled by the ECS Allows multiple parallel Data Acquisition tasks The Run Control Human Interface and the Run Control are two interacting processes that may run on different PCs One single DATE_SITE describing the ITS setup (SDD, SPD, and SSD)

DAQ tasks Data Taking for individual detectors (standalone mode) Without event building Data recording on LDC local disks Data Taking for ITS (the 3 detectors or any combination of 2 detectors) With event building based on Event Identification (orbit counter + bunch crossing) given by the ‘CTP’ Data recording on GDC local disks

DAQ implementation 4 DAQ Run Control processes permanently on the Control Machine 1 for each Detector 1 for the global Data Acquisition With Human Interfaces on any DAQ machine The 4 processes compete to get resources (LDCs) The resolution of the access conflicts is left to the Experiment Control System The synchronization of the DAQ and the Trigger is left to the ECS

{about Trigger ……. from Pedja

Simplified block-diagram of the LTU

Test-beam system connections

An LTU process allows to set the ‘Mode’ of operation of an LTU (emulator or ‘CTP’) To enable/disable the BUSY2 connector A TPA (Trigger Partition Agent) process coordinates the setting (mode and BUSY2 enabling option) of the LTUs involved in a global run to emulate the CTP one LTU becomes master and plays the role of CTP

…. from Pedja} }

ITS seen from ECS From the ECS point of view the ITS is made of 8 objects associated to processes 4 Run Controls Controlling Data Acquisition for standalone detectors or the global one 3 LTUs Sending trigger signals to the detectors under the ‘CTP’ control or using the LTU emulator 1 TPA Synchronizing the LTUs (2 or 3) participating to global runs to simulate the CTP

ITS overall view

Include/Exclude Detectors from ITS partition

All together

SDD and SSD

SDD and SPD

SPD and SSD

SPD standalone

Major results Major step towards the final TRG/DAQ/ECS system DATE 5.0 successfully used New Run Control allowing multiple parallel Data Acquisition Tasks Same version of ‘readout’ for all the detectors Event building based on information taken from the Common Data Header Monitoring possible at LDC and GDC level MOOD has been intensively used by SSD ECS The major components ready and used extensively to run… All the possible combinations of 1/2/3 detectors Interfaces with DAQ and Trigger worked without problems

feedback The access control mechanism used by DAQ and ECS works and it is necessary, but it is not sufficiently solid to resist to process crashes. The access control mechanism makes some operation (e.g. redefinition of DAQ run parameters) somewhat difficult and not easy for an operator with limited expertise The ECS HI can be improved (number of windows, more information,…) An interactive tool to register operator comments must be provided