The Control and Hardware Monitoring System of the CMS Level-1 Trigger Ildefons Magrans, Computing and Software for Experiments I IEEE Nuclear Science Symposium,

Slides:



Advertisements
Similar presentations
The Control System for the ATLAS Pixel Detector
Advertisements

Ildefons Magrans, CMS Trigger Software Technical Coordinator 1 Complexity Management Solutions for High Energy Physics Control Systems: The CMS experiment.
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC M. Della Pietra, P. Adragna,
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
RPC Trigger Software ESR, July Tasks subsystem DCS subsystem Run Control online monitoring of the subsystem provide tools needed to perform on-
CMS Michele Gulmini, CHEP2003, San Diego USA, March Run Control and Monitor System for the CMS Experiment Michele Gulmini CERN/EP – INFN Legnaro.
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
Data Acquisition Software for CMS HCAL Testbeams Jeremiah Mans Princeton University CHEP2003 San Diego, CA.
First operational experience with the CMS Run Control System Hannes Sakulin, CERN/PH on behalf of the CMS DAQ group 17 th IEEE Real Time Conference,
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
Database System Concepts and Architecture Lecture # 3 22 June 2012 National University of Computer and Emerging Sciences.
Robert Gomez-Reino on behalf of PH-CMD CERN group.
CERN - IT Department CH-1211 Genève 23 Switzerland t Monitoring the ATLAS Distributed Data Management System Ricardo Rocha (CERN) on behalf.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
DCS TCSG November 10th 1999, H.J.Burckhart1 Status of the general purpose I/O system LMB u DCS Architecture u LMB u Local Monitor Box (LMB) u Concept u.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
CMS Michele Gulmini, Cern, DAQ Weekly 07/05/ RCMS – Plan of work Michele Gulmini DAQ Weekly 7th May 2002.
NCSX NCSX Preliminary Design Review ‒ October 7-9, 2003 G. Oliaro 1 G. Oliaro - WBS 5 Central Instrumentation/Data Acquisition and Controls Princeton Plasma.
The Run Control and Monitoring System of the CMS Experiment Presented by Andrea Petrucci INFN, Laboratori Nazionali di Legnaro, Italy On behalf of the.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
C.Combaret, L.Mirabito Lab & beamtest DAQ with XDAQ tools.
RPC PAC Trigger system installation and commissioning How we make it working… On-line software Resistive Plate Chambers Link Boxes Optical Links Synchronization.
G. Maron, Agata Week, Orsay, January Agata DAQ Layout Gaetano Maron INFN – Laboratori Nazionali di Legnaro.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
Data Acquisition for the 12 GeV Upgrade CODA 3. The good news…  There is a group dedicated to development and support of data acquisition at Jefferson.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
D etector C ontrol S ystem ALICE DCS workshop G. De Cataldo CERN-CH, A. Franco INFN Bari, I 1 Finite State Machines (FSM) for the ALICE DCS:
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
Summary of CSC Track-Finder Trigger Control Software Darin Acosta University of Florida.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
David BAILLEUX US CMS on behalf of the Caltech group Laser Monitoring System - Electronics review 20 March Overview Overview Laser control Laser.
ALICE, ATLAS, CMS & LHCb joint workshop on
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
TriDAS Equipment database Equipment database is partially built (Oracle). Currently consists of 2 separate databases.  DB I – database containing.
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
Databases in CMS Conditions DB workshop 8 th /9 th December 2003 Frank Glege.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Bruno Belbute, October 2006 Presentation Rehearsal for the Follow-up meeting of the Protocol between AdI and CERN.
DØ Online16-April-1999S. Fuess Online Computing Status DØ Collaboration Meeting 16-April-1999 Stu Fuess.
Controls EN-ICE FSM for dummies (…w/ all my respects) 15 th Jan 09.
DAQ Andrea Petrucci 6 May 2008 – CMS-UCSD meeting OUTLINE Introduction SCX Setup Run Control Current Status of the Tests Summary.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
CMS Luigi Zangrando, Cern, 16/4/ Run Control Prototype Status M. Gulmini, M. Gaetano, N. Toniolo, S. Ventura, L. Zangrando INFN – Laboratori Nazionali.
DØ Online Workshop3-June-1999S. Fuess Online Computing Overview DØ Online Workshop 3-June-1999 Stu Fuess.
Rack Wizard LECC 2003 Frank Glege. LECC Frank Glege - CERN2/12 Content CMS databases - overview The equipment database The Rack Wizard.
TDAQ Experience in the BNL Liquid Argon Calorimeter Test Facility Denis Oliveira Damazio (BNL), George Redlinger (BNL).
The ATLAS DAQ System Online Configurations Database Service Challenge J. Almeida, M. Dobson, A. Kazarov, G. Lehmann-Miotto, J.E. Sloper, I. Soloviev and.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
CMS Luigi Zangrando, Cern, 16/4/ Run Control Prototype Status M. Gulmini, M. Gaetano, N. Toniolo, S. Ventura, L. Zangrando INFN – Laboratori Nazionali.
Maria del Carmen Barandela Pazos CERN CHEP 2-7 Sep 2007 Victoria LHCb Online Interface to the Conditions Database.
IceCube DAQ Mtg. 10,28-30 IceCube DAQ: Implementation Plan.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
Online Software November 10, 2009 Infrastructure Overview Luciano Orsini, Roland Moser Invited Talk at SuperB ETD-Online Status Review.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
Karol Buńkowski, University of Warsaw Control software in the current RPC trigger and DAQ system CMS GEM 2 day Electronics Meeting 10 October 2012.
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
Overview of TANGO Control system
Software Overview Sonja Vrcic
CMS Trigger Supervisor Framework
CMS High Level Trigger Configuration Management
Information Collection and Presentation Enriched by Remote Sensor Data
LHC experiments Requirements and Concepts ALICE
Controlling a large CPU farm using industrial tools
Design Principles of the CMS Level-1 Trigger Control and Hardware Monitoring System Ildefons Magrans de Abril Institute for High Energy Physics, Vienna.
Pierluigi Paolucci & Giovanni Polese
Tools for the Automation of large distributed control systems
Presentation transcript:

The Control and Hardware Monitoring System of the CMS Level-1 Trigger Ildefons Magrans, Computing and Software for Experiments I IEEE Nuclear Science Symposium, 30 th October 07 1.Context 2.Concept 3.Framework 4.System 5.Services

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria Context ~55 Million Channels, ~1 Mbyte per event 100 Khz, no dead time 40 Mhz, ~20 events per BX 100 Hz L1 Decision Loop. HARDWARE CMS Control System. SOFTWARE

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria Hardware context 3.2 µs L1-Trigger Decision Loop Configuration: 64 crates O(10 3 ) boards Firmware ~ 15 MB/board O(10 2 ) regs/board 8 independent detector partitions Project context Out of project context Testing: O(10 3 ) links Integration coordination: Large number of involved institutes

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria Software context Run Control and Monitoring System (RCMS): Overall experiment control and monitoring RCMS framework implemented with java Detector Control System (DCS): Detector safety, gas and fluid control, cooling system, rack and crate control, high and low voltage control, and detector calibration. DCS is implemented with PVSSII Cross-platform Data AcQuisition middleware (XDAQ): C++ component based distributed programming framework Used to implement the distributed event builder L1-Trigger Control and Hardware Monitoring System: Provides a machine and a human interfaces to operate, test and monitor the Level-1 decision loop hardware components. (8)

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria Concept [1] Conceptual design (IEEE TNS VOL. 53, NO. 2, APRIL 2006, pp ) [2] Prototype (IEEE NSS 2005, Puerto Rico)

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria Baseline Infrastructure Subsystem OSWI integration effort (C++, Linux) Supervisory and Control Infrastructure development effort DCS (PVSSII, Windows) ++Ok RCMS (Java) ++ XDAQ (C++, Linux) Ok+ CMS official software frameworks to develop distributed systems: DCS, RCMS, XDAQ: Subsystems Online SoftWare Infrastructure needs to be integrated Infrastructure should be oriented to develop SCADA systems XDAQ-based baseline solution + additional development to reach SCADA framework

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria The Cell Synchronous and Asynchronous SOAP API Other plug-ins: Command: RPC method. SOAP API extensions Monitoring items FSM Plug-ins Xhannel infrastructure: Designed to simplify access to web services (SOAP and HTTP/CGI) from operation transition methods Tstore (DB) Monitor collector Cells Control panel plug-ins + e.g. GT panel e.g. DTTF panel HTTP/CGI: Automatically generated e.g. Cell FSM operation

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria The Trigger Supervisor Framework Components RCMS components Tstore: DB interface. Exposes SOAP. 1 per system. Mon. Collector: Polls all cell sensors. 1 per system. Mstore: interface M. collector with Tstore. 1 per system. Job control: Remote startup of XDAQ applications. 1 per host. XS: Reads logging data base. 1 per cell. Monitor sensor: Cell interface to poll monitoring information. 1 per cell. Cell: Facilitates subsystem integration and operation (additional development, next slide). 1 per crate. Log Collector: 1 per system. Collects log statements from cells and forward them to consumers. System based uniquely on these components

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria L1 Trigger Control System Hierarchical system enhances: Distributed development Subsystem control Partial deployment Graceful degradation Centralized access to DBs Configuration and Interconnection Test Services framework 1 crate ~ 1 cell Multicrate subsystems ~ 2 level of subsystem cells (1 subsystem central cell)

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria L1 Trigger Monitoring System 1 cell ~ 1 sensor System ~ 1 Mon. Collector, 1 Mstore (centralized system) Centralized access to DBs Hardware Monitoring Service framework

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria L1 Trigger Logging and Start-up Systems 1 cell ~ 1 XS System ~ 1 Log. Collector (centralized system) 1 host ~ 1 JC Auxiliary systems

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria Control System Services: Configuration New service methodology: 1 Define FSM plug-in in central cell 2 Define operation transition methods ( = Define expected subsystem central cells FSM) Subsystem integration coordination strategy Can be applied to Multicrate subsystems 1 Configures L1 Trigger HW 2 Configures Partition

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria Crate Cell Services New command plugin extends the cell API Additional FSMs to fulfill the requirements of experts during commissioning and testing operations Control panel plug-ins extends the default cell GUI with expert oriented control panels Crate Cell Level Services = Expert level facilities replace standalone tools and programs

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria New Monitoring Item visible to the central collector requires: 1 Declare it in XML file (shared with central collector) 2 Define callback routine in crate cell Monitoring System Services

Ildefons MagransInstitute for High Energy Physics of the Austrian Academy of Sciences, Vienna, Austria Summary HW Context Problem definition Concept Conceptual design of the solution ~ agreement with all involved parties Prototype Prove of concept and better understanding of the requirements SW Context Control system & Available facilities Framework Filling the gap between available sw facilities and the ideal framework System Distributed sw system = flexible services provider Services Solution