The Detector Control System – FERO related issues

Slides:



Advertisements
Similar presentations
ALICE DCS, Heidelberg 8 Sept G. De Cataldo, CERN CH and INFN Bari;A. Franco INFN Bari 1 Updating on the HV control systems in ALICE The DELPHI HV,
Advertisements

JCOP FW Update ALICE DCS Workshop 6 th and 7 th October, 2005 Fernando Varela Rodriguez, IT-CO Outline Organization Current status Future work.
Sezione di Bari September 16, 2002D. Elia - DCS workshop / ALICE week 1 SPD PS system and Controls Domenico Elia, INFN-Bari.
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
1 DCS Installation & commissioning TB 18 May 06 L.Jirden Central DCS Detector DCS Status.
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
Experiment Control Systems at the LHC An Overview of the System Architecture An Overview of the System Architecture JCOP Framework Overview JCOP Framework.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
1 ALICE Detector Control System (DCS) TDR 28 January 2004 L.Jirdén On behalf of ALICE Controls Coordination (ACC): A.Augustinus, P.Chochula, G. De Cataldo,
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
Test Systems Software / FEE Controls Peter Chochula.
Control and Monitoring of Front-end and Readout Electronics in ALICE Peter Chochula.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
3 June 2003U. Frankenfeld1 TPC Detector Control System Status.
1 DCS TDR Key technical points & milestones TB 15 Dec 2003 L.Jirdén.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
1 Status & Plans DCS WS L.Jirdén. 2 DCS Planning FINAL INST COM- MISS BEAM OP PRE- INST DET DCS URD ENG. SOLUTIONS PROTOTYPE SUBSYSTEM.
1 ALICE Control System ready for LHC operation ICALEPCS 16 Oct 2007 L.Jirdén On behalf of the ALICE Controls Team CERN Geneva.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
André Augustinus 10 September 2001 Common Applications to Prototype A two way learning process.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
Update on Database Issues Peter Chochula DCS Workshop, June 21, 2004 Colmar.
Peter Chochula ALICE DCS Workshop, October 6,2005 DCS Computing policies and rules.
June 14, 2005 Alice DCS workshop, Utrecht S.Popescu Guidelines and conventions for ALICE PVSSII control software Graphical User Interface Naming and Numbering.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
DCS Workshop - L.Jirdén1 ALICE DCS PROJECT ORGANIZATION - a proposal - u Project Goals u Organizational Layout u Technical Layout u Deliverables.
The Joint COntrols Project Framework Manuel Gonzalez Berges on behalf of the JCOP FW Team.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
1 Responsibilities & Planning DCS WS L.Jirdén.
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
D etector C ontrol S ystem ALICE DCS workshop G. De Cataldo CERN-CH, A. Franco INFN Bari, I 1 Finite State Machines (FSM) for the ALICE DCS:
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
P. Chochula ALICE Week Colmar, June 21, 2004 Status of FED developments.
André Augustinus 21 June 2004 DCS Workshop Detector DCS overview Status and Progress.
Naming and Code Conventions for ALICE DCS (1st thoughts)
André Augustinus 10 March 2003 DCS Workshop Detector Controls Layout Introduction.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
JCOP Review, March 2003 D.R.Myers, IT-CO1 JCOP Review 2003 Architecture.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Bruno Belbute, October 2006 Presentation Rehearsal for the Follow-up meeting of the Protocol between AdI and CERN.
CERN, O.Pinazza: ALICE TOF DCS1 ALICE TOF DCS Answers to DCS Commissioning and Installation related questions ALICE week at CERN O. Pinazza and.
Peter Chochula ALICE Offline Week, October 04,2005 External access to the ALICE DCS archives.
L0 DAQ S.Brisbane. ECS DAQ Basics The ECS is the top level under which sits the DCS and DAQ DCS must be in READY state before trying to use the DAQ system.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
André Augustinus 21 June 2004 DCS Workshop Follow-up from last workshop.
DCS overview - L.Jirdén1 ALICE ECS/DCS – project overview strategy and status L.Jirden u Organization u DCS system overview u Implementation.
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
CONFIGURATION OF FERO IN ALICE Peter Chochula 7 th DCS Workshop, June 16, 2003.
T0 DCS Status DCS Workshop March 2006 T.Karavicheva on behalf of T0 team.
E Ethernet C CAN bus P Profibus HV HV cables LV LV cables (+busbar) S Serial (RS232) Signal cable Other/Unknown Liquid or Gas Cable and/or Bus PCI-XYZ.
Summary of TPC/TRD/DCS/ECS/DAQ meeting on FERO configuration CERN,January 31 st 2006 Peter Chochula.
André Augustinus 17 June 2002 Detector URD summaries or, what we understand from your URD.
Database Issues Peter Chochula 7 th DCS Workshop, June 16, 2003.
André Augustinus 18 March 2002 ALICE Detector Controls Requirements.
JCOP Framework and PVSS News ALICE DCS Workshop 14 th March, 2006 Piotr Golonka CERN IT/CO-BE Outline PVSS status Framework: Current status and future.
ALICE. 2 Experiment Size: 16 x 26 meters (some detectors >100m away from IP) Weight: 10,000 tons Detectors: 18 Magnets: 2 Dedicated to study of ultra.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
PVSS an industrial tool for slow control
DCS Status and Amanda News
SPD DCS Overview & FED Server
Peter Chochula Calibration Workshop, February 23, 2005
ATLAS MDT HV – LV Detector Control System (DCS)
Controlling a large CPU farm using industrial tools
The LHCb Run Control System
TPC Detector Control System
Tools for the Automation of large distributed control systems
Presentation transcript:

The Detector Control System – FERO related issues Peter Chochula On behalf of DCS team

The DCS sub-systems Main role of the DCS is to assure safe operation of ALICE. To fulfill this task, it has to communicate with a large number of devices The DCS devices are logically grouped into several sub-systems which are operated separately and synchronized via FSM tools implemented in the SMI++ package Main DCS sub-systems are: High Voltage (HV) Low voltage (LV) Front-End and Readout Electronics (FERO) Gas Cooling Some detectors implement additional sub-systems such as laser control, etc. On top of this, the DCS communicates with LHC, Detector Safety System, external services … In this talk only issues related directly to FEE will be discussed Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

The DCS Architecture and its Place in ALICE Online Hierarchy The DCS is a strictly hierarchical online system, its synchronization with DAQ, TRG and HLT is achieved via the ECS There are no horizontal connections between online systems Efficient operation of ALICE sub-detectors depends on communication between online systems and this is strongly influenced by the FERO design ECS DCS DAQ/RC TRG HLT SPD SPD TPC … TPC . SPD TPC … LV LV HV HV FERO FERO Gas Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

The (simplified) DCS operation of a device Configuration Database Device Configuration Device Control (Regulation, switching, handling of alarms and exceptions…) Device Monitoring Archive Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Two main groups of questions related to FERO need to be solved: How to access the electronics? How to operate the electronics? Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

How to access the devices Unlike the other online systems, the DCS communicates with wide range of different devices The nature as well as handling of acquired data differs from sub-system to sub-system There is no unique and standardized mechanism for accessing the DCS devices Profibus Ethernet JTAG EasyNet RS 232 VME CANbus Custom solutions… Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Access to the DCS hardware The main operation tool of the DCS is a commercial SCADA system : PVSS-II The PVSS-II offers a limited set of tools for accessing the hardware. Native PVSS-II drivers are used where appropriate (e.g. for RS-232) The aim is to use a set of software interfaces, which will hide the complexity of the underlying hardware The ALICE DCS profits from (and contributes to) common developments between four LHC experiments – the JCOP project. The JCOP framework software package provides solutions for most standard devices (power supplies, ELMB based monitors, etc.) Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

The hardware access standards in the DCS – the OPC Most commercial devices are shipped with software based on the OPC industrial standard OPC client is integrated in PVSS The DCS team is testing OPC servers and provides feedback to manufacturers if needed Reports are regularly given at DCS workshops organized during the ALICE weeks Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

The main problem related to the FERO control in ALICE is caused by the fact, that different detectors are using different access strategies. The FERO of some sub-detectors is accessed exclusively via the DDL which belongs to the DAQ domain and is handled by the DAQ/RC Other architectures use a wide variety of access paths, which are handled by the DCS In principle we can group all ALICE sub-detectors into four architectural classes shown on the next slide Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

FERO Access Architectures Class B Control Monitoring FERO DDL Non-DDL Class A Control Monitoring DDL FERO Class C Class D Control Monitoring Control Monitoring Non-DDL Non-DDL FERO FERO Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

The hardware access standards in the DCS – the FED The ALICE FERO (and some additional devices like Arem-Pro power supplies) are not covered by JCOP developments Unfortunately the operation of this group of devices proves to be a very complicated one The Front-End Device (FED) concept for accessing this group of devices has been elaborated (see presentations on DCS workshops and TB in May 2004) The corresponding software architecture (called the FED Server) is based on DIM protocol, which is also integrated in the PVSS-II Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Monitoring of all classes FED Architecture ECS Class B,C,D Control DAQ/RC DCS Class A+B Control DDL SW FED Client Monitoring of all classes FED DDL SW FED Server Control CPU Control CPU Profibus, JTAG, etc. DDL FERO Hardware Layer Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Architecture of the FED Server (PVSS) DIM Client DIM Interface layer allows for communication with higher levels of software Software Client Commands & Data Services DIM server FED Server Application layer contains detector control and monitoring code (agents) Database CA1 CAi MA1 MAi Device driver(s) Hardware Hardware access layer contains device drivers Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

How to operate the FED The architecture of the FED server covers also some basic operational aspects Implemented commands allow for integration with the DCS The DCS handles cross-dependencies between sub-systems e.g. FERO configuration is pending until the LV becomes operational, this is dependent on cooling status etc. Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Standard FED Operation Running Stop Run Configured Re - Configure Switch-Off Configuring Configure OFF Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Additional (detector-specific) FED commands A number of FED Server commands is unique for sub-detectors E.g. JTAG chain verification in SPD Autocalibration… In close collaboration with DCS contact persons we try to identify these commands and propose solutions. In some cases only a very limited feedback is provided – this is mostly given by the fact that the software design has not reached the implementation phase yet. FED Server prototyping is very well advanced for some detectors (SPD, TPC, TRD, PHOS) and we are profiting very much from a very good collaboration with detector groups Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

FERO Device Monitoring Principle in DCS Sampling interval PVSS Alarm Limit Publishing deadband Value recorded In DCS Published value Acquired values PVSS Alarm Limit Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

FERO operation A number of questions still needs to be answered: What detector specific commands need to be implemented? How do we monitor and treat SEU ? What are the sub-system and system dependencies? (switching order….) What parameters need to be monitored and at what frequencies? What are the expected actions if some parameters are out of range? (sometimes is sufficient to record the anomaly in the archive, sometimes we can recover the settings, in some cases the run must be stopped…) In some cases the DCS is expected to monitor for example local trigger counters – what happens if these are out of range ? WHO and WHEN will starts the software developments on detector side, HOW and WHERE do we test the prototypes? Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Typical problem requiring synchronization between online systems (Example – Class A device) VR Failure (e.g. due to SEU) ECS As a consequence the FERO gets mis-configured DCS TRG DAQ Recovery Action by DCS DCS informs the DAQ and TRG via ECS VR FERO DAQ reloads the FERO Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Device Configuration and Archiving All configuration data is stored in online database Present implementation is based on Oracle Monitoring and alarm limits are read by PVSS and sent to the FED Servers FEE parameters (thresholds, mask matrices etc.) are read directly by FED Servers in order to minimize the amount of data passed through the PVSS-I All acquired data, alarms and errors are stored in DCS archive (Oracle) and displayed in PVSS panels Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

DCS Configuration Database System Static Configuration Configuration DB Device Static Configuration Common Solutions (FW devices only) Device Dynamic Configuration PVSS-II & underlying software FERO Static Configuration Alice Specific FERO Dynamic Configuration Hardware ALICE FERO Configuration will be shared between TRG,DAQ, DCS and ECS Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

DCS Archiving PVSS Present scheme Archive Archive Condition DB New scheme Archive Archive New archiving model will be introduced in PVSS-II v. 3 - release is scheduled for September 2004. Depending on its performance, the CondDB model could be dropped PVSS Archive ORACLE Database Condition DB Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Configuration and Archiving Related Questions Again, there are questions to be answered: What are the configuration parameters and how are they used? (data size, loading sequence…) What are the requirements for data archival? Interface to offline, analysis procedures …? How are the configuration parameters obtained? Calibration procedures Data analysis Configuration database updates Who, when, where, how? The DCS team is presently testing the prototypes based on ORACLE. First results will be presented at coming DCS workshop Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004

Conclusions FERO access strategy has been developed Prototyping is proceeding well for SPD, HPMID, TPC, TRD and PHOS Operational requirements are being studied Feedback is essential: please make sure that the DCS-URD’s are updated DCS team is happy to help you DCS lab and its infrastructure are ready for testing the prototypes Many questions are still open, please inform us about the progress of developments and your plans Peter Chochula, ALICE workshop on radiation hard electronics, August 30, 2004