The Control System for the ATLAS Pixel Detector

Slides:



Advertisements
Similar presentations
ALICE DCS, Heidelberg 8 Sept G. De Cataldo, CERN CH and INFN Bari;A. Franco INFN Bari 1 Updating on the HV control systems in ALICE The DELPHI HV,
Advertisements

1 of 17 LUKAS PÜLLEN New prototypes for components of a control system for the new ATLAS pixel detector at the HL-LHC New prototypes for components of.
J. Varela, CERN & LIP-Lisbon Tracker Meeting, 3rd May Partitions in Trigger Control J. Varela CERN & LIP-Lisbon Trigger Technical Coordinator.
DCS workshop 13-14/6/2005G. De Cataldo, CERN-CH and INFN bari1 Common FSM’s updates An exercise to design the standard FSMs for the DCS, the HV and the.
SPD DCS Status Report Ivan Amos Calì a,b, S.Ceresa a,c, C.Torcato de Matos a a CERN-AIT a CERN-AIT b Università degli studi di Bari b Università degli.
Experiment Control Systems at the LHC An Overview of the System Architecture An Overview of the System Architecture JCOP Framework Overview JCOP Framework.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Atlas SemiConductor Tracker Andrée Robichaud-Véronneau.
The Detector Control System for the ATLAS SemiConductor Tracker
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
DCS LEB Workshop ‘98, Rome, Detector Control System, H.J.Burckhart,1 Detector Control System H.J Burckhart, CERN u Motivation and Scope u Detector and.
Robert Gomez-Reino on behalf of PH-CMD CERN group.
System Integration Tool Basic Introduction. „System Integration Tool “2Content I.Nomenclature I.Nomenclature II.Introduction II.Introduction III.Implementation.
CERN - IT Department CH-1211 Genève 23 Switzerland t The High Performance Archiver for the LHC Experiments Manuel Gonzalez Berges CERN, Geneva.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
Towards a Detector Control System for the ATLAS Pixeldetector Susanne Kersten, University of Wuppertal Pixel2002, Carmel September 2002 Overview of the.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
09/11/20061 Detector Control Systems A software implementation: Cern Framework + PVSS Niccolo’ Moggi and Stefano Zucchelli University and INFN Bologna.
MDT PS DCS for ATLAS Eleni Mountricha
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
DCS Workshop - L.Jirdén1 ALICE DCS PROJECT ORGANIZATION - a proposal - u Project Goals u Organizational Layout u Technical Layout u Deliverables.
ATLAS MDT Power Supply DCS Anastasios Iliopoulos 15/08/2006.
ATCA based LLRF system design review DESY Control servers for ATCA based LLRF system Piotr Pucyk - DESY, Warsaw University of Technology Jaroslaw.
DCS Plans and Pending Activities for 2013/14 Shutdown Muons in general Replacement/upgrade of DCS computers OS migration to Win-7, Win-2008 or SL6 PVSS.
The Joint COntrols Project Framework Manuel Gonzalez Berges on behalf of the JCOP FW Team.
ALICE DCS Meeting.- 05/02/2007 De Cataldo, Franco - INFN Bari - 1 ALICE dcsUI Version 3.0 -dcsUI v3.0 is ready and will be soon posted on the ACC site.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
D etector C ontrol S ystem ALICE DCS workshop G. De Cataldo CERN-CH, A. Franco INFN Bari, I 1 Finite State Machines (FSM) for the ALICE DCS:
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
DCS workshop July 2007 Overview Installed detectors Installed detectors FMD3 commissioning FMD3 commissioning FSM Hierarchy FSM Hierarchy LV+HV FSMs LV+HV.
Controls EN-ICE Finite States Machines An introduction Marco Boccioli FSM model(s) of detector control 26 th April 2011.
J. Varela, LIP-Lisbon/CERN LEADE WG Meeting CERN, 29 March 2004 Requirements of CMS on the BST J. Varela LIP Lisbon / CERN LHC Experiment Accelerator Data.
Giovanni Polese1 RPC Detector Control System for MTCC Pierluigi Paolucci, Anna Cimmino I.N.F.N. of Naples Giovanni Polese Lappeenranta University.
The Detector Control Power System of the Monitored Drift Tubes of the ATLAS Experiment Theodoros Alexopoulos NTU Athens TWEPP08 September 17, 2008 Naxos,
JCOP Review, March 2003 D.R.Myers, IT-CO1 JCOP Review 2003 Architecture.
Bruno Belbute, October 2006 Presentation Rehearsal for the Follow-up meeting of the Protocol between AdI and CERN.
IDE DCS development overview Ewa Stanecka, ID Week, CERN
Controls EN-ICE FSM for dummies (…w/ all my respects) 15 th Jan 09.
XXVI workshop on resent developments in High Energy Physics and Cosmology Ancient Olympia,16-19 April 2008 Tsarouchas Charilaos NTUA Detector Control System.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
DCS overview - L.Jirdén1 ALICE ECS/DCS – project overview strategy and status L.Jirden u Organization u DCS system overview u Implementation.
CMS ECAL DCS 10.Oct S.Zelepoukine ICALEPCS CMS ECAL DCS 1 The Detector Control System for the Electromagnetic Calorimeter of the CMS Experiment.
STAR Pixel Detector readout prototyping status. LBNL-IPHC-06/ LG22 Talk Outline Quick review of requirements and system design Status at last meeting.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
AFP Trigger DAQ and DCS Krzysztof Korcyl Institute of Nuclear Physics - Cracow on behalf of TDAQ and DCS subsystems.
G. Anders, G. Avolio, G. Lehmann Miotto, L. Magnoni CERN, Geneva, Switzerland The Run Control System and the Central Hint and Information Processor of.
The ALICE Silicon Pixel Detector Control system and Online Calibration tools Ivan Amos Calì (a,b) On behalf of the SPD Project in.
22/May/02 JCOP meeting G. De Catatldo-A.Franco INFN Bari Italy 1 The Control System of the HMPID up to the end of 2001 ; 2002: The FSM toolkit, Hierarchy.
DCS Meeting - 17/6/2002 G. De Cataldo, A.Franco - INFN Bari - 1 The implementation of the HMPID DCS in the PVSS-JCOP Framework The Liquid Circulation and.
André Augustinus 18 March 2002 ALICE Detector Controls Requirements.
The (prototype) C&V Framework component used for the SPD Cooling Control A.Tauro, G.De Cataldo.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
ALICE. 2 Experiment Size: 16 x 26 meters (some detectors >100m away from IP) Weight: 10,000 tons Detectors: 18 Magnets: 2 Dedicated to study of ultra.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
CHEP 2010 – TAIPEI Robert Gomez-Reino on behalf of CMS DAQ group.
SPD DCS Overview & FED Server
ATLAS MDT HV – LV Detector Control System (DCS)
CMS – The Detector Control System
Controlling a large CPU farm using industrial tools
The LHCb Run Control System
Philippe Vannerem CERN / EP ICALEPCS - Oct03
Experiment Control System
Pierluigi Paolucci & Giovanni Polese
Pierluigi Paolucci & Giovanni Polese
Tools for the Automation of large distributed control systems
Pierluigi Paolucci & Giovanni Polese
Presentation transcript:

The Control System for the ATLAS Pixel Detector Kerstin Lantzsch, CERN/Bergische Universität Wuppertal Gentner Day, 18.11.2009

The ATLAS Pixel Detector evaporative Cooling System 88 Parallel Cooling Circuits/loops: 56 PCCs with 2 staves each 24 PCCs with 2 disk sectors each 8 PCCs for 272 optoboards 1744 detector modules 3 Barrel Layers 3 Disks per Endcap 272 Halfstaves/Disksectors optical data transfer: 272 readout groups (Halfstaves/Disksectors) 80 million readout channels length: 1.3 m,  34.4 cm, weight: ~4.4 kg Kerstin Lantzsch, Gentner Day 18.11.2009

Tasks of the Detector Control System Control Integration of the Hardware into the DCS software and its control for the supply of the detector. Monitoring and Archiving Supervising values which are relevant to the state of the detector, and archive them for later trouble shooting. Operation Provide the tools for the operation of the detector by the shift crew. Interlock Ensure the safety of the detector and personnel. Kerstin Lantzsch, Gentner Day 18.11.2009

Kerstin Lantzsch, Gentner Day 18.11.2009 Hardware The DCS hardware supervises detector modules, the optoboards, the readout crates, and the environment. It consists of: The power supply system Monitoring units The interlock system Kerstin Lantzsch, Gentner Day 18.11.2009

Kerstin Lantzsch, Gentner Day 18.11.2009 Software PVSS II (by ETM, Eisenstadt, Austria) was selected by the LHC experiments' Joint Controls Project (JCOP) as a base for the individual control systems. Pixel DCS is built of 3 software layers: Frontend Integration Tools (FIT): Establish communication to the hardware Data structures follow hardware properties System Integration Tool (SIT): Mapping between hardware and detector Initialises archiving of DCS data Finite State Machine (FSM): Overview and Operation of detector. Main shifter tool. Protection routines Tools are provided by JCOP framework and ATLAS DCS Kerstin Lantzsch, Gentner Day 18.11.2009

Kerstin Lantzsch, Gentner Day 18.11.2009 The FSM „The FSM“ is a hierarchical structure of objects that behave like separate FSMs, they are described by states, transitions, and commands. In the hierarchy, commands are propagated down to the children, states are propagated up. A tool for creation of control systems on FSM basis is provided centrally: SMI++ (State Management Interface). The FSM is a combination from smi++ and PVSS. While state and command propation are handled by smi processes, the generation of the device states and the execution of commands is pure PVSS. While there are some central requirements, the items specific for subdetector operation must be implemented by the subdetectors. ATLAS specific is the subdivision into state (operational mode) and status (health of the system) Kerstin Lantzsch, Gentner Day 18.11.2009

Kerstin Lantzsch, Gentner Day 18.11.2009 The Pixel FSM The FSM tree for the Pixel detector is driven by its mechanical structure, and granularity of readout and services, while the states and commands are adapted for operational aspects specific to the pixel detector. This is driven by the readout group and its devices. Additional layers of protection are in the form of software interlocks built into the FSM scripts, and perform automated actions to protect the detector, which can not be expected of the shifter. Kerstin Lantzsch, Gentner Day 18.11.2009

Kerstin Lantzsch, Gentner Day 18.11.2009 The Pixel FSM Tree Subdetector DAQ-Partition Layer Cooling Readout / services Device Units Kerstin Lantzsch, Gentner Day 18.11.2009

Kerstin Lantzsch, Gentner Day 18.11.2009 The Readout Group The Readout group mainly consists of the 6/7 Detector Modules serviced by one Optoboard, as well as their common supply units. The synchronization of commands is handled by a special Device Unit. The Device Units are rather complex (e.g. Module state and status is depending on 6 parameters). The calculation of state and status is distributed over 17 processes („managers“) on 3 DCS-PCs. These PVSS-based scripts are also responsible for software Interlocks in case of high temperature, critical current values, problems in the cooling system. Kerstin Lantzsch, Gentner Day 18.11.2009

State Diagram of the Readout Group The State of the readout group is determined by the state of the device units Not all states can be reached by DCS actions. Most prominently READY can only be reached passively, after the DAQ sends the configuration to the modules. Defined States allow for the independent operation of single components. The pixel specific states are well-proven to be adequate for operation. Kerstin Lantzsch, Gentner Day 18.11.2009

Kerstin Lantzsch, Gentner Day 18.11.2009 Detector Overview Overview over all the Readout groups in one panel Not all nodes must contribute to the parent‘s state (e.g. disabled due to hardware problems) In STANDBY for „Beam“ Kerstin Lantzsch, Gentner Day 18.11.2009

Kerstin Lantzsch, Gentner Day 18.11.2009 Summary and outlook Work on the FSM had to cover the evolving requirements from moving from the 10% system test, to the final system in the pit which is the main interface for the operation of the detector. In addition to providing software interlock, scripts are in place which handle the reaction to and interaction with external systems, mainly the cooling system, but also handling of beam related issues. The handling of the DCS crates is integrated into the FSM, allowing for easier recovery in case of shutdowns. Besides the work on the detector control system, I want to start looking into correlations between information from Pixel and the ATLAS Beam Conditions Monitor (BCM). Kerstin Lantzsch, Gentner Day 18.11.2009