Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.

Slides:



Advertisements
Similar presentations
Clara Gaspar, April 2012 The LHCb Experiment Control System: Automation concepts & tools.
Advertisements

The Detector Control System – FERO related issues
1 DCS Installation & commissioning TB 18 May 06 L.Jirden Central DCS Detector DCS Status.
S.PopescuAlice DCS workshop, Colmar, Standards for control room PVSS panels A brainstorm meeting.
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
Experiment Control Systems at the LHC An Overview of the System Architecture An Overview of the System Architecture JCOP Framework Overview JCOP Framework.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
CHEP 2012 – New York City 1.  LHC Delivers bunch crossing at 40MHz  LHCb reduces the rate with a two level trigger system: ◦ First Level (L0) – Hardware.
1 ALICE Detector Control System (DCS) TDR 28 January 2004 L.Jirdén On behalf of ALICE Controls Coordination (ACC): A.Augustinus, P.Chochula, G. De Cataldo,
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
L. Granado Cardoso, F. Varela, N. Neufeld, C. Gaspar, C. Haen, CERN, Geneva, Switzerland D. Galli, INFN, Bologna, Italy ICALEPCS, October 2011.
Clara Gaspar, March 2006 LHCb’s Experiment Control System Step by Step.
Clara Gaspar, November 2012 Experiment Control System LS1 Plans…
First attempt of ECS training Work in progress… A lot of material was borrowed! Thanks!
Robert Gomez-Reino on behalf of PH-CMD CERN group.
3 June 2003U. Frankenfeld1 TPC Detector Control System Status.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
1 Status & Plans DCS WS L.Jirdén. 2 DCS Planning FINAL INST COM- MISS BEAM OP PRE- INST DET DCS URD ENG. SOLUTIONS PROTOTYPE SUBSYSTEM.
1 ALICE Control System ready for LHC operation ICALEPCS 16 Oct 2007 L.Jirdén On behalf of the ALICE Controls Team CERN Geneva.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
Status and plans for online installation LHCb Installation Review April, 12 th 2005 Niko Neufeld for the LHCb Online team.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
DCS Workshop - L.Jirdén1 ALICE DCS PROJECT ORGANIZATION - a proposal - u Project Goals u Organizational Layout u Technical Layout u Deliverables.
The Joint COntrols Project Framework Manuel Gonzalez Berges on behalf of the JCOP FW Team.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
1 Responsibilities & Planning DCS WS L.Jirdén.
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
Report on the Commissioning Task Force activity Global and sub-detector views on ECS Histogram handling : – histogram handling and PVSS – the ALEPH and.
ALICE, ATLAS, CMS & LHCb joint workshop on
Controls EN-ICE Finite States Machines An introduction Marco Boccioli FSM model(s) of detector control 26 th April 2011.
André Augustinus 21 June 2004 DCS Workshop Detector DCS overview Status and Progress.
Giovanni Polese1 RPC Detector Control System for MTCC Pierluigi Paolucci, Anna Cimmino I.N.F.N. of Naples Giovanni Polese Lappeenranta University.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
JCOP Review, March 2003 D.R.Myers, IT-CO1 JCOP Review 2003 Architecture.
Bruno Belbute, October 2006 Presentation Rehearsal for the Follow-up meeting of the Protocol between AdI and CERN.
CERN, O.Pinazza: ALICE TOF DCS1 ALICE TOF DCS Answers to DCS Commissioning and Installation related questions ALICE week at CERN O. Pinazza and.
L0 DAQ S.Brisbane. ECS DAQ Basics The ECS is the top level under which sits the DCS and DAQ DCS must be in READY state before trying to use the DAQ system.
Controls EN-ICE FSM for dummies (…w/ all my respects) 15 th Jan 09.
Clara Gaspar, July 2005 RTTC Control System Status and Plans.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
ECS and LS Update Xavier Vilasís-Cardona Calo Meeting - Xvc1.
DCS overview - L.Jirdén1 ALICE ECS/DCS – project overview strategy and status L.Jirden u Organization u DCS system overview u Implementation.
Clara Gaspar, December 2012 Experiment Control System & Electronics Upgrade.
Configuration database status report Eric van Herwijnen September 29 th 2004 work done by: Lana Abadie Felix Schmidt-Eisenlohr.
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
Clara Gaspar on behalf of the ECS team: CERN, Marseille, etc. October 2015 Experiment Control System & Electronics Upgrade.
LHCb Configuration Database Lana Abadie, PhD student (CERN & University of Pierre et Marie Curie (Paris VI), LIP6.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
T0 DCS Status DCS Workshop March 2006 T.Karavicheva on behalf of T0 team.
André Augustinus 17 June 2002 Detector URD summaries or, what we understand from your URD.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
Clara Gaspar, May 2010 SMI++ A Tool for the Automation of large distributed control systems.
PVSS an industrial tool for slow control
ATLAS MDT HV – LV Detector Control System (DCS)
CMS – The Detector Control System
Controlling a large CPU farm using industrial tools
WinCC-OA Upgrades in LHCb.
Handling online information in the LHCb experiment
The LHCb Run Control System
Philippe Vannerem CERN / EP ICALEPCS - Oct03
Experiment Control System
Tools for the Automation of large distributed control systems
Presentation transcript:

Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries

Clara Gaspar, April ECS Scope Detector Channels Front End Electronics Readout Network High Level Trigger Storage L0 Experiment Control System DAQ DCS Devices (HV, LV, GAS, Temperatures, etc.) External Systems (LHC, Technical Services, Safety, etc) TFC

Clara Gaspar, April Project Organization Online Project DAQ (Readout) TFC (Timing and Fast Control) Experiment Control System Detector Control DAQ Control TFC Control L0 Trigger Control Sub-detectors Infrastructure Control

Clara Gaspar, April Central team ❚ Responsibilities: ❙ Provide guidelines & tools for sub-detectors: ❘ JCOP + LHCb specific ❙ Provide complete central applications: ❘ Central DAQ/TFC/HLT control, RunControl, etc. ❙ Provide central services ❘ Infrastructure (PCs, network, Databases, etc.) ❙ Provide support to Sub-systems ❘ During development, installation and commissioning ❙ Coordination & Integration

Clara Gaspar, April Sub-detector/Sub-systems ❚ Responsibilities: ❙ Integrate their own devices ❘ Mostly from FW (JCOP + LHCb) ❙ Build their control hierarchy ❘ According to Guidelines (using templates) ❙ Test, install and commission their sub- systems ❘ With the help of the central team

Clara Gaspar, April Tools: The Framework ❚ JCOP + LHCb Framework (Based on PVSS II) ❙ A set of tools to help sub-systems create their control systems: ❘ Complete, configurable components: 〡 High Voltages, Low Voltages, Temperatures (ex.: CAEN, WIENER, ELMB(ATLAS), etc.) ❘ Tools for defining User Components: 〡 Electronics boards (SPECS/ CC-PC) 〡 Other home made equipment (DIM protocol) ❘ Other Tools, for example: 〡 FSM for Building Hierarchies 〡 Configuration DB 〡 Interface to Conditions DB 〡 Archiving, Alarm handling, etc.

Clara Gaspar, April Tools: Electronics Interfaces ❚ Interfaces proposed: ❙ SPECS/ELMB – for electronics in radiation areas ❙ CC-PC – for electronics in counting rooms ➨ Status ❘ HW & Low-level software ready ❘ PVSS interface ready (allows access to any board) ❘ A Tool to model/describe electronics boards ❘ TELL1 control being developed ❘ Tools are started to be used by sub-systems SPECS CAN Ethernet M S S S I2C JTAG Bus I2C JTAG Bus I2C JTAG Bus Control PC

Clara Gaspar, April Tools: Databases PVSS. To Offline... PVSS Cond.. DB Conf. DB To Offline... PVSS Arch. ❚ Three Logical Databases in the Online System Experimental Equipment

Clara Gaspar, April Tools: Databases ❚ Configuration DB (Oracle) contains: ❘ All data needed to configure the HW or SW for the various running modes (ex.: HV settings, Pedestals, trigger settings, etc.) ➨ Status: First versions exist ❘ LHCb part: connectivity & queries (partitioning) ❘ JCOP part: device settings (running modes and versions) ❚ PVSS Archive (Oracle) contains: ❘ All monitoring data read from HW for monitoring and debugging of the Online System (ex.: HV readings, temperatures, etc.) ➨ Status: Delivered with PVSS (Still worrying) ❚ Conditions DB (COOL/Oracle) contains: ❘ A subset of the monitoring data read from HW if it is needed for Offline processing ❘ Some configuration data once it has been used (ex.: trigger settings) ➨ Status: COOL tools provided by LCG, PVSS interface being developed

Clara Gaspar, April Tools: Integration (FSM) ❚ Started defining guidelines: ❙ Naming conventions. ❙ Standard “domains” per sub-detector: ❙ Standard states & actions per domain

Clara Gaspar, April Guidelines: FSM Domains ❚ Standard “domains” per sub-detector: ❙ DCS ❘ DCS Infrastructure (Cooling, Gas, Temperatures, pressures, etc) normally stable throughout a running period ❙ HV ❘ High Voltages or in general components that depend on the status of the LHC machine (fill related) ❙ DAQ ❘ All Electronics and components necessary to take data (run related) ❙ DAI ❘ Infrastructure necessary for the DAQ to work (computers, networks, electrical power, etc.) in general also stable throughout a running period. ❚ And standard states & transitions per domain. ❙ Templates available as FW Component ❚ Doc available in EDMS:

Clara Gaspar, April Guidelines: FSM States ❚ State Diagram for Trigger and DAQ Domains: ❙ Possible intermediate “CONFIGURING” and “STARTING” states if operations slow… RUNNING ERRORUNKNOWN READY NOT_READY Start Configure Stop Reset Recover

Clara Gaspar, April MUON DCS MUON HV MUON DAQI MUON DAQ LHCb Hierarchy VELO DCS Infrast.DCSHVDAQIDAQTFCL0HLTLHC VELO HV VELO DAQI VELO DAQ VELO DCS_1 VELO DCS_2 VELO DAQ_1 VELO DAQ_2 ECS VELO Dev1 VELO DevN SubFarm 1 SubFarm N ❚ “Green” means centrally provided

Clara Gaspar, April Application: Infrastructure ❚ Integration of Infrastructure Services: ❘ Power Distribution and Rack/Crate Control (CMS) ❘ Cooling and Ventilation Control ❘ Magnet Control (Monitoring) ❘ Gas Control ❘ Detector Safety System ➨ Status: All advancing in parallel (mostly JCOP) ❚ And interface to: ❘ LHC machine ❘ Access Control System ❘ CERN Safety System ➨ Status: Tools exist (DIP protocol) ❚ Sub-detectors can use these components: ❙ For defining logic rules (using their states) ❙ For high-level operation (when applicable)

Clara Gaspar, April Application: TFC Control ❚ Integrates: CC-PC, FSM, ConfDB, etc. ❙ Being used by several Sub-detectors

Clara Gaspar, April Application: CPU Control ❚ For All Control PCs and Farm nodes: ❘ Very Complete Monitoring of: 〡 Processes running, CPU usage, Memory usage, etc. 〡 Network traffic 〡 Temperature, Fan speeds, etc. ❘ Control of Processes: 〡 Job description DB (Configuration DB) 〡 Start/Stop any job (type) on any group of nodes ❘ Control of the PC 〡 Switch on/off/reboot any group of nodes ❙ FSM Automated Monitoring (& Control): ❘ Set CPU in "ERROR" when monitored data bad ❘ Can/will take automatic actions

Clara Gaspar, April Application: HLT Control ❚ Integrated for Real Time Trigger Challenge: ❙ Monitoring & Control of: ❘ Farm Infrastructure ❘ Trigger Jobs (Gaudi) ❘ Event Builder ❘ (MEP Producer) ❘ Run Control ➨ The RTTC was one of the pioneers of the integration of ECS components ➨ Being upgraded for 1MHz readout Event Builder Switch SFC PC SFC PC SFC PC Control PC Event Builder Switch Control PC switch PC switch PC switch PC Control PC

Clara Gaspar, April RTTC Run-Control

Clara Gaspar, April Next “Challenges” ❚ VELO “ACDC” Now and in June (Alignment Challenge and Detector Commissioning) ❙ Complete Read-out chain (1/4 of VELO): ❘ Front-end boards, Tell1 boards, TFC System, Readout Network, Small Farm ❙ Several DCS Components: ❘ Few ELMBs (Temp. and LV monitoring), CAEN Easy (LV), ISEG (HV) ❙ Run Control ❚ This setup will evolve into a combined TestBeam in September with other Sub-detectors

Clara Gaspar, April Other “Challenges” ❚ Configuration DB Tests ❙ Installing Oracle RAC in our private network (With help/expertise from Bologna) ➨ Simulate load -> Test Performance ➨ Test with more than 1 DB servers ❚ HLT 1 MHz scheme (RTTC II) ❙ Full Sub-farm with final number of processes in each node ➨ Test performance and functionality of Monitoring & Control System

Clara Gaspar, April Manpower ❚ Our Team (Online people participating in control): ❚ If it doesn’t decrease and with JCOP’s help -> seems adequate TaskCERNExternal effort Coordination0.5 DIM & FSM (SMI++) tools:0.51 (Rutherford) Electronics HW (SPECS, CC-PC): (Orsay/Genova) Electronics SW Tools (SPECS, CC-CP):2 DCS + Infrastructure:1 Configuration DB:1 Run-Control & Sub-detector integration:10.5 (Marseille) Farm Control & Gaudi Monitoring:21.5 (Bologna) Central DAQ & TFC Control:1 Conditions DB interface:1 (soon)

Clara Gaspar, April Coordination ❚ Main aims: ❙ Promote commonality ❘ To allow reuse of software & ease maintenance 〡 Standard electronics interfaces 〡 Central purchasing of PCs, network, etc. ❙ Provide centrally as much as possible: ❘ To avoid sub-detector developments and try to optimize performance. 〡 Integrate the access to electronics boards 〡 FSM Guidelines with template implementation 〡 Integrate the Conf. DB access ( “Configurator” Object) ❙ Obtain a coherent and homogeneous system

Clara Gaspar, April Coordination ❚ Communication channels: ❙ ECS Web page ❘ Info, guidelines, software ❙ 1 st ECS Tutorial organized recently ❘ Very successful (71 participants !?) ❙ ECS section in the LHCb week Online meeting ❙ Many individual discussions per Sub-detector (case-by-case basis)

Clara Gaspar, April Integration & Commissioning ❚ Sub-system commissioning ❙ Done by each team within their schedule (for ex. after there are cables and power) ❘ With our help ❚ Central ECS & Infrast. Commissioning (PCs, Network, DBs, and central applications) ❙ Done centrally in parallel ❚ Integration of Sub-systems (into ECS) ❙ Done centrally one-by-one after each ready ❘ With their help

Clara Gaspar, April Worries ❚ Technical worries ❙ Performance ❘ PVSS (for complex electronics and farm jobs) 〡 Assumed 1 PC for ~50 boards/~50 farm nodes 〡 But then: ❘ Our system is highly parallel (could buy more PCs) ❘ Will try disabling “Last Value Archive”… ❘ Database Access (Conf.DB, Archive, Cond. DB) 〡 But then: ❘ Will have an Oracle RAC with several DB servers… ❘ FSM hierarchies (in other experiments) 〡 Too many nodes -> needs coordination effort (>1000 nodes for HV before the real channels)

Clara Gaspar, April Worries ❚ Non-Technical worries ❙ Support: ❘ FwDIM and FwFSM are widely used (FwFarmMon) 〡 Need more (better) help from JCOP/Central DCS teams ❘ To the Sub-detectors 〡 Try to make sure they respect the rules… ❙ Schedule: ❘ Sub-detectors are late (just starting) 〡 But then: More and better tools are available! ❙ Commissioning: ❘ All the problems we don’t know yet… 〡 For example: Error Detection and Recovery