Mitglied der Helmholtz-Gemeinschaft PANDA MVD Slow Control Issues Harald Kleines, Forschungszentrum Jülich, ZEA-2.

Slides:



Advertisements
Similar presentations
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
Advertisements

TPC-RO DCS set-up PVSS main(s) Iseg OPC server HV Iseg crates (3)LV Wiener crates (22) CAN bus Interface to FEE -temp.mon -I &V mon. Wiener OPC server.
CPV DCS STATUS REPORT Mikhail Bogolyubsky (IHEP, Protvino) Serguei Sadovsky (IHEP, Protvino) CERN, DCS meeting, 30 January, 2007.
Electrical distribution for ALICE experiment & Electronic Rack Control S. Philippin.
André Augustinus 16 June 2003 DCS Workshop Safety.
Digital RF Stabilization System Based on MicroTCA Technology - Libera LLRF Robert Černe May 2010, RT10, Lisboa
2D Detectors DAQ Overview 2D detectors are organized as tiles providing 10G Ethernet serialized portions of the full.
Tracker Controls MICE Controls and Monitoring Workshop September 25, 2005 A. Bross.
Shelf Management & IPMI SRS related activities
EPICS on TPS RF System Yu-Hang Lin Radio Frequency Group NSRRC.
DCS LEB Workshop ‘98, Rome, Detector Control System, H.J.Burckhart,1 Detector Control System H.J Burckhart, CERN u Motivation and Scope u Detector and.
The Detector Safety System for LHC Experiments Stefan Lüders ― CERN EP/SFT & IT/CO CHEP03 ― UC San Diego ― March 27 th, 2003.
SCADA and Telemetry Presented By:.
A* candidate for the power supply Wiener MPOD-LV crate w/ remote control only (except for local on/off switch) Type “EC LV” Front or rear connections (reverse.
LV boards Detector 8 m 2 6 sectors 8 m 2 6 sectors 8 m 2 6 sectors EPICS CLIENT SCS for custom hardware is also based on EPICS, with the particularity.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
5 March DCS Final Design Review: RPC detector The DCS system of the Atlas RPC detector V.Bocci, G.Chiodi, E. Petrolo, R.Vari, S.Veneziano INFN Roma.
Towards a Detector Control System for the ATLAS Pixeldetector Susanne Kersten, University of Wuppertal Pixel2002, Carmel September 2002 Overview of the.
SMACS Slow Monitor And Control System Developed system for CDF-TOF proposed for Atlas-MDT/RPC.
Gunther Haller SiD LOI Meeting March 2, LOI Content: Electronics and DAQ Gunther Haller Research Engineering Group.
VELO upgrade Front-end ECS LHCb upgrade electronics meeting 12 April 2012 Martin van Beuzekom on behalf of the VELO upgrade group Some thoughts, nothing.
15/12/10FV 1 Common DCS hardware. General ideas Centralized system: o Common computers. Detector dedicated computers (with detector participation) only.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
SALTRO TPC readout system Presented by Ulf Mjörnmark Lund University 1.
LNL 1 SLOW CONTROLS FOR CMS DRIFT TUBE CHAMBERS M. Bellato, L. Castellani INFN Sezione di Padova.
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
The Scheme of Slow Control System in BESIII Xie Xiaoxi Gao Cuishan
20/10/ PHOS Cooling status (1-20 Oct) Brief 28 Sep start cool down. Steps was: +5, -3, -10, -15, -20, -25, -30°C (by liquid).
André Augustinus 10 March 2003 DCS Workshop Detector Controls Layout Introduction.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
CPPM Pixel Evaporative Cooling Plant Description & Operation with C 3 F 8 or C 4 F 10 G. Hallewell, CPPM, Feb 07, 2005.
Chiller control system Lukasz Zwalinski – PH/DT.
DØ Online Workshop3-June-1999S. Fuess Online Computing Overview DØ Online Workshop 3-June-1999 Stu Fuess.
DT Shifter Training Infrastructure A. Benvenuti, M.Giunta 07/07/2010.
DCS Workshop, CERN MARCH ACORDE (Alice Cosmic ray detector) 60 scintillator modules (120 HV channels) Each module will have two scintillator counters.
60kW Thermosiphon control system
CMS consolidation activities of detector cooling system Detector cooling project P. Tropea.
I & C.
11 th February 2008Brian Martlew EPICS for MICE Status of the MICE slow control system Brian Martlew STFC, Daresbury Laboratory.
CEA DSM Irfu SIS LDISC 18/04/2012 Paul Lotrus 1 Control Command Overview GBAR Collaboration Meeting Paul Lotrus CEA/DSM/Irfu/SIS.
PHENIX Safety Review Overview of the PHENIX Hadron Blind Detector Craig Woody BNL September 15, 2005.
April 2005OLAV Workshop Diamond Vacuum Controls Hugo S Shiers.
JBMCambridge Workshop 6th to 7th January DAQ-DCS Communication.
ATLAS DCS ELMB PRR, March 4th 2002, H.J.Burckhart1 Embedded Local Monitor Board ELMB  Context  Aim  Requirements  Add-ons  Our aims of PRR.
S.Sergeev (JINR). Tracker consists of  4 stations of 4 views (planes) each In total ~7200 drift tubes (~450 per view) To be controlled/monitored 
DCS workshop,march 10th P.Saturnini, F. Jouve Slow control trigger of the muon arm Muon trigger of the muon arm Low voltage VME Crate External parameters.
E Ethernet C CAN bus P Profibus HV HV cables LV LV cables (+busbar) S Serial (RS232) Signal cable Other/Unknown Liquid or Gas Cable and/or Bus PCI-XYZ.
André Augustinus 17 June 2002 Detector URD summaries or, what we understand from your URD.
DCS meeting - CERN June 17, 2002V.Kouchpil SDD DCS status Low Voltage system End-ladder ASIC High Voltage system Cooling system Schedule.
ATLAS DCS Workshop on PLCs and Fieldbusses, November 26th 1999, H.J.Burckhart1 CAN and LMB in ATLAS u Controls in ATLAS u CAN u Local Monitor Box u Concept.
ESS (vacuum) control system Daniel Piso Controls Division February 20, 2013.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
The Slow Control System of the HADES RPC Wall Alejandro Gil on behalf of the HADES RPC group IFIC (Centro Mixto UV-CSIC) Valencia, 46071, Spain IEEE-RT2009.
MVD COOLING STATUS COOLING PLANT UPDATE: -ADDITION of the NEW CIRCUITS FOR GBT AND DC-DC Converter BOARDS IN THE COOLING PLANT -THERMAL STUDY OF GBT AND.
Mitglied der Helmholtz-Gemeinschaft MicroTCA at the Multiplexing Level of the PANDA STT and the PANDA MVD Harald Kleines, ZEL, Forschungszentrum Jülich.
Mitglied der Helmholtz-Gemeinschaft Status of the MicroTCA developments for the PANDA MVD Harald Kleines, ZEL, Forschungszentrum Jülich.
MVD COOLING STATUS COOLING PLANT: -UPDATE of THE HYDRAULIC SYSTEM -INTEGRATION OF THE MVD COOLING CIRCUIT PATHS INSIDE THE PANDA AREA -DEFINITION OF THE.
MVD COOLING STATUS MVD COOLING PROJECT: CHOICE of COOLING FLUID,
Counting Room Electronics for the PANDA MVD
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
MicroTCA Development and Status
ProtoDUNE - SP SLOW CONTROLS MEETING 11/11/2016 CERN – EP/DT
Beam
U units for the MVD Close MVD Pixels (Paolo) Strips (Robert and Hans)
How SCADA Systems Work?.
The IFR Online Detector Control at the BaBar experiment at SLAC
TPC Detector Control System
The IFR Online Detector Control at the BaBar experiment at SLAC
The Online Detector Control at the BaBar experiment at SLAC
Presentation transcript:

Mitglied der Helmholtz-Gemeinschaft PANDA MVD Slow Control Issues Harald Kleines, Forschungszentrum Jülich, ZEA-2

Client Level Overall Structure of MVD control system Server level: EPICS servers implemented on EPCIS IOCs provide HW abstraction  Server level code has to be implemented by the MVD team Client level: EPICS client applications (implemented on client computers) responsible for monitoring, user operation and online storage (of process values, setups and alarms) 2 types of client applications:  Generic applications provided by the DCS team: e.g. alarm system, data base application,…  MVD specific applications provided by the MVD team: e.g. GUIs for cooling plant operation or power supply operation 3 privilege levels on the client side: pure monitoring, standard operation and expert operation  Expert operation will be required for detector start-up and shut-down as well as for maintenance  EPCIS support for privilege levels: unclear Epics IOC Channel Access Protocol (based on Ethernet + UDP) to Client Systems Client Computers Database Server Level

MVD Slow Control Subsystems The MVD Slow Control System is partitioned into individual subsystems  Subdivision according to functional groups  Dedicated interconnects between subsystems possible, e.g. hardwired shutdown of power supplies in the case of leak detection in the cooling system  Each subsystem includes clients applications as well as EPCIS IOCs, to which all HW devices are connected  Client applications of different subsystems may share the same client computer Hardware Connections of IOCs  Ethernet for power supplies, crate control  GBT (indirectly via MicroTCA backplane) for detector configuration, run control and environmental monitoring  PROFINET to PLC for Cooling system, environmental monitoring Cooling System High Voltage System Low Voltage System Environ- metal Monitoring Detector Con- figuration Run Control Crate/ Rack Control

GBT-SCA Developed by CERN as part of the GBT chipset Local sensors directly connectable to ADC channels Remote Sensors connectable via I2C or SPI Unfortunately no space for GBT-SCA on service board developed for TOPIX in Torino This service board is supposed to be used also for the PASTA ASIC in the strip part So it seems that the GBT-SCA cannot be used for the PANDA MVD GBT-SCA chip can reduce cabling in the Low Voltage and the Environmental Monitoring System

EPICS IOC in MicroTCA crate of GBT-based slow control functions EPICS IOC implemented on AMC CPU Communication with MMBs via PCIe on backplane GBT downstream to front end: configuration data + commands GBT upstream: detector data + slow control information from GBT-SCA + TOPIX/PASTA temperatures and bias voltages AMC CPU (EPICS IOC) AMC: Timing MMB 1MMB 2 MMB 10 MCH Backplane 4 Lane PCIe (Point-to-Point to MCH) Clocks, Triggers, Control Signals according to MTCA.4 („uTCA for Physics“) SODA Ethernet/ CA Protocol MicroTCA Crate GBT Links

Cooling System Responsible for the control and monitoring of the cooling plant (water tank level, pumps, heat exchanger, chillers) and of the cooling lines Detector safety function:  Leak detection must lead to an immediate power shutdown of the MVD (and also additional detectors???) Sensors + Actuators  Temperature sensors, pressure sensors, flow regulators, shut off valves for the cooling lines  Signals for the cooling plant (water tank, pumps, heat exchanger, chillers, water cleaning,..) not completely clear It will be a PLC based system implemented by the MVD team. It is intended to use a failsafe Siemens S type of PLC, which is connected to the EPCIS IOC via PROFINET. In order to distribute digital and analogue IOs over the detector, ET200MP decentral periphery systems are intended to be used.  A global alarm from the DCS should be hardwired to the PLC  A hardwired signal form the PLC to the DCS should be provided in order to trigger a global detector power down

Schematic overview of the Cooling Plant (by Silvia Coli)

Cooling system sensors and actuators Cooling plant devices channel count level tank transmitter1 pump pressure transmitter1 Dissolved O2 sensor1 temperature sensors1 heat exchanger system (from ex.)2 vacuum tank p sensor1 electrovalves for tank vacuum pump1 N2 flow meter2 N2 pressure regulator2 flow meter1 elettovalves for liquid ring vacuum pump4 vacuum pump1 Cooling line devices channel count flow meter85 p regulators85 shut off pneumatic valves (I) NORMALLYCLOSED85 shut off pneumatic valves (out)NORMALLYOPEN85 T sensors85 p sensors (I)85 p sensors (out)85

Cooling Plant PLC system S7-1500/ET200MP  Digitial and analog IO- Modules for sensors + actuators  Failsafe CPUs available today, but no failsafe IO- Modules →Optionally ET200SP or ET200M have to be used Operation Panel, PLC S and ET200MP System 1 will be in a rack at the cooling plant location. Subdivision to several racks or decentral boxes is possible, even IP67 systems ET200MP System 2 will be near the PANDA detector

High Voltage System Responsible for the control and monitoring of detector high voltages and currents (including up- and down-ramping) Hardwired shutdown function in the case of alarms (environmental monitoring, cooling plant, global alarm from the DCS) It is intended to use commercial systems, probably Wiener/ISEG MPOD-based systems (preferred) or CAEN multi-channel power supplies (but no decision up to now). An industrial type of PC with Linux will be used as EPCIS IOC. Connection between IOC and Powersupply crates will be based on Ethernet. Number of channels: 480 (each V,I) Local Ethernet Channel Access Protocol Power Supply Crates EPICS IOC

Low Voltage System Responsible for the control and monitoring of supply voltages of the front end electronics Hardwired shutdown function in the case of alarms (environmental monitoring, cooling plant, global alarm from the DCS) It is intendend to use a commercial system. Wiener MPOD crates with MPV modules is the preferred solution, but no decision up to now Number of channels: 1584 (each V,I) Under evaluation for the Strip detector: GBT-SCA for PASTA voltages and currents Pixel detector: 810 bias voltages from TOPIX Local Ethernet Channel Access Protocol Power Supply Crates EPICS IOC

Environmental Monitoring Responsible for the monitoring of temperatures and humidity in the detector Detector power shutdown in the case of critical temperature or humidity Pixel Part:  temperature monitoring by TOPIX ASIC integrated sensor  additional temperature and humidity sensors wired outside of the MVD Strip Part:  Under evaluation: temperature monitoring by PASTA ASIC  Under evaluation: Sensors will be connected to GBT-SCA, either directly to ADC ports or via I2C or SPI (thus avoiding wiring of sensors outside of the MVD) Number of sensors:  Temperature Sensors: 116  Humidity Sensors: 8  Topix temperatures: 810

Detector Configuration TOPIX and PASTA parameters have to be set, TOPIX and PASTA operation has to be monitored Configuration data will be downloaded via GBT Definition of data format has to be done Size of parameter set: ca. 10 MBytes

Run Control Operational Mode changes of the DAQ system have to be translated into corresponding actions of the MVD The data acquisition of the MVD has to be monitored, statistics have to be collected Core functions have to be implemented on the MMBs, which will be controlled via the CPU (serving as an EPICS IOC) in the corresponding MicroTCA crate

Crate/Rack Control Status of the crates has to be controlled: counting room crates, high and low voltage power supply crates  Reset + power cycle of crates and modules  Monitoring of temperatures, voltages and currents in crates and modules  Water cooling is not required? MicroTCA crates (with MMBs) and ATCA crates (with Compute Nodes):  Use crate management based on IPMI (over Ethernet)  Unclear: Usage of an IPMI Management console or translation to EPICS?  EPICS IOC or IPMI management console will be based on one common industrial type of PC for all crates Power Supply crates: use Ethernet Protocol defined by Wiener or CAEN. EPICS servers as well as Client applications are supposed to be available. Number of MicroTCA crates: 8 Number of power supply crates: ca. 19

Status and Activities planned in 2015 Status: Conceptional work is being done Plans for 2015/2016:  Design of the PLC system for the cooling plant  Tests with HV and LV Power Supplies?? Number of FTEs in 2015: ca. 0.2 (from ZEA-2 for PLC design)