Controls & Monitoring Overview J. Leaver 03/06/2009.

Slides:



Advertisements
Similar presentations
1 1999/Ph 514: Channel Access Concepts EPICS Channel Access Concepts Bob Dalesio LANL.
Advertisements

EPICS Architecture Version 3 Channel Access Client (CAC) Connection Data Transfers WAN/LAN/Local Connection Data Transfers Channel Access Server (CAS)
Control Systems for Future GSI, May , 2003 Control System Requirements for the CBM detector Burkhard Kolb GSI HADES.
Experiment Control Systems at the LHC An Overview of the System Architecture An Overview of the System Architecture JCOP Framework Overview JCOP Framework.
MICE RF & Controls Paul Drumm. RF Layout 2 MW Amplifier Master Oscillator Controls etc 201 MHz Cavity Module 2 MW Amplifier 201 MHz Cavity Module CERN.
A U.S. Department of Energy Office of Science Laboratory Operated by The University of Chicago Argonne National Laboratory Office of Science U.S. Department.
Network Management Overview IACT 918 July 2004 Gene Awyzio SITACS University of Wollongong.
9-12 Oct 2000PCaPAC 2000, DESY Hamburg Epics to TINE translator Matthias Clausen, DESY Hamburg Phil Duval, DESY Hamburg Zoltan Kakucs, DESY Hamburg.
Controls and Monitoring Implementation Plan J. Leaver 03/06/2009.
Paul drumm daq&c-ws august/september Cooling Channel.
Pete Owens MICE Controls & Monitoring Workshop 25th September 2006 MICE Control and Monitoring: General Architecture Subsystem Integration User Interface.
C&M Systems Developed by Local MICE Community J. Leaver 03/06/2009.
Tracker Controls MICE Controls and Monitoring Workshop September 25, 2005 A. Bross.
Software Frameworks for Acquisition and Control European PhD – 2009 Horácio Fernandes.
Brian Martlew 6 th June 2006 Controls Ground Rules Brian Martlew.
Target Online Software J. Leaver 01/12/ /06/2015Imperial College 2 Target Controller Software Software for Stage 1 upgrade nearing completion –Hardware.
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
Brian Martlew 25 th Sept 2006 MICE Control & Monitoring Plan Brian Martlew.
OPC Overview OPC Device Support (PLC Gateway for 3.14) Ralph Lange – EPICS Collaboration Meeting at SLAC, April 2005.
Diagnostics and Controls K. Gajewski ESS Spoke RF Source Accelerator Internal Review.
Agenda Adaptation of existing open-source control systems from compact accelerators to large scale facilities.
MICE CM25 Nov 2009Jean-Sebastien GraulichSlide 1 Online Summary o Detector DAQ o Controls And Monitoring o Online Data Base o Bottom Lines Jean-Sebastien.
MICE CM February 08Jean-Sébastien GraulichSlide 1 Detector DAQ Hardware Status o Introduction o Preliminary list o Plan for action o Summary Jean-Sebastien.
Berliner Elektronenspeicherringgesellschaft für Synchrotronstrahlung mbH (BESSY) Accelerator and Experiment Control and Monitor Systems Ralph Lange BESSY,
Control and Monitoring System / EPICS Pete Owens Daresbury Laboratory.
Imperial College Tracker Slow Control & Monitoring.
Ralph Lange: OPC Gateway (Device Support) OPC Gateway (Device Support) Ralph Lange – EPICS Collaboration Meeting March SSRF.
Redundancy. 2. Redundancy 2 the need for redundancy EPICS is a great software, but lacks redundancy support which is essential for some highly critical.
TANGO on embedded devices: the Bimorph Mirror application case Fulvio Billè Roberto Borghes, Roberto Pugliese, Lawrence Iviani Instrumentation & Measurement.
IRMIS 2 Overview Andrew Johnson Computer Scientist, AES Controls.
BLU-ICE and the Distributed Control System Constraints for Software Development Strategies Timothy M. McPhillips Stanford Synchrotron Radiation Laboratory.
BROOKHAVEN SCIENCE ASSOCIATES High level applications and EPICS control GUOBAO SHEN NSLS-II, Control Group May 4 th, 2009.
EPICS Direction to Support Large Projects and Incorporate New Technology Leo R. Dalesio 09/21/99.
LCLS Undulator Positioning Control System Shifu Xu, Joseph Xu, Josh Stein Control Group, AES/APS, ANL June 15, 2006.
Control & Monitoring System Update Contributors: Brian Martlew - Daresbury Laboratory - STFC James Leaver - Imperial College Pierrick Hanlet - Fermilab.
Final Review of ITER PBS 45 CODAC – PART 1 – 14 th, 15 th and 16 th of January CadarachePage 1 FINAL DESIGN REVIEW OF ITER PBS 45 CODAC – PART 1.
EEDEED Tuesday, May 18, VBA & Process Variables  System Overview  Setup (What installations are needed)  What are Process Variables and what.
Eugenia Hatziangeli Beams Department Controls Group CERN, Accelerators and Technology Sector E.Hatziangeli - CERN-Greece Industry day, Athens 31st March.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
FLASH Free Electron Laser in Hamburg Status of the FLASH Free Electron Laser Control System Kay Rehlich DESY Content: Introduction Architecture Future.
Control System Overview J. Frederick Bartlett Fermilab June 1,1999.
1. LabVIEW and EPICS Workshop EPICS Collaboration Meeting Fall 2011.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
Connecting LabVIEW to EPICS network
Controls & Monitoring Status Update J. Leaver 05/11/2009.
1 BROOKHAVEN SCIENCE ASSOCIATES High Level Applications Infrastructure and Current Status Guobao Shen, Lingyun Yang* Controls Group & Accelerator Physics.
Database David Forrest. What database? DBMS: PostgreSQL. Run on dedicated Database server at RAL Need to store information on conditions of detector as.
11 th February 2008Brian Martlew EPICS for MICE Status of the MICE slow control system Brian Martlew STFC, Daresbury Laboratory.
1 Channel Access Concepts – IHEP EPICS Training – K.F – Aug EPICS Channel Access Concepts Kazuro Furukawa, KEK (Bob Dalesio, LANL)
MPD Slow Control System historical background, present status and plans D.S. Egorov, R.V. Nagdasev, V.B. Shutov V.B.Shutov /21.
EPICS and LabVIEW Tony Vento, National Instruments
E. Matias Canadian Light Source CLS Beamline Controls.
Control System Considerations for ADS EuCARD-2/MAX Accelerators for Accelerator Driven Systems Workshop, CERN, March 20-21, 2014 Klemen Žagar Robert Modic.
Control System Overview J. Frederick Bartlett Fermilab June 1,1999.
Adrian Oates Graham Cox Daresbury Laboratory MICE Control System DL Contribution June 09.
An Introduction to Epics/Tango Steve Hunt Alceli EPICS Meeting 2008 INFN Legnaro 15 Oct 17:15.
ORNL is managed by UT-Battelle for the US Department of Energy Status Report: Data Acquisition and Instrument Controls for the Spallation Neutron Source.
Fermilab Control System Jim Patrick - AD/Controls MaRIE Meeting March 9, 2016.
Using COTS Hardware with EPICS Through LabVIEW – A Status Report EPICS Collaboration Meeting Fall 2011.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
Control System Tools for Beam Commissioning Timo Korhonen Controls Division Chief Engineer April 8, 2014.
LCLS Commissioning & Operations High Level Software
Overview of TANGO Control system
Laboratorio per dottorandi 2017 Particle Accelerators Control System
The Muon Ionization Cooling Experiment: Controls and Monitoring
Controls & Monitoring in MICE
LCLS Commissioning & Operations High Level Software
Getting Started with EPICS A lecture Series
TANGO from an EPICS perspective
Presentation transcript:

Controls & Monitoring Overview J. Leaver 03/06/2009

23/12/2015Imperial College 2 Outline Overview of MICE –Systems requiring control & monitoring Outline of C&M framework –EPICS

23/12/2015Imperial College 3 MICE Layout

23/12/2015Imperial College 4 Required C&M Systems: Beamline

23/12/2015Imperial College 5 Required C&M Systems: Beamline Target –Drive (PSU, extraction, temperature monitoring, etc.) –Controller (actuation enable, dip depth, timing, etc.) –Beam Loss (Sector 7/8, total, c.f. Target position) Beamline Magnets [Q1-Q9, D1-D2] (PSU) Pion Decay Solenoid (PSU, cryo)

23/12/2015Imperial College 6 Required C&M Systems: Instrumentation

23/12/2015Imperial College 7 Required C&M Systems: Instrumentation Fermilab Beam Profile Monitors (PMT, sensor temperature) Time of Flight System (PMT HV) Cherenkov System (PMT HV, temperature, humidity) Diffuser (motor control, disc position) Tracker –Spectrometer Solenoid (PSU, cryo) –Magnetic Field Probes (B-field, sensor temperature) –AFEIIts (readout configuration & control, temperature monitoring) –AFEIIt Infrastructure (PSU, cryo) Calorimeter System –KL Calorimeter (PMT HV) –Electron Muon Ranger (PMT HV)

23/12/2015Imperial College 8 Required C&M Systems: Cooling Channel

23/12/2015Imperial College 9 Required C&M Systems: Cooling Channel Hydrogen Absorbers –Focus Coils (PSU, cryo) –Hydrogen System (PSU, pumps, valves, vacuum, cryo) RF Cavities –Coupling Coils (PSU, cryo) –RF System (PSU, amplifiers, B-field probes, feedback)

23/12/2015Imperial College 10 Required C&M Systems: Auxiliary DATE Status –Run state of global DAQ system (armed, taking data, idle, error, etc.) Network Status –State of each PC on the DAQ & Control Networks (link up/down, correct ID, necessary services running)

23/12/2015Imperial College 11 Required C&M System Features Reliability –Need months/years of uninterrupted runtime No C&M = no experiment –Need effective help & support when problems occur Plus ability to resolve issues without waiting for vendor support (availability of source code) Scalability –Facility to add new systems to existing setup with minimal development overheads/performance impact

23/12/2015Imperial College 12 Required C&M System Features Ease of use –But extensible for arbitrarily complex custom systems Error reporting –Clear alarms when fault conditions occur Data archiving –Permanent & accessible record of all significant monitoring parameters

23/12/2015Imperial College 13 EPICS Collaboration –World wide collaboration sharing designs, software & expertise for implementing large-scale control systems Control System Architecture –Client/server model with efficient communication protocol (Channel Access) for passing data –Distributed real-time database of machine values Software Toolkit –Collection of software tools which can be integrated to provide a comprehensive & scalable control system

23/12/2015Imperial College 14 Why Use EPICS for MICE? Free Open Source software Mature packages –Stable, robust infrastructure Widely used in scientific laboratories & industry –Significant expertise available –Good support/documentation Collaboration members with prior EPICS experience (e.g. Fermilab → DØ) Many tools available for common C&M tasks –(Relatively) easy to develop new applications

23/12/2015Imperial College 15 EPICS Fundamentals Each control/status parameter represented by a ‘Process Variable’ (PV) –PV: Named piece of data (e.g. temperature) with a set of attributes (e.g. safe operating limits) Channel Access (CA) servers provide access to PVs CA clients read/write PVs to perform control & monitoring tasks Network-based, distributed system –PVs can be spread over multiple servers, accessed transparently over network Network Channel Access Client Control Application Monitoring Application Process Variables Channel Access Client Channel Access Server Process Variables

23/12/2015Imperial College 16 EPICS Servers EPICS servers typically connect to hardware –PVs often correspond directly to physical device states or control settings 2 types of server –Input/Output Controller (IOC) –Portable CA server –Approx. equal numbers of each type used in MICE IOC IOC Channel Access Server - Channel Access - Database - Device / Driver Support Network Hardware Portable Portable Channel Access Server - Channel Access Custom Code Hardware OR

23/12/2015Imperial College 17 EPICS Servers System configuration & hardware IO connections defined by PV record database Existing driver support for many devices & standard communication interfaces Server creation often a matter of database configuration rather than software development Runs on vxWorks/RTEMS boards, or PCs (Linux, Win, Mac, etc.) Suitable for most ‘standard’ hardware systems IOC IOC Channel Access Server - Channel Access - Database - Device / Driver Support Network Hardware Portable Channel Access Server - Channel Access Custom Code Hardware

23/12/2015Imperial College 18 EPICS Servers C++ library containing CA & PV framework Use with arbitrary hardware access code to produce fully custom servers MICE wrapper framework greatly simplifies creation of portable servers Runs on PCs (Linux, Win, Mac, etc.) Suitable for non-standard hardware & complex software architectures Network Portable Portable Channel Access Server - Channel Access Custom Code Hardware IOC Channel Access Server - Channel Access - Database - Device / Driver Support Hardware

23/12/2015Imperial College 19 EPICS Clients Clients provide (user) interface to C&M system –Abstracted from hardware → clients only need to access PVs Packages available to facilitate building control/monitoring displays –Extensible Display Manager (EDM), Motif Editor and Display Manager (MEDM), etc. –Interactive graphical environments enabling PVs to be attached to control/display widgets –No low-level programming required –~60% of MICE C&M systems use EDM

23/12/2015Imperial College 20 EPICS Clients Not limited to standard GUI builders - CA bindings exist for many languages/tools –C/C++, Java, Matlab, Perl, Python, LabVIEW, etc. –~30% of MICE C&M systems use custom C++ wrapper, with QT GUIs Well-established client tools provide core C&M functionality –Alarm Handler, Channel Archiver, etc. –Minimises required infrastructure development effort

23/12/2015Imperial College 21 C&M Systems Overview