Controls and Monitoring Status Update J. Leaver 29/05/2009.

Slides:



Advertisements
Similar presentations
MICE CM January 2009Jean-Sébastien GraulichSlide 1 DAQ and CAM Summary o Organization o Control and Monitoring o Detector DAQ o Online Monitoring o Front-End.
Advertisements

VxWorks Real-Time Kernel Connectivity
MOM Report Ray Gamet MICE Operations Manager University of Liverpool MICO 18 th May 2009.
Nada Abdulla Ahmed.  SmoothWall Express is an open source firewall distribution based on the GNU/Linux operating system. Designed for ease of use, SmoothWall.
MICE RF & Controls Paul Drumm. RF Layout 2 MW Amplifier Master Oscillator Controls etc 201 MHz Cavity Module 2 MW Amplifier 201 MHz Cavity Module CERN.
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
MICE Electronics Upgrade P J Smith (University of Sheffield) J Leaver (Imperial College) 17 th June 2009.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Wrap Up o Control and Monitoring o DDAQ o Interface between DDAQ and MCM o Infrastructure Jean-Sebastien.
Workshop Goals MICE Controls and Monitoring Workshop September 25, 2006 A. Bross.
Controls and Monitoring Implementation Plan J. Leaver 03/06/2009.
Paul drumm daq&c-ws august/september Cooling Channel.
14.1 © 2004 Pearson Education, Inc. Exam Planning, Implementing, and Maintaining a Microsoft Windows Server 2003 Active Directory Infrastructure.
MICE CM15 June 2006Jean-Sébastien GraulichSlide 1 DAQ Status o Detector DAQ Test bench in Geneva o Preparing for Test Beam o FE electronics tests o Detector.
C&M Systems Developed by Local MICE Community J. Leaver 03/06/2009.
1 Configuration Database Review David Forrest University of Glasgow RAL :: 1 st June 2009.
MOM Report Ray Gamet MICE Operations Manager University of Liverpool MICO 11 th May 2009.
paul drumm; 3rd December 2004; AFC MM 1 Cost & Schedule Review I Terms of reference: –To review the Cost and schedule of the MICE Muon beam –To review.
Tracker Controls MICE Controls and Monitoring Workshop September 25, 2005 A. Bross.
Status of Electronics & Control System P J Smith University of Sheffield 16/12/2009.
Computing Panel Discussion Continued Marco Apollonio, Linda Coney, Mike Courthold, Malcolm Ellis, Jean-Sebastien Graulich, Pierrick Hanlet, Henry Nebrensky.
Target Online Software J. Leaver 01/12/ /06/2015Imperial College 2 Target Controller Software Software for Stage 1 upgrade nearing completion –Hardware.
Target Control Electronics Upgrade 08/01/2009 J. Leaver P. Smith.
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
Spectrometer Solenoid Update Michael S. Zisman Center for Beam Physics Accelerator & Fusion Research Division Lawrence Berkeley National Laboratory MICO.
Target Controller Electronics Upgrade Status P. Smith J. Leaver.
Maintaining and Updating Windows Server 2008
Check Disk. Disk Defragmenter Using Disk Defragmenter Effectively Run Disk Defragmenter when the computer will receive the least usage. Educate users.
© 2010 VMware Inc. All rights reserved VMware ESX and ESXi Module 3.
R. Lange, M. Giacchini: Monitoring a Control System Using Nagios Monitoring a Control System Using Nagios Ralph Lange, BESSY – Mauro Giacchini, LNL.
MICE CM25 Nov 2009Jean-Sebastien GraulichSlide 1 Online Summary o Detector DAQ o Controls And Monitoring o Online Data Base o Bottom Lines Jean-Sebastien.
11 NETWORK PROTOCOLS AND SERVICES Chapter 10. Chapter 10: Network Protocols and Services2 NETWORK PROTOCOLS AND SERVICES  Identify how computers on TCP/IP.
MICE CM26 March '10Jean-Sebastien GraulichSlide 1 Detector DAQ Issues o Achievements Since CM25 o DAQ System Upgrade o Luminosity Monitors o Sequels of.
CM26 March 2010Jean-Sebastien GraulichSlide 1 Online Summary o The heplnw17 case o DAQ o CAM o Online Reconstruction o Data Base o Data Storage Jean-Sebastien.
SOFTWARE & COMPUTING Durga Rajaram MICE PROJECT BOARD Nov 24, 2014.
Software Configuration Management
Current Job Components Information Technology Department Network Systems Administration Telecommunications Database Design and Administration.
Control and Monitoring System / EPICS Pete Owens Daresbury Laboratory.
Chapter 8 Implementing Disaster Recovery and High Availability Hands-On Virtual Computing.
Imperial College Tracker Slow Control & Monitoring.
Project manager report paul drumm CM16 October 2006.
IMPLEMENTATION OF SOFTWARE INPUT OUTPUT CONTROLLERS FOR THE STAR EXPERIMENT J. M. Burns, M. Cherney*, J. Fujita* Creighton University, Department of Physics,
MICE Controls Progress 15 th March 2007 * Controls Infrastructure * Controls network * Target * Superconducting Solenoid * Magnet Power Supplies * Hydrogen.
Update on Database Issues Peter Chochula DCS Workshop, June 21, 2004 Colmar.
Dec 8-10, 2004EPICS Collaboration Meeting – Tokai, Japan MicroIOC: A Simple Robust Platform for Integrating Devices Mark Pleško
MICE MICE Target Mechanism Control Electronics Upgrade P J Smith – University of Sheffield James Leaver – Imperial College 2 nd March 2009.
Control & Monitoring System Update Contributors: Brian Martlew - Daresbury Laboratory - STFC James Leaver - Imperial College Pierrick Hanlet - Fermilab.
MICE CM25 Nov 2009Jean-Sebastien GraulichSlide 1 Detector DAQ Issues o Achievements Since CM24 o Trigger o Event Building o Online Software o Front End.
Configuration Database Antony Wilson MICE CM February 2011 RAL 1.
Adrian Oates Daresbury Laboratory ALICE Control System July 08.
CIDB The PSI Controls Inventory DataBase Timo Korhonen, PSI (for the CIDB Team)
MICE CM June 2009Jean-Sebastien GraulichSlide 1 Online Summary o Achievements Since CM23 o Control Room o Controls o Detector DAQ o Open Issues o Summary.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Control and Monitoring Summary of DAQ WS III o General Architecture o RF System C&M o D0 EPICS extensions.
Technical Board and Safety Summary Michael S. Zisman Center for Beam Physics Accelerator & Fusion Research Division Lawrence Berkeley National Laboratory.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
MICE Status & Plans MICE-UK paul drumm 15 th September 2004.
DØ Online16-April-1999S. Fuess Online Computing Status DØ Collaboration Meeting 16-April-1999 Stu Fuess.
Online Reconstruction 1M.Ellis - CM th October 2008.
Controls & Monitoring Overview J. Leaver 03/06/2009.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
Controls & Monitoring Status Update J. Leaver 05/11/2009.
Linux Operations and Administration
Status & development of the software for CALICE-DAQ Tao Wu On behalf of UK Collaboration.
Database David Forrest. What database? DBMS: PostgreSQL. Run on dedicated Database server at RAL Need to store information on conditions of detector as.
11 th February 2008Brian Martlew EPICS for MICE Status of the MICE slow control system Brian Martlew STFC, Daresbury Laboratory.
1 Configuration Database David Forrest University of Glasgow RAL :: 31 May 2009.
Control System Overview J. Frederick Bartlett Fermilab June 1,1999.
Adrian Oates Graham Cox Daresbury Laboratory MICE Control System DL Contribution June 09.
Maintaining and Updating Windows Server 2008 Lesson 8.
What do we want to achieve in this running period? Michael S. Zisman Deputy Spokesmouse Center for Beam Physics Lawrence Berkeley National Laboratory MICO.
Controls & Monitoring in MICE
Presentation transcript:

Controls and Monitoring Status Update J. Leaver 29/05/2009

Infrastructure

13/09/2015Imperial College 3 Infrastructure Issues General EPICS infrastructure –EPICS server / client organisation –Unification of control systems Remote access –Monitoring –Controls Configuration database Schedule

13/09/2015Imperial College 4 EPICS Client / Server Overview

13/09/2015Imperial College 5 EPICS Server / Client Organisation Wide variety of EPICS server applications permitted –Typically connect to physical hardware Impossible to enforce common interface/processor/OS specifications –Each server is maintained by ‘owner’ of respective control system Strict central administration unnecessary – ‘end user’ only concerned with availability of PVs on network EPICS clients also varied, but must be uniformly accessible –Users should not have difficulty finding/launching clients –Applications should be consistently organised/updated –MICE Online Group (MOG) responsibility

13/09/2015Imperial College 6 EPICS Client Organisation All client-side applications run on miceecserv –Central installation repository greatly simplifies configuration/maintenance/backup –MOG collates individual applications, applies updates when available from control system ‘owners’ miceecservmiceopi1miceopi2 EPICS IOC Portable CA Server EPICS IOC Controls Network EPICS server applications EPICS client applications

13/09/2015Imperial College 7 EPICS Client Organisation Client control/monitoring GUIs viewed directly on miceecserv, or one of 2 ‘Operator Interface’ PCs –OPI PCs act as ‘dumb terminals’, running displays from miceecserv via SSH miceecservmiceopi1miceopi2 EPICS IOC Portable CA Server EPICS IOC Controls Network EPICS server applications EPICS client applications

13/09/2015Imperial College 8 Unification of Control Systems At user level: Simple ‘wrapper’ GUI provides menu for launching individual client applications At system level: Employ 2 standard EPICS tools (running as background services on miceecserv) –Alarm Handler Monitors all servers & warns operators of abnormal/dangerous conditions –Channel Archiver Automatically records PV parameters to disk & provides several visualisation options See PH’s talk

13/09/2015Imperial College 9 User Interface

13/09/2015Imperial College 10 User Interface Large wall-mounted display Alarm Handler Message log Any important parameters for current run

13/09/2015Imperial College 11 User Interface Client application launcher Standard desktop monitor Client GUI

13/09/2015Imperial College 12 User Interface Connected to miceecserv

13/09/2015Imperial College 13 User Interface Connected to miceopi1 Connected to miceopi2

13/09/2015Imperial College 14 Remote Monitoring: General Principles Remote users should have simple, easily accessible interface for routine monitoring ‘Expert’ remote users should have access to monitoring displays which match those in MLCR No machine on Controls Network should be directly accessible over the internet System load generated by remote monitoring should have minimal impact on control & monitoring services

13/09/2015Imperial College 15 Remote Monitoring: Web Server miceecserv EPICS IOC Portable CA Server EPICS IOC RAL Gateway Channel Archiver Web Server PV Archive Data Server CGI Export Controls Network Java Archive Viewer Web browser NFS Mount PPD Network Internet

13/09/2015Imperial College 16 Remote Monitoring: Direct PV Access Could recreate normal client displays using web interface, but would involve impractical development overheads –Provide direct read only access to PVs so actual client GUIs may be run remotely miceecserv EPICS IOC Portable CA Server EPICS IOC RAL Gateway Controls Network Standard client GUI running on remote PC (read only) CA Gateway (read only)

13/09/2015Imperial College 17 Remote Monitoring: Direct PV Access CA Gateway makes PVs available across subnets (with full access control), while minimising load on underlying servers To simplify end-user support, virtual machine disk image containing EPICS + all client applications will be made available miceecserv EPICS IOC Portable CA Server EPICS IOC RAL Gateway Controls Network Standard client GUI running on remote PC (read only) CA Gateway (read only)

13/09/2015Imperial College 18 Remote Control Where possible, operations affecting the state of any MICE system should only be performed within MLCR –Remote users accessing controls can lead to unknown/unexpected running conditions – should be discouraged If necessary, off-site experts will be permitted to run control client applications on miceecserv, via SSH through RAL Gateway –Each expert will have an account on miceecserv which only contains client applications for their designated system

13/09/2015Imperial College 19 Configuration Database Necessary to integrate control systems with central MICE configuration database 1)Read set point values from database 2)Upload PV values to EPICS servers 3)Modify PVs with client GUIs 4)Download PV values from EPICS servers 5)Write new set point values to database For (2) & (4) propose use of standard EPICS Backup & Restore Tool (BURT) –Backup/restore PV values to/from snapshot files

13/09/2015Imperial College 20 Configuration Database BURT snapshot files may be written in ‘Self- Describing Data Sets’ (SDDS) format For (1) & (5), propose development of application to write/read database values to/from SDDS files –C API for generating SDDS snapshots provided with BURT –C/C++ APIs for database (PostgreSQL) available NB: Configuration database interface still in very early planning stages – details to be discussed/decidedNB: Configuration database interface still in very early planning stages – details to be discussed/decided –Have not rejected possibility of developing custom backup/restore client which accesses database directly

13/09/2015Imperial College 21 Infrastructure Schedule

Control & Monitoring Systems

13/09/2015Imperial College 23 C&M Systems Overview

C&M Systems Developed by Local MICE Community

13/09/2015Imperial College 25 Target: Controller Existing Target Controller system stable/reliable, but only has ‘push button’ interface & limited upgradeability Currently undergoing complete redesign to increase functionality and enable PC control Based on USBDAQ –Contains 1M gate FPGA –USB interface for PC communication Will be fully integrated with EPICS Responsible for SystemResponsible for EPICS C&MDue Paul Smith (UOS); James Leaver (IC)James Leaver (IC)July 2009; Dec 2009

13/09/2015Imperial College 26 Target: Controller Responsible for SystemResponsible for EPICS C&MDue Paul Smith (UOS); James Leaver (IC)James Leaver (IC)July 2009; Dec 2009 In hardware/firmware design stage – EPICS development not yet commenced Stage 1 upgrade will be complete end of July 2009 –Interfaces USBDAQ with existing analogue electronics –EPICS C&M system recreating current ‘push button’ controls (actuation, target dip depth, timing) Stage 2 upgrade to be completed end of December 2009 –Redesign of analogue electronics –Enable fine control of subsystems

13/09/2015Imperial College 27 Target: Beam Loss Beam loss IOC reads local data archive written by DAQ system Clients provide virtual scope display, history plots & analysis System functionally complete, but requires final selection of algorithm for calculating ‘absolute’ beam loss Responsible for SystemResponsible for EPICS C&MDue Paul Smith; Paul Hodgeson (UOS); James Leaver (IC)Pierrick Hanlet (IIT)Now; Sep 2009 DAQ

13/09/2015Imperial College 28 FNAL Beam Profile Monitors EPICS Server/client applications complete Well tested, used for monitor calibration procedures Responsible for SystemResponsible for EPICS C&MDue Alan Bross (FNAL)James Leaver (IC)Now

13/09/2015Imperial College 29 Cherenkov System Responsible for SystemResponsible for EPICS C&MDue Lucien Cremaldi; David Sanders (OLEMISS)Pierrick Hanlet (IIT)Sep 2009

13/09/2015Imperial College 30 Tracker: Magnetic Field Probes NIKHEF Hall probes will be installed –In homogeneous region of Tracker volume –At Z-edges of Tracker volume –Outside solenoids (backup check of field polarity) Hall probes read out via CAN interface using Windows application Portable CA server reads parameters from Windows PC via network socket Monitor B-field (X, Y, Z components) + probe temperature Responsible for SystemResponsible for EPICS C&MDue Frank Filthaut (RUN) Nov 2009 Standalone Probe Interface (Widows PC) Hall Probes EPICS Server (Linux PC) CAN BusNetwork Socket

13/09/2015Imperial College 31 Tracker: Magnetic Field Probes C&M system functionally complete –Just requires error handling refinements & definition of alarm limits To be installed at RAL in November 2009 –Dependent on Tracker schedule – could change No dedicated client will be written – sufficient to display parameters via Channel Archiver Data Server Responsible for SystemResponsible for EPICS C&MDue Frank Filthaut (RUN) Nov 2009

13/09/2015Imperial College 32 Tracker: AFEIIts AFEIIt configuration, control & monitoring software complete Finalisation of DATE integration details required –Need DATE-side client to enable/disable triggers (i.e. run control) Responsible for SystemResponsible for EPICS C&MDue Alan Bross (FNAL)James Leaver (IC); Jean-Sebastien Graulich (UNIGE)Now; June 2009

13/09/2015Imperial College 33 Tracker: AFEIIt Infrastructure ‘Infrastructure’ corresponds to miscellaneous auxiliary hardware associated with AFEIIts –Somewhat ill-defined, since most hardware (AFEIIt cryo systems & safety interlocks) integrated with Spectrometer Solenoid controls Currently require C&M for AFEIIt power supplies –4 Wiener PSUs (1 per cryo) –CAN Bus or RS232 communication interface Intend to use RS232 for simplicity –No progress yet – expect manpower to be available for completion in August Additional C&M requirements may develop (to be discussed) Responsible for SystemResponsible for EPICS C&MDue Alan Bross (FNAL)James Leaver (IC)Aug 2009; TBD

13/09/2015Imperial College 34 Hydrogen Absorbers: Focus Coils Absorber Focus Coils expected to require C&M systems very similar to Pion Decay Solenoid & Spectrometer Solenoids Would be most efficient for DL to take over project (wealth of relevant expertise) –Unfortunately prevented by MICE funding constraints –Task assigned to MOG Responsible for SystemResponsible for EPICS C&MDue Wing Lau (OU)Pierrick Hanlet (IIT); TBDMay 2010; TBD

13/09/2015Imperial College 35 Hydrogen Absorbers: Focus Coils If possible, will attempt to use DL’s existing magnet designs as template –DL C&M systems have vxWorks IOCs For MICE to develop vxWorks software, expensive (~£15.2K) license required Investigate replacement with RTEMS controllers (‘similar’ real- time OS, free to develop) –DL system include custom in-house hardware Not available for general MICE usage – will check alternatives However, will consider possibility of entirely new design (perhaps with Linux PC-based IOCs) Responsible for SystemResponsible for EPICS C&MDue Wing Lau (OU)Pierrick Hanlet (IIT); TBDMay 2010; TBD

13/09/2015Imperial College 36 Hydrogen Absorbers: Focus Coils Work on Focus Coil C&M system has not yet commenced –Need to confirm availability of PH –Assistance from FNAL Controls Group would be highly beneficial – need to discuss Expect to start project in September 2009 Responsible for SystemResponsible for EPICS C&MDue Wing Lau (OU)Pierrick Hanlet (IIT); TBDMay 2010; TBD

13/09/2015Imperial College 37 RF Cavities: Coupling Coils Cavity Coupling Coil C&M situation identical to Focus Coils –Similar requirements to other MICE magnets –MOG responsibility (need to confirm PH’s availability) –Project should run in parallel with Focus Coil C&M system Responsible for SystemResponsible for EPICS C&MDue Derun Li; Steve Virostek (LBNL)Pierrick Hanlet (IIT); TBDSep 2010; TBD

13/09/2015Imperial College 38 DATE Status Need mechanism for reporting current DAQ state via EPICS Simple (‘dumb’) data server hosts DATE status PV Client application reads DATE status from DIIM server, forwards value to EPICS server Server & display client complete; DATE-side client to be implemented Responsible for SystemResponsible for EPICS C&MDue Jean-Sebastien Graulich (UNIGE)James Leaver (IC); Jean-Sebastien Graulich (UNIGE)Jun 2009 EPICS Data Server (Single ‘status’ PV) DATE Client

13/09/2015Imperial College 39 Network Status Need to verify that all machines on DAQ & control networks are functional throughout MICE operation Two types of machine –Generic PC (Linux, Windows) –‘Hard’ IOC (vxWorks, possibly RTEMS) EPICS Network Status server contains one status PV for each valid MICE IP address Responsible for SystemResponsible for EPICS C&MDate Due Anyone with a PC/IOC in the MLCR/HallJames Leaver (IC)Aug 2009

13/09/2015Imperial College 40 Network Status Read status: PC –SSH into PC Verifies network connectivity & PC identity –If successful, check list of currently running processes for required services Read status: ‘Hard’ IOC –Check that standard internal status PV is accessible, with valid contents e.g. ‘TIME’ PV, served by all MICE ‘hard’ IOCs Responsible for SystemResponsible for EPICS C&MDate Due Anyone with a PC/IOC in the MLCR/HallJames Leaver (IC)Aug 2009

13/09/2015Imperial College 41 Network Status Currently have working prototype –EPICS server connects to PCs via SSH, checks contents of ‘key’ ID file –Client displays status of all PCs, scans at user-specified period (with ‘check now’ override) Need to add service checking & ‘hard’ IOC support Responsible for SystemResponsible for EPICS C&MDate Due Anyone with a PC/IOC in the MLCR/HallJames Leaver (IC)Aug 2009

13/09/2015Imperial College 42 Unassigned Control Systems The following systems currently have no allocated C&M effort –Time of Flight System –Diffuser –Calorimeter system Request help from MICE community to identify system requirements Need to find additional C&M resources –MOG operating at full capacity & no funds for DL to undertake these projects –Expect those responsible for each system will be required to implement corresponding EPICS controls –Assistance from FNAL Controls Group would be welcome (to be discussed)

13/09/2015Imperial College 43 MICE Community C&M Projects Schedule

C&M Systems Developed by Daresbury

13/09/2015Imperial College 45 Target: Drive Significant work required for Target upgrade –Additional temperature sensors –Split power supply to reduce current → duplication of C&M components On schedule for Target installation Responsible for SystemResponsible for EPICS C&MDate Due Paul Smith; Paul Hodgeson; Chris Booth (UOS)Adrian Oates; Graham Cox (DL)Aug 2009

13/09/2015Imperial College 46 Beamline Magnets C&M system complete –DL provides ongoing support & maintenance Responsible for SystemResponsible for EPICS C&MDate Due Martin Hughes (RAL)Peter Owens (DL)Now

13/09/2015Imperial College 47 Pion Decay Solenoid C&M system complete –DL provides ongoing support & maintenance Responsible for SystemResponsible for EPICS C&MDate Due Mike Courthold (RAL)Adrian Oates; Graham Cox (DL)Now

13/09/2015Imperial College 48 Tracker: Spectrometer Solenoids Controls rack layout essentially complete –Associated wiring diagrams ~50% complete Require ~4 weeks work –Rack, cabling, distribution costs: ~£5K C&M system to follow standard DL design –Controls interface hardware costs: ~£13K –Software development effort: ~0.4 man years Responsible for SystemResponsible for EPICS C&MDate Due Steve Virostek (LBNL)Adrian Oates; Graham Cox (DL)Possibly Sep 2009

13/09/2015Imperial College 49 Tracker: Spectrometer Solenoids Work currently halted due to budget constraints 3 options –Allow DL to complete project Requires ~£18K capital man years effort –Take DL’s current design & complete within the collaboration Requires ~£18K capital + ~£15.2K vxWorks developer licence man years effort Insufficient MICE manpower available… –Discard DL’s design & start over within the collaboration Unknown capital requirements (likely ~£18K) Requires ~1.5 man years effort Insufficient MICE manpower available… Responsible for SystemResponsible for EPICS C&MDate Due Steve Virostek (LBNL)Adrian Oates; Graham Cox (DL)Possibly Sep 2009

13/09/2015Imperial College 50 Tracker: Spectrometer Solenoids Only reasonable option: provide DL with funds to complete project Cannot pay for work out of UK budget –Possibly utilise common fund? AB currently in negotiations with MZ Must decide on course of action (preferably before end of CM24) Responsible for SystemResponsible for EPICS C&MDate Due Steve Virostek (LBNL)Adrian Oates; Graham Cox (DL)Possibly Sep 2009

13/09/2015Imperial College 51 H2 Absorbers: Hydrogen System DL have acquired necessary safety training Started evaluating PLC systems Very early stages of development –However, full funding already allocated –No immediate problems Responsible for SystemResponsible for EPICS C&MDate Due Yury Ivanyushenkov; Tom Bradshaw (RAL)Adrian Oates; Graham Cox (DL)May 2010…?

13/09/2015Imperial College 52 RF Cavities: RF System Andy Moss has system well under control Local amplifier PLC to monitor safety interlocks – software development by Chris White –When installed at RAL, intend to implement EPICS readout via Ethernet or RS232 Second Low Level RF (LLRF) system –Reads power levels from 3 amplifiers + cavity probe signals –Implements amplifier drive feedback loop to stabilise RF phase & amplitude –LLRF cards designed by Larry Doolittle (LBNL), corresponding IOC to be written by Dimity Tettyleman DL to build LLRF cards + IOC crate No contracts placed yet, but expect to test LLRF system on DL RF test stand before end of year Other details to be finalised Responsible for SystemResponsible for EPICS C&MDate Due Andy Moss (ASTeC)Dimity Tettyleman (LBNL); Adrian Oates; Graham Cox (DL)Sep 2010…?

Important Points & Actions for the MICE Community

13/09/2015Imperial College 54 Items Which Require Action! Must find resources within MICE community to complete EPICS C&M systems for –Time of Flight System –Diffuser –Calorimeter system Must resolve issue of funding for DL’s work on the Spectrometer Solenoids PH’s contract expires very soon… –He is essential to success of Online Group –If he is not reemployed, we won’t have: Alarm Handler, Channel Archiver, remote parameter monitoring, C&M systems for CKOV, Focus Coils, Coupling Coils, etc.