Download presentation
Presentation is loading. Please wait.
Published byMarion Lawrence Modified over 9 years ago
1
Controls & Monitoring Status Update J. Leaver 05/11/2009
2
Infrastructure
3
21/01/2016Imperial College 3 Infrastructure Issues Control PCs & servers Application management –Client-side application launcher –Soft IOC run control Configuration Database (CDB) –EPICS Interface –Ensuring the validity of recorded run parameters Protocols for updating & maintaining software on control PCs (see PH’s talk) Alarm Handler (see PH’s talk) Channel Archiver (see PH’s talk) Remote Access (see PH’s talk)
4
21/01/2016Imperial College 4 Control PCs & Servers: Current Status miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 miceserv1miceecserv miceopipc1micecon1target2
5
21/01/2016Imperial College 5 Control PCs & Servers: Current Status miceserv1miceecserv miceopipc1micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 General purpose server PC –Currently runs EPICS servers for: FNAL BPMs DATE Status Config. DB User Entry Data CKOV Temp. & Humidity Monitor DSA Neutron Monitor –Will also run EPICS servers for: Network Status CKOV + TOF CAEN HV Supplies General purpose server PC –Currently runs EPICS servers for: FNAL BPMs DATE Status Config. DB User Entry Data CKOV Temp. & Humidity Monitor DSA Neutron Monitor –Will also run EPICS servers for: Network Status CKOV + TOF CAEN HV Supplies
6
21/01/2016Imperial College 6 Control PCs & Servers: Current Status miceserv1miceecserv miceopipc1micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 Target DAQ & Control PC –Currently runs: Target / Beam Loss Monitor DAQ –Will run EPICS servers for: Beam Loss Monitor Target Controller Target DAQ & Control PC –Currently runs: Target / Beam Loss Monitor DAQ –Will run EPICS servers for: Beam Loss Monitor Target Controller
7
21/01/2016Imperial College 7 Control PCs & Servers: Current Status miceserv1miceecserv miceopipc1micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 Target Drive IOC (vxWorks) –EPICS server for Target PSU & extraction motor Target Drive IOC (vxWorks) –EPICS server for Target PSU & extraction motor
8
21/01/2016Imperial College 8 Control PCs & Servers: Current Status miceserv1miceecserv miceopipc1micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 Beamline Magnets IOC (vxWorks) –EPICS server for Q1-9, D1-2 Beamline Magnets IOC (vxWorks) –EPICS server for Q1-9, D1-2
9
21/01/2016Imperial College 9 Control PCs & Servers: Current Status miceserv1miceecserv miceopipc1micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 Decay Solenoid IOC (vxWorks)
10
21/01/2016Imperial College 10 Control PCs & Servers: Current Status miceserv1miceecserv miceopipc1micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 Linde Refrigerator IOC (PC)
11
21/01/2016Imperial College 11 Control PCs & Servers: Current Status miceserv1 miceopipc1micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 miceecserv ‘EPICS Client’ Server PC –Runs all client-side control & monitoring applications –Runs infrastructure services: Alarm Handler Channel Archiver ‘EPICS Client’ Server PC –Runs all client-side control & monitoring applications –Runs infrastructure services: Alarm Handler Channel Archiver Large wall mount display shows: –Alarm Handler panel –Log message viewer Display may also be used to show any (non- interactive) panel containing information that must be monitored for the duration of a specific run Large wall mount display shows: –Alarm Handler panel –Log message viewer Display may also be used to show any (non- interactive) panel containing information that must be monitored for the duration of a specific run
12
21/01/2016Imperial College 12 Control PCs & Servers: Current Status miceecserv miceopipc1micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 miceserv1 Gateway / Archiver Web Server PC –Runs Channel Access Gateway, providing read-only access to PVs between MICE Network & heplnw17 –Runs web server enabling read-only access to Channel Archiver data –Currently running non-standard OS for control PCs Will reformat after current November/December run period –See PH’s talk Gateway / Archiver Web Server PC –Runs Channel Access Gateway, providing read-only access to PVs between MICE Network & heplnw17 –Runs web server enabling read-only access to Channel Archiver data –Currently running non-standard OS for control PCs Will reformat after current November/December run period –See PH’s talk
13
21/01/2016Imperial College 13 Control PCs & Servers: Current Status miceserv1miceecserv micecon1target2 miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 miceopipc1 General purpose Operator Interface PC –Primary access point for users to interact with control & monitoring panels –Essentially a ‘dumb’ X server – runs all applications via SSH from miceecserv General purpose Operator Interface PC –Primary access point for users to interact with control & monitoring panels –Essentially a ‘dumb’ X server – runs all applications via SSH from miceecserv
14
21/01/2016Imperial College 14 Control PCs & Servers: Current Status miceserv1miceecserv miceiocpc1target1ctlmiceioc2miceioc4miceioc1miceioc5 miceopipc1micecon1target2 Additional General purpose Operator Interface PCs –Currently running non-standard OS for control PCs –Useable, but not optimally configured… –Cannot disturb at present - will reformat after current November/December run period See PH’s talk –Shall be renamed miceopipc2 & miceopipc3 Additional General purpose Operator Interface PCs –Currently running non-standard OS for control PCs –Useable, but not optimally configured… –Cannot disturb at present - will reformat after current November/December run period See PH’s talk –Shall be renamed miceopipc2 & miceopipc3
15
21/01/2016Imperial College 15 Application Launcher New application launcher replaces DL TCL script –XML configuration file, easy to add items –Unlimited subcategory levels Provides real-time display of application status Configurable response to existing application instances
16
21/01/2016Imperial College 16 Application Launcher: App Status Application was previously launched by an external process, but is no longer running (return value unknown) Application was killed by a signal & is no longer running Application is running, but was not executed by this launcher Application is running Application quit with an error code & is no longer running
17
21/01/2016Imperial College 17 Application Launcher: External App Response Multiple application launchers will be operated simultaneously –On miceopipc1-3 (via SSH from miceecserv) & miceecserv itself Need to ensure that shifters using different launchers do not ‘conflict’ If operator attempts to execute an application that is already running, launcher has a configurable response: –Ignore:Launch another instance –Inhibit:Prevent another instance from running –Kill:Close existing instance & run a new one (e.g. could be used for a ‘master’ override control panel) Typical configuration: –Only one instance of each ‘control’ application may run (cannot have multiple users modifying the same parameter!) –Unlimited numbers of monitoring panels may be opened
18
21/01/2016Imperial College 18 Soft IOC Management Application launcher primarily concerned with managing client-side control & monitoring panels running on miceecserv Also need to implement run control for corresponding EPICS servers –‘Hard IOCs’ running on vxWorks (i.e. servers provided by DL) are always ‘on’ → require no routine external intervention –‘Soft IOCs’ running on control PCs (i.e. servers produced within the Collaboration) are executed like any other application → require user control Why can’t soft IOCs just run at system start-up, like any other service? –Assumes that servers run perpetually, unattended – not true! –Sometimes need to modify configuration files, requiring server restart –Servers sometimes crash due to hardware problems, requiring restart –May need to turn off or reconfigure hardware – cannot do this while a soft IOC is running –Shifters should not have to worry about Linux service management…
19
21/01/2016Imperial College 19 Soft IOC Management CmdExServer provides similar functionality to normal application launcher, but with an EPICS interface Each IOC PC runs an instance of the CmdExServer at start-up CmdExServer manages local soft IOCs Client-side ‘remote application launcher’ communicates with all CmdExServers & allows user to start/stop IOCs NB -Current remote launcher configuration only includes a subset of the servers assigned to miceiocpc1 -Others will be added as they become available CmdExServer FNAL BPM Server DATE Status Server Config. DB User Entry Data Server Network Status Server etc. miceiocpc1 CmdExServer AFEIIt Server micetk1pctarget1ctl Beam Loss Monitor Server CmdExServer Target Controller Server
20
21/01/2016Imperial College 20 Configuration Database: EPICS Interface Custom EPICS PV backup & restore client is functionally complete –Enables manual backup & restore of set point values –Automatically backs up set parameters when DATE signals end of run Currently stores values in local XML file archive –Automatic backup files transferred to CDB via SSH/SCP to heplnw17 –Temporary solution → will be replaced with direct SOAP XML transactions once RAL networking issues resolved Need publicly accessible web server on heplnw17 –Restoration of parameters from CDB will also be implemented once SOAP transfers are possible
21
21/01/2016Imperial College 21 Configuration Database: User Entry Not all parameters required for a CDB ‘run’ entry are available through normal EPICS channels –i.e. Relevant IOCs & integration with the DAQ are not yet complete –Currently only beamline magnet currents can be backed up from ‘live’ servers (Quasi) temporary solution: –Generic EPICS data server hosts PVs for all values missing from existing IOCs, so they can be read by backup/restore client –User entry client allows shifter to enter required parameters before initiating a run –As future work progresses, unnecessary user entry items will be removed However, shall always require some degree of manual data entry CDB Data Server (miceiocpc1) User Entry Client Backup / Restore Client
22
21/01/2016Imperial College 22 Ensuring the Validity of CDB Entries Vital that set point values remain fixed during each standard run –If set point value record in CDB does not represent physical state of system for entire run, data are invalid Implement following protocol to ensure invalid runs are correctly identified: –CDB data server hosts run status PV –User entry client automatically sets run status to true when user submits current run parameters At this stage, users should not modify set point values again until run is complete –Dedicated monitor IOC checks all set point PVs while DATE is in ‘data taking’ state → sets run status to false if any value changes (to do) –Alarm Handler monitors run status → immediately warns that run is invalid if any user modifies a set point value (to do) –run status incorporated in CDB run parameters record
23
Control & Monitoring Systems
24
21/01/2016Imperial College 24 C&M Systems Overview
25
C&M Systems Developed by Local MICE Community
26
21/01/2016Imperial College 26 Target: Controller Target Controller Stage 1 upgrade underway –Hardware essentially complete –Currently working on Controller firmware (P. Smith) Software nearing completion –Hardware driver framework in place –Have implemented all EPICS server / client functionality Tested using ‘virtual’ Target Controller Device –Remaining task: write low- level hardware driver plug-in once firmware is complete Stage 1 upgrade - Control functionality includes: -Set delays -Set / monitor Target depth -Park / hold, start / stop actuation -Monitor hardware status
27
21/01/2016Imperial College 27 Target: Beam Loss Standalone DAQ system upgrades: –Final algorithm selected for ‘absolute’ beam loss calculation Thanks to AD for implementation –Standalone event viewer can now follow real-time output of DAQ Target Beam Loss IOC will read local data archive written by DAQ & serve values as PVs –Enables integration with Alarm Handler & readout of actuation numbers for CDB run data entries –IOC will use same analysis & file access code as standalone DAQ, via C wrapper functions –See PH’s talk
28
21/01/2016Imperial College 28 DATE Status Need mechanism for reporting current DAQ state via EPICS –Required for user feedback, alarm handling & triggering automatic CDB run set point value backups Simple (‘dumb’) data server hosts DATE status PV Client application reads DATE status from DIIM server, forwards value to EPICS server Server & display client complete – DATE integration should be complete before end of Collaboration Meeting (JSG…?) EPICS Data Server (Single ‘status’ PV) DATE Client
29
21/01/2016Imperial College 29 Network Status Need to verify that all machines on Control & DAQ networks are functional throughout MICE operation Two types of machine –Generic PC (Linux, Windows) –Hard IOC (vxWorks) EPICS Network Status server contains one status PV for each valid MICE IP address Read status: PC –SSH into PC (using key files, for security) Verifies network connectivity & PC identity –If successful, check list of currently running processes for required services Read status: Hard IOC –Check that standard internal status PV is accessible, with valid contents e.g. ‘TIME’ PV, served by all MICE ‘hard’ IOCs
30
21/01/2016Imperial College 30 Network Status Server & client are functionally complete –Client displays status of all PCs & hard IOCs, scans at user- specified period (with ‘check now’ override) Currently not configured for use on the MICE network… –Necessary to compile list of required services for each MICE PC –Network specification not yet finalised Need to confirm how status server will connect to PCs on both Control & DAQ networks
31
21/01/2016Imperial College 31 Other ‘Local’ C&M Systems TOF CKOV Diffuser DSA Neutron Monitor KL Calorimeter Electron Muon Ranger See PH’s talk Remain unallocated…
32
Daresbury C&M Status Update A. Oates
33
21/01/2016Imperial College 33 Recent DL Controls Work New ‘IOC controls server’ installed –Provides boot/configuration parameters for DL hard IOCs (vxWorks): miceioc1, miceioc2, miceioc4 –Enabled Online Group to take ownership of old ‘IOC controls server’ (miceserv1) & associated Channel Archiver duties Provided the Online Group with general EPICS assistance –EPICS architecture guidance –Channel Archiver web server installation Changed both Target Drive systems to incorporate new spilt rail power supplies Fault diagnosis and correction on Target 2 Drive controls in R78 Prepared quotation for Cooling Channel Control System Completed requested changes to Magnet PSU databases
34
Schedule & Final Notes
35
21/01/2016Imperial College 35 Schedule
36
21/01/2016Imperial College 36 Schedule
37
21/01/2016Imperial College 37 Final Notes Good progress on all fronts Groundwork required to unify MICE control systems & create a user-friendly interface is well underway –Alarm Handler, Channel Archiver operational –Unified application launchers complete (bad old days of logging into individual PCs & typing commands are over!) –CDB interface functional Lots of work still to do! –Controls effort is ‘significantly understaffed’ (DAQ & Controls Review, June ‘09) –We are always operating at full capacity –‘Controls & Monitoring’ is such a broad subject that many unexpected additional requirements crop up all the time… –Please be patient!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.