Presentation is loading. Please wait.

Presentation is loading. Please wait.

MICE CM June 2009Jean-Sebastien GraulichSlide 1 Online Summary o Achievements Since CM23 o Control Room o Controls o Detector DAQ o Open Issues o Summary.

Similar presentations


Presentation on theme: "MICE CM June 2009Jean-Sebastien GraulichSlide 1 Online Summary o Achievements Since CM23 o Control Room o Controls o Detector DAQ o Open Issues o Summary."— Presentation transcript:

1 MICE CM June 2009Jean-Sebastien GraulichSlide 1 Online Summary o Achievements Since CM23 o Control Room o Controls o Detector DAQ o Open Issues o Summary of the summary Jean-Sebastien Graulich, Geneva

2 MICE CM June 2009Jean-Sebastien GraulichSlide 2 Achievements since CM23  Network in MLCR redesigned  Target DAQ integrated into Detector DAQ  EPICS Archiver and Alarm Handler implemented  EPICS Client Server installed in MLCR First steps toward Unified Control System First steps toward Unified Control System  Progress in Control and DAQ Interface  Progress in CAM and DAQ Interface with Configuration DataBase  Interface between DAQ and G4MICE improved  Problem with Event building understood

3 MICE CM June 2009Jean-Sebastien GraulichSlide 3 MLCR Network upgrade   Craig McWaters and Mike Courthold sorted out the chaotic situation   MiceNet: 172.16.246.XXX used for both DAQ: 172.16.246.01 -> 172.16.246.99 CAM: 172.16.246.100 -> 172.16.246.253   All DAQ machines moved to MiceNet   Structure created for all future machines   Strong safety policy enforced Strong passwords (ask your MOM) Non standard port for ssh No root ssh   + We have now a printer in MLCR !

4 MICE CM June 2009Jean-Sebastien GraulichSlide 4 EPICS Client / Server Overview

5 03/06/2016Imperial College 5 Network Status Currently have working prototype –EPICS server connects to PCs via SSH, checks contents of ‘key’ ID file –Client displays status of all PCs, scans at user-specified period (with ‘check now’ override) Need to add service checking & ‘hard’ IOC support Responsible for SystemResponsible for EPICS C&MDate Due Anyone with a PC/IOC in the MLCR/HallJames Leaver (IC)Aug 2009

6 MICE CM June 2009Jean-Sebastien GraulichSlide 6 CAM/DAQ Special Case:   Target DAQ   Target data is really both CAM and DAQ Double flow is implemented Based on a single I/O software -> No issue with format change   Waiting for the target to be online for final tests   Will require dedicated testing runs

7 MICE CM June 2009Jean-Sebastien GraulichSlide 7 Archiver and Alarm Handler

8 MICE CM June 2009Jean-Sebastien GraulichSlide 8 EPICS Client Server Installed   All client-side applications run on miceecserv Central installation repository greatly simplifies configuration/maintenance/backup MOG collates individual applications, applies updates when available from control system ‘owners’   Client control/monitoring GUIs viewed directly on miceecserv, or one of 2 ‘Operator Interface’ PCs OPI PCs act as ‘dumb terminals’, running displays from miceecserv via SSH miceecservmiceopi1miceopi2 EPICS IOC Portable CA Server EPICS IOC Controls Network EPICS server applications EPICS client applications

9 03/06/2016Imperial College 9 Unified User Interface Large wall-mounted display Alarm Handler Message log Client application launcher 3 Standard desktop monitors Client GUI Connected to miceecserv Connected to miceopi1 Connected to miceopi2

10 03/06/2016Imperial College 10 C&M Systems Overview

11 03/06/2016Imperial College 11 Tracker: Spectrometer Solenoids Work currently halted due to budget constraints 3 options –Allow DL to complete project Requires ~£18K capital + 0.4 man years effort –Take DL’s current design & complete within the collaboration Requires ~£18K capital + ~£15.2K vxWorks developer licence + 0.6-0.8 man years effort Insufficient MICE manpower available… –Discard DL’s design & start over within the collaboration Unknown capital requirements (likely ~£18K) Requires ~1.5 man years effort Insufficient MICE manpower available… Responsible for SystemResponsible for EPICS C&MDate Due Steve Virostek (LBNL)Adrian Oates; Graham Cox (DL)Possibly Sep 2009

12 MICE CM June 2009Jean-Sebastien GraulichSlide 12 Detector DAQ Issues   Three priorities were defined in CM23 Limit on Data Size Event Building CAM/DAQ Interface   Limit on Data Size From both DAQ and Monitoring First action on the data size itself ! Simply reducing the number of samples per fADC channel -> factor of ~5 Requires fADC firmware upgrade First attempt failed

13 MICE CM June 2009Jean-Sebastien GraulichSlide 13 Event building problem   Vassil Verguilov developed software to investigate the problem The Trigger Time Tag and Bunch ID from the TDC can be used to understand what happened   Each TDC records Trigger Time Tag (TTT) 27 bits 800 ns LSB ~ 100 s full range Bunch ID (BID) 12 bits 25 ns LSB ~ 50  s full range Clock running Independently   It is possible to retrieve the absolute time of the particle trigger W.r.t. to board reset (boards not yet synchronized) With some ambiguities… and complications …

14 MICE CM June 2009Jean-Sebastien GraulichSlide 14 CAM/DAQ Interface   Two ways link EPICS should know the DATE status A summary of EPICS data should be inserted in the online data stream The run should stop automatically when the CAM goes in severe alarm state   Date Status Monitoring James Leaver has provided EPICS Server, EPICS Client,Test Software, Status Display Next step: A small piece of code reading real status from DATE

15 MICE CM June 2009Jean-Sebastien GraulichSlide 15 CAM data Online   Principle re-discussed CAM data appended at the end of the run file vs CAM data inside the Spill Data   The issue: backward/forward compatibility of data format Solved by James Leaver An XML format description file will be provided and appended to the run file Require some “intelligent” unpacking code   CAM data will be stored Spill by Spill (Took just ~4 years to understand that AB was right…:-)

16 MICE CM June 2009Jean-Sebastien GraulichSlide 16 Trigger Issues  No modification since last CM  The effect of trigger condition choice on TOF calibration has been clearly understood Flexibility has a huge cost ! Flexibility has a huge cost ! TOF Calibration constants depend on the trigger source TOF Calibration constants depend on the trigger source Do we really need so many trigger conditions ? Do we really need so many trigger conditions ?  Decision on the future of GVA1 It will stay in place at least until Step 2 is complete

17 MICE CM June 2009Jean-Sebastien GraulichSlide 17 Online Software  Unpacking library has been upgraded New fADC (V1731) for CKOV Some bug fixed  Interface with G4MICE Vassil Verguilov has optimized the Date Reader for both clarity and efficiency Spill structure implemented in G4MICE (for scalar data, number of particle triggers per DAQ event, etc.) David Adey reproduced online histograms with G4MICE. Analysis is under way See Software session for more details  Know issues (No change since CM23) Online monitoring GUI tends to get stalled at the end of a run Need better Online plots for fADC Current plots found not efficient in detecting broken channels

18 MICE CM June 2009Jean-Sebastien GraulichSlide 18 Front End Electronics  TOF Cabling for TOF TDCs synchronization is finished (Internal clock and trigger distribution). Will need some dedicated beam time for testing TOF2: We miss some electronics. Shapers/Splitters: Production finished. Under evaluation TDC: One board missing FADC: 3 boards missing  EMR: Electron Muon Ranger (aka SW) Prototype being tested by Michela Prest and Erik Valaza

19 MICE CM June 2009Jean-Sebastien GraulichSlide 19 Schedule Milestones  For DDAQ  Target DAQ included in Detector DAQmid February  DAQ migration to the DAQ/Control networkMarch 06  Installation of the Online Reconstruction FarmMarch 06  Upgrade of fADC firmwareMarch 06 -> July 09  TOF2 Shaper/Splitter Productionmid March need testing  CAM data in Online Data Streamend of April -> June 09  Tracker integrated in DAQ and OLMend of April -> July 09  TOF TDC Clock Synchronizationend of April -> July 09  DAQ review May 2009 -> June 09  Burst Gate Signal in the Trigger Systemneed beam  Approval of SW/EMR Front End ElectronicsIn Proto phase  Production of SW/EMR Front End ElectronicsSeptember 2009

20 03/06/2016Imperial College 20 Items Which Require Action! Must find resources within MICE community to complete EPICS C&M systems for –Time of Flight System –Diffuser –Calorimeter system Must resolve issue of funding for DL’s work on the Spectrometer Solenoids PH’s contract expires very soon… –He is essential to success of Online Group –If he is not reemployed, we won’t have: Alarm Handler, Channel Archiver, remote parameter monitoring, C&M systems for CKOV, Focus Coils, Coupling Coils, etc.

21 MICE CM June 2009Jean-Sebastien GraulichSlide 21 Summary -------------- DDAQ -------------------  Network in MLCR is under Control  Target DAQ is integrated into online data stream  Interface between DAQ and G4MICE improved  Problem with Event building understood  DAQ review will happen on Thursday June 4 *2009* -------------- CAM ---------------------  Infrastructure design has been re-exanimated  EPICS Client Server installed in MLCR  EPICS Archiver and Alarm Handler implemented First steps toward Unified Control System First steps toward Unified Control System  Progress in Control and DAQ Interface  Man power is critical – we should not let go the rare & precious expertise we have now


Download ppt "MICE CM June 2009Jean-Sebastien GraulichSlide 1 Online Summary o Achievements Since CM23 o Control Room o Controls o Detector DAQ o Open Issues o Summary."

Similar presentations


Ads by Google