Presentation is loading. Please wait.

Presentation is loading. Please wait.

Review of Daresbury Workshop MICE DAQ Workshop Fermilab February 10, 2006 A. Bross.

Similar presentations


Presentation on theme: "Review of Daresbury Workshop MICE DAQ Workshop Fermilab February 10, 2006 A. Bross."— Presentation transcript:

1 Review of Daresbury Workshop MICE DAQ Workshop Fermilab February 10, 2006 A. Bross

2 Daresbury MICE DAQ WS  The first MICE DAQ Workshop was held at Daresbury Lab in September of 05  Focus was to give an overview of requirements of the experiment and discuss possible hardware and software implementations.  Included Front-end, monitoring and controls

3 paul drumm daq&c-ws august/september 2005 3

4 4

5 DAQ Concepts

6 Daresbury Aug 2005Jean-Sébastien Graulich Introduction  Normal Detector DAQ is synchronised with MICE Beam  We want RF ON and RF Off Data (50/50 ?)  We need calibration data For each run  We want RF Noise data Dedicated Run

7 Daresbury Aug 2005Jean-Sébastien Graulich DDAQ vs CM Detector DAQ Synchronised with the beam Very fast reaction time (~  s) High transfer rate (~50 MB/s) Read and Store, no time for on-line processing Limited User Interface Run Control only (Slow) Control and Monitoring Continuous and permanent Very reliable (Safety issue) Deal with a lot of ≠ hardware Read and Check Calibration, manage alarms at ≠ levels, soft interlocks, take actions, log history, etc. Extended UI Set many parameters, manage complicate initialisation procedures, etc.  Why separate Particle Detector DAQ and Control Sensor DAQ ?

8 Daresbury Aug 2005Jean-Sébastien Graulich Concept Schematic MICE Beam Line MICE Cooling Channel MICE Detectors Detector DAQ Slow Control Data Storage Monitoring Data Log Run Control UI MICE User Interface Environment RF Phase

9 The Parts of MICE Parts is Parts

10 paul drumm daq&c-ws august/september 2005 10 Beam Line Overview and Controls Needs

11 paul drumm daq&c-ws august/september 2005 11 Beam Line Overview What’s in the beam line? –ISIS synchronisation pulses –Target –Beam Loss –Beam optics elements (power supplies) Quads, dipoles and muon decay solenoid Trim coils, water flow, temperature –Vacuum system – Warm ion pumps –Muon Decay Solenoid Cryogenics & Vacuum –Diffuser – id-sensor –Diagnostics – Beam Scintillators (not at same time as MICE Running

12 paul drumm daq&c-ws august/september 2005 12 dipole quads solenoid quads Power supplies - control & diagnostic Assume control feedback is done in the power supply “Control” – consists of setting I in each element “Diagnostic” – output = alarms/mimics etc – input parameters: Quads I,V – x4 each x9 Dipole I,V – x1 each x2 Solenoid I,V Diffuser bar-code reader? Environmental parameters (marshalled elsewhere?) cooling water temperature fault conditions (power supplies) fault conditions (solenoid cryogenics) Vacuum – per pump set Gauges – three: rough – turbo – system Integrated system control through PLC or off-shelf controller: - pump on/off; valve open/closed - auto start-up & shutdown v v v v v v VV Target ISIS: -BLM -Cycle information Solenoid Cryogenics & control system MICE Diagnostics DAQ  Control System Hybrid

13 paul drumm daq&c-ws august/september 2005 13 Target System Target motion is locked to ISIS cycle R/O system Target Mechanism Control System Drive G-MS PC System Linux / EPICS Event Bus Ethernet Infrastructure System implemented in MICE hall

14 paul drumm daq&c-ws august/september 2005 14 Mix of levels of monitoring & control –Dedicated hardware to drive the motors Read the position Control the current demand –Control system Used to set parameters in hardware –Amplitude, delays, speed Used to read & analyse diagnostic info –position… GUI – simplified user interaction Used to display diagnostics info

15 paul drumm daq&c-ws august/september 2005 15 Cooling Channel

16 paul drumm daq&c-ws august/september 2005 16 Physics Parameters Monitoring system can record many parameters, but: What are the critical parameters? Is the following sufficient? –Beam Line & Target Almost irrelevant Dipole one = momentum of pions Dipole two = momentum setting of muon beam Diffuser = energy loss = momentum into MICE –Poor measurement & p measured by TOF and Tracker Use Table of settings –record table-ID when loaded –“All ok” should be a gate condition –Magnets/Absorber Table of settings as above –RF – more dynamic information is needed High Q system – what does this mean for V &  ? Tuning mechanism – no details yet – pulse to pulse

17 paul drumm daq&c-ws august/september 2005 17 Superconducting magnets Current Setting –Process control – turn on/off/adjust –Passive monitoring – status Power supply = lethargic parameters Transducers = V&I Temperature monitoring Magnetic field measurements –3d Hall probes –CANBus – PCI/USB/VME interfaces available Feedback system or… Fault warnings –Temperatures / operational features MLS  Record Settings, Report Faults, “All ok” in gate

18 paul drumm daq&c-ws august/september 2005 18 Absorber (and Focus Coils) Absorber & Focus Coil Module –Mike Courthold’s talk on Hydrogen & Absorber Systems Process control Passive monitoring –Suit of instruments: »Temperature »Fluid Level »Thickness (tricky) Fault warnings –MLS  Record Settings, Report Faults, “All ok” in gate

19 paul drumm daq&c-ws august/september 2005 19 RF (& Coupling Coil) RF Cavity –Tuning –Cavity Power Level (Amplitude) –Cavity Phase RF Power –LL RF (~kW) –Driver RF (300kW) –Power RF (4MW) MICE has separate tasks to develop cavity and power systems but they are closely linked in a closed loop

20 The Parts of MICE Parts is Parts

21 MICE Safety System DE Baynham TW Bradshaw MJD Courthold Y Ivanyushenkov

22 MICE Layout

23 MICE Hazards Beam Radiation Fire Explosion Overpressure Material brittleness Skin burn Radiation HV=> Sparks HV RF LH 2 TrackerLH 2 Tracker Particle detectors High magnetic field => High mechanical forces Magnetic stray field Photo detectors and front-end electronics Optical fibres Photo detectors and front-end electronics Optical fibres HV TOF Monitoring the MICE Safety Systems is Important Component of Monitoring System

24 Controls and Monitoring Architecture

25 EPICS Experience at Fermilab Geoff Savage August 2005 Controls and Monitoring Group

26 What is EPICS?  Experimental Physics and Industrial Control System Collaboration Control System Architecture Software Toolkit  Integrated set of software building blocks for implementing a distributed control system  www.aps.anl.gov/epics

27 Why EPICS?  Availability of device interfaces that match or are similar to our hardware  Ease with which the system can be extended to include our experiment-specific devices  Existence of a large and enthusiastic user community that understands our problems and are willing to offer advice and guidance.

28 EPICS Architecture Input Output Controller (IOC) Operator Interface (OPI) Local Area Network (LAN) OPI – a workstation running EPICS tools. Operating Systems: Linux and Windows XP. IOC – platform supporting EPICS run-time database. Example: VME based PPC processor running vxWorks, Linux workstation. LAN – communication path for Channel Access (CA), the EPICS communication protocol.

29 IOC Architecture

30 IOC Database  Collection of records  Each record represents a system parameter (process variable, PV) Unique name Set of attributes Attributes and value can be modified  Records must process to do something An input record can read a value every 10 seconds A CA write to an output record causes the record to process Either input or output, not both

31 EPICS Records  Input Analog In (AI) Binary In (BI) String In (SI)  Algorithm/control Calculation (CALC) Subroutine (SUB)  Output Analog Out (AO) Binary Out (BO)  Custom – only needed when existing record types or a collection of existing record types are inadequate

32 Channel Access IOC (CA Server) LAN EPICS Tool (CA Client) One client can connect to many servers. A channel connects to the value or an attribute of a PV. Each channel has a unique name = record name + attribute. CA Services: Search – find a channel by name Get – retrieve a channel value Put – modify a channel value Add monitor – notification of state change

33 MIL-STD-1553B Bus  Restricted detector access while running  Provides a robust and highly reliable connection to electronics in the remote collision hall  Developed a queuing driver, device support, and a generic record  12 IOCs with ~70 1553 busses from the counting house to the detector and ~10 busses within the counting house

34 Describe a Device with Records Template File Generator File EPICS Database EPICS tool Multiple record definitions with substitution parameters This defines a device. Define instances of a template. Assign values to the substitution parameters in the template file. Database Load ASCII file Instances of records read by IOC to create record database.

35 Centralized Database System Oracle Hardware Database Template File DB Creation EPICS Database Record Extract Generator File EPICS Database IOC Id Template Extract Generator File Instance Creation Python Scripts Web Access

36 Support for CFT  Device support for the AFE board using the MIL-1553 driver  AFE and AFE power supply templates  Record and device support for the SVX sequencers  Expert GUI for downloading Does not use COMICS  Calibration software Runs on processor in VRB crate Communicates with AFE

37 Analog Front End Fiber Wave guide Cryostat VLPC Analog Front End Stereo Board Cassette MIL-1553 Communication

38 EPICS at Fermilab  SMTF Superconducting module test facility Cavity test in October DOOCS for LLRF – speaks CA EPICS for everything else No ACNET – beams division controls system  ILC Control system to be decided Management structure now forming

39 EPICS Lessons  Three Layers Tools  EPICS tools  OPI development - CA library Applications  Build base  Build application - combine existing pieces  Develop device templates Development  Records, device, and driver support  EPICS is not simple to use Need expertise in each layer Support is an email away  IOC operating system selection

40 paul drumm daq&c-ws august/september 2005 40 Integration of DAQ & Control systems

41 paul drumm daq&c-ws august/september 2005 41 Monitoring System

42 paul drumm daq&c-ws august/september 2005 42 Data Stream? Controls Header MHz Data (DAQ) Controls Footer DB Pointer MHz Data (DAQ) DB Pointer clock

43 paul drumm daq&c-ws august/september 2005 43 System Concept? equipment Control PCs Master Monitor & Logging ControlDisplay Event (Trigger) bus Ethernet bus

44 An idea of the DAQ architecture Bit3 SASeq#1 SASeq#2 SASeq#3 SASeq#4 SERDES#1 SERDES#2 SERDES#3 SERDES#4 SERDES#5 SERDES#6 SERDES#7 SERDES#8 VLPC #1 L VLPC #1 R VLPC #2 L VLPC #2 R VLPC #3 L VLPC #3 R VLPC #4 L VLPC #4 R Tracker Collector Upstream Tracker Collector Downstream Tracker Builder PID Builder Beam Builder MICE Builder MICE Storage MICE Control Bit3 SASeq#1 SASeq#2 SASeq#3 SASeq#4 SERDES#1 SERDES#2 SERDES#3 SERDES#4 SERDES#5 SERDES#6 SERDES#7 SERDES#8 Tracker Control Bit3 1553 Tracker Slow Ctrl VLPC #1 L VLPC #1 R VLPC #2 L VLPC #2 R VLPC #3 L VLPC #3 R VLPC #4 L VLPC #4 R Upstream TrackerDownstream Tracker 4096ch 4kBytes/event 8MBytes/spill 4kBytes/event 4MBytes/spill Cryostat Ctrl/Monitor

45 Online System

46 UniDaq  Makoto described the KEK Unidaq system u Used for Fiber Tracker test at KEK s Worked well, interface to tracker readout (AFEII- VLSB) would be the same as in MICE u Could be used as MICE online system u Needs event builder s Significant issue

47 ALICE approach HLT embedded in the DAQ More than one hardware trigger  not straightforward to change trigger –But DAQ can be very flexible with standard technology. Easy to partition down to the level of the single front-end HLT / Monitoring

48 Alice HLT

49 Ready-to-use GUIs Run control should be implemented as a state machine for proper handling of state change Configure and partition Set run parameters and connect Select active processes and start them Start/stop

50 Ready-to-use GUIs Run control should be implemented as a state machine for proper handling of state change Configure and partition Set run parameters and connect Select active processes and start them Start/stop

51 Conclusions: never underestimate … Users are not experts: provide them the tools to work and to report problems effectively. Flexible partitioning. Event-building with accurate fragment alignment and validity checks, state reporting and reliability. Redundancy / fault tolerance A proper run-control with state-machine –And simplifying to the users the tasks of Partition, Configure, Trigger selection, Start, Stop A good monitoring framework, with clear-cut separation between DAQ services and monitoring clients Extensive and informative logging GUIs This represents a significant amount of work Not yet started!

52 Conclusions Coming out of Daresbury  Made Decisions!: u EPICS u Linux u VME/PC s Ethernet backbone u Event Bus (maybe)  The Start of a Specifications document u Has had one iteration

53 Next  Specify/evaluate as much of the hardware as possible u Front-end electronics needs to be fully specified since it can effect many aspects of the DAQ u Crate specs s PCs s Various interface cards  Assemble Test System u Core system s Must make a decision here (UNIDAQ, ALICE-like TB system,?) –I think we need to make a decision here relatively soon and will likely involve some arbitrariness. Who does the work will present a particular “POV” u VME backbone u Trigger architecture u Front-end electronics u EPICS interface s Need experts to be involved


Download ppt "Review of Daresbury Workshop MICE DAQ Workshop Fermilab February 10, 2006 A. Bross."

Similar presentations


Ads by Google