Review of Daresbury Workshop MICE DAQ Workshop Fermilab February 10, 2006 A. Bross.

Slides:



Advertisements
Similar presentations
Network II.5 simulator ..
Advertisements

SciFi Tracker DAQ M. Yoshida (Osaka Univ.) MICE meeting at LBNL 10.Feb.2005 DAQ system for KEK test beam Hardware Software Processes Architecture SciFi.
MICE CM Berkeley 9-12 Feb February 2005 Edda Gschwendtner 1 Parameter List Edda Gschwendtner Introduction Parameter list for sub-systems of MICE.
MICE RF & Controls Paul Drumm. RF Layout 2 MW Amplifier Master Oscillator Controls etc 201 MHz Cavity Module 2 MW Amplifier 201 MHz Cavity Module CERN.
Daresbury Aug 2005Jean-Sébastien Graulich Detector DAQ Overview and Needs Jean-Sebastien Graulich, Univ. Genève o Introduction o Detector Systems Overview.
EPICS Experience at Fermilab Geoff Savage August 2005 Controls and Monitoring Group.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Wrap Up o Control and Monitoring o DDAQ o Interface between DDAQ and MCM o Infrastructure Jean-Sebastien.
KEK Test Beam Phase II Plan Makoto Yoshida (Osaka Univ.) MICE FT Daresbury 2005/8/30.
Workshop Goals MICE Controls and Monitoring Workshop September 25, 2006 A. Bross.
Tracker DAQ Makoto Yoshida (Osaka Univ.) MICE Frascati 2005/6/27.
1 VLPC system and Cosmic Ray test results M. Ellis Daresbury Tracker Meeting 30 th August 2005.
SciFi Tracker DAQ M. Yoshida (Osaka Univ.) MICE Tracker KEK Mar. 30, 2005.
Paul drumm daq&c-ws august/september Cooling Channel.
Tracker Controls MICE Controls and Monitoring Workshop September 25, 2005 A. Bross.
MICE Safety System DE Baynham TW Bradshaw MJD Courthold Y Ivanyushenkov.
2005/10/22 MICE CM at RAL, Tracker Parallel, Makoto Yoshida 1 KEK test beam - introduction - M. Yoshida MICE CM tracker parallel 2005/10/ /10/22.
Control, Monitoring and DAQ Makoto Yoshida Osaka Univ. MICE Frascati June 28, 2005.
Controls Review  Want to record a full configuration of the experiment at every possible “event”, including controls data.  Event trigger = accelerator.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Interface between Control & Monitoring and DDAQ o Introduction o Some background on DATE o Control Interface.
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
EPICS on TPS RF System Yu-Hang Lin Radio Frequency Group NSRRC.
Distributed Control Systems Emad Ali Chemical Engineering Department King SAUD University.
MICE Tracker Readout and Data Acquisition; Solenoid Magnetic Field Measurement Terry Hart for the MICE Collaboration, Illinois Institute of Technology,
17 March 2005Edda Gschwendtner1 MICE Cooling Channel: Can we predict cooling to ? Edda Gschwendtner Challenge Systematics Cooling Channel Beam Line.
MICE CM February 08Jean-Sébastien GraulichSlide 1 Detector DAQ Hardware Status o Introduction o Preliminary list o Plan for action o Summary Jean-Sebastien.
SNS Integrated Control System EPICS Collaboration Meeting SNS Machine Protection System SNS Timing System Coles Sibley xxxx/vlb.
Emittance measurement: ID muons with time-of-flight Measure x,y and t at TOF0, TOF1 Use momentum-dependent transfer matrices iteratively to determine trace.
1 MICE Tracker Update M. Ellis UKNFIC Meeting 25 th August 2005.
Emittance measurement: ID muons with time-of-flight Measure x,y and t at TOF0, TOF1 Use momentum-dependent transfer matrices to map  path Assume straight.
Imperial College Tracker Slow Control & Monitoring.
NCSX NCSX Preliminary Design Review ‒ October 7-9, 2003 G. Oliaro 1 G. Oliaro - WBS 5 Central Instrumentation/Data Acquisition and Controls Princeton Plasma.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
IMPLEMENTATION OF SOFTWARE INPUT OUTPUT CONTROLLERS FOR THE STAR EXPERIMENT J. M. Burns, M. Cherney*, J. Fujita* Creighton University, Department of Physics,
Control & Monitoring System Update Contributors: Brian Martlew - Daresbury Laboratory - STFC James Leaver - Imperial College Pierrick Hanlet - Fermilab.
ATCA based LLRF system design review DESY Control servers for ATCA based LLRF system Piotr Pucyk - DESY, Warsaw University of Technology Jaroslaw.
Online Calibration of the D0 Vertex Detector Initialization Procedure and Database Usage Harald Fox D0 Experiment Northwestern University.
FAIR Accelerator Controls Strategy
Final Review of ITER PBS 45 CODAC – PART 1 – 14 th, 15 th and 16 th of January CadarachePage 1 FINAL DESIGN REVIEW OF ITER PBS 45 CODAC – PART 1.
March 2008EPICS Meeting in Shanghai1 KEKB Control System Status Mar Tatsuro NAKAMURA KEKB Control Group, KEK.
Eugenia Hatziangeli Beams Department Controls Group CERN, Accelerators and Technology Sector E.Hatziangeli - CERN-Greece Industry day, Athens 31st March.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Control and Monitoring Summary of DAQ WS III o General Architecture o RF System C&M o D0 EPICS extensions.
Issues in Accelerator Control Bob Dalesio, December 23, 2002.
Mice CM Oct 2005Jean-Sébastien GraulichSlide 1 MICE Collaboration Meeting Oct05 Review of DAQ Workshop and DAQ Issues Jean-Sebastien Graulich, Univ. Genève.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
MICE Status & Plans MICE-UK paul drumm 15 th September 2004.
January 31, MICE DAQ MICE and ISIS Introduction MICE Detector Front End Electronics Software and MICE DAQ Architecture MICE Triggers Status and Schedule.
MICE Phase 1 Koji Yoshimura KEK June
Controls for Particle ID/tracking in MICE 1.Physics and systematics 2.Controls types 3.Controls list 4.Needed for particle ID/tracking M. A. Cummings May.
Control System Overview J. Frederick Bartlett Fermilab June 1,1999.
Controls & Monitoring Overview J. Leaver 03/06/2009.
The DØ Control System J. Frederick Bartlett For The DØ Controls Group.
Database David Forrest. What database? DBMS: PostgreSQL. Run on dedicated Database server at RAL Need to store information on conditions of detector as.
11 th February 2008Brian Martlew EPICS for MICE Status of the MICE slow control system Brian Martlew STFC, Daresbury Laboratory.
Controls and monitoring in MICE 1.Physics and systematics 2.Controls types 3.Controls list 4.Data record M. A. Cummings April 2004 CERN.
1 Channel Access Concepts – IHEP EPICS Training – K.F – Aug EPICS Channel Access Concepts Kazuro Furukawa, KEK (Bob Dalesio, LANL)
EPICS and LabVIEW Tony Vento, National Instruments
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
1 Tracker Software Status M. Ellis MICE Collaboration Meeting 27 th June 2005.
Control System Overview J. Frederick Bartlett Fermilab June 1,1999.
XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser Wojciech Jalmuzna, Technical University of Lodz, Department of Microelectronics and Computer.
An Introduction to Epics/Tango Steve Hunt Alceli EPICS Meeting 2008 INFN Legnaro 15 Oct 17:15.
ORNL is managed by UT-Battelle for the US Department of Energy Status Report: Data Acquisition and Instrument Controls for the Spallation Neutron Source.
Fermilab Control System Jim Patrick - AD/Controls MaRIE Meeting March 9, 2016.
Laboratorio per dottorandi 2017 Particle Accelerators Control System
ATF/ATF2 Control System
How SCADA Systems Work?.
CSNS Accelerator Control and Beam Instrumentation JIN Dapeng, XU Taoguang … June 9, 2015
Tracker Software Status
The Online Detector Control at the BaBar experiment at SLAC
Presentation transcript:

Review of Daresbury Workshop MICE DAQ Workshop Fermilab February 10, 2006 A. Bross

Daresbury MICE DAQ WS  The first MICE DAQ Workshop was held at Daresbury Lab in September of 05  Focus was to give an overview of requirements of the experiment and discuss possible hardware and software implementations.  Included Front-end, monitoring and controls

paul drumm daq&c-ws august/september

4

DAQ Concepts

Daresbury Aug 2005Jean-Sébastien Graulich Introduction  Normal Detector DAQ is synchronised with MICE Beam  We want RF ON and RF Off Data (50/50 ?)  We need calibration data For each run  We want RF Noise data Dedicated Run

Daresbury Aug 2005Jean-Sébastien Graulich DDAQ vs CM Detector DAQ Synchronised with the beam Very fast reaction time (~  s) High transfer rate (~50 MB/s) Read and Store, no time for on-line processing Limited User Interface Run Control only (Slow) Control and Monitoring Continuous and permanent Very reliable (Safety issue) Deal with a lot of ≠ hardware Read and Check Calibration, manage alarms at ≠ levels, soft interlocks, take actions, log history, etc. Extended UI Set many parameters, manage complicate initialisation procedures, etc.  Why separate Particle Detector DAQ and Control Sensor DAQ ?

Daresbury Aug 2005Jean-Sébastien Graulich Concept Schematic MICE Beam Line MICE Cooling Channel MICE Detectors Detector DAQ Slow Control Data Storage Monitoring Data Log Run Control UI MICE User Interface Environment RF Phase

The Parts of MICE Parts is Parts

paul drumm daq&c-ws august/september Beam Line Overview and Controls Needs

paul drumm daq&c-ws august/september Beam Line Overview What’s in the beam line? –ISIS synchronisation pulses –Target –Beam Loss –Beam optics elements (power supplies) Quads, dipoles and muon decay solenoid Trim coils, water flow, temperature –Vacuum system – Warm ion pumps –Muon Decay Solenoid Cryogenics & Vacuum –Diffuser – id-sensor –Diagnostics – Beam Scintillators (not at same time as MICE Running

paul drumm daq&c-ws august/september dipole quads solenoid quads Power supplies - control & diagnostic Assume control feedback is done in the power supply “Control” – consists of setting I in each element “Diagnostic” – output = alarms/mimics etc – input parameters: Quads I,V – x4 each x9 Dipole I,V – x1 each x2 Solenoid I,V Diffuser bar-code reader? Environmental parameters (marshalled elsewhere?) cooling water temperature fault conditions (power supplies) fault conditions (solenoid cryogenics) Vacuum – per pump set Gauges – three: rough – turbo – system Integrated system control through PLC or off-shelf controller: - pump on/off; valve open/closed - auto start-up & shutdown v v v v v v VV Target ISIS: -BLM -Cycle information Solenoid Cryogenics & control system MICE Diagnostics DAQ  Control System Hybrid

paul drumm daq&c-ws august/september Target System Target motion is locked to ISIS cycle R/O system Target Mechanism Control System Drive G-MS PC System Linux / EPICS Event Bus Ethernet Infrastructure System implemented in MICE hall

paul drumm daq&c-ws august/september Mix of levels of monitoring & control –Dedicated hardware to drive the motors Read the position Control the current demand –Control system Used to set parameters in hardware –Amplitude, delays, speed Used to read & analyse diagnostic info –position… GUI – simplified user interaction Used to display diagnostics info

paul drumm daq&c-ws august/september Cooling Channel

paul drumm daq&c-ws august/september Physics Parameters Monitoring system can record many parameters, but: What are the critical parameters? Is the following sufficient? –Beam Line & Target Almost irrelevant Dipole one = momentum of pions Dipole two = momentum setting of muon beam Diffuser = energy loss = momentum into MICE –Poor measurement & p measured by TOF and Tracker Use Table of settings –record table-ID when loaded –“All ok” should be a gate condition –Magnets/Absorber Table of settings as above –RF – more dynamic information is needed High Q system – what does this mean for V &  ? Tuning mechanism – no details yet – pulse to pulse

paul drumm daq&c-ws august/september Superconducting magnets Current Setting –Process control – turn on/off/adjust –Passive monitoring – status Power supply = lethargic parameters Transducers = V&I Temperature monitoring Magnetic field measurements –3d Hall probes –CANBus – PCI/USB/VME interfaces available Feedback system or… Fault warnings –Temperatures / operational features MLS  Record Settings, Report Faults, “All ok” in gate

paul drumm daq&c-ws august/september Absorber (and Focus Coils) Absorber & Focus Coil Module –Mike Courthold’s talk on Hydrogen & Absorber Systems Process control Passive monitoring –Suit of instruments: »Temperature »Fluid Level »Thickness (tricky) Fault warnings –MLS  Record Settings, Report Faults, “All ok” in gate

paul drumm daq&c-ws august/september RF (& Coupling Coil) RF Cavity –Tuning –Cavity Power Level (Amplitude) –Cavity Phase RF Power –LL RF (~kW) –Driver RF (300kW) –Power RF (4MW) MICE has separate tasks to develop cavity and power systems but they are closely linked in a closed loop

The Parts of MICE Parts is Parts

MICE Safety System DE Baynham TW Bradshaw MJD Courthold Y Ivanyushenkov

MICE Layout

MICE Hazards Beam Radiation Fire Explosion Overpressure Material brittleness Skin burn Radiation HV=> Sparks HV RF LH 2 TrackerLH 2 Tracker Particle detectors High magnetic field => High mechanical forces Magnetic stray field Photo detectors and front-end electronics Optical fibres Photo detectors and front-end electronics Optical fibres HV TOF Monitoring the MICE Safety Systems is Important Component of Monitoring System

Controls and Monitoring Architecture

EPICS Experience at Fermilab Geoff Savage August 2005 Controls and Monitoring Group

What is EPICS?  Experimental Physics and Industrial Control System Collaboration Control System Architecture Software Toolkit  Integrated set of software building blocks for implementing a distributed control system 

Why EPICS?  Availability of device interfaces that match or are similar to our hardware  Ease with which the system can be extended to include our experiment-specific devices  Existence of a large and enthusiastic user community that understands our problems and are willing to offer advice and guidance.

EPICS Architecture Input Output Controller (IOC) Operator Interface (OPI) Local Area Network (LAN) OPI – a workstation running EPICS tools. Operating Systems: Linux and Windows XP. IOC – platform supporting EPICS run-time database. Example: VME based PPC processor running vxWorks, Linux workstation. LAN – communication path for Channel Access (CA), the EPICS communication protocol.

IOC Architecture

IOC Database  Collection of records  Each record represents a system parameter (process variable, PV) Unique name Set of attributes Attributes and value can be modified  Records must process to do something An input record can read a value every 10 seconds A CA write to an output record causes the record to process Either input or output, not both

EPICS Records  Input Analog In (AI) Binary In (BI) String In (SI)  Algorithm/control Calculation (CALC) Subroutine (SUB)  Output Analog Out (AO) Binary Out (BO)  Custom – only needed when existing record types or a collection of existing record types are inadequate

Channel Access IOC (CA Server) LAN EPICS Tool (CA Client) One client can connect to many servers. A channel connects to the value or an attribute of a PV. Each channel has a unique name = record name + attribute. CA Services: Search – find a channel by name Get – retrieve a channel value Put – modify a channel value Add monitor – notification of state change

MIL-STD-1553B Bus  Restricted detector access while running  Provides a robust and highly reliable connection to electronics in the remote collision hall  Developed a queuing driver, device support, and a generic record  12 IOCs with ~ busses from the counting house to the detector and ~10 busses within the counting house

Describe a Device with Records Template File Generator File EPICS Database EPICS tool Multiple record definitions with substitution parameters This defines a device. Define instances of a template. Assign values to the substitution parameters in the template file. Database Load ASCII file Instances of records read by IOC to create record database.

Centralized Database System Oracle Hardware Database Template File DB Creation EPICS Database Record Extract Generator File EPICS Database IOC Id Template Extract Generator File Instance Creation Python Scripts Web Access

Support for CFT  Device support for the AFE board using the MIL-1553 driver  AFE and AFE power supply templates  Record and device support for the SVX sequencers  Expert GUI for downloading Does not use COMICS  Calibration software Runs on processor in VRB crate Communicates with AFE

Analog Front End Fiber Wave guide Cryostat VLPC Analog Front End Stereo Board Cassette MIL-1553 Communication

EPICS at Fermilab  SMTF Superconducting module test facility Cavity test in October DOOCS for LLRF – speaks CA EPICS for everything else No ACNET – beams division controls system  ILC Control system to be decided Management structure now forming

EPICS Lessons  Three Layers Tools  EPICS tools  OPI development - CA library Applications  Build base  Build application - combine existing pieces  Develop device templates Development  Records, device, and driver support  EPICS is not simple to use Need expertise in each layer Support is an away  IOC operating system selection

paul drumm daq&c-ws august/september Integration of DAQ & Control systems

paul drumm daq&c-ws august/september Monitoring System

paul drumm daq&c-ws august/september Data Stream? Controls Header MHz Data (DAQ) Controls Footer DB Pointer MHz Data (DAQ) DB Pointer clock

paul drumm daq&c-ws august/september System Concept? equipment Control PCs Master Monitor & Logging ControlDisplay Event (Trigger) bus Ethernet bus

An idea of the DAQ architecture Bit3 SASeq#1 SASeq#2 SASeq#3 SASeq#4 SERDES#1 SERDES#2 SERDES#3 SERDES#4 SERDES#5 SERDES#6 SERDES#7 SERDES#8 VLPC #1 L VLPC #1 R VLPC #2 L VLPC #2 R VLPC #3 L VLPC #3 R VLPC #4 L VLPC #4 R Tracker Collector Upstream Tracker Collector Downstream Tracker Builder PID Builder Beam Builder MICE Builder MICE Storage MICE Control Bit3 SASeq#1 SASeq#2 SASeq#3 SASeq#4 SERDES#1 SERDES#2 SERDES#3 SERDES#4 SERDES#5 SERDES#6 SERDES#7 SERDES#8 Tracker Control Bit Tracker Slow Ctrl VLPC #1 L VLPC #1 R VLPC #2 L VLPC #2 R VLPC #3 L VLPC #3 R VLPC #4 L VLPC #4 R Upstream TrackerDownstream Tracker 4096ch 4kBytes/event 8MBytes/spill 4kBytes/event 4MBytes/spill Cryostat Ctrl/Monitor

Online System

UniDaq  Makoto described the KEK Unidaq system u Used for Fiber Tracker test at KEK s Worked well, interface to tracker readout (AFEII- VLSB) would be the same as in MICE u Could be used as MICE online system u Needs event builder s Significant issue

ALICE approach HLT embedded in the DAQ More than one hardware trigger  not straightforward to change trigger –But DAQ can be very flexible with standard technology. Easy to partition down to the level of the single front-end HLT / Monitoring

Alice HLT

Ready-to-use GUIs Run control should be implemented as a state machine for proper handling of state change Configure and partition Set run parameters and connect Select active processes and start them Start/stop

Ready-to-use GUIs Run control should be implemented as a state machine for proper handling of state change Configure and partition Set run parameters and connect Select active processes and start them Start/stop

Conclusions: never underestimate … Users are not experts: provide them the tools to work and to report problems effectively. Flexible partitioning. Event-building with accurate fragment alignment and validity checks, state reporting and reliability. Redundancy / fault tolerance A proper run-control with state-machine –And simplifying to the users the tasks of Partition, Configure, Trigger selection, Start, Stop A good monitoring framework, with clear-cut separation between DAQ services and monitoring clients Extensive and informative logging GUIs This represents a significant amount of work Not yet started!

Conclusions Coming out of Daresbury  Made Decisions!: u EPICS u Linux u VME/PC s Ethernet backbone u Event Bus (maybe)  The Start of a Specifications document u Has had one iteration

Next  Specify/evaluate as much of the hardware as possible u Front-end electronics needs to be fully specified since it can effect many aspects of the DAQ u Crate specs s PCs s Various interface cards  Assemble Test System u Core system s Must make a decision here (UNIDAQ, ALICE-like TB system,?) –I think we need to make a decision here relatively soon and will likely involve some arbitrariness. Who does the work will present a particular “POV” u VME backbone u Trigger architecture u Front-end electronics u EPICS interface s Need experts to be involved