Download presentation
Presentation is loading. Please wait.
1
LUSI Controls and Data Systems W.B.S. 1.6
Gunther Haller Project Manager April 20-22, 2009 Presented by Perry Anthony Breakout Presentation
2
Content Overview Controls System Data System
CXI and XPP Detector Control/Data Chain Common Diagnostics
3
Controls W.B.S. Scope Near Experimental Hall X-ray Transport
Far Experimental Hall 1 3 2 4 5 6 XCSMono AMO SXR XPP XCS CXI H6 Part of LCLS Installation Part of LCLS FES ARRA Funds Separate WBS 1.6 to combine all LUSI control & data needs due to commonality in requirements, design, implementation, installation and integration XPP, CXI, XCS, Diagnostics & Common Optics Common control and data systems design for LUSI and rest of photon beam-line instruments (AMOS, SXR, FES)
4
Scope – WBS 1.6 Control & Data Systems
Included in W.B.S. 1.6 All controls & DAQ, labor and M&S, for XPP, CXI, XCS instrument components and diagnostics/common optics included in baseline Includes controllers, racks, cables, switches, installation Data-storage and processing for FEH instruments Initial offline (more effort will be on operating budget) Input-signals to LCLS machine protection system link-node modules Provided by LCLS X-Ray End Station controls (CAM is G. Haller) Personnel protection system Machine protection system (LCLS modules, fibers) Laser safety system Accelerator timing Femto-second laser timing Network architecture & security Data-storage and processing for NEH User safeguards Laser controls CXI 2D detector controls Interfaces described in ICD between XES and LUSI (released document)
5
ESD’s and ICD’s Two types of documents required for each instrument
Engineering Specification Documents (ESD’s) Interface Control Documents (ICD’s) XPP SP XPP Controls ESD SP XPP Controls & DAQ ICD SP XPP DAQ ESD CXI SP CXI Controls ESD SP CXI Controls & DAQ ICD SP CXI DAQ ESD XCS SP XCS Controls ESD SP XCS Controls & DAQ ICD SP XCS DAQ ESD Diagnostics SP LUSI Common Diagnostics & Optics ESD All documents released at
6
Reviews Preliminary Design Reviews for Instrument Controls and Data Systems held before the Instrument FIDR XPP Controls and Common Diagnostics PDR’s Held February 7, 2009 CXI and XCS PDR’s Scheduled for May 11, 2009 XRay End-Station (XES) Reviews under LCLS XES, not part of LUSI Common services: e.g. Networking, DAQ, PPS, LSS, MPS
7
Controls EPICS Basic EPICS Control and Monitoring
In use at BaBar, APS, ALS It is the LCLS control system Basic EPICS Control and Monitoring Vacuum: Instruments, connecting ‘pipes’ Valve control Timing/triggering (timing strobe from EVR) Motion control (‘stages’) Camera control Bias voltage supplies 120-Hz (slow) Analog-Digital Converters Digital IO bits/states Temperatures Hardware As much as feasible chosen from LCLS repertoire Added new controllers based on instrument requirements
8
Common Controls Hardware
Examples Racks VME Crates Motorola CPUs Timing EVR PMC cards Cameralink PMC cards VME ISEG HV supplies Analog-digital converter modules Solenoid controllers PLCs Network switches Terminal servers (Ethernet-to-Serial Port)
9
EPICS/Python/Qt EPICS (Experimental Physics and Industrial Control System): Control software for RT systems Monitor (pull scheme) Alarm Archive Widely used at SLAC and other labs More: Python/Qt is a user interface between the EPICS drivers and records and the user System is used for XTOD and AMO, provided as part of the XES Photon Controls Infrastructure
10
Example of Python/Qt User Interface
11
Example: Vacuum All gauge controllers are MKS 937A
Interface Terminal server – DIGI TS16 MEI Automation Direct PLC All ion pump controllers are Gama Vacuum DIGITEL MPC dual All valves are controlled by PLC relay module The out/not-out state of all valves go into the MPS system to prevent damage if a valve closes unexpectedly.
12
Example: Motion Control System provides support for all motions Motors
IMS MDrive Plus2 integrated controller and motor IMS MForce Plus2 controller for control of in vacuum and other specialized motors Newport motor controllers Others as required Pneumatic motion Solenoid Driver chassis, SLAC Articulated Detector Holder (robot arm) Controls group to work with outside integrator to interface to EPICS control system
13
High-Level Applications
To allow commissioners and users and of each experiment to: Use a common interface to both the DAQ system and EPICS Speed up the development cycle by using a high level programming language, but still be able to easily build critical sections in C/C++ Easily develop new applications Provide a GUI integrated with the programming language Re-use code developed by other LUSI experiments Python as high level scripting language Easy to learn, fast dev cycle, extensible, open-source, powerful, relatively fast QT as graphical user interface Framework and support for scientists provided by PCDS
14
Controls Status ESD and ICD’s released for all instruments
Hardware order lists for LUSI XPP, CXI, XCS are available XPP items being ordered
15
Overall Status Control and data systems hardware and software components to be provided are agreed on and documented XPP controls & data systems items are being ordered Following services required by XPP are already in place in hutch 3 or soon will be in place, months before required by XPP Hutch Protection System Laser Safety System User Safeguards Machine Protection System Network Timing (accelerator as well as femto-second laser) Racks including AC connections and cooling Data processing and storage Offsite data access and transport Racks for XPP on order, long-haul cable installation contracts in progress Software in progress
16
Data Sub-System Data Systems
Challenge is to perform data-correction and image processing while keeping up with continuous incoming data-streams LUSI benefits that SLAC Particle Physics and Astro-Physics group is involved which has substantial experience acquiring, processing, and archiving large data volumes at high rates Use common dataflow/processing/storage & offline interface DAQ for instrument components in the real-time detector data chain (BNL & Cornell 2-D detectors, future SXR detector, waveform sampler, etc) Minimizes development, production, commissioning, and maintenance effort Data Rate/Volume of CXI Experiment (comparable to other LUSI experiments) LCLS Pulse Rep Rate (Hz) 120 Detector Size (Megapixel) 1.2 Intensity Depth (bit) 14 Success Rate (%) 30% Ave. Data Rate (Gigabit/s) 0.6 Peak Data Rate (Gigabit/s) 1.9 Daily Duty Cycle (%) 50% Accu. for 1 station (TB/day) 3.1
17
Data System Architecture
Photon Control Data Systems (PCDS) Instrument specific Digitizers, Cameras, 2D Detectors Timing L0: Control L1: Acquisition L2: Processing L3: Data Cache Beam Line Data To SCCS Offline Level 0: Control Run & configuration control Run & telemetry monitoring Level 2: Processing Pattern recognition, sort, classify, alignment, reconstruction Level 1: Acquisition Image acquisition, calibration Event-building with beam-line data Correction using calibration constants Data reduction (vetoing, compression) Level 3: Online Archiving NEH/FEH local data-cache Local cache can buffer up to 4-days worth of data Offline will transport data to tape staging area in SCCS Computer Center
18
LUSI Data Acquisition Cornell and Brookhaven 2-D pixel detectors are configured & read out using the SLAC ATCA Reconfigurable Cluster Element modules Details in following slides XPP XAMP Detector with custom ASIC CXI Detector with custom ASIC
19
XAMP 2D-Detector Control and DAQ Chain
Beamline Instrument Detectors Fiber ATCA crate with SLAC DAQ boards, e.g. the SLAC Reconfigurable Cluster Element Module Brookhaven XPP/XCS 2D detector-ASIC SLAC FPGA front-end board XAMP (XPP) LUSI instrument custom integrated circuits from Brookhaven are already connected at SLAC to SLAC LCLS high-performance DAQ system XPP BNL XAMP Detector 1,024 x 1,024 array Uses 16 each 64-channel FexAmps BNL custom ASICs Instantaneous readout: 4 ch x 20 MHz x 16bit= 20 Gbit/sec into FPGA Output FPGA: 250 Mbytes/s at 120 Hz (1024x1024x2x120) In addition BNL has ATCA crate with SLAC modules to develop software and test with detector ATCA Advanced Telecommunication Computing Architecture Based on backplane serial communication fabric, 10-G E 2 SLAC custom boards (also used in other SLAC experiments) 8 x 2.5 Gbit/sec links to detector modules Dataflow and processing Managed 24-port 10-G Ethernet switching Essentially 480 Gbit/sec switch capacity Naturally scalable
20
Example: XPP Online Processing
Electronics gain correction (in RCE) Response of amplifying electronics is mapped during calibration Science data images are corrected for channel gain non-uniformity + non-linearity. Dark image correction (in RCE) Dark images accumulated between x-ray pulses Averaged dark image subtracted from each science data image Flat field correction (in RCE) Each science data image is corrected for non-uniform pixel response Event filtering (in RCE or later) Events are associated with beam line data (BLD) via timestamp and vetoed based upon BLD values. Veto action is recorded. Images may be sparsified by predefined regions of interest. Event binning (processing stage) Images (and normalization) belonging to the same bin (dt, Eg, ..) are summed together
21
Example: XPP Monitoring
A copy of the data is distributed (multicast) to monitoring nodes on the DAQ subnet. The monitoring nodes will provide displays for experimenters’ viewing: Corrected XAMPS images at ≥ 5 Hz Histories of veto rates, beam intensity, + other BLD values. Reduced analysis of sampled binned data (versus scan parameter) Implemented with Qt (C++/Python open source GUI)
22
Example: XPP XAMPS Data Rates
Photon Control Data Systems (PCDS) XPP specific Digitizers + Cameras Timing L0: Control (One) L1: Acquisition (Many) L2: Processing L3: Data Cache Beam Line Data 4 x 2.5Gb PGP 10 GbE L1: RCE L2: Processing L3: Cache XAMPS 240 MB/s (480 MB/s) < 200 MB/s 10 GbE n x (200 MB – 20 GB) Binned data archived at end of run (mins – hrs) Expect ~ 6 – 60 GB / day
23
CXI 2D-Detector Mechanical/Electrical Vacuum Assembly
SLAC PPA Engineering Positioning plate Supports quadrant raft Mounts to drive system Cam follower mounted to torque ring Hole size remotely adjustable Via PCDS Controls One Quadrant Raft Removed Pixel Detectors Cut-outs in base plate for cold straps and cables Cold strap
24
Quadrant Board and Electrical Interfaces
Quadrant raft provides structural support and stability for the double-detector packages Feet mount on quadrant raft through holes in the quadrant boards This is also the thermal path Quadrant boards provide grounding, power, and signal interface to the PAD detector package 1 flex cable per detector 1 FPGA on each quadrant board Cold strap Quadrant raft Mounting feet Double-detector package (4 per quadrant) Detector Quadrant board 2 Quadrant board 1
25
ASIC Board Rigid-flex ASIC board (SLAC design)
ASIC Bump-bonded to detector ASIC/detector package bonded to carrier board
26
CXI 2D-Detector Control and DAQ Chain
Vacuum Ground-isolation Fiber Carrier Board Cornell detector/ASIC with SLAC quadrant board ATCA crate with SLAC DAQ Boards Each Cornell detector has ~36,000 pixels Controlled and read out using Cornell custom ASIC ~36,000 front-end amplifier circuits and analog-to-digital converters Initially 16 x 32,000-pixel devices, then up to 64 x 32,000-pixel devices 4.6 Gbit/sec average with > 10 Gbit/sec peak
27
DAQ Status Re-used significant fraction of Babar DAQ software
Implemented “zero-copy” transmission/reception of network data (hard in Linux) Running full DAQ Chain (EVG/EVR/L0/L1/L2/L3): Configuring/Reading out e.g. Acqiris/Opal1000 with “zero-copy” of objects in memory (better performance) Generating official data files. Iterating over them. XPP and CXI detector/ASIC connected to LCLS system and functional
28
Common Diagnostics Readout
E.g. intensity, profile monitor, intensity position monitors E.g. Canberra PIPS or IRD SXUV large area diodes (single or quad) Amplifier/shaper/ADC for control/calibration/readout R2 R1 q2 q1 L Target Quad-Detector FEL Four-diode design On-board calibration circuits not shown Board designed, fabricated, loaded, is in test
29
Interface to LCLS Interface to LCLS/X-Ray End-Station Infrastructure
Machine timing (~ 20 psec jitter) Laser timing (< 100 fsec jitter) 120 Hz beam data Machine protection system Hutch protection system Laser safety system Networking EPICS server
30
120-Hz Data Feedback Loop Low latency 120 Hz beam-line data communication Use existing second Ethernet port on IOC’s No custom hardware or additional hardware required UDP multi-cast Raw Ethernet packages RF Phase Cavity Accelerator Experiment IOC IOC IOC 120-Hz network Timing Realtime per-pulse information can be used for e.g. Vetoing of image samples (using accelerator data) Adjustment of accelerator or photon beamline components based on instrument/diagnostics results Compensation of drifts, etc Transport of electro-optics timing result to hutch experiments
31
Organization 1.6. CAM: G. Haller
Deputy (P. Anthony) Online (A. Perazzo) Controls (R. Machet) DAQ (C. O’Grady) Infrastructure (R. Rodriguez) Offline Computing (I. Gaponenko) Technical leaders are also responsible for AMO, SXR, and XES-provided photon area controls/DAQ/infrastructure needed by LUSI Provides low risk having interface issues, provides high efficiency Ensures common solutions No issue with man-power, plus instruments are time-phased. Scientist XPP (D. Fritz) CXI (S. Boutet) XCS (A. Robert) Diagnostics/Common Optics (Y. Feng) Detectors (N. Van Bakel)
32
Some Milestones XPP CXI XCS Controls PDR (done) Feb 2009
Controls FDR Oct 2009 Start installation of controls Jan 2010 Controls ready to use Jun 2010 CXI Controls PDR May 2009 Start installation of controls Apr 2010 Controls ready to use Oct 2010 XCS Controls FDR Jan 2010 Start installation of controls Oct 2011 Controls ready to use Apr 2011
33
Summary Interface and Requirements documents released
Clear what needs to be done. No issues, design meets requirements Design Advanced Most items are already used (hardware and software) in XTOD and AMO, ahead of XPP (and CXI, XCS) XPP Preliminary Design Review completed Most items similar to XTOD and AMO which both already had Final Design Reviews for Controls and Data Systems (XTOD is starting to be installed, AMO will follow in August 09) CXI and XCS Preliminary Design Review scheduled for May Technical and cost/schedule risks are low Already know what is being used and quantity of items Already ordering XPP items Configuration and data acquisition for 2D detectors using SLAC ATCA system well advanced Data processing for XPP defined and is in progress Team Engineers and technicians from PPA Research Engineering Group, sufficient man-power available for LUSI
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.