Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gunther Haller LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 1 Breakout Session G. Haller – Sub-System Manager August 20,

Similar presentations


Presentation on theme: "Gunther Haller LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 1 Breakout Session G. Haller – Sub-System Manager August 20,"— Presentation transcript:

1 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 1 Breakout Session G. Haller – Sub-System Manager August 20, 2008 LUSI WBS 1.6 Controls and Data Systems

2 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 2 Content Scope Cost & Schedule WBS Organization Cost Schedule Control and Data System Architecture Control System Architecture Devices Controller Examples Data System Data Systems Architecture Science Data Acquisition & Processing DAQ Components High Level Applications, Online Archive Offline File Management, Meta Data Summary

3 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 3 LUSI Controls & Data Systems Location XPP LCLS AMO X-Ray Transport Tunnel (XRT) (200 m) XCS HEDS CXI SXR imaging Controls & Data Systems hardware/software Hutches 3, 4, an 5 Control rooms for hutches 3, 4, and 5 X-Ray tunnel NEH and FEH server rooms H: Hutch NEH (H3, H3 Control Room & Server Room FEH (H4/5, H4/5 Control Rooms & Server Room Common control and data systems design for photon beam-line/instruments (XTOD, AMOS, LUSI, SXR)

4 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 4 Scope – WBS 1.6 Control & Data Systems Included in W.B.S. 1.6 All controls & DAQ, labor and M&S, for XPP, CXI, XCS instrument components with diagnostics/common optics included in baseline Includes controllers, racks, cables, switches, installation Data-storage and processing for FEH Initial offline (more effort will be on operating budget) Input-signals to LCLS machine protection system link-node modules Provided by LCLS X-Ray End Station controls (CAM is G. Haller) Personnel protection system Machine protection system (LCLS modules, fibers) Laser safety system Accelerator timing Femto-second laser timing Network architecture & security Data-storage and processing for NEH User safeguards Laser controls CXI 2D detector controls Interfaces described in 1.1-517 ICD between XES and LUSI (released document)

5 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 5 Performance Requirements From LUSI Performance Execution Plan (PEP) This presentation will show that the requirements will be fulfilled

6 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 6 1.6 WBS to Level 4 Example XPP

7 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 7 W.B.S 1.6.2 Common Controls W.B.S. 1.6.2 Common Controls Photon beam feedback Electron beam feedback Hutch environmental measurement FEH data storage Data processing Initial level 2 processing Racks & cables Non-hutch racks and cables, mainly FEH

8 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 8 W.B.S 1.6.3, 1.6.4, 1.6.5 W.B.S. 1.6.3 XPP, 1.6.4 CXI, 1.6.5 XCS Requirements, design, setup Standard hutch controls Hutch cables, racks, installation Workstations Beamline processor Channel access gateway Machine protection system Interface Specific controls Valve/vacuum controls Pop-in profile monitor Pop-in intensity monitor Intensity position monitor Slit controls Instrument specific controls for each section of the instrument

9 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 9 W.B.S 1.6.6 Offline Computing W.B.S. 1.6.6 Offline Computing Data-format API Data-catalog Meta-data management Processing framework Workflow Pipeline

10 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 10 Cost Methodology Basis for agreement on what components need to be controlled and how Detailed Engineering Specification Documents (ESD’s) for each instrument All ESD’s are approved and released Two ESD’s for each instrument Controls ESD Describing devices to be controlled E.g. motion, vacuum EPICS processing to be performed E.g scanning DAQ ESD Describing devices to be read into DAQ E.g. 2-D detectors, waveform sampling, some 120-Hz cameras, etc Online processing to be performed Plus one ESD for diagnostics Basis for agreement on who is responsible for what and where the interface is: Interface Control Documents (ICD’s) ICD’s to all instruments are approved and released

11 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 11 Example XPP Beam-Line Start from beam-line, itemize controls

12 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 12 ESD’s and ICD’s XPP SP-391-001-21 XPP Controls ESD SP-391-001-22 XPP Controls & DAQ ICD SP-391-001-23 XPP DAQ ESD CXI SP-391-001-13 CXI Controls ESD SP-391-001-14 CXI Controls & DAQ ICD SP-391-001-18 CXI DAQ ESD XCS SP-391-001-24 XCS Controls ESD SP-391-001-25 XCS Controls & DAQ ICD SP-391-001-26 XCS DAQ ESD Diagnostics SP-391-001-19 LUSI Common Diag. & Optics ESD All documents at http://confluence.slac.stanford.edu/display/PCDS/LUSI+Document+Page

13 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 13 Cost Methodology Bottoms-up: supporting excel spread-sheet organized by WBS created from ESD content (agreement between scientist and controls) Labor Number of hours and detailed tasks for each WBS Based on prior experience from previous SLAC experiments Material Lists each individual component to be purchased with price under each WBS Each item is labeled with reference number Reference number references component on LUSI Controls Item list spread-sheet List of every controls item used for LUSI > 95% of components supported by quotes or purchase orders All items on item list supported by quote or purchase order print- out

14 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 14 WBS Spread-Sheet Example WBS Activity BOE Hours Cost Item Item # Count $/each Total Item # references Controls Item list, see next slide

15 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 15 LUSI Controls Item List Below are first 14 items of LUSI Controls Item list Total ~70 separate items Components in WBS spread-sheet refer to this Reference Number Price support pages containing copies of previous orders or quotes are labeled with this item #

16 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 16 Contingency Contingency calculated for each element from two factors Design Maturity 6 levels for labor 5 levels for M&S Judgment Factor Risks, exchange rate, etc Held at project level

17 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 17 Project Budget WBS Control Accounts Work Packages Values WBS 1.1612$5,461,314 WBS 1.21449$5,942,486 WBS 1.31145$9,486,460 WBS 1.41645$7,715,265 WBS 1.51039$6,383,995 WBS 1.6 (G. Haller)20289$7,135,691 Total BAC $42,125,211 WBS 1.6 Resource TypeValue Labor$3,409,458 Non-Labor$3,726,233 Total BAC$7,135,691 Detailed bottoms-up cost estimate Labor: number of hours listed for each task All M&S itemized to the component level Almost 100% supported by vendor quotes or recent purchase orders

18 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 18 Schedule All tasks and materials (order, award, receive dates) in P3 1.6 is internally linked with predecessors and successors “Available” mile-stones for each deliverable identified and entered Linked to instrument “Need” mile-stones Resources leveled

19 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 19 Milestones XPP XPP Controls PDR Dec 08 CD-3A – XPP Instrument Start ConstructionJun 09 XPP Controls FDR Sept 09 XPP Controls availableMar 10 CD-4A – XPP Start OperationDec 10 CXI CXI Controls PDR Sep 09 CD-3B – CXI – Instrument Start ConstructionApr 10 CXI Controls FDR Jun 10 CXI Controls availableNov 10 CD-4B – CXI – Start Operation Dec 11 XCS XCS Controls PDR Nov 09 CD-3C – XCS – Instrument Start ConstructionApr 10 XCS Controls FDR Feb 11 XCS Controls availableJul 11 CD-4C – XCS – Start Operation Aug 12

20 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 20 CDA Schedule Critical Envelope CDA has multiple deliveries to the instruments and is heavily driven by their needs. The project will monitor strings of activities with the least float

21 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 21 Slow Controls Tasks & Hardware EPICS In use at BaBar, APS, ALS It is the LCLS control system Basic EPICS Control and Monitoring Vacuum: Instruments, connecting ‘pipes’ Valve control Timing/triggering (timing strobe from EVR) Motion control (‘stages’) Camera control Bias voltage supplies 120-Hz (slow) Analog-Digital Converters Digital IO bits/states Temperatures Hardware As much as feasible chosen from LCLS repertoire Added new controllers based on instrument requirements

22 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 22 Common Controls Hardware Examples Racks VME Crates Motorola CPUs Timing EVR PMC cards Cameralink PMC cards VME ISEG HV supplies Analog-digital converter modules Solenoid controllers PLCs Network switches Terminal servers (Ethernet-to-Serial Port)

23 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 23 Example: Motion Systems Newport XCS controller

24 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 24 Common Diagnostics Readout Four- diode design R2R2 R1R1 22 11 L Target Quad-Detector FEL On-board calibration circuits not shown E.g. intensity, profile monitor, intensity position monitors E.g. Canberra PIPS or IRD SXUV large area diodes (single or quad) Amplifier/shaper/ADC for control/calibration/readout

25 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 25 Interface to LCLS Interface to LCLS/X-Ray End-Station Infrastructure Machine timing (~ 20 psec jitter) Laser timing (< 100 fsec jitter) 120 Hz beam data Machine protection system Hutch protection system Laser safety system Networking EPICS server

26 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 26 120-Hz Data Feedback Loop Low latency 120 Hz beam-line data communication Use existing second Ethernet port on IOC’s No custom hardware or additional hardware required UDP multi-cast Raw Ethernet packages IOC AcceleratorEOExperiment Timing 120-Hz network Realtime per-pulse information can be used for e.g. Vetoing of image samples (using accelerator data) Adjustment of accelerator or photon beamline components based on instrument/diagnostics results Compensation of drifts, etc Transport of electro-optics timing result to hutch experiments

27 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 27 Data Sub-System Data Rate/Volume of CXI Experiment (comparable to other LUSI experiments) LCLS Pulse Rep Rate (Hz)120 Detector Size (Megapixel)1.2 Intensity Depth (bit)14 Success Rate (%)30% Ave. Data Rate (Gigabit/s)0.6 Peak Data Rate (Gigabit/s)1.9 Daily Duty Cycle (%)50% Accu. for 1 station (TB/day)3.1 Difference to conventional X-Ray experiments: High peak rate & large volume comparable to high-energy physics experiments such as BaBar @ SLAC Challenge is to perform data-correction and image processing while keeping up with continuous incoming data-stream SLAC Particle Physics and Astro-Physics group involved has advantage since it has substantial experience acquiring and processing large data rates at high rates

28 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 28 Coherent Imaging of Single Molecules Diffraction from a single molecule: single LCLS pulse noisy diffraction pattern of unknown orientation Combine 10 5 to 10 7 measurements into 3D dataset: Classify/sortAverageAlignment Reconstruct by Oversampling phase retrieval Miao, Hodgson, Sayre, PNAS 98 (2001) unknown orientation Gösta Huldt, Abraham Szöke, Janos Hajdu (J.Struct Biol, 2003 02-ERD-047) The highest achievable resolution is limited by the ability to group patterns of similar orientation

29 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 29 Data System Architecture Photon Control Data Systems (PCDS)‏Detector specific Detector + ASICFEE TimingL0: Control L1: Acquisition L2: Processing L3: Data Cache Detector Experiment specific May be bump-bonded to ASIC or integrated with ASIC Front-End Electronics (FEE) ‏ Provide local configuration registers and state machines Provide ADC if ASIC has analog outputs FEE uses FPGA to transmit to DAQ system Beam Line Data

30 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 30 Level 0 Nodes Level 0: Control DAQ operator consoles Provide different functionalities Run control Partition management, data-flow Detector control Configuration (modes, biases, thresholds, etc) ‏ Run monitoring Data quality Telemetry monitoring Temperatures, currents, voltages, etc Manage all L1, L2 and L3 nodes in a given partition (i.e. the set of DAQ nodes used by a specific experiment or test-stand) ‏

31 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 31 Level 1 Nodes Level 1: Acquisition Receive 120 Hz timing signals, send trigger to FEE, acquire FEE data Error detection and recovery of the FEE data Control FEE parameters Calibration Dark image accumulation and averaging Transfer curve mapping, gain calculation Neighbor pixel cross-talk calculation Event-build FEE science data with beam-line data Image processing Pedestal subtraction using calibration constants, cross-talk corrections Partial data reduction (compression) ‏ Rejection using 120 Hz beam-line data Processing envisioned both in software and firmware (VHDL) ‏ Send collected data to Level 2 nodes over 10 Gb/s Ethernet

32 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 32 Level 2 & 3 Nodes Level 2: Processing High level data processing: Learn, pattern recognition, sort, classify e.g. combine 10 5 – 10 7 images into 3D data-set Alignment, reconstruction Currently evaluating different ATCA blades for L2 nodes Send processed data to L3 over 10 Gb/s Ethernet Level 3: Data Cache Provide data storage Located in server room in experimental hall Off-line system will transfer data from local cache to tape staging system Tape staging system located in SLAC central computing facilities Must be able to buffer up data in local storage during downtimes of staging system

33 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 33 ATCA Crate ATCA Advanced Telecommunication Computing Architecture Based on backplane serial communication fabric We use 10-Gigabit Ethernet 2 custom boards Reconfigurable Cluster Element (RCE) Module Interface to detector Up to 8 x 2.5 Gbit/sec links to detector modules Cluster Interconnect Module (CIM) Managed 24-port 10-G Ethernet switching One ATCA crate can hold up to 14 RCE’s & 2 CIM’s Essentially 480 Gbit/sec switch capacity Naturally scalable Can also scale up crates ATCA Crate RCE CIM

34 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 34 Reconfigurable Cluster Element (I)‏ The RCE is the most challenging among the different Level 1 node types SLAC custom made ATCA board Used in other SLAC experiments Based on System On Chip (SOC) Technology Implemented with Xilinx Virtex 4 devices, FX family Xilinx devices provide Reconfigurable FPGA fabric DSPs (200 for XC4VFX60) ‏ Generic CPU (2 PowerPCs 405 running at 450 MHz for XC4VFX60) ‏ PPC is choice for IP cores for next generation FPGA’s TEMAC: Xilinx TriMode Ethernet Hard Cores MGT: Xilinx Multi-Gigabit Transceivers 622Mb/s to 6.5Gb/s (16 for XC4VFX60) ‏ RCE with RTM

35 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 35 Reconfigurable Cluster Element (II)‏ System Memory Subsystem 512 MB of RAM Memory controller provides 8 GB/s overall throughput Uses Micron RLDRAM II Platform Flash Memory Subsystem Stores firmware code for FPGA fabric Configuration Flash Memory Subsystem 128 MB configuration flash Dedicated file system for storing software code and configuration parameters (up to 16 selectable images) ‏ Storage Flash Memory Subsystem (optional) ‏ Up to 1TB per RCE persistent storage flash (currently 256GB per RCE) ‏ Low latency/high bandwidth access through I/O channels using PGP Uses Samsung K9NBG08 (32 Gb per chip) ‏

36 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 36 RCE Software Software Ported open source Real-Time kernel Adopted RTEMS: Real Time Operating Systems for Multiprocessor Systems Written BSP mainly in C++ Plus some C and assembly Written 10Gb Ethernet driver and PGP drivers for bulk data 1Gb management interface driver Built interface to RTEMS TCP/IP network stack Developed specialized network stack for zero-copy Ethernet traffic

37 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 37 Cluster Interconnect Module ATCA network card SLAC custom made board Based on two 24-port 10Gb Ethernet switch ASICs from Fulcrum Up to 480 Gb/s total bandwidth Managed via Virtex-4 device Currently XC4VFX12 Fully managed layer-2, cut-through switch Interconnect up to 14 in-crate RCE boards (i.e. 28 RCEs) ‏ Interconnect multiple crates for additional scalability Fully configurable Designed to optimize crates populated with RCE boards Ability to use ATCA redundant lanes for additional bandwidth if desired Ability to use 2.5Gb/s connections in place of standard 1Gb/s Ethernet At the same time may be configured to connect standard ATCA blades

38 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 38 Experiment Front-End Board Interfaces to detector ASIC Control signals Row/column clocks Biases/thresholds Analog pixel voltage Contains Communication IP core Local configuration state machine Local image readout state machine Example: SLAC board FPGA with MGT interfaces, up to 4 x 2.5 Gbit/sec fiber IO ~ 200 digital IO VHDL programmed Includes communication IP core provided by SLAC

39 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 39 CXI 2D-Detector Control and DAQ Chain Ground- isolation Vacuum Fiber Cornell detector/ ASIC SLAC FPGA front-end board ATCA crate with SLAC DAQ Boards Each Cornell detector has ~36,000 pixels Controlled and read out using Cornell custom ASIC ~36,000 front-end amplifier circuits and analog-to-digital converters Initially 16 x 32,000-pixel devices, then up to 64 x 32,000-pixel devices 4.6 Gbit/sec average with > 10 Gbit/sec peak

40 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 40 Calibration & Distribution (using SLAC DAQ)

41 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 41 Noise (using SLAC DAQ)

42 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 42 XPP 2D-Detector Control and DAQ Chain BNL XAMP Detector 1,024 x 1,024 array Uses 16 each 64-channel FexAmps BNL custom ASICs Instantaneous readout: 4 ch x 20 MHz x 16bit= 20 Gbit/sec into FPGA Output FPGA: 250 Mbytes/s at 120 Hz (1024x1024x2x120) FexAmps proto-type ASIC has been received at SLAC and configuration and read out tests using SLAC LCLS DAQ system have begun Detector ASIC board with readout ASIC plus ADC’s SLAC standard front-end board Fiber SLAC LCLS DAQ ATCA crate

43 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 43 DAQ Waveform Sampling Digitizer Agilent Acqiris DC282 high-speed 10-bit cPCI Digitizer 4 channels 2-8 GS/s sampling rate Acquisition memory from 1024 kpoints to 1024 Mpoints Low dead time (350 ns) sequential recording with time stamps 6U PXI/CompactPCI standard, 64 bit, 66 MHz PCI bus Sustained transfer rate up to 400MB/s to host SBC

44 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 44 High-Level Applications To allow commissioners and users and of each experiment to: Use a common interface to both the DAQ system and EPICS Speed up the development cycle by using a high level programming language, but still be able to easily build critical sections in C/C++ Easily develop new applications Provide a GUI integrated with the programming language Re-use code developed by other LUSI experiments Python as high level scripting language Easy to learn, fast dev cycle, extensible, open-source, powerful, relatively fast QT as graphical user interface Framework and support for scientists provided by PCDS

45 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 45 Online Archive The online archive has a dual role Store science and EPICS data for retrieval/monitoring/analysis by the online system Allow DAQ and controls to keep operating during downtimes of the offline staging system The archive size depends on average data rate and estimated downtime Initially assumes: 2 MB per image, 120Hz pulse rate, 30% success rate, 50% daily duty cycle: ~3.1 TB/day 4 days estimated downtime offline staging system (eventually up to 7 days) Will start with 12 TB going up to 20 TB before all 3 instruments are operating Must be able to easily scale size to accommodate for larger detectors Must be able to store initially > 250 MB/s

46 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 46 Online Archive Data Format Online acquires data from instruments as C++ objects Each class represents instrument data type or instrument configuration Classes might also describe processed instrument data or EPICS data needed for data analysis Data written to disk native DAQ object oriented format Data stored in its memory representation Classes designed to optimize high performance and self describing features Minimize read/write operations needed to re-create or store an object Maximize ability to adapt to changes in the data structures (eg number of pixels for a given detector) without introducing a new class

47 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 47 Archive and Offline System Interface between the archive and the offline system made of two parts Files staged on dedicated local disk 10 Gb/s link between NEH and SCCS for bulk data transfer Replicated MySQL database used to maintain transfer state MySQL database in PCDS enclave to share meta-data information Availability of a file, completion of a file copy operation, etc 1.6-526 Online/Offline ICD (Interface Control Document) released Offline will store the data in HDF5 files Compatible with NeXus standard for X-ray, neutron and muon data

48 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 48 File Management, Metadata 1.6-118 Offline Data Management System document released File Management Central file manager tracks all files [iRODS]. High-performance parallel filesystem used for disk storage [Lustre]. Tape system used for long-term archiving [HPSS]. Network-based export interface. Disk-based (e-SATA, USB) export interface. Meta Data Science metadata database contains user, run, instrument, and pulse attributes from online system. Additional user and run information replicated from electronic logbook. All metadata may be queried to locate files and portions of files of interest. Metadata is exported with data.

49 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 49 Analysis Options NeXus API Can use NeXus or HDF5 Tools and Analysis Packages HDF5 File API Open Genie LAMP GumTree Nathan Redas Scilab Amortool More … Open Source Cactus Chombo dxhsf5 H5PartRoot HL-HDF ParaView PyTables VisAD Many more … Open Source Commercial IDL-HDF5 Matlab Mathematica O-Matrix ViTables More …

50 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 50 Scientific Computing Scientific Computing for LUSI Science Opportunities and needs are being evaluated Very dependent on the detailed nature of the science Unprecedented size (for photon science) of data sets to be analyzed Unprecedented computational needs (for photon science) Comparable in scale to a major high-energy physics experiment Greater need for flexibility than for a major high-energy physics experiment Main scientific computing effort not part of baseline

51 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 51 Risk IF there are major changes in the scope, performance, existence or placement of CXI/XPP/XCS instrumentation due to evolving user requirements…THEN, it might be difficult to meet the schedule and budget as specified in P3 Mitigation Release Engineering Requirement documents Already done Adhere to BCR process LCLS requirement Participate in Experimental Area design process Already participating

52 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 52 Controls/DAQ Team Leaders 1.6. CAM: G. Haller Deputy (P. Anthony) Online (A. Perazzo) Controls (D. Nelson) DAQ (C. O’Grady) Infrastructure (R. Rodriguez) Offline Computing (S. Luitz) Technical leaders are also responsible for AMO and XES-provided photon area controls/DAQ/infrastructure needed by LUSI Provides low risk having interface issues, provides high efficiency Ensures common solutions No issue with man-power, plus instruments are time-phased. Could accelerate LUSI controls, all driven by budget availability Scientist XPP (D. Fritz) CXI (S. Boutet) XCS (A. Robert) Diagnostics/Common Optics (Y. Feng) Detectors (N. Van Bakel) 1.6. CAM: G. Haller Deputy (P. Anthony) Online (A. Perazzo) Controls (D. Nelson) DAQ (C. O’Grady) Infrastructure (R. Rodriguez) Offline Computing (S. Luitz) Technical leaders are also responsible for AMO and XES-provided photon area controls/DAQ/infrastructure needed by LUSI Provides low risk having interface issues, provides high efficiency Ensures common solutions No issue with man-power, plus instruments are time-phased. Could accelerate LUSI controls, all driven by budget availability Scientist XPP (D. Fritz) CXI (S. Boutet) XCS (A. Robert) Diagnostics/Common Optics (Y. Feng) Detectors (N. Van Bakel) CAM: Control Account Manager

53 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 53 Summary All control and data systems requirements in LUSI Performance Execution Plan will be met with system presented for W.B.S. 1.6 Technical, cost, and schedule risks are low Well documented agreements with instruments Re-use of LCLS controls software, hardware where appropriate Cost bottoms-up with detailed quotes for each component Schedule fully linked and resource leveled Data subsystem concept & architecture are well developed Use standard interface to all detectors CXI and XPP/XCS detector ASICs are already being configured and read out using the LCLS DAQ system Data management system provides high bandwidth and is scalable Leverage significant expertise at SLAC in data acquisition and management Ready to be approved for cost and schedule baseline All control and data systems requirements in LUSI Performance Execution Plan will be met with system presented for W.B.S. 1.6 Technical, cost, and schedule risks are low Well documented agreements with instruments Re-use of LCLS controls software, hardware where appropriate Cost bottoms-up with detailed quotes for each component Schedule fully linked and resource leveled Data subsystem concept & architecture are well developed Use standard interface to all detectors CXI and XPP/XCS detector ASICs are already being configured and read out using the LCLS DAQ system Data management system provides high bandwidth and is scalable Leverage significant expertise at SLAC in data acquisition and management Ready to be approved for cost and schedule baseline

54 Gunther Haller haller@slac.stanford.edu LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 54 END OF PRESENTATION


Download ppt "Gunther Haller LUSI DOE Review Aug. 20, 2008 Controls (WBS 1.6) p. 1 Breakout Session G. Haller – Sub-System Manager August 20,"

Similar presentations


Ads by Google