Data Acquisition, Diagnostics & Controls (DAQ)

Slides:



Advertisements
Similar presentations
11/12/2003LIGO Document G Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald.
Advertisements

LIGO-G A-M April 5, 2006 Interferometer Sensing and Control (ISC) Cost and Schedule Breakout Presentation NSF Review of Advanced LIGO Project Richard.
1 PFP IPDR 2010/6/ Particles and Fields Package (PFP) GSE Timothy Quinn.
LIGO Reduced Data Sets E7 standard reduced data set RDS generation for future runs decimation examples LIGO-G Z Isabel Leonor (w/ Robert Schofield)
LIGO-G D partial ADVANCED LIGO1 Development Plan R&D including Design through Final Design Review »for all long lead or high risk subsystems »LIGO.
LIGO-G M April 5, 2006 Interferometer Sensing and Control (ISC) Cost and Schedule Breakout Presentation NSF Review of Advanced LIGO Project Richard.
Hamid Shoaee Accelerator Readiness Dec. 2, ‘08 SLAC National Accelerator Laboratory Controls Department LCLS Maintenance.
ELI: Electronic Timing System (ETS) at Facility Level E L I – B L – – P R E – B.
Lecture 9 Modems and Access Devices. Overview Computers are connected to telephone lines through the use of modems –modems: are connecting devices between.
G Data Acquisition, Diagnostics & Controls (DAQ) Rolf Bork, CIT Technical Status Annual NSF Review of Advanced LIGO Project April 30 – May 2, 2013.
Module 9 Review Questions 1. The ability for a system to continue when a hardware failure occurs is A. Failure tolerance B. Hardware tolerance C. Fault.
Module 1 WANs and Routers.
© 2006 Cisco Systems, Inc. All rights reserved.Cisco PublicITE I Chapter 6 1 Characterizing the Existing Network Designing and Supporting Computer Networks.
LSC Segment Database Duncan Brown Caltech LIGO-G Z.
4/25/11 G v3 1 Facility Modifications and Preparations (FMP) Technical Status Annual NSF Review of Advanced LIGO Project April , 2011 John.
G D Electronics Infrastructure for advanced LIGO LSC Meeting, Boston, July 25, 2007 Daniel Sigg, LIGO Hanford Observatory.
Avaya Wireless Installation (hands-on). Hands-on tasks overview  Choice of the following (depending on type of system present). Multiple tasks allowed.
Chapter © 2006 The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/ Irwin Chapter 7 IT INFRASTRUCTURES Business-Driven Technologies 7.
Guide to Linux Installation and Administration, 2e1 Chapter 2 Planning Your System.
Object-Oriented Analysis & Design Subversion. Contents  Configuration management  The repository  Versioning  Tags  Branches  Subversion 2.
ATCA based LLRF system design review DESY Control servers for ATCA based LLRF system Piotr Pucyk - DESY, Warsaw University of Technology Jaroslaw.
Data Acquisition for the 12 GeV Upgrade CODA 3. The good news…  There is a group dedicated to development and support of data acquisition at Jefferson.
Overview of computer communication and Networking Communication VS transmission Computer Network Types of networks Network Needs Standards.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
ATF Control System and Interface to sub-systems Nobuhiro Terunuma, KEK 21/Nov/2007.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Software Development A Proposed Process and Methodology.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
18-19 July, 2002Correlator Backend System OverviewTom Morgan 1 Correlator Backend System Overview Tom Morgan, NRAO.
Jan 12, 2009LIGO-G Z1 DMT and NDS2 John Zweizig LIGO/Caltech Ligo PAC, Caltech, Jan 12, 2009.
Scott Koranda, UWM & NCSA 14 January 2016www.griphyn.org Lightweight Data Replicator Scott Koranda University of Wisconsin-Milwaukee & National Center.
DoE Review January 1998 Online System WBS 1.5  One-page review  Accomplishments  System description  Progress  Status  Goals Outline Stu Fuess.
ClinicalSoftwareSolutions Patient focused.Business minded. Slide 1 Opus Server Architecture Fritz Feltner Sept 7, 2007 Director, IT and Systems Integration.
5 June 2002DOM Main Board Engineering Requirements Review 1 DOM Main Board Software Engineering Requirements Review June 5, 2002 LBNL Chuck McParland.
Demonstrations of RDBE-PFB Data Flow. Review the original PFB Data Flow.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
1 State of Georgia (GTA) Desktop Inventory Workshop March 16, 2009 © 2009 IBM Corporation IBM.
Laboratory equipment Dr. W. Huisman Cairo, November 21th 2012.
January 2010 – GEO-ISC KickOff meeting Christian Gräf, AEI 10 m Prototype Team State-of-the-art digital control: Introducing LIGO CDS.
E. Hazen1 MicroTCA for HCAL and CMS Review / Status E. Hazen - Boston University for the CMS Collaboration.
Scott Koranda, UWM & NCSA 20 November 2016www.griphyn.org Lightweight Replication of Heavyweight Data Scott Koranda University of Wisconsin-Milwaukee &
Team Members: ECE- Wes Williams, Will Steiden, Josh Howard, Alan Jimenez Sponsor: Brad Luyster Honeywell Network Traffic Generator.
Compute and Storage For the Farm at Jlab
Architecture Review 10/11/2004
Edited by : Noor Al-Hareqi
WP18, High-speed data recording Krzysztof Wrona, European XFEL
CCNA Routing and Switching Routing and Switching Essentials v6.0
Module 2: DriveScale architecture and components
PLC-based control systems at SOLEIL - ICALEPCS 2017
An Overview of Data-PASS Shared Catalog
Designing and Installing a Network
How SCADA Systems Work?.
ARMAMENT TECHNOLOGY FACILITY SOLUTIONS FOR YOUR DEMANDS
UW Madison OpenDCIM Bill Jensen 8/10/2017.
Chapter 10: Device Discovery, Management, and Maintenance
CCNA Routing and Switching Routing and Switching Essentials v6.0
Computing Infrastructure for DAQ, DM and SC
AMCOM Digital Archive Design Review - Week 3.
Programmable Logic Controllers (PLCs) An Overview.
Chapter 10: Device Discovery, Management, and Maintenance
External Collaboration
Edited by : Noor Al-Hareqi
Characteristics of Reconfigurable Hardware
CS 501: Software Engineering Fall 1999
The Importance of Project Communications Management
EPICS: Experimental Physics and Industrial Control System
Edited by : Noor Al-Hareqi
By Hussein Alhashimi.
Project Migration / Management
Presentation transcript:

Data Acquisition, Diagnostics & Controls (DAQ) System Acceptance Testing Annual NSF Review of Advanced LIGO Project June 24 – June 26, 2014 Rolf Bork, CIT LIGO-G1400671-v1 2014 June 16

DAQ Scope Provide core software for real-time control and data acquisition Standard application development environment Common software to perform real-time scheduling and I/O functions Provide a Timing Distribution System (TDS) to synchronize all control and DAQ systems to GPS Provide real-time data acquisition software/hardware for Physical Environment Monitoring (PEM) sensors. Provide system to acquire, store and distribute data from real-time control systems

DAQ System Design Overview Timing Distribution System (TDS), slaved to GPS, provides clocks to PCI Express (PCIe) modules in I/O chassis. PCIe modules interface to control computer via PCIe fiber link. Control computer acquires data and transmits to DAQ data concentrator (DC) via network. DC assembles data from all controllers and broadcasts full data blocks every 1/16 second. FrameWriter computers format data and write to disk (64sec. data frame) Network Data Server (NDS) provides data on demand either live or from disk.

DAQ Test Facilities Dedicated DAQ Test Stands (DTS) Scaled versions of production systems Caltech, LHO, LLO Testing performed using automated regression test software Described in LIGO-T1300721 Testing includes both hardware and software Employed in various interferometer R&D systems Caltech 40m lab MIT LASTI AEI (Hannover, Germany) laser and 10m labs

Timing Distribution System (TDS) Contracted to Columbia Univ. for manufacture and test after a joint development effort. Design described in the journal "Classical and Quantum Gravity” under Imre Bartos et al., 2010 Class. Quantum Grav. Vol. 27, No. 8, 084025 IRIG-B Timing Fanout Provides accurate time information to computers. Timing Slave provides accurate clocks At 65536Hz to ADC/DAC modules.

TDS Testing Individual TDS chassis tested prior to delivery, each in accordance with its own test procedure. Overview contained in LIGO-T0900050 List of all documentation linked from LIGO-E090003 Continuous on-line verification Each real-time control computer verifies timing via TDS duotone and IRIG-B signals. TDS comparators report diagnostics as compared to atomic clock.

PEM Equipment For PEM, DAQ only provides the interface to the DAQ system ie does not provide any of the sensing instruments. For each IFO, DAQ provides: 3 Standard I/O chassis 4 ADC modules and associated Anti-Aliasing (AA) chassis All AA chassis tested in accordance with test plan T1000673. Reports attached to equipment traveler documentation. All I/O chassis tested in Caltech DAQ test system using automated script. Reports auto generated and attached to equipment traveler documentation. (linked to LIGO-E1200651). 17 Slot PCIe Bus with PCIe Uplink I/O Timing Bus Timing Slave I/O Interface Module 24V DC Supply

DAQ Computing / Storage Equipment Design Overview Data Concentrator (DC) (2) Collects data from all real-time control computers via 1GigE to 10GigE Ethernet switch and broadcasts data to separate 10GigE network. One unit on-line, second hot backup FrameWriter (2) Receives data from DC Formats data into LVC standard Frame format Writes data to disk Local Data Analysis group disk farm Network Data Server (NDS) (2) Provides real-time or stored data on request to various control room software tools NDS clients also developed for Perl, Python and Matlab Two computers running Solaris operating system to connect disk systems via QFS. 24 TByte Local Disk

DAQ Primary Requirements Support the following continuous data acquisition rates: 4MByte/sec from each real-time control application, up to 20MByte/sec from each real-time control computer. Verified in both off-line and on-line testing Aggregate data rate of up to 30MByte/sec from all real-time control computers. Total of data to be archived plus user selected test point data. LHO system presently running at 62MByte/sec Write data to disk at 10MByte/sec (compressed) LHO system presently running at 26MByte/sec 16 Mbyte/sec commissioning frame 10 Mbyte/sec science frame Provide capability to access data, live and archived, from various control room operations support tools.

Software Testing Primary development project for DAQ is software. Software test progression: Off-line test system R&D lab installation Production system installation Off-line test systems employ the Jenkins continuous integration tool for test management. Code checked out nightly from SVN repository All user applications recompiled. Representative set of user applications and test applications started on the real-time control computers. Test programs run to verify proper operation. Test reports auto-generated using doxygen tool.

Continuous On Line Testing Real-time control and acquisition systems continuously run various diagnostics as described in LIGO-T1100625 DAQ performs various data integrity checks. CRC checksum sent with all data at each interface, with receiving code performing CRC checksum calculation and verifying match. Each DAQ application performs a CRC checksum on the data channel list to verify channel to data ordering is correct. CRC checksums of frame files written by both frame writers compared to verify that they are identical.

DAQ Status Installation complete In continuous operation at both sites for several years Formal fabrication acceptance review complete Documentation linked from LIGO-E1200645 All 3rd Interferometer equipment inventoried and committed to long term storage.