Download presentation
Presentation is loading. Please wait.
Published byOwen Logan Modified over 9 years ago
1
DAQ Software Gordon Watts UW, Seattle December 8, 1999 Director’s Review Introduction to the System Goals for Installation & Commissioning Software Tasks & Manpower
2
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 2 Run II DAQ Readout channels: u Will be ~800,000 in run 2, /event Data rates: u 60-80 crates u Initial design capacity: s ~1000 Hz s 250 MBytes/sec into the DAQ/l3-farm Integration & Control u With online
3
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 3 Continuous operation u Version 0 s PC & Run 1 DAQ hardware simulate the functionality of the Run 2 system. s Looks similar to final system to both Level 3 and the outside user u Integrate with Hardware as it arrives s Small perturbations u Reliability u Integration with Online (monitor, errors, etc.) We don’t get calls at 4am u Careful testing as we go along s Test stand at Brown s Si test and other boot strap operations here u System isn’t Fragile s If things aren’t done in the exact order –deal with it –understandable error messages. u All code kept in a code repository (vss) Goals
4
Segment Data Cables Segment Data Cables ) VRC 1 ) Front End Crate Front End Crate Front End Crate Front End Crate Front End Crate Front End Crate Front End Crate Front End Crate VRC 8 S (4 DATA CES L3 Node (1 of 16) L3 Node (1 of 16) L3 Node (1 of 16) SB 1 SB 4 ETG Event Tag Loop Primary Fiber Channel Loop #1 Primary Fiber Channel Loop #8 Front End Token Readout Loop Front End Token Readout Loop Trigger Framework ) L3 Node (1 of 16) L3 Node (1 of 16) L3 Node (1 of 16) To Collector Router To Collector Router Ethernet We have to write software =
5
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 5 SC L3 Software L3 Farm Node L3 Supervisor ETG VRC SC SBSB Online System Collector Router L3 Monitor
6
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 6 L3 Software During Running, DAQ hardware is stand alone u Running components do not require software intervention on an event-by-event basis s Except for monitoring u Software must deal with initialization and configuration only. Except for the Farm Node DAQ components require very little software u VRC, SB are simple, similar, control programs with almost no parameter settings u ETG is similar, with more sophisticated software to handle routing table configuration u Farm Node and Supervisor are the only components that require significant programming effort. s Monitor node to a lesser extent.
7
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 7 ETG Interface ETG Program Control Disk L3 Supervisor L3 Monitor ETG Node Embedded Systems Embedded Systems Embedded Systems Trigger Framework Triggers Disable DCOM Similar to the VRC (and SB); will reuse software
8
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 8 Filter Process Physics Filter executes in a separate process. u Isolates the framework from crashes s The physics analysis code changes much more frequently than the framework once the run has started s Crash recovery saves the event, flags it, and ships it up to the online system for debugging. u Raw event data is stored in shared memory Framework Filter Process Shared Memory Mutexes Run 1 Experience
9
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 9 Physics Filter Interface ScriptRunner u Framework that runs physics tools and filters to make the actual physics decision. u Cross platform code (NT, Linux, IRIX, OSF??) u Managed by the L3 Filters Group L3 Framework Interface L3 Framework Interface L3 Filter Process ScriptRunner
10
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 10 L3 Supervisor Manages Configuration of DAQ/Trigger Farm u About 60 nodes Command Planning u Online system will send Level 3 simple commands u L3 must translate them into the specific commands to each node to achieve the online system’s requests Supports u Multiple Runs u Partitioning the L3 Farm u Node Crash and Recovery u Generic Error Recovery s With minimal impact on running
11
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 11 ErrorLogging & Monitoring Error Logging u L3 Filters group will use the zoom ErrorLogger s Adopted a consistent set of standards for reporting errors. u Plug-in module to get the errors off the Level 3 nodes s Sent to monitor process for local relay to online system s Logfiles written in a standard format –Trying to agree with online group to make this standard across all online components Monitoring u Noncritical information s Event counters, buffer occupancy, etc. u Variables declared & mapped to shared memory s Slow repeater process copies data to monitor process.
12
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 12 DAQ/Trigger Integration Between the Hardware and the Online System Interface is minimal u Data Output u Control u Monitor Information Implications are not Minimal Detector L1, L2 TCC L3 Supervisor DAQ System Readout Crate L3 Node Ethernet Collector / Router Data Logger RIP Disk FCC Ethernet Data Cable Trigger and Readout UNIX Host NT Level 3 UNIX Host COOR Monitor Ethernet L3 Monitor DAQ Console DAQ Console Detector Console Detector Console
13
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 13 Software Integration Integration outside of Level 3 software Integration with offline (where we meet) u Level 3 Filter s Must run same offline and online u Doom/dspack Control & Monitor communication u Uses ITC package s Online Group’s standard Communications Package Requires offline-like Code Releases built on Online
14
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 14 NT Releases Build is controlled by SoftRelTools u 100’s of source files u Build system required u UNIX centric (offline) s Too much work to maintain two SRT2 NT integration is done u SRT2 is the build system u Set us back several months; no assigned person Li (NIU MS) is building NT releases now u Just starting… u Starting with a small DAQ only release s DSPACK + friends, itc, thread_util, l3base u Next step is to build 30+ size release s Everything we had nt00.12.00 release
15
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 15 NT Releases Progress is slow u Build system is still in flux! What does it affect? u ScriptRunner + filters + tools + ITC u 10% Test right now u Our ability to test the system now s Dummy versions of SR interface u Regular nt trigger releases must occur by March 15, 2000 s Muon Filtering in L3 is one of the commissioning milestones.
16
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 16 Scheduling Conflicting Requirements u Must be continuous availible starting now u Must upgrade & integrate final hardware as it arrives Software is impacted u Must upgrade in tandem and without disturbing running system Tactic u Version 0 of software u Upgrade adiabatically u Interface to internal components remains similar u Interface to online system does not change u Test stand at Brown University for testing.
17
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 17 VRC Interface FCI from last SB VRC Program Control Disk L3 Supervisor L3 Monitor VRC Node Embedded Systems Embedded Systems Embedded Systems VBD Data Cables 50 MB/s 100 MB/s FCI to 1 st SB 100 MB/s DCOM
18
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 18 VRC Interface (V0) VRC Program Control Disk L3 Supervisor L3 Monitor VRC Node VBD Data Cables 50 MB/s 100 Mb/s DCOM VME/MPM (2) (to SB/ETG)
19
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 19 November 1, 1999 Read raw data from FEC into VRC Send raw data in offline format to online system Control via COOR u Held up by NT releases COOR L3 Super ITC DCOM Detector/ VRC Collector Router ITC Auto Start Utility DCOM Done Started Not Started FEC 50%
20
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 20 February 15, 2000 Multicrate readout Internal communication done via ACE u Already implemented COOR L3 Super ITC DCOM Detector/ VRC Collector Router ITC Auto Start Utility DCOM Done Started Not Started FEC SB/ETG Detector/ VRC ACE L3 Farm Node ACE 25% FEC 50% 75%
21
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 21 March 15, 2000 Muon Filtering in Level 3 u ScriptRunner Interface must be up u NT releases must be regular COOR L3 Super ITC DCOM Detector/ VRC Collector Router ITC Auto Start Utility DCOM Done Started Not Started FEC SB/ETG Detector/ VRC ACE L3 Farm Node ACE 20% FEC 50%
22
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 22 May 1, 2000 Multistream Readout u Ability to partition the L3 Farm s Multiple Simultaneous Runs u Route events by trigger bits u ScriptRunner does output streams COOR L3 Super ITC DCOM Detector/ VRC Collector Router ITC Auto Start Utility DCOM Done Started Not Started FEC SB/ETG Detector/ VRC ACE L3 Farm Node ACE 10% FEC 25% 45%
23
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 23 Test Stands Detector subsystems have individual setups u Allows them to test readout with final configuration u Allows us to test our software early s High speed running, stress tests for DAQ software Subsystems have some unique requirements u Necessary for error rate checking in the Si, for example. u Separate software development branches s Attempt to keep as close as possible to the final L3 design to avoid support headaches.
24
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 24 Test Stands Three test stands currently in operation u Brown Test Stand s Test hardware prototypes s Primary software development u Silicon Test Stand s L3 Node directly reads out a front end crate s Helping us and Si folks test readout, perform debugging and make system improvements u CFT Test Stand s Instrumented and ready to take data (missing one tracking board (VRBC) to control readout)
25
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 25 10% Test Si Test Stand will evolve into full blown readout u 10% test – single barrel readout u Requires full L3 Node u Test out Silicon Filter Code s ScriptRunner, Trigger Tools, etc. s NT releases must be up to speed for this This is in progress as we speak u The ScriptRunner components are held up by NT releases.
26
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 26 People Joint effort u Brown University u University of Washington, Seattle People: u Gennady Briskin, Brown u Dave Cutts, Brown u Sean Mattingly, Brown u Gordon Watts, UW u +1 post-doc from UW u Students (>1)
27
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 27 Tasks VRC u Simple; once done will require few modifications. (1/4 FTE) SB u Simple; once done will require few modifications (very similar to VRC) (1/4) ETG u Complex initialization required; hardware interface not well understood yet, requires little work now. By the time VRC ramps down, this will ramp up. (1/2) Farm Node u Large amount of work left to do in communication with the supervisor and with ScriptRunner. Will require continuous work as system gains in complexity (3/4)
28
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 28 Tasks L3 Supervisor u Complex communication with COOR, started but will require continuous upgrades as the system develops in complexity. (1/2) Monitoring u Initial work done by undergraduates. Have to interface to the outside world. No one working on it at the moment (1/4). NT Releases u Offloading to NIU student. Requires continuous work and interface with many different software developers (1). L3 Filter Integration u Done by hand now, will have to be made automatic. Take advantage of offline tools (1/2).
29
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 29 Conclusions NT Releases have been the biggest delay u Keeping up with the offline changes requires constant vigilance u Offloading this task to a dedicated person. u 10% test impact, March 15 milstone impact Group is correct size to handle the task u Continuous operation u Integrating the new hardware with the software u As long as this group isn’t also responsible for releases. Weak points currently u Monitoring u Integration with online system (log files, error messages, etc.).
30
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 30 Dedicated 100 Mbits/s Ethernet to Online Collector/Router Dedicated 100 Mbits/s Ethernet to Online Collector/Router L3 Farm Node Ethernet L3 Filter L3 Filter L3 Filter Process DMA capable VME-PCI Bridge L3 Node Framework Each 48 MB/s Control, Monitoring and Error Module L3 Filter Interface Module Node-VME I/O Module Shared Memory Buffers VME Crate MPM Collector Router Module Prototype of the framework is finished Runs in Silicon Test Stand Second version finished by Jan 1. Improved Speed, interaction between processes, new interface, and stability Prototype of the framework is finished Runs in Silicon Test Stand Second version finished by Jan 1. Improved Speed, interaction between processes, new interface, and stability
31
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 31 Validation Queue Event Validation FECs presence validation Checksum validation L3 Filter Input Interface Process Interface Pool Queue Collector/Router Network Interface Determine where this event should be sent Sent event to collector/router node Filter Queue Data to Online Host System Data Control Output Events Queue Get a pointer to an event buffer Configures MPMs for receiving new event Wait till complete event arrives into MPM Load event data into shared memory buffer Insert event pointer into the next queue L3 Supervisor Interface L3 Monitor Interface L3 Error Interface Command/Monitor/Error Shared Memory Event Buffer Shared Memory Event Buffer Shared Memory L3 Filter Process L3 Filter Output Interface Process Interface Output Pool Queue MPM Reader Details Of Filter Node
32
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 32 L3 Supervisor Interface Receives and interprets COOR commands and turns them into internal state objects Next step is communication to clients VRC/ETG/SB/L3Node COOR Command Interface Current Configuration DB Resource Allocator Command Generator Sequencer L3 Node Clients Supervisor Online System Commands Configuration Request Desired Configuration Data Base Direct Commands
33
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 33 Auto Start System Configuration Database Client Machine Auto Start Service Get Package List Install Packages Package Database Auto Start Service Change Packages, Get Status, Reboot, etc. Package Running Packages Designed to automatically start after a cold boot and bring a client to a known idle state Also manages software distribution
34
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 34 Timeline 2000 2001 J F M A M J J A S O N D J F ICD FPS Lum (L0) FT install/hookup 1/2 VLPCs installed 1st CFT crate operational Waveguide production All VLPCs installed SMT install/ hookup Beam- ready Forward MDT & pixel planes install/survey (A&B) Forward MDT & pixel planes install/survey (C) CC cosmicsECN cosmics Assemble/install/ align EMC End toroids installed Hookup ECS Cosmic Ray Commissioning Phase I: Central muon, DAQ, RECO, trigger, tracker front-end Phase II: Fiber tracker, preshowers, VLPCs, CFT, forward muon Phase III: Full Cosmic Ray Run (add TRIG, SMT, CAL) Install/checkout CAL BLS Install final CFT electronics Install tracking front-end 1 st Collab Commissioning Milestone: Feb 15, 2000 Run II begins: Mar 1, 2001 Roll in Remove shield wall DAQ Availible
35
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 35 Silicon Test Display Master GUI Monitor Counters Raw Data Viewer CPU1 CPU2
36
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 36 Monitoring Non essential information u Helpful for debugging Two sources of information: u Level 3 Physics Trigger Info s Accept rates, filter timing. s Framework ships binary block of data out to concentrator (1-of-50) which combines it and re- presents it. u Framework Items s Event Counters, Node State. So others can read without impacting system
37
Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 37 Monitoring Framework Items use a Shared Memory Scheme: Shared Memory Framework Process 1 Framework Process 2 Framework Process 3 Slow Retransmitter Process Rest of World TCP/IP (ACE) Rest of World: Requests Particular Items Update Frequency Framework Process: Saves name, type, and data of monitor Data type is arbitrary Implemented with template classes NT Native Now, Soon ACE Try to reuse online software
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.