Download presentation
Presentation is loading. Please wait.
Published byPhilippa Boyd Modified over 8 years ago
1
LAT HSK Data Handling from B33 Cleanroom
2
ISOC Software Architecture
3
Import Flow Diagram
4
Backup Slides
5
HSK Data Collection Flight hardware in the cleanroom is polled by the LATTE HSK Server. Sampled HSK values are stored in a MySQL database Every two hours, new data is dumped to binary files on the DMZ drive. –This shared volume is mounted by LATTE workstations in the cleanroom, and by a small number of designated hosts on the SLAC network.
6
HSK Data Export A cron job on the ‘glast02’ host moves the database-dump files from the DMZ volume to the U12 volume –This volume is accessible to (most) GLAST- project hosts and to the LSF batch farm The cron job also injects a new instance of the ‘b33-hsk-import’ task into the pipeline.
7
HSK Data Import The pipeline schedules execution of the Oracle-import program (written in Python) on the LSF batch farm. –Processing status can be tracked through the pipeline web pages. The Python program reads the binary data file, creates change records and statistics records, and posts them to Oracle.
8
Web Data Access The test-data processing pipeline populates the ‘ELogReport’ database table with information about each test run. –This information can be queried to provide in- test time-spans of interest for querying trending data. –It can also be used to determine the serial numbers of hardware under test (with some limitations).
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.