STO Control Interfaces ● Gondola command and control interface ● Instrument and data acquisition interface ● Limitations of both and how they affect observing.

Slides:



Advertisements
Similar presentations
1 PFP IPDR 2010/6/ Particles and Fields Package (PFP) GSE Timothy Quinn.
Advertisements

University of Kansas EPS of KUTEsat Pathfinder Leon S. Searl April 5, 2006 AE256 Satellite Electrical Power Systems.
Design and Implementation of a Virtual Reality Glove Device Final presentation – winter 2001/2 By:Amos Mosseri, Shy Shalom, Instructors:Michael.
GLAST LAT ProjectNovember 18, 2004 I&T Two Tower IRR 1 GLAST Large Area Telescope: Integration and Test One and Two Tower Integration Readiness Review.
Fundamentals of Python: From First Programs Through Data Structures
March 2004 At A Glance ITOS is a highly configurable low-cost control and monitoring system. Benefits Extreme low cost Database driven - ITOS software.
- Software block schemes & diagrams - Communications protocols & data format - Conclusions EUSO-BALLOON DESIGN REVIEW, , CNES TOULOUSE F. S.
Yongho Kim Aug 2, : My Job TLM_ReceiverRS decoderGSEOS Nothing.. Interface - Serial line - TCP/IP connection with MPS Functions - CCSDS frame,
STO Instrument Control, Software Interfaces, Data Management, etc. Let's expand this block.
University of Bergen public domain software: SEISLOG Linux, data-acquisition system for standard PC and embedded lowcost, low-power solutions Terje Utheim.
NETWORK CENTRIC COMPUTING (With included EMBEDDED SYSTEMS)
FALL 2005CSI 4118 – UNIVERSITY OF OTTAWA1 Part 4 Web technologies: HTTP, CGI, PHP,Java applets)
CHAPTER FOUR COMPUTER SOFTWARE.
Chapter 34 Java Technology for Active Web Documents methods used to provide continuous Web updates to browser – Server push – Active documents.
BLU-ICE and the Distributed Control System Constraints for Software Development Strategies Timothy M. McPhillips Stanford Synchrotron Radiation Laboratory.
ATCA based LLRF system design review DESY Control servers for ATCA based LLRF system Piotr Pucyk - DESY, Warsaw University of Technology Jaroslaw.
CE Operating Systems Lecture 3 Overview of OS functions and structure.
Control & Data Handling, Operator Control, Aircraft Interface to C&DH Steve Musko Space Physics Research Laboratory University of Michigan Ann Arbor, MI.
Basic UNIX Concepts. Why We Need an Operating System (OS) OS interacts with hardware and manages programs. A safe environment for programs to run is required.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
SPI NIGHTLIES Alex Hodgkins. SPI nightlies  Build and test various software projects each night  Provide a nightlies summary page that displays all.
13-1 MAVEN PFP ICDR, May 23 – 25, 2011 Particles and Fields Package Critical Design Review May , 2011 GSE Timothy Quinn.
Submitted by:.  Project overview  Block diagram  Power supply  Microcontroller  MAX232 & DB9 Connector  Relay  Relay driver  Software requirements.
Embedded Real-Time Systems Introduction to embedded software development Lecturer Department University.
TRIO-CINEMA 1 UCB, 2/08/2010 ACS Dave Auslander, Dave Pankow, Han Chen, Yao-Ting Mao, UC Berkeley Space Sciences Laboratory University of California, Berkeley.
Scalable Readout System Data Acquisition using LabVIEW Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer]
CIT 140: Introduction to ITSlide #1 CSC 140: Introduction to IT Operating Systems.
Instrument & Data Mgmt. System ● Items to be controlled: – Receiver subsystems (mixer bias, LO, synthesizer, etc...). (RS232 and I 2 C/SPI interfaces)
Team Members: ECE- Wes Williams, Will Steiden, Josh Howard, Alan Jimenez Sponsor: Brad Luyster Honeywell Network Traffic Generator.
SPiiPlus Training Class
Chapter 13: I/O Systems.
Operating System Overview
Operating System & Application Software
Module 12: I/O Systems I/O hardware Application I/O Interface
Chapter 13: I/O Systems Modified by Dr. Neerja Mhaskar for CS 3SH3.
Working in the Forms Developer Environment
SOFTWARE TECHNOLOGIES
USB The topics covered, in order, are USB background
Project Center Use Cases Revision 2
Command & Data Handling
Topics Introduction Hardware and Software How Computers Store Data
Ground Support Software
PC Mouse operated Electrical Load Control Using VB Application
How SCADA Systems Work?.
Chapter 12: Mass-Storage Structure
Operating System Structure
Deploying and Configuring SSIS Packages
Introduction to Operating System (OS)
Project Center Use Cases Revision 3
Project Center Use Cases Revision 3
Computer Science I CSC 135.
Chapter 2: System Structures
Chapter 2: The Linux System Part 1
IT Infrastructure: Software
Operating System Concepts
13: I/O Systems I/O hardwared Application I/O Interface
CS703 - Advanced Operating Systems
Topics Introduction Hardware and Software How Computers Store Data
Chapter 2: Operating-System Structures
What is Concurrent Programming?
Chapter 13: I/O Systems I/O Hardware Application I/O Interface
Chapter 13: I/O Systems I/O Hardware Application I/O Interface
Command and Data Handling
FPGA Vinyl to Digital Converter (VDC)
Chapter 4: Threads.
Chapter 2: Operating-System Structures
In Today’s Class.. General Kernel Responsibilities Kernel Organization
Module 12: I/O Systems I/O hardwared Application I/O Interface
Chapter 13: I/O Systems “The two main jobs of a computer are I/O and [CPU] processing. In many cases, the main job is I/O, and the [CPU] processing is.
Presentation transcript:

STO Control Interfaces ● Gondola command and control interface ● Instrument and data acquisition interface ● Limitations of both and how they affect observing ● Schedule and Responsibilities

Commands & Control Overview 4 Computers: Command & Control Computer (CCC) “Direct link” to ground Commands high level gondola operation Actuators Control Computer (MAX3) Responsible to acquire housekeeping information Responsible to pointing control Science Control Computer (SCC) or Data Acquisition Computer (DAC) Controls the FFT spectrometer, data flow, storage, processing Instrument Control Computer (ICC) Controls Instrument hardware operations STO relies entirely on CSBF’s Support Instrument Package (SIP) for communications with ground Fort Sumner Configuration: Use Mini SIP Exclusively LOS UHF communications Simulates TDRSS & IRIDIUM channels Antarctica Configuration: LOS UHF for first ~ 24 hours and at turn around (~ 15 days later) Main channel: TDRSS: 6Kb/sec downlink, commanding every ~15 minutes Secondary channel: IRIDIUM: 255 bytes/15minutes

CDR 10/09/054 Command & Control Computer ● Runs two processes in parallel communicating via pipe ● Autonomous Control Executive (ACE): – Written in C language – Schedule balloon operations: telescope deployment, sets track state, sets telescope focus, filter settings,... – Carry out science mission objectives: autonomous scheduling of observing sequence. – Handles DAC and detector operations. – Main interface between ground operator and payload subsystems. ● Instrument Control Interface (ICI): – Written in C++ language. – Simple continuous loop process. – Interfaces the ACE process with the payload subsystems. – Handles all communications: ● Receives ground commands from SIP ● Collects & transmits housekeeping data to SIP ● Handles I/O with all instrument controllers

CDR 10/09/055 ACE ● Uses the POSIX p_threads library. To generate multiple sub-processes sharing the same resources. ● Use of SWIG (Simplified Wrapper & Interface Generator) to create a module for the scripting language PYTHON. – SWIG is a compiler that takes ANSI C declarations and turns them into a file containing the C code for binding C functions, variables, and constants to a scripting language (PYTHON). – The generated C code is converted by the C compiler into a module that can be imported from PYTHON.

CDR 10/09/056 ACE (2) ● From the PYTHON shell the operator can access all the routines written for the ACE. Examples below: ● devUnstow()Unstow the telescope ● devStow()Stow the telescope ● devTrackOn()Turn on tracking ● devTrackOff()Turn off tracking ● centerCalib()Start Sun center seeking procedure ● gotoHelio(lat,long) Point telescope at heliographic location ● camRec(#frames)Save #frames on disk and log recording ● … ● A set of the most commonly used commands can be transmitted as discrete commands via a pull-down menu (or push-button) from the GSE console  more reliable transmission of command.

Communications: Fort Sumner Configuration

Communications: Antarctica Configuration

Ground Support Use GSEOS software Developed for APL spacecraft missions: Messenger, New Horizons Used previously with FGE and SBI STO Protocol decoding done in Python. Easily extensible graphical display interface. Uplink command handler in C ++ module. Currently a basic interface with separate up and down windows. Can be extended to include history, editing, if desired. Provides “direct” access to SBI Python interface. Interface with CSBF Command & Control for Fort Sumner: RS232 connection with Fort Sumner CSBF GSE LOS UHF communications with ballooncraft Internet connection to other STO GSEs in Arizona, Maryland, … Interface with CSBF Command & Control for Antarctica: RS232 connection with Williams Field CSBF GSE for LOS UHF communication with ballooncraft Internet connection to STO GSE computer at OCC in Palestine (TX) for TDRSS & IRIDIUM communication with ballooncraft

GSE Station LOS videoGSEOS main telem. console GSEOS command window

GSEOS main telemetry console

Let's expand this block

Test Flight Block and Cabling Diagram

Control Computers ● Embedded systems are robust and simple solutions. 1-2 watt, 200 MHz ARM board: CF & USB storage, ethernet, serial, ADC, digital I/O, PC/104 bus (Linux and BSD) 3-4 watt, 500 MHz ARM board: SD & USB storage, programmable FPGA, ethernet, serial, digital IO, PC/104 bus (Linux and BSD) CF storage available in industrial-grade up to 32 GB, SDHC to 16 GB.

Two TS-ARM boards flew on the STO test flight without a hitch Cold testing in the (UNSW) lab Controls the Pre-HEAT 20cm tipper telescope and FFT spectrometer at Dome A Antarctica

STO System Software ● NetBSD – Open Source BSD UNIX – Small, simple, emphasis on portability, correctness, and clean design – Port to the TS-7200 took a weekend – Cleanly abstracted hardware interfaces, kernel debugging – Almost magical cross-building toolkit – Recognition that developer time is non-infinite – Best system software is the one you didn't know you were using

Potential dual-string control system One computer manages data system, the other acts as instrument controller. Careful design of digital bus generation on the digital I/O lines and ADCs allow the two ARM computers to swap roles on-the-fly if needed. Only serial subsystems are hard-wired to a specific computer; neither is particularly critical.

Instrument & Data Mgmt. System ● Items to control: – Receiver subsystems (mixer bias, LO, synthesizer, etc...). (RS232, DIO, I 2 C, SPI interfaces) – Flip mirror (direct DIO through SSR) – Spectrometer and Data system (ethernet) ● Interplay of these items with “gondola systems” – data system needs TCS header (timestamp, RA/DEC, etc.) – need synchronization with telescope for OTF

Control Software ● Each hardware component has a separate TCP/IP socket server associated with it. The server listens on that socket's port for ASCII text commands to perform. ● Watchdog timers allow software and hardware to be automatically reset should they become unresponsive. ● Low-level server code is written in C; client code for observing programs is written in object-based higher level languages; e.g. perl, ruby or python. ● Instrument and control interfaces can be performed through a standalone GUI or a web browser on the ground, and through GSEOS in flight (via python scripts) ● We (C. Kulesa and C. Martin) have downloaded the SBI control code/simulator from Pietro and Harry, and have dovetailed the instrument control software into the APL system. ● The OTF mapping scheme, modifications to the SBI scheduler, and integration of the STO data server are the principal changes.

simplified view of socket servers

Example of control flow implemented on the STO test flight ● Observer issues a GSEOS position switch script ● Telescope is moved into position for on-source integration (ACE+MAX3) ● Once on-source, the DAC is commanded to begin a timed integration. DAC acknowledges the command request. ● DAC notifies the ICC and spectrometer to see if everything is ready, on OK, it commands the FFT spectrometer to integrate ● After integration, FFT spectrometer uses anon-FTP service to move the level 0 data to a ramdisk on the DAC ● DAC scoops up level 0 data and splats a FITS header on it based on pointing data from MAX3 and instrument housekeeping from the ICC ● Once integration is complete, DAC spools the data to the CCC for downlink, if requested and tells ACE that it is done. An error code is reported otherwise. ● The same process is repeated for off-source integrations until the position-switched sequence is complete. ● Tony Stark accessed the downlinked data via a Samba share and wrote an interface to the Bell Labs COM software to examine the data

On-board data processing ● We are generalizing FCRAO's OTFtool and OTFmap software for use with STO, both for online and offline reductions ● This takes the level 0/1 data to level 2 ● (i.e. to baselined, regridded OTF spectral cubes)

Data Selection

Data Assessment

Regridding

Output Modules ● FITS-compliant data cubes are default – Only interfaces needed for HIPE and Miriad – Need to ensure that all header tags and keys are supplied ● GILDAS files (gdf and class) – 2001 interface is already supported, needs updating for GILDAS 2009

Limitations ● Pointing and scheduling ● Data taking cadence (< 5 Hz) ● On board data storage ● HEB mixer, IF, FFTS stability – Can we get away with the combination of load chopping and position-switched OTF? – Frequency-switching?

Pointing Limitations The sky viewed from 80 o S

Galactic Center STO Survey Region Upper Elevation Limit (Balloon Avoidance Zone)

Survey Feasibility / Optimization Not all of the STO survey area is visible at all times of the mission, due to pointing constraints. l = -20 o is visible 90% of the time l = -40 o is visible 45% of the time l = -60 o is visible 20% of the time This places a restriction on the amount of time we will be able to observe at high Galactic longitude.

On-The-Fly Mapping Strategy ● ~30 second OTF scans as per Allan variance measurements ● Normally, we would position the secondary or the telescope off- source for a reference scan. For STO, this may not be practical. Instead, we will use internal cold load(s) for immediate reference. ● Off-source reference scans and cals to be performed at regular intervals.

Optimization spreadsheet – 2 examples 1 second spectrometer dumps scan 15”/s 2 second spectrometer dumps scan 10”/s