Presentation is loading. Please wait.

Presentation is loading. Please wait.

SDO Ground System Mission PDR

Similar presentations


Presentation on theme: "SDO Ground System Mission PDR"— Presentation transcript:

1 SDO Ground System Mission PDR
Raymond J. Pages, Ground System Manager William J. Potter, Deputy Ground System Manager

2 Agenda Ground System Overview Raymond J. Pages
Operations Concepts William J. Potter Development Status William J. Potter Ground System Element Designs William J. Potter Science Operations Centers Instrument Teams Verification Approach Raymond J. Pages Schedule Raymond J. Pages Ground System Risks Raymond J. Pages Summary Raymond J. Pages

3 Major Mission Drivers and Impacts
Three major drivers in the formulation and implementation of the SDO Mission Concept and Ground System High data rate driven by increased instrument observation resolution and cadence over previous similar solar science missions (SOHO, TRACE, etc) 130 Mbps continuous science data downlink (1.4 Terabytes/day) Instrument science loss requirements driven by science data completeness needs HMI Science data completeness has the most stringent requirement 95% of all possible 10 minute observation periods over mission life must be captured and delivered to SOC For a 10 minute observation to be valid, 99.99% of data must be error-free and delivered to the SOC Instrument science data processing, analysis, & user community access is driven by scale and volume of continuous science data Requires significant, data organization, processing, and storage capability 1.4 Terabytes/day of raw science data expand into data products that are used for scientific analysis Results in a significant data storage, organization, and analysis implementation challenge Science data management leveraged and expanded from existing approaches proven on previous solar science predecessor missions Reuse of algorithms, user interface, data processing approach Higher data volume addressed by planning for conservative technology improvements to lower cost, but can be implemented with current technology Maintaining health and safety of SDO in geosynchronous orbit in continuous contact with dedicated ground antennas is easier than most GSFC missions

4 SDO Ground System Architecture
10/21/03 SDO Ground System Architecture S-Band: TRK, Cmd & HK Tlm SDO Ground Site #2 (White Sands) S-Band HK Tlm, TRK Data S-Band External S-Band: TRK, Tracking Station Cmd & HK Tlm ground system Acquisition (Includes 72-hr storage) Data Cmd Ka-Band: 150 Mbps Science Data Same Interfaces Ka-Band as Prime Ground Site ground system S-Band: TRK, (Includes 48-hr storage) Cmd & HK Tlm Ka-Band: 150 Mbps Science Data SDO Ground Site #1 SDO Mission Operations Center (White Sands) Observatory Commands S-Band Acquisition Data ground system Station Control Telemetry & Command (T&C) System Flight Dynamics (Includes 72-hr storage) System Observatory Housekeeping Telemetry Orbit Determination Ka-Band Tracking Data Maneuver Planning ground system ASIST / FEDS Station Status Telemetry Monitoring Product Generation (Includes 48-hr storage) Command Management R/T Attitude Determination Sensor/Actuator Calibration Status and Control Ka Science Data HK Data Archival (DHDS) HK Level-0 Processing Mission Planning & Scheduling Automated Operations Data Distribution DDS Control Anomaly detection plan daily/periodic events System create engineering plan (Incl. 30-Day DDS Status Generate Daily Loads Science Data Storage ) Ground Station Control System Trending HMI Science Data AIA 55Mbps Science Data EVE 67Mbps Science Data DDS Alert Notification System (ANS) 7 Mbps Control System R/T Housekeeping Telemetry HMI AIA JSOC (Stanford / LMSAL/ EVE SOC R/T Housekeeping Telemetry Palo Alto Ca.) (LASP / Boulder Co.) Memory dumps Simulated commands Flight software loads Simulated housekeeping telemetry Science Planning and FDS Products Flight Software Maintenance Lab (FLATSAT) Instrument Commands / Loads

5 SDO Science Data Path Implementation & Flow
Observatory science data collection: - No Data recorder, dramatically easing overall system complexity & cost - Data loss allocations characterized in data capture budget meet instrument science requirements - BER allocation for Radiation SEE data loss Dual antenna sites (SDO dedicated): Both antennas actively receiving & transferring data to DDS Geographically diverse (3 miles) to address localized rain attenuation Each antenna has 48 hr data buffer to avoid data loss in transferring data to ground station DDS Dual antennas dramatically increase system reliability, remove need for more costly, higher reliability single antenna- remove need for quick turnaround maintenance to avoid prohibitive data loss in event of single antenna anomaly RF Link: BER allocation in data capture budget Ground station : Data Distribution System (DDS) system provides 30 day temporary data storage; replaces Observatory recorder & eliminates complex recorder management of Observatory Dual 30 day recorder allows for system anomalies & maintenance, as well as data storage for retransmission in event of SOC data loss or line outages SOC data collection & analysis: SOCs can request retransmission of science data from 30 day DDS over existing high speed data lines Removes quick turn-around data evaluation, reducing SOC complexity and cost NO PROCESSING OF SCIENCE DATA AT GROUND STATION Dramatically eases complexity and cost of ground system Allows “bent pipe” transfer of instrument science data to SOCs Ground Station Data Storage HMI/AIA Data transmission to SOCs: High speed data lines with 1.5x capability allows for retransmission of data to SOCs from 30 day DDS storage Result is essentially “error free’ data transmission from DDS to SOCs Science Community EVE

6 SDO Mission Overview - Ground Segment
Missions Operations Center at GSFC Typical Observatory Control Center based on MAP, EO-1, IMAGE Architecture “Ground Station” designed to collect/distribute Science Data to Instrument analysis sites Redundant Antenna Sites with 48 hour “line outage” function - New Build of dedicated facility at White Sands Fault tolerant 30 day short term data archive (DDS) - 42 Tb Total No data processing, primarily “Bent Pipe” function with Data Accounting New build of dedicated Ground Station at White Sands Unprocessed science data distribution to SOCs - Use commercial Ethernet networks Three Science Operations Centers (SOCs) Monitor Instrument Performance, Health and Safety - Typical of most instruments Plan Science Operations, Configure and Control Instruments - Typical of most instruments Science Data processing, analysis, and distribution to user community Receive and Store Data Raw Science Data Decompress Raw Data to Level 0 Process Science Data into “Products” Catalog and Index Data Archive All Data for Backup Analyze Science Products, Address Level 1 Science Questions Deliver Science Products Provide Access for User Community

7 Operations Concept - Mission Phases
Launch and Acquisition Orbit Circularization & In-orbit Checkout Instrument Commissioning Science Mission Phase Nominal Mode Periodic Calibrations/Housekeeping Eclipse Stationkeeping & Momentum Management Safehold & Emergency Modes Disposal

8 Launch and Acquisition – Ground Ops
Phase covering pre-launch configuration until Observatory is power-positive and pointing at the sun Launch until separation is approximately 45 minutes to a separation altitude of 300km Transmitter powered on minutes before separation to allow for telemetry at separation At separation, telemetry and command available through external network ground station, Overburg, South Africa, Perth or Dongara Australia TDRSS available as contingency/backup Attitude Control System acquires sun in 45 minutes Once the observatory is power-positive Deploy the HGAs by ground command – Nominally deployment is scheduled early before the S/C (including the HGA hinges) begins to cool even though HGAs not needed until much later Operations are directed from GSFC with minimal power up team at KSC

9 Orbit Circularization & In-Orbit Checkout
Phase used during first weeks to circularize the orbit from the GTO, place SDO in its geosynchronous slot at 102ºW and checkout and calibrate Observatory Spacecraft components brought on-line, and capabilities/modes checked ACS/Propulsion Checkout and calibration Safehold checkout (both ACEs) Sensor Checkout/Calibration – IRUs, Star Trackers Inertial Hold / Slew Capability checkout prior to fist planned maneuver Thruster checkout prior to first planned maneuver (phasing for 5lb thrusters) Instruments not powered on until all large apogee maneuvers complete (after 3rd maneuver) or later High rate science data system (Ka-Comm, Ka-XMTR, HGAs) off Four (4) large Apogee Motor Firing (AMF) and three (3) small Trim Motor Firing (TMF) maneuvers are planned to place SDO in its final geosynchronous slot over two week period AMF maneuvers use 100lb thruster, TMF maneuvers use 5lb thrusters Total maximum duration for any maneuver activity will be less than ~90 minutes Maximum slew time of 20 minutes before/after, settling, 50 minute maximum Delta V Observatory may be pointed to any orientation during maneuver, so power, thermal, other designs must take 90 minute off-pointing as a requirement Maneuvers are not time critical If a maneuver is aborted or missed it can be made up later with no penalty Observatory communications via external ground networks and SDO ground station SDO dedicated ground station not available for continuous coverage until 3rd apogee maneuver Thruster burns must be started and completed within view of one (or more) ground station Consideration given to slight delay of apogee burns until Observatory in sight of station Commands for maneuvers uploaded to Absolute Time Sequence buffer, rather than singularly commanded from ground

10 Instrument Commissioning
Instrument calibration and commissioning begins once on-station and lasts 30 to 60 days Instruments are powered on (if not already) and optics doors are opened Observatory communications through SDO ground station for S and Ka band High rate science system brought on line (Ka-Comm, Ka-XMTR, HGAs) HGA calibration is performed to remove static misalignments SDO Ground Station tracks RF power while HGA performs raster slews Instruments begin producing science data Science data distributed directly from SDO ground station/DDS to SOCs Instrument operations support from both MOC and SOCs Spacecraft supports instrument calibration roll maneuvers and off point maneuvers Maneuvers similar to periodic instrument calibration maneuvers described later

11 Nominal Mode (Science Phase)
Expected to be phase that mission stays in 99% of time once at GEO, with few operational activities/interruptions normally planned Ka-band science data is downlinked through SDO ground station and distributed to SOCs on continuous basis S-band housekeeping data is collected by ground site and distributed to MOC, which further distributes data to SOCs Nominal downlink rate is 64 kbps to the SDO ground station (data on RF carrier) Twice daily periods of s-band omni antenna RF interference may degrade H/K data S-Band data rate may be reduced to improve link margin during interference times Orbit tracking operations performed two consecutive days a week 6 passes (30mins each) each day from the SDO ground station and 1 pass (30 mins) each day from an external ground station (Hawaii) – potential to reduce tracking to bi-weekly RF reconfiguration required for tracking, data placed on subcarrier and data rate lowered Instruments SOCs will have a normal window each weekday to command Instruments and uplink loads with all commands passing through MOC to ground site Anticipate weekly loads since instruments are full sun viewing with routine operations Contingency command periods if on-duty FOT member(s) are contacted and bring up command link Spacecraft data recorder maintains circular buffer with 24 hours of housekeeping data in order to capture anomalies in case of data loss Attitude control system autonomously points reference boresight to sun and maintains proper rotation about sunline Proper orientation is achieved by Inertial slew to sunline using Star Tracker attitude, then switching to Guide Telescope for science pointing

12 A Week in the Life of the SDO

13 Periodic Interruptions to Science Mode
There are several periodic interruptions to the nominal science mission mode Stationkeeping maneuvers: twice a year Momentum Unload maneuvers: every 4 weeks Instrument Calibration (Roll and Off-point) maneuvers Roll maneuvers to observe solar shape: twice a year Off-point maneuvers for flat fielding and optical distortion calibration: twice a year Alignment adjustments – to align instrument to reference: every 2 weeks Cruciform off-point and FOV Maps: quarterly Eclipses (earth and moon): Two eclipse seasons a year centered around spring and fall equinoxes HGA handovers: twice a year Periodic HGA calibration may be required to maintain HGA pointing requirement Thermal effects (to HGA boom) may degrade HGA pointing RF signal strength degrades rapidly as HGA boresight pointing drifts outside the nominal antenna beamwidth All scheduled interruptions which cause science data loss are included in the Data Capture Budget

14 East-West Stationkeeping
Stationkeeping shown here in December and June but current plan is for September and March (near equinoxes) to improve thruster alignment with anti-velocity vector.

15 Safehold/Emergency Modes
Several capabilities will exist on the Observatory for “safing” in the event of an anomaly as described earlier in this PDR In addition to on-board detection and notification, ground system elements will perform real-time monitoring of all flight subsystems and instruments 24x7 capture all health and safety data for life of mission perform long-term trend analysis of key subsystem components utilize MOC and SOC paging systems to alert staff of detected anomalies or limit violations during unstaffed hours

16 Disposal At end of mission, NASA policy requires disposal of SDO into an orbit that won’t interfere with other spacecraft Increase altitude to >300 km above GEO orbit The actual de-orbit altitude is GEO km + X, where X is a function of the spacecraft mass and cross-sectional area. Operations similar to orbit circularization at beginning of life In order to ensure enough power for operations, instruments and science-oriented spacecraft components will be powered off

17 Development Status – Highlights since SCR
Completed mission operations concepts Completed level-3 requirements for ground system elements Conducted Ground System Requirements Review in September 2003 Delivered ASIST workstations to flight software and attitude control subsystem for I&T support Completed MOC trade studies to select subsystems comprising the MOC Selected two vendors for high-rate Front-End Processor evaluation by DDS Conducted Preliminary Design Peer Reviews for all ground system elements with exception of AIA SOC Commenced reconfiguration of future SDO MOC location in Bldg 14, second floor Developed consolidated HMI and AIA science operations approach - JSOC

18 Summary (Most significant RFA)
Ground System Reviews Review Type & Date Review Team RFA Count/Status Summary (Most significant RFA) GS SRR – 30Sep03 R. Menrad-Chair, S. Scott, R. Mahmot, M. Foote, R. Schweiss, J. Donohue, G. Iona, J. Wolfgang, J. Bristow, S. Bundick, T. Cygnarowicz 36 RFAs/ All submitted for closure Add SOC requirements to Ground System Level-3 document (DMR) HMI Instrument PDR-Nov03 T Cygnarowicz - Chair, R. Chalmers, S. Graham, V. Hall, F. Herrero, C. Rice, M. Roberto, J. Schepis, S. Scott, J. Srour, E. Waluschka No SOC/Ops-related RFAs No impacts to SOC design EVE Instrument PDR-Dec03 Comm Network Peer Review-28Jan04 R. Kemp-Chair, M. Foote, S. Douglas, J. Cameron, A. Operchuk 7 RFAs/ All open Separate high rate science data lines from backup housekeeping data lines Operations Peer Review-5Feb04 P. Crouse-Chair, R. Burley, C. Bengston, R. Lonigro, J. Pepoy, A. Dress, H. Benefield Characterize omni antenna nulls and impact on forward/return links MOC Peer Review-11Feb04 M. Rackley-Chair, S. Coyle, H. Dew, D. Mandl, R. Lonigro, M. Woodard 13 RFAs/ Reassess decision to not have complete instrument databases in MOC DDS/Ground Station Peer Review-17Feb04 J. Martin-Chair, F. Stocklin, R. Conrad, B. Gioannini, R. Burley, H. Dew, J. Gurman, D. Israel, P. Militch, M. Rackley, T. Walsh 12 RFAs/ Increase raw Ka-band storage from 48 hours to 120 hours

19 Ground System-Level Documents
DDS to SOC ICD CDR Bialas MOC to SOC ICD Tann MOC to External Network ICD SGTS to MOC ICD Anderson WSC Controller ICD FlatSat to MOC ICD Walters Detailed Mission Reqs. Doc PDR Pages L&EO Handbook MOR Ferrara Observatory Ops Handbook FORR Ward GS to Ops Transition Plan Product Dev. Handbook SDO GTS Maint. Manual SGTS SOW Project Database Mgmt Plan Mission Ops Support Plan MOC Reqs. Spec. DDS Reqs. Spec. WSC Facility MOU SRR Tlm & Cmd Handbook Maldonado DDS As-Built Design Doc. MOC As-Built Design Doc. SDO GTS As-built Plans MOC Design Document GS Readiness Test Plan Oertly SDO GTS Accept Test Plan DDS User’s Guide MOC User’s Guide IT Risk Management Plan Spinolo Info. Technology Security Plan SDO GSt Facility Plans Flt Operations Handbook Flight Operations Procedures Flight Operations Plan GS Contingency Plan GSRT Reqs. Matrix SDO GTS Users Guide External Network Ops Agreement MOC Facility Ops Handbook Network Ops Support Plan WSC to MOC Ops Agreement SOC to MOC Ops Agreement FSW Maint. Agreement WSC Facility Ops GS Freeze Plan Product Development Hdk

20 Documentation Status Mission Operations Concept Document (MOCD)
CCB approved Detailed Mission Requirements (DMR) for SDO Ground System Submitted for formal review with CCR Product Development Handbook Interface Control Documents (ICDs) MOC-to-SOC ICD draft under review DDS-to-SOC ICD draft under review WSC-to-MOC ICD draft in work GSt-to-MOC-ICD draft in work External Network-to-MOC ICD waiting on identification of supporting networks Ground System Requirements Specifications Final versions under review Ground System Element Design Specifications Draft versions under review

21 Trade Studies Since SDO SCR
Antenna Trade Study Evaluation in progress to identify external networks to support early mission and nominal mission phases including TDRS DDS Trade Study Evaluation in progress on selection of commercial vendor for high-speed front-end processors MOC Trade Studies Evaluation completed for in-house mission unique designs vs. COTS vs. GOTS vs. custom application Finding: Five largest subsystems will be COTS/GOTS

22 SDO Dedicated Antenna - Architecture

23 Dedicated SDO Ground Network Support S-Band Hardware Architecture

24 S-Band Description The downlink path consists of a low noise amplifier (LNA), fiber optic modems, down-converters, and Receive Range Command Processor (RRCP) The RRCP performs all receive demodulation, bit and frame synchronization, and CCSDS packet handling. RRCP also performs all command formatting, command verify, and ranging / range rate functions. Interface to DDS and MOC via LAN/WAN The uplink path consists of RRCP, up-converter, fiber optic modems, and a power amplifier The S-band will provide 2 Kbps uplink for commanding health and safety real-time downlink (32 Kbps nominal), plus 32 Kbps for housekeeping dumps to the MOC 72 hour archive of all spacecraft telemetry Initial signal acquisition Provide angle, range, and Doppler tracking data

25 SDO Antenna Architecture – Ka-Band

26 Ka-Band Description The spacecraft signal will be received on a frequency of 26.5 GHz and amplified by the LNAs prior to down conversion to a lower intermediate frequency (IF) Gigabit Ethernet will be used to allow the data from the Second TDRS Ground Terminal (STGT) (northern antenna) to reach the White Sands Ground Terminal (WSGT) (southern) site The Ka-band receiver (part of DDS) demodulates high rate science data from the spacecraft The Ka-band system will provide: continuous downlink, high rate science data to the DDS precision antenna pointing

27 SDO Ground Network Description
Two 9.3-meter antennas configured for S-band and Ka-band; both required to support high rate data capture Weather (rain, clouds) Maintenance Failures (mechanical, electrical) Antennas will be located at White Sands NM approximately 3 miles apart Co-located Ka-band Receivers / Front-End Processors (FEPs) will be installed (Part of DDS) Gigabit Ethernet will be used to provide connectivity between primary and secondary sites Both antennas will be online all the time capturing telemetry and forwarding it to the DDS Each antenna will have redundant equipment chains Station operations will be automated and status monitored from the MOC during normal mission operations Station control and reconfigurations can be conducted remotely by the MOC or on-site at WSC

28 SDO Ground Station Make/Buy
Implementation will use sites at WSC and available infrastructure and personnel The SDO antennas will be a competitive commercial buy The SDO Project will issue a second Request For Information (RFI) in June 04 to begin the search for a vendor The Vendor will be responsible for design, fabrication, installation, testing, and training The Vendor will provide all necessary documentation to maintain and operate the antenna systems The SDO Project will be responsible for acceptance testing and take full possession of antennas when all test criteria have been met Commercial-off-the-shelf products will be used extensively on this development effort

29 SDO Ground Station Heritage
Ground system personnel have procured, installed and tested ground stations that have supported launch and operations Polar Ground Network (PGN) in support of EOS missions Terra, Aura and soon Aqua Landsat-7 Ground Station at Sioux Falls in support of QuikScat and Landsat-7 DataLynx commercial ground station at Poker Flat in support of several NASA missions Use of Ka-band frequency brings special challenges Tighter pointing constraints in order to support autotracking Higher antenna surface requirements Weather impacts Mitigation of Ka-band challenges Selection of vendor will emphasize prior expertise in Ka-band antenna system development Leverage NASA experience from Wallops Ka-band prototyping activities Leverage WSC expertise in antenna system support Several tracking sources: WSC col tower, ER-2 test flights, Sun Environmental trade study identified WSC location as best site

30 External Network Mission Support
External network support for early mission phases is driven by ground station views of critical observatory maneuvers Separation from launch vehicle First apogee Four maneuver burns Three trim burns External network support for nominal mission support is driven by ground view of geosynchronous orbit Support twice weekly tracking passes Support observatory contingencies A view period and requirements analysis of all mission phases will determine the external network(s) best suited to meet the Project’s needs while minimizing costs and complexity

31 DDS Architecture STGT Spare FEP FEP 1 FEP 2 DDS Core System Spare FEP
TCP/IP Over Gig Ethernet Archived Tlm Files Near-Realtime Data Retransmissions File Output Processor EVE SOC LASP BOULDER, CO AIA SOC LMSAL PALO ALTO,CA HMI SOC STANFORD DDS & Antenna Status & Control MOC GSFC Temporary Online Archive Device (SAN) Fiber Channel Retransmit Manager IP Output Network DDS Core System Re-Interleaver Frame Sync NRZ-M to NRZ-L 150 Mbps Viterbi Decoder FEP 1 Spare FEP FEP 2 Spare FEP Hot Spare WSGT Bldg T1 , RM 150 (GCE) I Q Ka-RF GEAR Demod/ Bit Sync Bit Sync ACC File Exchange Quality Compare Volume Reed- Solomon Pseudo- Random Noise VCDU Sorter Tmp VCDU Files Server STGT QAC Files Control Files Permanent Online

32 DDS Description All Ka Band data will be demodulated, decoded and temporarily stored as VCDUs at the Front End Processor (FEP) in a 2-day circular Buffer A set of files will be created for each Instrument Prime and spare FEPs will be located at each antenna site Instrument VCDU Files will be transferred from the FEPs in real time to the DDS Core System The DDS Core system will accept data from either or both FEPs A "Best Quality" Instrument data file is generated from FEP data sets “Best Quality” is VCDU based The "Best Quality" data files will be stored on line for 30 days then deleted Quality and Accounting (QAC) metadata files will be generated for each Instrument file Instrument data files and metadata files will be automatically transferred to the SOCs with minimal delay (Approx. 1 min) A data catalog will be maintained to facilitate re-transmission and deletion Re-transmissions will be done only at the request of the SOCs

33 DDS Make/Buy and Heritage
Ka FEP Ka FEPs will be commercially procured hardware components Ka FEP will include RF intermediate frequency Ka FEPs will be used for Observatory I&T and experience leveraged for operations version DDS Core System The DDS Core system will be assembled in-house DDS 40 Terabyte RAID system will be procured as a system DDS System software will be developed in two builds with some prototyping Quality compare function will be prototyped using MIDEX approach 1st build will provide full functionality 2nd build will provide bug fixes and enhancements All software will be written in C All machines will be UNIX/LINUX based

34 MOC Architectural Diagram
SDO MOC Station Schedule Mission Planning System External Network Acquisition Data Science Planning Requests Tracking Data FDS Calibration & Scheduling Products Science Schedule HK Telemetry Observatory Commands Tracking Data S D O G R U N I T E FDS Products Acquisition Data Instrument Calibration Requests R/T Telemetry Playback Telemetry Flight S/w Loads Simulated HK Telemetry Command Load Requests T&C SDO SOCs Flight Software Maintenance Lab (FLATSAT) Housekeeping Telemetry ASIST/FEDS Memory Dumps Simulated Commands Instrument Commands Observatory Commands Events/Log Observatory HK Telemetry SOC Staff SDO DDS Alert Notification System DDS Control H&S Files DDS Status Ground Site Control Ground Site Status Trending System Paging Messages and Responses FOT Staff Web Server

35 MOC Architecture - Major Subsystems
ASIST/FEDS: Telemetry & Command (T&C) Subsystem Provide real-time commanding and monitoring of the spacecraft Process, distribute, and archive housekeeping telemetry Support spacecraft and instrument I&T activities Provide remote control and monitoring of the SDO ground station and DDS Support automation Mission Planning & Scheduling (MPS) Subsystem Provide spacecraft and ground system planning and scheduling functions ASIST-based command load generation Flight Dynamics Subsystem (FDS) Provide orbit determination and products Provide attitude determination, prediction, ACS sensor calibration and products Perform maneuver planning Integrated Trending & Analysis System (ITPS): Trending Subsystem Provide spacecraft engineering data analysis and long-term trending capabilities Alert Notification System (ANS): Paging Subsystem Monitor event messages from the T&C subsystem. Page the FOT and SOC personnel when key events occur

36 New Development Effort
MOC Software Heritage S/W Component Implementation Approach Heritage New Development Effort GSFC Missions: MAP, IMAGE, EO-1 I & T Support: XTE, TRMM, FUSE ASIST GOTS Low Mission Planning System GOTS GSFC Missions: EO-1 Low FDS GOTS & COTS GSFC Missions: EOS, EO-1, SMEX, GOES & others Low Alert Notification System GOTS GSFC Missions: EO-1 Low ITPS GOTS GSFC Missions: LandSat-7 SOHO Medium

37 Communications Network Description
Dedicated ground stations give opportunity for a dedicated communications ground network tailored to the SDO mission High Science data rates Tailored communication lines based on best market value SOC real time mission involvement Standard routed IP technology Ethernet system interfaces Common carrier interfaces Nominal firewall requirements Nascom IPNOC monitored, maintained and operated NPG 2810 compliant IT Security Plan tailored to SDO

38 Communications Network Architecture

39 Science Operations Centers (SOC)

40 HMI-AIA Joint Science Operations Center
HMI and AIA teams plan to merge the HMI and AIA SOCs to form a Joint Science Operations Center (JSOC) Science data capture through to Level-0 processing is identical with similar data volume AIA Level-1 processing matches the HMI pipeline model. The JSOC will be at both Stanford and Lockheed-Martin Solar and Astrophysics Lab (LMSAL) SOC-Ops for both HMI and AIA at LMSAL Data Capture, pipeline processing, online archive, tape archive, export functions at Stanford. HMI Higher level processing at Stanford AIA Higher level processing at LMSAL High-speed network connection allows “near-local-disk” speeds between the two sites The dataflow numbers shown in following slides are for the Joint-SOC

41 JSOC Implementation – HMI Component
The HMI SOC provides two primary functions: HMI Instrument Operations Responsible for Instrument science planning, commanding, operations, health and safety monitoring Development based on previous mission (SOHO, TRACE) development and operations (not a driver) HMI Science Ground Data System The SOC-GDS (Ground Data System) captures, processes, archives and exports data while supporting local and remote science investigations. Provides capture of data from DDS, process to level-0 images, process to level-1 science observables, pipeline processing of higher-level “standard” helioseismology data products, and maintain the permanent archive raw and higher level data. Provides support of science analysis by the local investigators as well as to provide data and some computing to the HMI Co-investigator team. The SOC will provide the data in a form useful to accomplish the “level 1” science goals of HMI. Those goals can be achieved by the science team if sufficiently supported. Significant heritage from the SOHO/MDI investigation (same PI and large overlap in science team). SOC Data System user support follows same model as MDI Similar algorithms/operation, with system designed to accommodate larger data volume

42 JSOC Implementation - AIA Component
Instrument MOC (JMOC) - Monitors Heath and Safety and Sends Instrument Commands Hardware and software developed by LMSAL as operational GSE for test and integration of HMI and AIA Responsible for Instrument commanding, operations, health and safety monitoring Development based on previous missions (SOHO, TRACE,SXI, FFP) GSE development Minimal operations commands sent only for software uploads, calibration, and occasion operational mode selection Science Processing Center - Provides Data for Scientific Analysis and Quick Look Pipeline Software developed operated by Stanford All computers, disk drives, and tape libraries in single computer system AIA Quick look and calibration software developed by LMSAL Some software for special science products developed by Co-I’s and foreign collaborators. Catalog uses formats developed for CoSEC and VSO AIA CPU Processing task approximately 160 times that required for TRACE cost estimate assumes factor of 4 increase in CPU speed from 2001 to 2007. AIA On-line disk storage estimated 270 Terabytes assumes average cost per Terabyte drops a factor of 4 over duration of mission. 120 Terabytes purchased in phase C-D and 150 Terabytes in E. On-line data available on Web 2500 Terabyte Archive on robotic tape libraries for access to entire mission data base 240 Terabytes in C-D and 2260 TB in phase E Archive Data available via web request in typically 24 hours Two web sites Public site open to all with limited data transfer allowed. Professional site for data and processing requests.

43 HMI & AIA JSOC Architecture
Catalog Primary Archive HMI & AIA Operations House- keeping Database MOC DDS Redundant Data Capture System 30-Day Archive Offsite Offline HMI JSOC Pipeline Processing System Data Export & Web Service Stanford LMSAL High-Level Data Import AIA Analysis System Local Archive Quicklook Viewing housekeeping GSFC White Sands World Science Team Forecast Centers EPO Public

44 HMI - SOC Processing and Data Flow
LMSAL secure host Dataflow (GB/day) 0.04 Joint Ops Hk Quick Look 1610 1210 1230 Data Capture 1230 Level 0 (HMI & AIA) Level 1 (HMI) HMI High Level Processing 2 processors each HMI & AIA Science 2 processors 16 processors c. 200 processors 1210 1610 75 1200 30d cache 40TB each Online Data 325TB+50TB/yr LMSAL Link (AIA Level 0, HMI Magnetograms) rarely needed 240 1820 Redundant data capture system Data Exports 2 processors 1230 Science Archive 440TB/yr (Offsite) HMI Science Analysis Archive 650TB/yr SDO Scientist & User Interface

45 HMI - SOC Processing and Data Flow
DDS data ingest 1230 GB/day Redundant data capture system 2 processors 30d cache of 40Tb Tape archive 440 TB/year Sent offsite JSOC Pipeline Processing System 2 Processors for Level 0 16 Processors for Level 1 c. 200 Processors for Higher Level Level 0 1610 GB/day Level 1 1210 GB/day High Level 75 GB/day Developed by Launch Upgraded software over mission life LMSAL Link 1200 GB/day Data Catalog and Archive System 325 TB disk cache 500 TB near-line tape system 50 TB/yr permanent online disk 650 TB/yr tape storage Oracle database manages data All on SAN Exports 240 GB/day Data Export System Web access to Data archive Developed by Launch, hardware upgraded over mission life

46 HMI - SOC Pipeline HMI Data Analysis Pipeline Level-0 Level-1 HMI Data
Doppler Velocity Heliographic Doppler velocity maps Tracked Tiles Of Dopplergrams Stokes I,V Filtergrams Continuum Brightness Tracked full-disk 1-hour averaged Continuum maps Brightness feature Solar limb parameters I,Q,U,V Full-disk 10-min Averaged maps Line-of-sight Magnetograms Vector Magnetograms Fast algorithm Inversion algorithm Egression and Ingression maps Time-distance Cross-covariance function Ring diagrams Wave phase shift maps Wave travel times Local wave frequency shifts Spherical Harmonic Time series To l=1000 Mode frequencies And splitting Brightness Images Line-of-Sight Magnetic Field Maps Coronal magnetic Field Extrapolations Coronal and Solar wind models Far-side activity index Deep-focus v and cs maps (0-200Mm) High-resolution v and cs maps (0-30Mm) Carrington synoptic v and cs Full-disk velocity, v(r,Θ,Φ), And sound speed, cs(r,Θ,Φ), Maps (0-30Mm) Internal sound speed, cs(r,Θ) (0<r<R) Internal rotation Ω(r,Θ) (0<r<R) Vector Magnetic Field Maps HMI Data Data Product Processing Level-0 Level-1

47 Generation of Level 1, 1a, and 2 Data
AIA Data Flow Generation of Level 1, 1a, and 2 Data From Stanford Pipeline

48 AIA Implementation Data Flow
Developed by Launch Data from Stanford Pipeline AIA Level 0 decompressed images HMI Level 1a Magnetograms Near Real time Tape AIA Level 0 + HMI Magnetograms Archive / Backup 1.1 Tb/ Day 1.4 Tb / Day, Life AIA Science Data Production Quick Look Movies Browser Catalog Index Quick Look Movies(Level 1a) Browser Catalog Index Calibrated Selected Regions (Level 1a) Calibrated Level 0 (Level 1) Temperature Maps (Level 2) Field Line Models (Level 2) 1.18 Tb/Day On-Line Survey Data for Public Outreach, Some Forecasting Public & User Community Open Web Connection 8 Gb/Day, On-Line Basic Data for Science Analysis SDO Science Team Controlled Web Connection 100Gb / Day Total Cache 100 TB Developed by Launch, Upgraded software and hardware over mission life

49 JSOC Development &Acquisition Strategy
SOC Ops system developed at LMSAL as evolution of existing operating MDI, TRACE, SXI, etc. programs. During instrument build, used with EGSE for I&T During operations hour/day for health check and day/week for command loads SOC Data system developed at Stanford as evolution of existing operating MDI data system. : First 2 Years Procure development system with most likely components (e.g. tape type, cluster vs SMP, SAN vs NAS, etc) Modify pipeline and catalog infrastructure and implement on prototype system. Modify analysis module API for greater simplicity and compliance with pipeline. Develop calibration software modules. : Two years prior to launch Complete Level-1 analysis development, verify with HMI test data. Populate prototype system with MDI data to verify performance. Procure, install, verify computer hardware. Implement higher-level pipeline processing modules with Co-I support During Phase-E Add media and disk farm capacity in staged plan, half-year or yearly increments First two years of mission continue Co-I pipeline testing support

50 EVE SOC Overview EVE Science Operations Center (SOC)
Monitor Instrument Performance, Health and Safety Plan Science Operations, Configure and Control Instruments Science Data processing, analysis, and distribution to user community Receive and Store Data Raw Science Data Tb/day (archive to tape) Decompress Raw Data to Level Tb (no EVE data compression) Process Science Data into “Products” Tb/day (reduction of images to spectra) Catalog and Index Data - N/A for EVE (same as delivered science products Tb/day) Archive All Data for Backup Tb/day (archive to tape) Data On-line for Mission Tb/day Analyze Science Products, Answer Level 1 Science Questions Deliver Science Products Tb/day - solar EUV spectral irradiance data products - on 10-sec cadence and daily averages Provide Access for User Community - daily data products available from EVE FTP site - mission sets available on tape

51 EVE SOC Implementation Overview
EVE Science Operations Center (SOC) in Boulder, Colorado EVE Instrument MOC (IMOC) Responsible for Instrument commanding, operations, health and safety monitoring Development based on previous mission (TIMED, SORCE) IMOC development and operations - very little new development required EVE Science Processing Operations Center (SPOC) Algorithms are similar to TIMED SEE, SORCE, and SOHO SEM instruments, but volume and scale of the processing is larger Archived EVE data volume is x200 of SORCE volume - use robotic tape storage units Continuous 24-7 data flow - auto-switching of redundant interface computers On-line data volume is only x5 of SORCE volume Data Products Space Weather Data Product - near real-time solar EUV indices delivered to NOAA SEC Solar EUV Spectral Irradiance - available daily - 10-sec integrations and daily average EVE data served to public on FTP site (1.4 GB/day) EVE’s SOC budget assumes today’s technology (no new technology required for EVE SOC implementation) EVE SOC Development In-house Development Mission Operations Software (COTS-like from LASP) OASIS-CC - monitoring & commanding OASIS-PS - planning & scheduling Science Processing Software (custom) Heritage algorithms and software from TIMED SEE, SORCE, and SOHO SEM Stage for first year of operations prior to launch, then increment storage capability each year Today’s technology is assumed for EVE SOC budget and processing estimates No contingency backup approach required for EVE

52 EVE SOC Design Overview
WSC DDS <-- Ack --> Science Data SPOC Data Products S/C Activities S/C housekeeping data MOC Planning / Scheduling Planning info Eng Data Public Access Internet Server EVE observation procedures Commands Operations Database User Community Housekeeping (HSK) GSFC EVE SOC Instrument Operations

53 EVE SPOC Architecture Block Diagram
Developed before I-I&T Basic Raw Data Pipeline EVE DDS Interface Cluster EVE Archive / On-line Tape Offsite Tape DDS I/F Data Ingest Level 0 (same as raw) Backup 7 Mbps Archive 0.07 TB / Day, 60 Days 0.09 0.16 TB / Day, Life 18 Computer s TB / Day, Life EVE Central Cluster Levels 1-3 Developed before I-I&T, Upgraded over life Developed by Launch, Upgraded over life 0.02 TB / Day, 60 Days Developed by Launch, Upgraded over life 10 Computer s On-line Private Products Public FTP Site 3 GB / Day, Life Space Weather Operations Science Data Analysis Answer Level 1 Questions 1.4 GB / Day, Life 1 EVE Computer = dual 2.0 GHz Mac G5 1 Computer User Community

54 SDO Facilities At GSFC At WSC
MOC located at GSFC on the 2nd floor of Building 14 Acquired through the Space Utilization Committee Total MOC area is ~2940ft2 MOC consist of Missions Operation Room (MOR) Missions Analysis Room (MAR) Subsystem Engineering Support (SES) Science Operations Support (SOS) Office space, also acquired in Building 14 Rooms E212, E214, E218, E222, E226 and E232 Total office space is ~1720 ft2 At WSC Facility agreement documented in WSC-GSFC MOU authored by the Ground System Manager WSGT - Building T-1 and STGT - Building T-2 Total WSGT area is ~ 320 ft2 Total STGT area is ~ 250 ft2 Two antenna pad locations acquired WSGT - Primary Antenna Site adjacent to the DDC STGT - Secondary Antenna Site

55 GSFC Spacecraft I&T Facility
Ground System Verification Architecture North Site South Site SDO Antenna Sites at WSC External Ground Station Sites NASA/GSFC CTV All KA Science Data Data Source WSC Col Tower CMD, Ephemeris, and Planning H/K & Tracking Data H/K, Tracking Data DDS CMD & Ephemeris Data DDS and Antenna Status and Control H/K And CMD Science GSFC Mission Operations Center (MOC) HK and CMD Data SOC GSE HK & CMD EVE SOC H/K Data FDS& Planning Information MOC GSE Status and Requests from SOCs CMD Data KSC Launch Site EVE SOC LASP AIA SOC H/K Data FDS& Planning Information GSFC Spacecraft I&T Facility MOC I&T Support H/K & CMD Data CMD Data CMD Data Science Data to SOCs AIA SOC LSMC HMI SOC H/K Data FDS& Planning Information HMI SOC LSMC NASA/GSFC CTV Data Collection

56 Ground System Verification Overview
Ground system readiness is to be determined through a series of ground system verification tests, operations tests and simulations Element Level Testing Software, hardware provider system testing Element Acceptance testing to verify each system functionality and performance MOC development and test in the spacecraft lab I&T environment Ground System Readiness Tests (GSRT) Verifies integration of the ground system elements by interface data flows between the elements through to the full integrated end-to-end system validation Demonstrates/proves that the ground system satisfies the requirements Demonstrates element-to-element interfaces and functional compatibility Establishes ground system end-to-end compatibility with the observatory and instruments RF Compatibility Tests Verifies RF links between spacecraft and ground system antennas design during the spacecraft integration Provide Spacecraft RF capability to test the actual ground system antennas at WSC Mission Operations Simulations Verify procedures and personnel operations in long term duration operational environments

57 Verification Approach (continued)
The Ground System Readiness Test Plan will describe the overall effort of ground system testing and describe the tests designed to demonstrate the ground system's readiness to support the SDO mission. The GSRT Plan will specify: Test configurations, team roles and responsibilities, and processes for discrepancy tracking, resolution and retest Test readiness dates, dependencies and objectives Specifies target dates for reviews of test procedures and scripts and data resources Test resources include Instrument data that is gathered during instrument tests Spacecraft and Observatory data captured during Spacecraft I&T Simulators such as FLATSAT, and Engineering Test Units (ETUs) developed by the Project Simulators, modulators digital and RF interface equipment provided by the GSFC CTV Section A portable Spacecraft KA-band modulator is desired for testing the dedicated ground antenna systems. This may be flown on the NASA ER2 plane for a dynamic test.

58 Ground System Schedule Overview
10/24/03 Subsystem & Element CY 2003 CY 2004 CY 2005 CY 2006 CY 2007 CY 2008 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 MISSION MILESTONES 4/03 9/03 3/04 6/04 2/05 1/07 1/08 4/08 SRR/ SCR ICR PDR CR CDR PER PSR LAUNCH Ground System Reviews 9/03 4/04 1/05 1/07 1/08 SRR PDR CDR MOR FORR Ground Tracking Station Ka-Band Frequency Approval Construction/Test 7/03 Antenna Site Complete FDF Certification 4/06 3/07 Data Distribution System Design DDS Ops System Prototype/Implementation 1/07 Breadboard I&T FEP 3/04 High-Rate I&T FEPS 10/05 6/06 Communications Networks MOC Voice Ops T&C 4/05 3/06 WSC LAN & SOC Voice 10/06 Science 12/06 Mission Operations Center Design Implementation I&T ASIST R1 R2 R3 7/05 2/06 8/06 9/03 8/04 Facility Design/Construction Construction Complete 2/05 Facility Complete-3/06 Science Operations Centers JSOC Design Implementation & Test JSOC Ready-4/07 EVE SOC Design Implementation & Test EVE SOC Ready-4/07 Ground System Readiness Testing Acceptance Testing 1/07 GSRT CTV CTV 1/08 8/06 2/07

59 Ground System Risks There are no “Yellow” or “Red” Ground System risks
Ground System Feasibility Presentation conducted on 19 February 2004 yielded three action items Investigate potential NASA systems as backups to the dedicated Ka-band and S-band forward & return links Analyze Code 450 EPGN lessons learned and apply findings to SDO Ground Network design Reassess drivers for science data capture requirement of 99.99%, 95% of the time and compare SDO science data product processing requirements vs. EOS products All action items are being worked with responses due by Confirmation Review – June 2004

60 Ground Segment Assessment
MOC implementation not seen as a driver Significant heritage from previous in-house missions Achievable with current technology- costs compare with other new development missions Ground station has challenges but are appear reasonable Antenna ground stations - no new technology required, RF data rate proven DDS -Storage requirements can be met with today’s technology, “bent pipe” concept with no processing is low risk and can today’s technology & commercial land lines SOCs – Primary driver for data processing, storage and analysis SOC algorithms, processing techniques, and user interfaces are proven, based on previous mission experience Significant heritage/re-use of data processing algorithms from SOHO, TRACE, & TIMED The scale and volume of data received by SOCs are the primary difference and driver Clear understanding on how to address data management based on previous science experiences HMI and AIA implementations assume incremental and reasonable technology advancements (evolutionary, not revolutionary technology advancements) System can be implemented with today’s technology, but will incur additional costs over current estimates

61 Conclusion The ground system design meets requirements with margin and we are ready to proceed to critical design


Download ppt "SDO Ground System Mission PDR"

Similar presentations


Ads by Google