Rosetta Magnetic Field PDS Review B. J. Anderson.

Slides:



Advertisements
Similar presentations
EPOCH 1000 Series Software Feature AWS D1.1/1.5
Advertisements

1 Chapter 16 Fourier Analysis with MATLAB Fourier analysis is the process of representing a function in terms of sinusoidal components. It is widely employed.
LADEE PDS4 Experience G. Delory LADEE Instrument and PDS Teams LADEE SOC PDS Management Council Nov /19/2014 LADEE PDS4 Experience 1.
Time correlation CHRIS WATSON ESAC.
Dale E. Gary Professor, Physics, Center for Solar-Terrestrial Research New Jersey Institute of Technology 1 3/16/2012OVSA Preliminary Design Review.
DECO3008 Design Computing Preparatory Honours Research KCDCC Mike Rosenman Rm 279
Conversation Form l One path through a use case that emphasizes interactions between an actor and the system l Can show optional and repeated actions l.
14/06/20151 MORE Requirements seen from ESA Pedro Pablos 1 st MORE Team Meeting 27 Febrero 2007.
QARTOD 2: Remote Currents Working Group Definitions: Level 1 – refers to radials Level 2 – refers to total vectors Level 3 – refers to higher level data.
A452 – Programming project – Mark Scheme
PAT Validation Working Group Process and Analytical Validation Working Group Arthur H. Kibbe, Ph.D. Chair June 13, 2002.
Solar Probe Plus FIELDS Monthly Management Telecom Sep 17, 2012.
Qualitative Studies: Case Studies. Introduction l In this presentation we will examine the use of case studies in testing research hypotheses: l Validity;
GODIAN MABINDAH RUTHERFORD UNUSI RICHARD MWANGI.  Differential coding operates by making numbers small. This is a major goal in compression technology:
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
THEMIS-SCM THM – SCM – CDR – 08-April-2004 in Velizy
1. 2 Purpose of This Presentation ◆ To explain how spacecraft can be virtualized by using a standard modeling method; ◆ To introduce the basic concept.
Science & Music Project Overview Phase 1 Phase 2 Phase 3 Phase 4 Phase 5 Phase 6 Phase 7 Phase 8 Final Notes All phases will be completed over the course.
Common PDR Problems ACES Presentation T. Gregory Guzik March 6, 2003.
Error reports as a source for SPI Tor Stålhane Jingyue Li, Jan M.N. Kristiansen IDI / NTNU.
Designing Interface Components. Components Navigation components - the user uses these components to give instructions. Input – Components that are used.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution.
Using error reports in SPI Tor Stålhane IDI / NTNU.
Question To know that quality has improved, it would be helpful to be able to measure quality. How can we measure quality?
R-Alice Lutetia PDS/PSA Data Review Kurt Retherford.
PDS Geosciences Node Page 1 Archiving Mars Mission Data Sets with the Planetary Data System Report to MEPAG Edward A. Guinness Dept. of Earth and Planetary.
CODIF Status Lynn Kistler, Chris Mouikis Space Science Center UNH July 6-8, 2005 Paris, France.
Page 1ENVISAT Validation Review / GOMOS session - ESRIN – 13th December 2002 ENVISAT VALIDATION WORKSHOP GOMOS Recommendations by the ESL team : Service.
An Intercalibrated Microwave Radiance Product for Use in Rainfall Estimation Level 1C Christian Kummerow, Wes Berg, G. Elsaesser Dept. of Atmospheric Science.
Validating Requirements Determining Completeness and Correctness of Requirements Using the System Reference Model IV&V Workshop 16 September 2009.
SPICE Production at ESTEC April SPICE Production at ESTEC 2 Overview SPK/CK production SCLK production FK and IKs production Other Kernels.
Data Standards Development August 29, Topics 1.Current Status 2.What was delivered for Build 2c 3.How was IPDA supported 4.What mission support.
Comments from User Services C. Boquist/Code 423 The HDF Group Meeting 1 April 2009.
SWGTemplate- 1 UCB, Nov 15/16, 2006 THEMIS SCIENCE WORKING TEAM MEETING Search Coil Magnetometer (SCM) team Co-i: A. Roux, O. Le Contel Technical Manager(*):
1 W.Hell (ESA) March 2015 Validated TDM Delivery Considerations Validated TDM Delivery Considerations March 2015.
1 Lecture 15: Chapter 19 Testing Object-Oriented Applications Slide Set to accompany Software Engineering: A Practitioner’s Approach, 7/e by Roger S. Pressman.
Van Allen Probe Data Pathways SWG, September 23, 2014 D. G. Sibeck NASA/GSFC.
Sally McCallum Library of Congress
EPOXI HRI-IR C/ISON (2012 S1) Calibrated Spectra Second Review (Sitko) CATALOG: Contains ASCII text files on each science target (9P, 103P, Garradd, ISON,
1 Policy Discussion: Data Processing Levels PDS Management Council March 26, 2010 PDS 2010 Data Design WG.
Document Releases Peer Reviews Constraints CDF-A How to…
RBSP Radiation Belt Storm Probes RBSP Radiation Belt Storm Probes 3-4 Sept. 2008EFW INST+SOC PDR447 Command, Telemetry, and Ground Support Equipment (CTG)
PDS4 Build 3b System Readiness PDS Management Council Face-to-Face Columbia, Maryland April 2-3, 2013 Sean Hardman.
Zou Ziming 1 Ma Wenzhen Li Lei Zhao Hua Wang Chi 1: Center for Space Science and Applied Research Chinese Academy of Sciences Moscow ·
Fusion Design Overview Object Interaction Graph Visibility Graph Class Descriptions Inheritance Graphs Fusion: Design The overall goal of Design is to.
ESOC | Chris Watson | ESA/ESOC | Page 1 SOC-Instrument Team interfaces CHRIS WATSON ESAC.
1 CAA 2009 Peer Review, Jesus College, Cambridge, UK, March CAA Peer Review: Selected Recommendations.
What is this? SE-2030 Dr. Mark L. Hornick 1. Same images with different levels of detail SE-2030 Dr. Mark L. Hornick 2.
9 th CAA Cross-Calibration Workshop, Jesus College, Cambridge, UK, March /17 CAA Graphics: Pre-generated/On-demand Panels and Cross-Calibration.
16 th CAA Cross-Calibration Workshop IRAP, Toulouse, 6-9 November20121 Removing strong solar array disturbances and telemetry errors from DC magnetic field.
15th CAA Cross-calibration workshop CIS archiving activities report University College of London 2012, April
Double Star Active Archive - STAFF-DWP Data errors and reprocessing Keith Yearby and Hugo Alleyne University of Sheffield Nicole Cornilleau-Wehrlin LPP.
Navigation and Ancillary Information Facility NIF Frames Kernel FK March 2010.
Double Star Active Archive - DWP/STAFF 1 Double Star Active Archive STAFF/DWP Keith Yearby and Hugo Alleyne University of Sheffield Nicole Cornilleau-Wehrlin.
Investigation of a discrepancy between magnetic field magnitudes determined by the FGM and EDI instruments Jonny Gloag, Edita Georgescu, Elizabeth Lucek,
Quality Control of Soil Moisture and Temperature For US Climate Reference Network Basic Methodology February 2009 William Collins USCRN.
WEC meeting TED status and WEC timing.
Validation of HLA Source Lists Feb. 4, 2008 Brad Whitmore 1.Overview 2.Plots 3.Summary.
Pre-PDR Peer Review 1 UCB MAVEN Particles and Fields Flight Software Peer Review RFAs and Recommendations Peter R. Harvey.
SOC-Produced Auxiliary data
Charles Acton NAIF Manager JPL July 18, 2007
Cluster Active Archive – Wideband data BM2 mode
CAA-OR (End of Phase 1) CAA DWP Operations Review
Optical Bench Anomaly Investigation and Modelling
SECTION 8 OVER-RUNNING PULLEY.
PRINCIPAL INVESTIGATOR: Ralph McNutt, APL
Transmitted by the expert
NanoBPM Status and Multibunch Mark Slater, Cambridge University
<Your Team # > Your Team Name Here
Presentation transcript:

Rosetta Magnetic Field PDS Review B. J. Anderson

Completeness of data Science needs: See actions 1, 8, Calibration levels: See actions 2-3, 5-7, Housekeeping and science operations info.: See action 4. Software: Not evaluated. Documentation: See actions 9, Calibration information: See actions 10,

PDS Compliance Data structures: appear to be PDS standard. Conventions used: standard time and magnetic field conventions used. Geometry information: Did not see SPICE kernel information – if these are provided under another Rosetta delivery they should be referenced in the MAG documentation. SIS document: Seems complete. Completeness: – Documentation: YES (with few minor corrections) – Catalogs: Not evaluated. – Index files: Not evaluated. – Processing levels: YES. – Data coverage: YES.

Recommended Revisions (1-3 of 12) 1.Time offsets due to PIU (Power or Plasma Interface Unit?) software filter algorithm can be large (>100s). This is not applied to OBT but only to UTC in the products. Action: Need to clarify that this time correction was accounted for when relating the heater, reaction wheel, and other spacecraft events (power system transitions etc. – though not mentioned) to magnetic field signals in the IB and OB sensor data. 2.Corrections for Lander Heater Current, and Reaction Wheel signals were applied. Action: Spacecraft events (heater currents, solar array angles, high-gain antenna angles, reaction wheel rates) need to be provided (are not included in delivery) and registered in UTC since the PIU time offset is only included in the UTC magnetic field data. 3.Temperature corrections were critical in calibrating the data. Action: Need to clarify the following: Was the time offset applied in doing the in- flight temperature correction calibration? Or were the temperature and magnetic field data aligned in OBT? (Which would be wrong according to the long time offsets that the PIU software introduces.)

Recommended Revisions (4-7 of 12) 4.Spacecraft (SC) currents evidently contribute to the magnetic field signals. Action: Specify where SC telemetry for currents and gimbaled structures are provided in PDS delivery. If not in another data set then provision for them should be made in the magnetometer delivery. 5.Correction for reaction wheel signals is discussed in the frequency domain. Action: Need to clarify the following: How was the phase of the reaction wheel signal established? How was the conversion back to the time domain done (presumably via inverse FFT). What time windows were used for the FFT and IFFT to implement the correction (i.e. the frequency resolution of the reaction wheels)? 6.Data qualify flag is very useful. Action: The definitions of perfect, good and poor correlation between IB and OB sensors are not given. 7.Clarification of data qualify flag definition: Action: The measure used to determine whether IB and OB show different long-term behavior is not given. That is, what does long-term mean? Longer than what time interval?

Recommended Revisions (8 of 13) 8.The quality flag string is very useful. It would perhaps be useful to indicate the quality flags which would be considered appropriate for science analysis. Action: Add set of flags that can be regarded as valid science data. Presumably these are: Char 1:0 or 1 Char 2:0 or 1 Char 3:1 (i.e. one would assume that only boom deployed data are appropriate for science). Char 4:0 Char 5:0 or 1 (*see notes below) Char 6:0 Char 7:x Char 8:x Thus, the following eight strings indicate valid science data: “xx000100”, “xx010100” “xx000110”, “xx010110” “xx000101”, “xx010101” “xx000111”, “xx010111” Is this true?

Recommended Revisions (9-11 of 13) 9.The data set under review consists of only a few days of data. Action: Would it not be reasonable to request that some form of survey overview plots generated by the team be provided in a platform independent format, e.g. png or jpeg? (Note that such plots are included in flight reports.) Separate stand-alone plot files would be extremely useful to investigators trouble shooting their own implementations of file readers and displays. 10.Extensive information on the amplitude response of the magnetometers with frequency are provided. However, no information about the corresponding phase relationship is given. Action: The calibrated phase function vs. frequency and/or fixed time delay in the instrument should be given. 11.No information is given about the PIU filter and its frequency response or its phase response. Action: Since the PIU time offset is dramatic, the frequency and phase responses of the PIU software should be documented as well.

Recommended Revisions (12-13 of 13) 12.Calibration gain, temperature, offset etc. A veritable mountain of data and plots are provided but it is not until page 163(!) of the RO_IGM_TR00003.pdf document that a useful summary of the results is presented. This should be given in a suitably named summary of calibration results document so that it can be readily located. In this regard, RO_IGEP_TR0028.pdf is a very useful document. Action: The calibration (RO_IGEP_TR0028.pdf) summary should be suitably named so that investigators can readily find this information. 13.The poor quality of the figures in RO_IGEP_TR013.pdf (and.doc) is unfortunate. These are highly compressed graphic files, which are barely legible. Action: Perhaps better quality originals are still available?