Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #1 Software Benchmarking Results V. Husson.

Slides:



Advertisements
Similar presentations
Introduction to parameter optimization
Advertisements

Principles of the Global Positioning System Lecture 19 Prof. Thomas Herring Room A;
Colorado Center for Astrodynamics Research The University of Colorado ASEN 5070 OD Accuracy Assessment OD Overlap Example Effects of eliminating parameters.
04/22/02EGS G STABILITY OF GLOBAL GEODETIC RESULTS Prof. Thomas Herring Room ;
Sampling: Final and Initial Sample Size Determination
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
2-3 November 2009NASA Sea Level Workshop1 The Terrestrial Reference Frame and its Impact on Sea Level Change Studies GPS VLBI John Ries Center for Space.
The loss function, the normal equation,
The Comparison of the Software Cost Estimating Methods
DFG-Research Unit “Earth rotation and Global Dynamic processes” Poznan, 13 – 17 October 2008 N. Panafidina, M. Rothacher, D. Thaller Comparison and Combination.
Analysis Working Group Report ILRS General Assembly Meeting Poznań, Poland, Oct. 16, 2008 Erricos C. Pavlis Analysis Coordinator.
ESTIMATION OF THE ELASTICITY EARTH PARAMETERS FROM THE SLR TECHNIQUE M.Rutkowska Space Research Centre, Polish Academy of Sciences M. Jagoda Technical.
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 37: SNC Example and Solution Characterization.
CNNIC Symposium Conceptual and Operational Issues in the Measurement of Internet Use * Jonathan Zhu City University of Hong Kong
1 On the use of Numerical Weather Models to Predict Neutral-Atmosphere Delays Felipe G. Nievinski.
Page 1 ENVISAT Validation Workshop - ESRIN – 9-13 December 2002 ENVISAT Validation Workshop, Frascati, 9-13 December 2002 Hannes Raggam Institute of Digital.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
SSFS Test Plan/Report Overview of the test cases What “success” means for a data format type test How the tests were conducted Key results and conclusions.
Testing. Definition From the dictionary- the means by which the presence, quality, or genuineness of anything is determined; a means of trial. For software.
Mission Planning and SP1. Outline of Session n Standards n Errors n Planning n Network Design n Adjustment.
1 SVY207: Lecture 18 Network Solutions Given many GPS solutions for vectors between pairs of observed stations Compute a unique network solution (for many.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
ILRS Analysis Working Group, Orbit Benchmark Study NERC Space Geodesy Facility Graham Appleby Presentation to AWG, 2002 Oct 3-4, HTSI, MD, USA.
Confidence Intervals 1 Chapter 6. Chapter Outline Confidence Intervals for the Mean (Large Samples) 6.2 Confidence Intervals for the Mean (Small.
Confidence Intervals for the Mean (Large Samples) Larson/Farber 4th ed 1 Section 6.1.
Confidence Intervals for the Mean (σ known) (Large Samples)
NGS GPS ORBIT DETERMINATION Positioning America for the Future NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION National Ocean Service National Geodetic.
01/0000 HEO and Daylight Ranging “Reality and Wishes” Ramesh Govind ILRS Fall Workshop, 4 th October 2005.
Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Comparison of ILRS Station Positions (“AA” & “A” Series, i.e. 1999) Van Husson.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION ASEN 5070 LECTURE 11 9/16,18/09.
Copyright © 2015, 2012, and 2009 Pearson Education, Inc. 1 Section 6.1 Confidence Intervals for the Mean (  Known)
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION The Minimum Variance Estimate ASEN 5070 LECTURE.
Reference Frame Theory & Practice: Implications for SNARF SNARF Workshop 1/27/04 Geoff Blewitt University of Nevada, Reno.
Section 6.1 Confidence Intervals for the Mean (Large Samples) Larson/Farber 4th ed.
International Workshop on Laser Ranging, October 2008, Poznań (Poland) Quality assessment of the ILRS EOP „Daily” Product G. Bianco Agenzia Spaziale.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Jason-1 POD reprocessing at CNES Current status and further developments L. Cerri, S. Houry, P. Perrachon, F. Mercier. J.P. Berthias with entries from.
Honeywell Analysis WG (May 22-23, 2000)1 ILRS Station Coordinate Comparison (SINEX Format & Data Integrity) Van Husson.
Theoretical Centre-of-mass Corrections for LAGEOS, ETALON and AJISAI Toshimichi OTSUBO Communications Research Laboratory, Kashima, Japan Graham M APPLEBY.
Copyright © 2015, 2012, and 2009 Pearson Education, Inc. 1 Chapter Confidence Intervals 6.
Chapter Confidence Intervals 1 of 31 6  2012 Pearson Education, Inc. All rights reserved.
Chapter 6 Confidence Intervals 1 Larson/Farber 4th ed.
Limits of static processing in a dynamic environment Matt King, Newcastle University, UK.
IERS Combination WG and CPP Meeting, April 27, 2005, TU of Vienna, Austria Strategies for Weekly Routine Generation of Combined IERS Products Markus Rothacher.
Limits of static processing in a dynamic environment Matt King, Newcastle University, UK.
Thomas Herring, IERS ACC, MIT
TSA2.2 QC Change Management
Unified Analysis Workshop, July 2017, Paris
Detector Configuration for Simulation (i)
Tuning AUTCLN for editing
Understanding what sh_gamit does
Elementary Statistics: Picturing The World
Track Output Interpretation
Automatic Processing with GAMIT
Chapter 6 Confidence Intervals.
X SERBIAN-BULGARIAN ASTRONOMICAL CONFERENCE 30 MAY - 3 JUNE, 2016, BELGRADE, SERBIA EARTH ORIENTATION PARAMETERS AND GRAVITY VARIATIONS DETERMINED FROM.
Coordinate Operations Standard
Spatial Data Entry via Digitizing
Utility Billing Balancing the Accounts Receivable
GLOBK Velocity and Coordinate Solutions
Track Output Interpretation
Agenda Background and Motivation
M. Kezunovic (P.I.) S. S. Luo D. Ristanovic Texas A&M University
Confidence Intervals for the Mean (Large Samples)
Chapter 6 Confidence Intervals.
CNES-CLS Dynamical modelling of GPS orbits
Presentation transcript:

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #1 Software Benchmarking Results V. Husson

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #2 Benchmark Report Card (Orbit Comparisons)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #3 Benchmark Report Card (Residual and Correction Comparisons)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #4 Benchmark Report Card (SINEX File Comparisons)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #5 Orbit Definitions Orbit A - Nominal Model (Initial orbit, NOTHING adjusted during the run) Orbit B - Fixed EOP and Station Coordinates (Iterated orbit, ONLY orbit adjustment!) Orbit C - Final Orbit (ALL adjusted: Orbit, Station Positions, Biases, EOP)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #6 Summary of Solutions

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #7 Radial Comparisons (Orbit A & B) CRL-CSR Orbit B Orbit A IAAK-CSR NERC-CSRNASDA-CSR JCET-CSR AUSLIG-CSR

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #8 Radial Comparisons (Orbit A & B)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #9 GEODYN Radial Analysis (JCET vs GEOS)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #10 Radial Comparisons (Orbit C)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #11 Radial Comparisons (Orbit C)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #12 Radial Comparisons (Orbit C)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #13 Radial Discontinuities (Orbit C)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #14 Radial Discontinuities (Orbit B)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #15 Radial Discontinuities (Orbit A)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #16 Residual Comparisons

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #17 Residual Comparisons (Orbit A)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #18 Residual Analysis of 1 st Pass (Orbit A)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #19 GEODYN Residual Analysis JCET - ASIGEOS - ASI A B C Orbit JCET - GEOS

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #20 GEODYN Residual Analysis (Midnight Crossing)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #21 NASDA Residual (Orbit B) Large residuals on 1st 3 normal points on Nov 1

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #22 Residual Comparison (Orbit B)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #23 Residual Comparison (NERC)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #24 Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #25 Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #26 NASDA/IAAK Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #27 NASDA Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #28 IAAK Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #29 Refraction Analysis of 1 st pass

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #30 CSR Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #31 CSR Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #32 GEODYN Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #33 DGFI Refraction Analysis

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #34 Center of Mass Corrections Everyone is using.251 meters for LAGEOS JCET CoM corrections in their V4.cor files are in error (these files state a CoM of.252 vs.251m, but.251m was actually used) Software changes may be necessary to accommodate system dependent LAGEOS CoM corrections.

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #35 Relativity Correction

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #36 Relativity Corrections

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #37 Relativity Corrections

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #38 Relativity Corrections

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #39 SINEX File Comparisons (parameters and unknowns)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #40 Range Bias Comparison (Orbit C)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #41 Height Comparisons (Orbit C)

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #42 Height Comparisons (Orbit C) with Range Bias removed

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #43 Lessons learned Need to specify minimum resolution of parameters to be compared Need clearer definition of model standards

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #44 Future Software Modifications that may require benchmark testing Station dependent CoM corrections Use Bias File for apriori Biases Mutli-color data capability Weight data based on #obs/bin

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #45 Recommendations QC your own files before you submit your solution Report all range corrections in the residual file to at least meter (i.e millimeters) Need to verify if any problems found in the benchmark will impact the corresponding POS/EOP solution(s) Put benchmarking presentations on-line, ASAP Distribute finding to ACs not in attendance, ASAP In the POS/EOP pilot project, submit at least 1 solution with the.orb and.res files to ensure problems identified in the benchmark do not “sneak” back in.

Honeywell Technology Solutions Inc AWG Meeting, Washington DC, Oct 3-4, 2002 Benchmarking #46 What’s next What analysis should be performed that has not yet been performed? Establish pass/fail criteria for report card Test for time-of-flight and epoch rounding/truncation issues Do we need to modify our modeling requirements? Should we test and isolate any particular types of models (e.g. range bias estimation, along track acceleration) Should we expand the dataset to include LAGEOS-2 and/or Etalon, other satellites? SP3 format for Orbits (are we ready?) Separate Orbits and Software Benchmarking? Document and distribute results List action items Anything else??