ASEN 5070: Statistical Orbit Determination I Fall 2014

Slides:



Advertisements
Similar presentations
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Advertisements

Probabilistic Reasoning over Time
Colorado Center for Astrodynamics Research The University of Colorado ASEN 5070 OD Accuracy Assessment OD Overlap Example Effects of eliminating parameters.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 28: Orthogonal Transformations.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 24: Numeric Considerations and.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 20: Project Discussion and the.
University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 8: Stat.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 41: Initial Orbit Determination.
Lecture II-2: Probability Review
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 37: SNC Example and Solution Characterization.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 38: Information Filter.
Principles of the Global Positioning System Lecture 13 Prof. Thomas Herring Room A;
Principles of the Global Positioning System Lecture 11 Prof. Thomas Herring Room A;
Colorado Center for Astrodynamics Research The University of Colorado STATISTICAL ORBIT DETERMINATION Project Report Unscented kalman Filter Information.
University of Colorado Boulder ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker Professor George H. Born Lecture 25: Error.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Satellite Tracking Example of SNC and DMC ASEN.
Computer vision: models, learning and inference Chapter 19 Temporal models.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 34: Probability Ellipsoids.
The Kalman Filter ECE 7251: Spring 2004 Lecture 17 2/16/04
Modern Navigation Thomas Herring
Karman filter and attitude estimation Lin Zhong ELEC424, Fall 2010.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 18: Minimum Variance Estimator.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 26: Singular Value Decomposition.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION ASEN 5070 LECTURE 11 9/16,18/09.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2015 Professor Brandon A. Jones Lecture 21: A Bayesian Approach to the.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION The Minimum Variance Estimate ASEN 5070 LECTURE.
University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 11: Batch.
Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring Room ;
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Probability and statistics review ASEN 5070 LECTURE.
An Introduction To The Kalman Filter By, Santhosh Kumar.
University of Colorado Boulder ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker Professor George H. Born Lecture 23: Process.
Using Kalman Filter to Track Particles Saša Fratina advisor: Samo Korpar
Tracking with dynamics
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 30: Lecture Quiz, Project Details,
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION EKF and Observability ASEN 5070 LECTURE 23 10/21/09.
University of Colorado Boulder ASEN 6008 Interplanetary Mission Design Statistical Orbit Determination A brief overview 1.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2015 Professor Brandon A. Jones Lecture 32: Gauss-Markov Processes and.
University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 29: Observability and Introduction.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2015 Professor Brandon A. Jones Lecture 22: Further Discussions of the.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 10: Weighted LS and A Priori.
University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 10: Batch.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2015 Professor Brandon A. Jones Lecture 41: Information Filter.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2015 Professor Brandon A. Jones Lecture 15: Statistical Least Squares.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2015 Professor Brandon A. Jones Lecture 19: Examples with the Batch Processor.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Statistical Interpretation of Least Squares ASEN.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 39: Measurement Modeling and Combining.
University of Colorado Boulder ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker Professor George H. Born Lecture 18: CKF,
ASEN 5070: Statistical Orbit Determination I Fall 2014
STATISTICAL ORBIT DETERMINATION Kalman (sequential) filter
(5) Notes on the Least Squares Estimate
STATISTICAL ORBIT DETERMINATION Coordinate Systems and Time Kalman Filtering ASEN 5070 LECTURE 21 10/16/09.
ASEN 5070: Statistical Orbit Determination I Fall 2015
ASEN 5070: Statistical Orbit Determination I Fall 2015
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
ASEN 5070: Statistical Orbit Determination I Fall 2015
ASEN 5070: Statistical Orbit Determination I Fall 2014
ASEN 5070: Statistical Orbit Determination I Fall 2015
ASEN 5070: Statistical Orbit Determination I Fall 2015
PSG College of Technology
Chapter 10: Solving Linear Systems of Equations
Propagating Uncertainty In POMDP Value Iteration with Gaussian Process
Singular Value Decomposition SVD
Bayes and Kalman Filter
Consider Covariance Analysis Example 6.9, Spring-Mass
Principles of the Global Positioning System Lecture 11
Principles of the Global Positioning System Lecture 13
Presentation transcript:

ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 36: Fixed-Interval Smoothing

Announcements Homework 10 due on Friday Lecture quiz due by 5pm on Wednesday

Lecture Quiz 7

Question 1 Percent Correct: 68% When analyzing the performance of a filter, one should characterize the residuals. When assessing accuracy based on post-fit residual ratios, we expect random deviations that are mostly between -3 and 3. True False

Question 2 Percent Correct: 44% Assume we have a scenario where we want to estimate the position and velocity of a satellite and the height of a table in the Engineering Center for a total of n=7 state parameters.  We are given m >> 7 independent observations of the range and range-rate of the satellite taken from a station in Hawaii (p=2).  Hence, in this case, the H-tilde matrix is 2x7 and there are many observation times.  We use the Batch processor and accumulate the sum of H^T*H for each observation time.  There is no a priori information available. What conclusion can we draw from this scenario? The system is observable and the accumulated information matrix is rank n The system is unobservable and the accumulated information matrix is rank n The system is unobservable and the accumulated information matrix is rank p-1 The system is unobservable and the accumulated information matrix is rank n-1 28% 28% 0% 44%

Question 3 Percent Correct: 60% Consider the scenario where: There are two spacecraft in Earth orbit. The two vehicles are in a two-body gravity field. There is no modeling error. We have an infinite precision computer Negligible measurement error with a small variance and zero mean. Range observations are gathered relative to a ground station with a fixed location in the inertial space. We make no other assumptions on the scenario.  Which of the following statements apply to this scenario? The position and velocity states of both spacecraft are observable The position and velocity states of neither spacecraft are observable We can estimate the state of one spacecraft at a time If we add a third spacecraft to the scenario, then we can estimate the states of all three 20% 60% 8% 12%

Question 4 Percent Correct: 92% When characterizing the performance of a filter by inspecting the covariance matrix, we should take a look at: The variances on the diagonal of the matrix The correlation coefficients using the off-diagonal terms Both of the above. Neither of the above.

Question 5 Percent Correct: 40% In the method of orthogonal transformations, we generate the matrix Q such that: QH = [ R; 0 ] and the resulting R matrix is orthogonal. True False

Lecture Quiz 8

Question 1 Percent Correct: 52% For the problem below: There are n estimated states There are m observations available over the data fit span There are p observations available at a single point in time. The process noise u(t) is Gaussian with zero mean, covariance Q(t), and zero correlation in time. The vector u(t) is a vector of length s. The process-noise transition matrix is: p x s s x m n x s m x n 33% 5% 52% 10%

Question 2 Percent Correct: 90% One advantage of the Kalman filter over the Batch processor is the ability to easily add a process noise model to the state dynamics. True False

Question 3 Percent Correct: 67% To account for temporal correlation in unknown accelerations, methods based on a Gauss-Markov process add the estimation of a stochastic acceleration to the estimated state vector. True False

Question 4 Percent Correct: 14% When using the State Noise Compensation (SNC) model of process noise, which of the following is assumed? (select all that apply) Linear dynamics Dense data, i.e., a small delta time The perturbing acceleration has zero correlation in time The random vector u(t) may be modeled as a discrete time random process. 52% 62% 76% 71%

Question 5 Percent Correct: 38% To approximate the stochastic term in the analytic solution of a first-order Gauss- Markov process, we use an equivalent process.  This equivalent process is selected to approximate the full probability density function of the stochastic term. True False

Fixed Interval Smoothing

Motivation The batch processor provides an estimate based on a full span of data When including process noise, we lose this equivalence between the batch and any of the sequential processors Is there some way to update the estimated state using information gained from future observations?

Smoothing Smoothing is a method by which a state estimate (and optionally, the covariance) may be constructed using observations before and after the epoch. Step 1. Process all observations using a CKF with process noise (SNC, DMC, etc.). Step 2. Start with the last observation processed and smooth back through the observations.

Notation Based on observations up to and including Value/vector/matrix Time of current estimate As presented in the book, the most common source of confusion for the smoothing algorithm is the notation

Smoothing visualization Process observations forward in time: If you were to process them backward in time (given everything needed to do that):

Smoothing visualization Process observations forward in time: If you were to process them backward in time (given everything needed to do that):

Smoothing visualization Smoothing does not actually combine them, but you can think about it in order to conceptualize what smoothing does. Smoothing results in a much more consistent solution over time. And it results in an optimal estimate using all observations.

Smoothing Caveats: If you use process noise or some other way to increase the covariance, the result is that the optimal estimate at any time really only pays attention to observations nearby. While this is good, it also means smoothing doesn’t always have a big effect. Smoothing shouldn’t remove the white noise found on the signals. It’s not a “cleaning” function, it’s a “use all the data for your estimate” function.

Smoothing of State Estimate First, we use If Q = 0,

Smoothing of State Estimate Hence, in the CKF, we store:

Smoothing of Covariance Optionally, we may smooth the state error covariance matrix

Smoothing Algorithm

Smoothing Algorithm

Smoothing If we suppose that there is no process noise (Q=0), then the smoothing algorithm reduces to the CKF mapping relationships:

An example: 4-41 and 4-42 Book p. 283

An example: 4-41 and 4-42 Book p. 284

Smoothing Say there are 100 observations We want to construct new estimates using all data, i.e.,

Smoothing Say there are 100 observations

Smoothing Say there are 100 observations

Smoothing Say there are 100 observations