Download presentation
Presentation is loading. Please wait.
Published byMolly Hampton Modified over 6 years ago
1
ASEN 5070: Statistical Orbit Determination I Fall 2014
Professor Brandon A. Jones Lecture 36: Fixed-Interval Smoothing
2
Announcements Homework 10 due on Friday
Lecture quiz due by 5pm on Wednesday
3
Lecture Quiz 7
4
Question 1 Percent Correct: 68%
When analyzing the performance of a filter, one should characterize the residuals. When assessing accuracy based on post-fit residual ratios, we expect random deviations that are mostly between -3 and 3. True False
5
Question 2 Percent Correct: 44%
Assume we have a scenario where we want to estimate the position and velocity of a satellite and the height of a table in the Engineering Center for a total of n=7 state parameters. We are given m >> 7 independent observations of the range and range-rate of the satellite taken from a station in Hawaii (p=2). Hence, in this case, the H-tilde matrix is 2x7 and there are many observation times. We use the Batch processor and accumulate the sum of H^T*H for each observation time. There is no a priori information available. What conclusion can we draw from this scenario? The system is observable and the accumulated information matrix is rank n The system is unobservable and the accumulated information matrix is rank n The system is unobservable and the accumulated information matrix is rank p-1 The system is unobservable and the accumulated information matrix is rank n-1 28% 28% 0% 44%
6
Question 3 Percent Correct: 60% Consider the scenario where:
There are two spacecraft in Earth orbit. The two vehicles are in a two-body gravity field. There is no modeling error. We have an infinite precision computer Negligible measurement error with a small variance and zero mean. Range observations are gathered relative to a ground station with a fixed location in the inertial space. We make no other assumptions on the scenario. Which of the following statements apply to this scenario? The position and velocity states of both spacecraft are observable The position and velocity states of neither spacecraft are observable We can estimate the state of one spacecraft at a time If we add a third spacecraft to the scenario, then we can estimate the states of all three 20% 60% 8% 12%
7
Question 4 Percent Correct: 92%
When characterizing the performance of a filter by inspecting the covariance matrix, we should take a look at: The variances on the diagonal of the matrix The correlation coefficients using the off-diagonal terms Both of the above. Neither of the above.
8
Question 5 Percent Correct: 40%
In the method of orthogonal transformations, we generate the matrix Q such that: QH = [ R; 0 ] and the resulting R matrix is orthogonal. True False
9
Lecture Quiz 8
10
Question 1 Percent Correct: 52% For the problem below:
There are n estimated states There are m observations available over the data fit span There are p observations available at a single point in time. The process noise u(t) is Gaussian with zero mean, covariance Q(t), and zero correlation in time. The vector u(t) is a vector of length s. The process-noise transition matrix is: p x s s x m n x s m x n 33% 5% 52% 10%
11
Question 2 Percent Correct: 90%
One advantage of the Kalman filter over the Batch processor is the ability to easily add a process noise model to the state dynamics. True False
12
Question 3 Percent Correct: 67%
To account for temporal correlation in unknown accelerations, methods based on a Gauss-Markov process add the estimation of a stochastic acceleration to the estimated state vector. True False
13
Question 4 Percent Correct: 14%
When using the State Noise Compensation (SNC) model of process noise, which of the following is assumed? (select all that apply) Linear dynamics Dense data, i.e., a small delta time The perturbing acceleration has zero correlation in time The random vector u(t) may be modeled as a discrete time random process. 52% 62% 76% 71%
14
Question 5 Percent Correct: 38%
To approximate the stochastic term in the analytic solution of a first-order Gauss- Markov process, we use an equivalent process. This equivalent process is selected to approximate the full probability density function of the stochastic term. True False
15
Fixed Interval Smoothing
16
Motivation The batch processor provides an estimate based on a full span of data When including process noise, we lose this equivalence between the batch and any of the sequential processors Is there some way to update the estimated state using information gained from future observations?
17
Smoothing Smoothing is a method by which a state estimate (and optionally, the covariance) may be constructed using observations before and after the epoch. Step 1. Process all observations using a CKF with process noise (SNC, DMC, etc.). Step 2. Start with the last observation processed and smooth back through the observations.
18
Notation Based on observations up to and including Value/vector/matrix Time of current estimate As presented in the book, the most common source of confusion for the smoothing algorithm is the notation
19
Smoothing visualization
Process observations forward in time: If you were to process them backward in time (given everything needed to do that):
20
Smoothing visualization
Process observations forward in time: If you were to process them backward in time (given everything needed to do that):
21
Smoothing visualization
Smoothing does not actually combine them, but you can think about it in order to conceptualize what smoothing does. Smoothing results in a much more consistent solution over time. And it results in an optimal estimate using all observations.
22
Smoothing Caveats: If you use process noise or some other way to increase the covariance, the result is that the optimal estimate at any time really only pays attention to observations nearby. While this is good, it also means smoothing doesn’t always have a big effect. Smoothing shouldn’t remove the white noise found on the signals. It’s not a “cleaning” function, it’s a “use all the data for your estimate” function.
23
Smoothing of State Estimate
First, we use If Q = 0,
24
Smoothing of State Estimate
Hence, in the CKF, we store:
25
Smoothing of Covariance
Optionally, we may smooth the state error covariance matrix
26
Smoothing Algorithm
27
Smoothing Algorithm
28
Smoothing If we suppose that there is no process noise (Q=0), then the smoothing algorithm reduces to the CKF mapping relationships:
29
An example: 4-41 and 4-42 Book p. 283
30
An example: 4-41 and 4-42 Book p. 284
31
Smoothing Say there are 100 observations
We want to construct new estimates using all data, i.e.,
32
Smoothing Say there are 100 observations
33
Smoothing Say there are 100 observations
34
Smoothing Say there are 100 observations
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.