12.540 Principles of the Global Positioning System Lecture 13 Prof. Thomas Herring Room 54-820A; 253-5941

Slides:



Advertisements
Similar presentations
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Advertisements

Properties of Least Squares Regression Coefficients
Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring Room A;
Probabilistic Reasoning over Time
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The FIR Adaptive Filter The LMS Adaptive Filter Stability and Convergence.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
280 SYSTEM IDENTIFICATION The System Identification Problem is to estimate a model of a system based on input-output data. Basic Configuration continuous.
Mobile Intelligent Systems 2004 Course Responsibility: Ola Bengtsson.
SYSTEMS Identification
Prediction and model selection
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Course AE4-T40 Lecture 5: Control Apllication
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Linear and generalised linear models
Modern Navigation Thomas Herring
Principles of Time Scales
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Adaptive Signal Processing
RLSELE Adaptive Signal Processing 1 Recursive Least-Squares (RLS) Adaptive Filters.
Principles of the Global Positioning System Lecture 11 Prof. Thomas Herring Room A;
Constant process Separate signal & noise Smooth the data: Backward smoother: At any give T, replace the observation yt by a combination of observations.
Colorado Center for Astrodynamics Research The University of Colorado STATISTICAL ORBIT DETERMINATION Project Report Unscented kalman Filter Information.
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
T – Biomedical Signal Processing Chapters
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Modern Navigation Thomas Herring
CHAPTER 4 Adaptive Tapped-delay-line Filters Using the Least Squares Adaptive Filtering.
The “ ” Paige in Kalman Filtering K. E. Schubert.
Statistical learning and optimal control: A framework for biological learning and motor control Lecture 4: Stochastic optimal control Reza Shadmehr Johns.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION ASEN 5070 LECTURE 11 9/16,18/09.
Modern Navigation Thomas Herring MW 11:00-12:30 Room
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION The Minimum Variance Estimate ASEN 5070 LECTURE.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
An Introduction to Kalman Filtering by Arthur Pece
NCAF Manchester July 2000 Graham Hesketh Information Engineering Group Rolls-Royce Strategic Research Centre.
Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring Room ;
Dept. E.E./ESAT-STADIUS, KU Leuven
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Normal Equations The Orthogonality Principle Solution of the Normal Equations.
University of Colorado Boulder ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker Professor George H. Born Lecture 23: Process.
Error Modeling Thomas Herring Room ;
Discrete-time Random Signals
By: Aaron Dyreson Supervising Professor: Dr. Ioannis Schizas
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
SVY207: Lecture 10 Computation of Relative Position from Carrier Phase u Observation Equation u Linear dependence of observations u Baseline solution –Weighted.
University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
DSP-CIS Part-III : Optimal & Adaptive Filters Chapter-9 : Kalman Filters Marc Moonen Dept. E.E./ESAT-STADIUS, KU Leuven
Least Squares Measurement model Weighted LSQ  Optimal estimates  Linear  Unbiased  Minimum variance.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Statistical Interpretation of Least Squares ASEN.
University of Colorado Boulder ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker Professor George H. Born Lecture 18: CKF,
Physics 114: Lecture 13 Probability Tests & Linear Fitting
ASEN 5070: Statistical Orbit Determination I Fall 2014
STATISTICAL ORBIT DETERMINATION Kalman (sequential) filter
STATISTICAL ORBIT DETERMINATION Coordinate Systems and Time Kalman Filtering ASEN 5070 LECTURE 21 10/16/09.
ASEN 5070: Statistical Orbit Determination I Fall 2014
Computational Data Analysis
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Kalman Filtering: Control with Limited/Noisy Measurements
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Bayes and Kalman Filter
Principles of the Global Positioning System Lecture 11
Principles of the Global Positioning System Lecture 13
11. Conditional Density Functions and Conditional Expected Values
11. Conditional Density Functions and Conditional Expected Values
Kalman Filter: Bayes Interpretation
Presentation transcript:

Principles of the Global Positioning System Lecture 13 Prof. Thomas Herring Room A;

3/18/ Lec 132 Estimation Summary –First-order Gauss Markov Processes –Kalman filters – Estimation in which the parameters to be estimated are changing with time

3/18/ Lec 133 Specific common processes White-noise: Autocorrelation is Dirac-delta function; PSD is flat; integral of power under PSD is variance of process (true in general) First-order Gauss-Markov process (one of most models common in Kalman filtering)

3/18/ Lec 134 Other characteristics of FOGM

3/18/ Lec 135 Characteristics of FOGM This process noise model is very useful because as , inverse correlation time, goes to infinity (zero correlation time), the process is white noise When the correlation time goes to infinity (  –>0), process becomes random walk (ie, sum of white noise). NOTE: Random walk is not a stationary process because its variance tends to infinity as time goes to infinity In the FOGM solution equation, note the damping term e -  t  x which keeps the process bounded

3/18/ Lec 136 Formulation of Kalman filter A Kalman filter is an implementation of a Bayes estimator. Basic concept behind filter is that some of the parameters being estimated are random processes and as data are added to the filter, the parameter estimates depend on new data and the changes in the process noise between measurements. Parameters with no process noise are called deterministic.

3/18/ Lec 137 Formulation For a Kalman filter, you have measurements y(t) with noise v(t) and a state vector (parameter list) which have specified statistical properties.

3/18/ Lec 138 Basic Kalman filter steps Kalman filter can be broken into three basic steps Prediction: Using process noise model, “predict” parameters at next data epoch –Subscript is time quantity refers to, superscript is data

3/18/ Lec 139 Prediction step The state transition matrix S projects state vector (parameters) forward to next time. –For random walks: S=1 –For rate terms: S is matrix [1  t][0 1] –For FOGM: S=e -  t  –For white noise S=0 The second equation projects the covariance matrix of the state vector, C, forward in time. Contributions from state transition and process noise (W matrix). W elements are 0 for deterministic parameters

3/18/ Lec 1310 Kalman Gain The Kalman Gain is the matrix that allocates the differences between the observations at time t+1 and their predicted value at this time based on the current values of the state vector according to the noise in the measurements and the state vector noise

3/18/ Lec 1311 Update step Step in which the new observations are “blended” into the filter and the covariance matrix of the state vector is updated. The filter has now been updated to time t+1 and measurements from t+2 can added and so on until all the observations have been added.

3/18/ Lec 1312 Aspects to note about Kalman Filters How is the filter started? Need to start with an apriori state vector covariance matrix (basically at time 0) Notice in updating the state covariance matrix. C, that at each step the matrix is decremented. If the initial covariances are too large, then significant rounding error in calculation e.g., If position assumed ±100 m (variance mm apriori and data determines to 1 mm, then C is decremented by 10 orders of magnitude (double precision has on 12 significant digits). Square-root-information filters overcome this problem but usually take longer to run than a standard Kalman filter.

3/18/ Lec 1313 “Smoothing” filters In a standard Kalman filters, the stochastic parameters obtained during the filter run are not optimum because they do not contain information about the deterministic parameters obtained from future data. A smoothing Kalman filter, runs the filter forwards (FRF) and backwards in time (BRF), taking the full average of the forward filter at the update step with the backwards filter at the prediction step.

3/18/ Lec 1314 Smoothing filters The derivation of the full average can be derived from the filter equations. The smoothing filter is

3/18/ Lec 1315 Properties of smoothing filter Deterministic parameters (ie., no process noise) should remain constant with constant variance in smoothed results. Solution takes about 2.5 times longer to run than just a forward filter If deterministic parameters are of interest only, then just FRF needed.

3/18/ Lec 1316 Note on apriori constraints In Kalman filter, apriori covariances must be applied to all parameters, but cannot be too large or else large rounding errors (non-positive definite covariance matrices). Error due to apriori constraints given approximately by (derived from filter equations). Approximate formulas assuming uncorrelated parameter estimates and the apriori variance is large compared to intrinsic variance with which parameter can be determined.

3/18/ Lec 1317 Errors due to apriori constraints Note: Error depends on ratio of aposteriori to apriori variance rather than absolute magnitude of error in apriori to apriori variance

3/18/ Lec 1318 Contrast between WLS and Kalman Filter In Kalman filters, apriori constraints must be given for all parameters; not needed in weighted least squares (although can be done). Kalman filters allow zero variance parameters; can not be done is WLS since inverse of constraint matrix needed Kalman filters allow zero variance data; can not be done in WLS again due to inverse of data covariance matrix. Kalman filters allow method for applying absolute constraints; can only be tightly constrained in WLS In general, Kalman filters are more prone to numerical stability problems and take longer to run (strictly many more parameters). Process noise models can be implemented in WLS but very slow.

3/18/ Lec 1319 Applications in GPS Most handheld GPS receivers use Kalman filters to estimate velocity and position as function of time. Clock behaviors are “white noise” and can be treated with Kalman filter Atmospheric delay variations ideal for filter application Stochastic variations in satellite orbits