Download presentation
Presentation is loading. Please wait.
1
Lesson 2 – kalman Filters
2
Agenda Applications What is a Kalman Filter Conceptual Overview
Design Kalman Filter Example Demo MATLAB
3
Applications Tracking Objects Economics Navigation (Google car)
Missiles Lip reading Hands Economics Navigation (Google car)
4
What is a Kalman Filter Recursive data processing algorithm Recursive?
KF does not require all previous data to be kept in storage Kalman filter finds the most optimum averaging factor for each consequent state. Also somehow remembers a little bit about the past states
5
Previous lesson Monte Carlo localization Now we look at KF
World divided info discrete grids Approximates the posterior distribution as histogram Now we look at KF The posterior distribution is given by a Gaussian Gaussian is a cont. Function Area underneath sums to one
6
Conceptual overview Lost on the 1D line
A measurement at t1: Mean = z1 and Variance = 2z1 Optimal estimate of position is: 𝑥 (t1) = z1 Variance of error in estimate: 2x (t1) = 2z1 Boat in same position at time t2
7
Conceptual overview So we have the prediction 𝑥 -(t2)
Better Measurement at t2: Mean = z2 and Variance = z2 Need to correct the prediction due to measurement to get 𝑥 (t2) Closer to more trusted measurement prediction 𝑥 -(t2) measurement z(t2)
8
Conceptual overview Corrected mean is the new optimal estimate of position New variance is smaller than either of the previous two variances corrected optimal estimate 𝑥 (t2) prediction 𝑥 -(t2) measurement z(t2)
9
Conceptual overview Optimal estimate 𝑥 (t2) = µ
𝑥 − : prediction (a priori estimate) 𝑥 : update (a posteriori estimate) K: Kalman gain
10
Conceptual overview At time t3, boat moves with velocity dx/dt=u
Naïve Prediction 𝑥 -(t3) At time t3, boat moves with velocity dx/dt=u Naïve approach: Shift probability to the right to predict This would work if we knew the velocity exactly (perfect model)
11
Conceptual overview Naïve Prediction 𝑥 -(t3) 𝑥 (t2) Better to assume imperfect model by adding Gaussian noise dx/dt = u + w u= nominal velocity w= noise term or uncertainty (variance) Distribution for prediction moves and spreads out Prediction 𝑥 -(t3)
12
Conceptual overview Now we take a measurement at t3
Need to once again correct the prediction Same as before Corrected optimal estimate 𝑥 (t3) Measurement z(t3) Prediction 𝑥 -(t3)
13
Design Kalman Filters
14
Design Kalman Filters Process to be estimated: (state space)
xk = Axk-1 + Buk + wk-1 Process Noise (w) with covariance Q zk = Hxk + vk Measurement Noise (v) with covariance R Prediction: 𝑥 - is estimate based on measurements at previous time-steps 𝑥 -k = Axk-1 + Buk P-k = APk-1AT + Q P-k: Prior error covariance Correction: 𝑥 k has additional information – the measurement at time k 𝑥 k = 𝑥 -k + K(zk - H 𝑥 -k ) K = P-kHT(HP-kHT + R)-1 Pk = (I - KH)P-k
15
Design Kalman Filters
16
Design Kalman Filters
17
Kalman Filters - Example
Consider an object falling under a constant gravitational field. Let y(t) denote the height of the object, then:
18
Kalman Filters - Example
Construct state space model from equations, when we are able to perform measurements, zk, of the height. that is, find A, B, uk and H in: xk = Axk-1 + Buk zk = Hxk
19
Kalman Filters - Example
Construct state space model from equations, when we are able to perform measurements, zk, of the height. Solution: A B H u Xk-1 Xk zk
20
Example MATLAB demo
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.