HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
The Kalman Filter (part 1) HCI/ComS 575X: Computational Perception Iowa State University, SPRING 2006 Copyright © 2006, Alexander Stoytchev February 13, 2006
Maybeck, Peter S. (1979) Chapter 1 in ``Stochastic models, estimation, and control'', Mathematics in Science and Engineering Series, Academic Press.
Greg Welch & Gary Bishop (2001) SIGGRAPH 2001 Course: ``An Introduction to the Kalman Filter''.
Rudolf Emil Kalman [
Application: Lunar Landing
Application: Radar Tracking
Application: Missile Tracking
Application: Sailing
Application: Robot Navigation
Application: Other Tracking
Application: Head Tracking
Face & Hand Tracking
Quick Review of Probability
What is a probability?
Probability of A or B occurring
Probability of both A and B occurring together (assuming that they are independent of each other)
Conditional Probability
Example: Rolling Dice [
Example: Rolling Dice Event A={The sum of the numbers the dice show is 7 or 9} Event B={The second die shows 2 or 3} [
Example: Rolling Dice What is P(A)? What is P(B)? What is P(A or B)? What is P(A and B)? What is P(A given B)?
Cumulative Probability Density In the continuous domain Therefore we need something else to describe a probability distribution
Example of a Cumulative Density Function
Properties of the cumulative density functions
Probability Density Function (pdf)
Properties of the probability density functions
Probability Density Function (pdf) [
Relationship b/en pdf and cdf [
Summary Random Variable: Cumulative Density Function: Probability Density Function:
Mean (Average)
Mean (Discrete Case)
Expected Value (Discrete Case)
Expected Value (Continuous Case)
Same thing but applied to functions of the random variable X Discrete Case: Continuous Case:
Moments Let Then the k -th moment is given by
Second Moment
Variance Let Then
A trick for computing the variance E(X - µ) 2 = E(X 2 - 2Xµ + µ 2 ) = E(X 2 ) - 2[E(X)] µ + µ 2 = E(X 2 ) – 2 µ 2 + µ 2 = E(X 2 ) - µ 2
A trick for computing the variance E(X - µ) 2 = E(X 2 - 2Xµ + µ 2 ) = E(X 2 ) - 2[E(X)] µ + µ 2 = E(X 2 ) – 2 µ 2 + µ 2 = E(X 2 ) - µ 2 σ2σ2
Standard Deviation Has the same units as the variable X Is Equal to the positive square root of the variance
Example Data Set (of 4 numbers): –1, 2, 3, 4 N=4 Mean – sum/N = ( )/4 = 10/4 = 2.5
Example Data Set: 1, 2, 3, 4 µ = 2.5 N=4 σ 2 = [ (1-µ) 2 + (2-µ) 2 + (3-µ) 2 + (4-µ) 2 ] / (N - 1) = [ ] / (4 - 1) = 5 / 3 = 1.666(6)
Example: Shortcut Way σ 2 = [ ( ) – ( ) 2 /4] / (4 - 1) = (30 – 10 2 /4) /3 = 5/3 = 1.666(6)
Gaussian Properties
The Gaussian Function
Gaussian pdf
Properties If and Then
pdf for
Properties
Summation and Subtraction
Noise Models
White Noise Properties Time domain Frequency domain
The Kalman Filter: Theory
Definition A Kalman filter is simply an optimal recursive data processing algorithm Under some assumptions the Kalman filter is optimal with respect to virtually any criterion that makes sense.
Definition “The Kalman filter incorporates all information that can be provided to it. It processes all available measurements, regardless of their precision, to estimate the current value of the variables of interest.” [Maybeck (1979)]
Why do we need a filter? No mathematical model of a real system is perfect Real world disturbances Imperfect Sensors
Application: Radar Tracking
Conditional density of position based on measured value of z 1 [Maybeck (1979)]
Conditional density of position based on measured value of z 1 [Maybeck (1979)] position measured position uncertainty
Conditional density of position based on measurement of z 2 alone [Maybeck (1979)]
Conditional density of position based on measurement of z 2 alone [Maybeck (1979)] measured position 2 uncertainty 2
Conditional density of position based on data z 1 and z 2 [Maybeck (1979)] position estimate
Propagation of the conditional density [Maybeck (1979)]
More about this Next Time
THE END