Introduction to Kalman Filters

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

Subspace Embeddings for the L1 norm with Applications Christian Sohler David Woodruff TU Dortmund IBM Almaden.
By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
Sales Forecasting using Dynamic Bayesian Networks Steve Djajasaputra SNN Nijmegen The Netherlands.
Business Transaction Management Software for Application Coordination 1 Business Processes and Coordination.
and 6.855J Cycle Canceling Algorithm. 2 A minimum cost flow problem , $4 20, $1 20, $2 25, $2 25, $5 20, $6 30, $
Introduction to Algorithms 6.046J/18.401J
Summary of Convergence Tests for Series and Solved Problems
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Title Subtitle.
You will need some paper!
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
FACTORING Think Distributive property backwards Work down, Show all steps ax + ay = a(x + y)
Addition Facts
Solve Multi-step Equations
Control and Feedback Introduction Open-loop and Closed-loop Systems
Pole Placement.
1 How to Enter Time. 2 Select: Log In Once logged in, Select: Employees.
Announcements Homework 6 is due on Thursday (Oct 18)
Evaluating Limits Analytically
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2012 – 70343: A Robust Technique for Lumped Parameter Inverse.
Detection Chia-Hsin Cheng. Wireless Access Tech. Lab. CCU Wireless Access Tech. Lab. 2 Outlines Detection Theory Simple Binary Hypothesis Tests Bayes.
Chapter 2.3 Counting Sample Points Combination In many problems we are interested in the number of ways of selecting r objects from n without regard to.
Chapter 10 Money, Interest, and Income
Environmental Data Analysis with MatLab
Chapter 2 Section 3.
Squares and Square Root WALK. Solve each problem REVIEW:
This, that, these, those Number your paper from 1-10.
Addition 1’s to 20.
25 seconds left…...
Week 1.
DTU Informatics Introduction to Medical Image Analysis Rasmus R. Paulsen DTU Informatics TexPoint fonts.
We will resume in: 25 Minutes.
1 PART 1 ILLUSTRATION OF DOCUMENTS  Brief introduction to the documents contained in the envelope  Detailed clarification of the documents content.
Multiple Regression and Model Building
Computer Vision Lecture 7: The Fourier Transform
NWS Calibration Workshop, LMRFC March 2009 Slide 1 Calibration of Local Areas 1 2 Headwater basin Local area.
Lirong Xia Reinforcement Learning (2) Tue, March 21, 2014.
Probabilistic Reasoning over Time
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
Russell and Norvig, AIMA : Chapter 15 Part B – 15.3,
Observers and Kalman Filters
Presenter: Yufan Liu November 17th,
Kalman’s Beautiful Filter (an introduction) George Kantor presented to Sensor Based Planning Lab Carnegie Mellon University December 8, 2000.
1 Introduction to Kalman Filters Michael Williams 5 June 2003.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Computer Vision Linear Tracking Jan-Michael Frahm COMP 256 Some slides from Welch & Bishop.
Slam is a State Estimation Problem. Predicted belief corrected belief.
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
Kalman filter and SLAM problem
Mobile Robot controlled by Kalman Filter
Human-Computer Interaction Kalman Filter Hanyang University Jong-Il Park.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 18: Minimum Variance Estimator.
2 Introduction to Kalman Filters Michael Williams 5 June 2003.
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Lesson 2 – kalman Filters
HCI / CprE / ComS 575: Computational Perception Instructor: Alexander Stoytchev
An Introduction To The Kalman Filter By, Santhosh Kumar.
Using Kalman Filter to Track Particles Saša Fratina advisor: Samo Korpar
State Estimation and Kalman Filtering Zeeshan Ali Sayyed.
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
VANET – Stochastic Path Prediction Motivations Route Discovery Safety Warning Accident Notifications Strong Deceleration of Tra ffi c Flow Road Hazards.
Kalman’s Beautiful Filter (an introduction)
Kalman Filter فيلتر كالمن در سال 1960 توسط R.E.Kalman در مقاله اي تحت عنوان زير معرفي شد. “A new approach to liner filtering & prediction problem” Transactions.
Kalman Filtering COS 323.
The Discrete Kalman Filter
Presentation transcript:

Introduction to Kalman Filters CEE 6430: Probabilistic Methods in Hydroscienecs Fall 2008 Acknowledgements: Numerous sources on WWW, book, papers

Overview What could Kalman Filters be used for in Hydrosciences? What is a Kalman Filter? Conceptual Overview The Theory of Kalman Filter (only the equations you need to use) Simple Example (with lots of blah blah talk through handouts)

A “Hydro” Example Suppose you have a hydrologic model that predicts river water level every hour (using the usual inputs). You know that your model is not perfect and you don’t trust it 100%. So you want to send someone to check the river level in person. However, the river level can only be checked once a day around noon and not every hour. Furthermore, the person who measures the river level can not be trusted 100% either. So how do you combine both outputs of river level (from model and from measurement) so that you get a ‘fused’ and better estimate? – Kalman filtering

Graphically speaking

What is a Filter by the way? Class – define mathematically what a filter is (make an analogy to a real filter) Other applications of Kalman Filtering (or Filtering in general): Your Car GPS (predict and update location) Surface to Air Missile (hitting the target) Ship or Rocket navigation (Appollo 11 used some sort of filtering to make sure it didn’t miss the Moon!)

The Problem in General (let’s get a little more technical) Black Box System Error Sources Sometimes the system state and the measurement may be two different things (not like river level example) External Controls System System State (desired but not known) Optimal Estimate of System State Observed Measurements Measuring Devices Estimator Measurement Error Sources System state cannot be measured directly Need to estimate “optimally” from measurements

What is a Kalman Filter? Recursive data processing algorithm Generates optimal estimate of desired quantities given the set of measurements Optimal? For linear system and white Gaussian errors, Kalman filter is “best” estimate based on all previous measurements For non-linear system optimality is ‘qualified’ Recursive? Doesn’t need to store all previous measurements and reprocess all data each time step

Conceptual Overview Simple example to motivate the workings of the Kalman Filter The essential equations you need to know (Kalman Filtering for Dummies!) Examples: Prediction and Correction

Conceptual Overview y Lost on the 1-dimensional line (imagine that you are guessing your position by looking at the stars using sextant) Position – y(t) Assume Gaussian distributed measurements

Conceptual Overview State space – position Measurement - position Sextant Measurement at t1: Mean = z1 and Variance = z1 Optimal estimate of position is: ŷ(t1) = z1 Variance of error in estimate: 2x (t1) = 2z1 Boat in same position at time t2 - Predicted position is z1 Sextant is not perfect

Conceptual Overview So we have the prediction ŷ-(t2) State (by looking at the stars at t2) Measurement usign GPS z(t2) So we have the prediction ŷ-(t2) GPS Measurement at t2: Mean = z2 and Variance = z2 Need to correct the prediction by Sextant due to measurement to get ŷ(t2) Closer to more trusted measurement – should we do linear interpolation?

Conceptual Overview prediction ŷ-(t2) Kalman filter helps you fuse measurement and prediction on the basis of how much you trust each (I would trust the GPS more than the sextant) corrected optimal estimate ŷ(t2) measurement z(t2) Corrected mean is the new optimal estimate of position (basically you’ve ‘updated’ the predicted position by Sextant using GPS New variance is smaller than either of the previous two variances

Conceptual Overview (The Kalman Equations) Lessons so far: Make prediction based on previous data - ŷ-, - Take measurement – zk, z Optimal estimate (ŷ) = Prediction + (Kalman Gain) * (Measurement - Prediction) Variance of estimate = Variance of prediction * (1 – Kalman Gain)

Conceptual Overview What if the boat was now moving? Naïve Prediction (sextant) ŷ-(t3) At time t3, boat moves with velocity dy/dt=u Naïve approach: Shift probability to the right to predict This would work if we knew the velocity exactly (perfect model)

Conceptual Overview Naïve Prediction ŷ-(t3) But you may not be so sure about the exact velocity ŷ(t2) Prediction ŷ-(t3) Better to assume imperfect model by adding Gaussian noise dy/dt = u + w Distribution for prediction moves and spreads out

Conceptual Overview Now we take a measurement at t3 Corrected optimal estimate ŷ(t3) Updated Sextant position using GPS Measurement z(t3) GPS Prediction ŷ-(t3) Sextant Now we take a measurement at t3 Need to once again correct the prediction Same as before

Conceptual Overview Lessons learnt from conceptual overview: Initial conditions (ŷk-1 and k-1) Prediction (ŷ-k , -k) Use initial conditions and model (eg. constant velocity) to make prediction Measurement (zk) Take measurement Correction (ŷk , k) Use measurement to correct prediction by ‘blending’ prediction and residual – always a case of merging only two Gaussians Optimal estimate with smaller variance

Blending Factor If we are sure about measurements: Measurement error covariance (R) decreases to zero K decreases and weights residual more heavily than prediction If we are sure about prediction Prediction error covariance P-k decreases to zero K increases and weights prediction more heavily than residual

The set of Kalman Filtering Equations in Detail Correction (Measurement Update) (1) Compute the Kalman Gain (2) Update estimate with measurement zk (3) Update Error Covariance ŷk = ŷ-k + K(zk - H ŷ-k ) K = P-kHT(HP-kHT + R)-1 Pk = (I - KH)P-k ŷ-k = Ayk-1 + Buk P-k = APk-1AT + Q Prediction (Time Update) (1) Project the state ahead (2) Project the error covariance ahead

Assumptions behind Kalman Filter The model you use to predict the ‘state’ needs to be a LINEAR function of the measurement (so how do we use non-linear rainfall-runoff models?) The model error and the measurement error (noise) must be Gaussian with zero mean

What if the noise is NOT Gaussian? Given only the mean and standard deviation of noise, the Kalman filter is the best linear estimator. Non-linear estimators may be better. Why is Kalman Filtering so popular? · Good results in practice due to optimality and structure. · Convenient form for online real time processing. · Easy to formulate and implement given a basic understanding. · Measurement equations need not be inverted. ALSO popular in hydrosciences, weather/oceanography/ hydrologic modeling, data assimilation

Now ..to understand the jargons (You may begin the handouts) First read the hand out by PD Joseph Next, read the hand out by Welch and Bishop titled ‘An Introduction to the Kalman Filter’. (you can skip pages 4-5, 7-11). Pages 7-11 are on ‘Extended Kalman Filtering’ (for non-linear systems). Read the solved example from pages 11-16.

Homework (conceptual) Explain in NO MORE THAN 1 PAGE the example that you read from pages 11-16 in the handout by Welch and Bishop. Basically, I want you to give me a simple conceptual overview of why and how ‘filtering’ was applied using the previous analogy on a boat lost in sea. DUE – Same date as the Class project report. EXTRA CREDIT 5% marks– If you review (3-4 pages) the classic paper in 1960 by Kalman (hand out) EXTRA CREDIT 5% marks – if you turn in a detailed summary of the STEVE software (pros/cons, what it is etc.)

References By the way Dr. Rudolf Kalman is alive and living well today Kalman, R. E. 1960. “A New Approach to Linear Filtering and Prediction Problems”, Transaction of the ASME--Journal of Basic Engineering, pp. 35-45 (March 1960). Welch, G and Bishop, G. 2001. “An introduction to the Kalman Filter”, http://www.cs.unc.edu/~welch/kalman/ By the way Dr. Rudolf Kalman is alive and living well today