HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev

Slides:



Advertisements
Similar presentations
Tracking Unknown Dynamics - Combined State and Parameter Estimation Tracking Unknown Dynamics - Combined State and Parameter Estimation Presenters: Hongwei.
Advertisements

ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Newton’s Method Application to LMS Recursive Least Squares Exponentially-Weighted.
Observers and Kalman Filters
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
280 SYSTEM IDENTIFICATION The System Identification Problem is to estimate a model of a system based on input-output data. Basic Configuration continuous.
1 Introduction to Kalman Filters Michael Williams 5 June 2003.
Single-Channel Speech Enhancement in Both White and Colored Noise Xin Lei Xiao Li Han Yan June 5, 2002.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Computer Vision Linear Tracking Jan-Michael Frahm COMP 256 Some slides from Welch & Bishop.
An introduction to Particle filtering
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
1 PREDICTION In the previous sequence, we saw how to predict the price of a good or asset given the composition of its characteristics. In this sequence,
EC220 - Introduction to econometrics (review chapter)
1 UNBIASEDNESS AND EFFICIENCY Much of the analysis in this course will be concerned with three properties of estimators: unbiasedness, efficiency, and.
HCI / CprE / ComS 575: Computational Perception
Constant process Separate signal & noise Smooth the data: Backward smoother: At any give T, replace the observation yt by a combination of observations.
Kalman filter and SLAM problem
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Colorado Center for Astrodynamics Research The University of Colorado STATISTICAL ORBIT DETERMINATION Project Report Unscented kalman Filter Information.
Mobile Robot controlled by Kalman Filter
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Oceanography 569 Oceanographic Data Analysis Laboratory Kathie Kelly Applied Physics Laboratory 515 Ben Hall IR Bldg class web site: faculty.washington.edu/kellyapl/classes/ocean569_.
Computer vision: models, learning and inference Chapter 19 Temporal models.
HCI / CprE / ComS 575: Computational Perception
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics.
Curve-Fitting Regression
CHAPTER 4 Adaptive Tapped-delay-line Filters Using the Least Squares Adaptive Filtering.
1Spring 02 First Derivatives x y x y x y dy/dx = 0 dy/dx > 0dy/dx < 0.
Adv DSP Spring-2015 Lecture#9 Optimum Filters (Ch:7) Wiener Filters.
2 Introduction to Kalman Filters Michael Williams 5 June 2003.
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
An Introduction to Kalman Filtering by Arthur Pece
HCI / CprE / ComS 575: Computational Perception Instructor: Alexander Stoytchev
Estimators and estimates: An estimator is a mathematical formula. An estimate is a number obtained by applying this formula to a set of sample data. 1.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Machine Learning 5. Parametric Methods.
Object Tracking - Slide 1 Object Tracking Computer Vision Course Presentation by Wei-Chao Chen April 05, 2000.
Regression. We have talked about regression problems before, as the problem of estimating the mapping f(x) between an independent variable x and a dependent.
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Tracking with dynamics
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
Chapter 2 Excel Fundamentals Logical IF (Decision) Statements Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
State-Space Recursive Least Squares with Adaptive Memory College of Electrical & Mechanical Engineering National University of Sciences & Technology (NUST)
DSP-CIS Part-III : Optimal & Adaptive Filters Chapter-9 : Kalman Filters Marc Moonen Dept. E.E./ESAT-STADIUS, KU Leuven
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
STATISTICAL ORBIT DETERMINATION Kalman (sequential) filter
11.2 Arithmetic Sequences.
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Kalman Filtering: Control with Limited/Noisy Measurements
Lecture 10: Observers and Kalman Filters
HCI / CprE / ComS 575: Computational Perception
HCI/ComS 575X: Computational Perception
HCI/ComS 575X: Computational Perception
HCI/ComS 575X: Computational Perception
HCI/ComS 575X: Computational Perception
HCI/ComS 575X: Computational Perception
Chapter14-cont..
HCI/ComS 575X: Computational Perception
Kalman Filter: Bayes Interpretation
HCI/ComS 575X: Computational Perception
Presentation transcript:

HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev

The Kalman Filter (part 3) HCI/ComS 575X: Computational Perception Iowa State University, SPRING 2006 Copyright © 2006, Alexander Stoytchev February 20, 2006

Brown and Hwang (1992) “Introduction to Random Signals and Applied Kalman Filtering” Ch 5: The Discrete Kalman Filter

Arthur Gelb, Joseph Kasper, Raymond Nash, Charles Price, Arthur Sutherland (1974) Applied Optimal Estimation MIT Press.

Let’s Start With a Demo Matlab Program Written by John Burnett

A Simple Recursive Example Problem Statement: Given the measurement sequence: z 1, z 2, …, z n find the mean [Brown and Hwang (1992)]

First Approach 1. Make the first measurement z 1 Store z 1 and estimate the mean as µ 1 =z 1 2. Make the second measurement z 2 Store z 1 along with z 2 and estimate the mean as µ 2 = (z 1 +z 2 )/2 [Brown and Hwang (1992)]

First Approach (cont’d) 3. Make the third measurement z 3 Store z 3 along with z 1 and z 2 and estimate the mean as µ 3 = (z 1 +z 2 +z 3 )/3 [Brown and Hwang (1992)]

First Approach (cont’d) n. Make the n-th measurement z n Store z n along with z 1, z 2,…, z n-1 and estimate the mean as µ n = (z 1 + z 2 + … + z n )/n [Brown and Hwang (1992)]

Second Approach 1. Make the first measurement z 1 Compute the mean estimate as µ 1 =z 1 Store µ 1 and discard z 1 [Brown and Hwang (1992)]

Second Approach (cont’d) 2.Make the second measurement z 2 Compute the estimate of the mean as a weighted sum of the previous estimate µ 1 and the current measurement z 2: µ 2 = 1/2 µ 1 +1/2 z 2 Store µ 2 and discard z 2 and µ 1 [Brown and Hwang (1992)]

Second Approach (cont’d) 3.Make the third measurement z 3 Compute the estimate of the mean as a weighted sum of the previous estimate µ 2 and the current measurement z 3: µ 3 = 2/3 µ 2 +1/3 z 3 Store µ 3 and discard z 3 and µ 2 [Brown and Hwang (1992)]

Second Approach (cont’d) n.Make the n-th measurement z n Compute the estimate of the mean as a weighted sum of the previous estimate µ n-1 and the current measurement z n: µ n = (n-1)/n µ n-1 +1/n z n Store µ n and discard z n and µ n-1 [Brown and Hwang (1992)]

Comparison Batch MethodRecursive Method

Analysis The second procedure gives the same result as the first procedure. It uses the result for the previous step to help obtain an estimate at the current step. The difference is that it does not need to keep the sequence in memory. [Brown and Hwang (1992)]

Second Approach (rewrite the general formula) µ n = (n-1)/n µ n-1 +1/n z n

Second Approach (rewrite the general formula) µ n = (n-1)/n µ n-1 +1/n z n µ n = µ n-1 + 1/n (z n - µ n-1 )

Second Approach (rewrite the general formula) µ n = (n-1)/n µ n-1 +1/n z n µ n = µ n-1 + 1/n (z n - µ n-1 ) Old Estimate Difference Between New Reading and Old Estimate Gain Factor

Second Approach (rewrite the general formula)

How should we combine the two measurements? [Maybeck (1979)] σZ1σZ1 σZ2σZ2

Calculating the new mean Scaling Factor 1 Scaling Factor 2

Calculating the new mean Scaling Factor 1 Scaling Factor 2

Calculating the new mean Scaling Factor 1 Scaling Factor 2 Why is this not z 1 ?

Calculating the new variance [Maybeck (1979)] σZ1σZ1 σZ2σZ2

Calculating the new variance Scaling Factor 1 Scaling Factor 2

The scaling factors must be squared! Scaling Factor 1 Scaling Factor 2

The scaling factors must be squared! Scaling Factor 1 Scaling Factor 2

The new variance is

What makes these scaling factors special? Are there other ways to combine the two measurements? They minimize the error between the prediction and the true value of X. They are optimal in the least-squares sense.

Minimize the error

What is the minimum value? [

What is the minimum value? [

Finding the Minimum Value Y= 9x x + 50 dY/dx = 18 x -50 =0 The minimum is obtained when x=50/18= (7) The minimum value is Y(x min ) = 9*(50/18) 2 -50*(50/18) +50 = (4)

Start with two measurements v 1 and v 2 represent zero mean noise

Formula for the estimation error The new estimate is The error is

Expected value of the error If the estimate is unbiased this should hold

Find the Mean Square Error = ?

Mean Square Error

Minimize the mean square error

Finding S 1 Therefore

Finding S 2

Finally we get what we wanted

Finding the new variance

Formula for the new variance

New Topic: Particle Filters

Michael Isard and Andrew Blake (1998) ``CONDENSATION -- conditional density propagation for visual tracking'', International Journal of Computer Vision, 29, 1,

Movies from the CONDENSATION Web Page

More about this next time

THE END