Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Copula Regression By Rahul A. Parsa Drake University &
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Computer Vision - A Modern Approach Set: Probability in segmentation Slides by D.A. Forsyth Missing variable problems In many vision problems, if some.
Integration of sensory modalities
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Newton’s Method Application to LMS Recursive Least Squares Exponentially-Weighted.
Observers and Kalman Filters
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Visual Recognition Tutorial
Prof. Trevor Darrell Lecture 19: Tracking
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Mobile Intelligent Systems 2004 Course Responsibility: Ola Bengtsson.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Evaluating Hypotheses
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Modern Navigation Thomas Herring
Slam is a State Estimation Problem. Predicted belief corrected belief.
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
TP15 - Tracking Computer Vision, FCUP, 2013 Miguel Coimbra Slides by Prof. Kristen Grauman.
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Computer vision: models, learning and inference Chapter 19 Temporal models.
EM and expected complete log-likelihood Mixture of Experts
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Ch4 Describing Relationships Between Variables. Pressure.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics Bayes Filter Implementations.
Karman filter and attitude estimation Lin Zhong ELEC424, Fall 2010.
Multifactor GPs Suppose now we wish to model different mappings for different styles. We will add a latent style vector s along with x, and define the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Mobile Robot Localization (ch. 7)
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
2 Introduction to Kalman Filters Michael Williams 5 June 2003.
1 Assignment, Project and Presentation Mobile Robot Localization by using Particle Filter by Chong Wang, Chong Fu, and Guanghui Luo. Tracking, Mapping.
An Introduction to Kalman Filtering by Arthur Pece
NCAF Manchester July 2000 Graham Hesketh Information Engineering Group Rolls-Royce Strategic Research Centre.
EECS 274 Computer Vision Tracking. Motivation: Obtain a compact representation from an image/motion sequence/set of tokens Should support application.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Tracking with dynamics
By: Aaron Dyreson Supervising Professor: Dr. Ioannis Schizas
Surveying II. Lecture 1.. Types of errors There are several types of error that can occur, with different characteristics. Mistakes Such as miscounting.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Designs for Experiments with More Than One Factor When the experimenter is interested in the effect of multiple factors on a response a factorial design.
General approach: A: action S: pose O: observation Position at time t depends on position previous position and action, and current observation.
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
12. Principles of Parameter Estimation
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Tracking Objects with Dynamics
Probabilistic Robotics
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Bayes and Kalman Filter
Parametric Methods Berlin Chen, 2005 References:
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
12. Principles of Parameter Estimation
Kalman Filter: Bayes Interpretation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
The Discrete Kalman Filter
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Nome Sobrenome. Time time time time time time..
Presentation transcript:

Tracking with Linear Dynamic Models

Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of images. Applications Motion capture Recognition from motion Surveillance Targeting

Tracking Very general model: We assume there are moving objects, which have an underlying state X There are measurements Y, some of which are functions of this state There is a clock at each tick, the state changes at each tick, we get a new observation Examples object is ball, state is 3D position+velocity, measurements are stereo pairs object is person, state is body configuration, measurements are frames, clock is in camera (30 fps)

Linear Dynamics Tracking cab be thought of as an inference problem: The moving object has some form of internal state, which is measured at each frame. We need to combine our measurements as efficiently as possible to estimate the object’s state. Assume both dynamics and measurements are linear. (Non-linearity introduces difficulties.)

Three main issues

Simplifying Assumptions

Tracking as induction Assume data association is done we’ll talk about this later; a dangerous assumption Do correction for the 0’th frame Assume we have corrected estimate for i’th frame show we can do prediction for i+1, correction for i+1

Base case

Induction step Given

Induction step

Linear dynamic models Use notation ~ to mean “has the pdf of”, N(a, b) is a normal distribution with mean a and covariance b. Then a linear dynamic model has the form This is much, much more general than it looks, and extremely powerful

Examples Drifting points we assume that the new position of the point is the old one, plus noise. For the measurement model, we may not need to observe the whole state of the object e.g. a point moving in 3D, at the 3k’th tick we see x, 3k+1’th tick we see y, 3k+2’th tick we see z in this case, we can still make decent estimates of all three coordinates at each tick. This property, which does not apply to every model, is called Observability

Examples Points moving with constant velocity Periodic motion Etc. Points moving with constant acceleration

Points moving with constant velocity We have (the Greek letters denote noise terms) Stack (u, v) into a single state vector which is the form we had above

Points moving with constant acceleration We have (the Greek letters denote noise terms) Stack (u, v) into a single state vector which is the form we had above

Velocity Position A constant velocity dynamic model

Position x-axis: position y-axis: velocity A constant acceleration dynamic model

The Kalman Filter Key ideas: Linear models interact uniquely well with Gaussian noise - make the prior Gaussian, everything else Gaussian and the calculations are easy Gaussians are really easy to represent --- once you know the mean and covariance, you’re done

The Kalman Filter in 1D Dynamic Model Notation Predicted mean Corrected mean

Notation and A Few Identities

Prediction

Prediction for 1D Kalman filter The new state is obtained by multiplying old state by known constant adding zero-mean noise Therefore, predicted mean for new state is constant times mean for old state Predicted variance is sum of constant^2 times old state variance and noise variance Because: old state is normal random variable, multiplying normal rv by constant implies mean is multiplied by a constant variance by square of constant, adding zero mean noise adds zero to the mean, adding rv ’ s adds variance

Correction

Correction for 1D Kalman filter Pattern match to identities given in book basically, guess the integrals, get: Notice: if measurement noise is small, we rely mainly on the measurement, if it’s large, mainly on the prediction

In higher dimensions, derivation follows the same lines, but isn ’ t as easy. Expressions here.

This is figure The notation is a bit involved, but is logical. We plot state as open circles, measurements as x’s, predicted means as *’s with three standard deviation bars, corrected means as +’s with three standard deviation bars.

Smoothing Idea We don’t have the best estimate of state - what about the future? Run two filters, one moving forward, the other backward in time. Now combine state estimates The crucial point here is that we can obtain a smoothed estimate by viewing the backward filter’s prediction as yet another measurement for the forward filter  so we’ve already done the equations

Backward Filter

Forward-Backward Algorithm

Forward filter, using notation as above; figure is top left of The error bars are 1 standard deviation, not 3 (sorry!)

Backward filter, top right of 17.5, The error bars are 1 standard deviation, not 3 (sorry!)

Smoothed estimate, bottom of Notice how good the state estimates are, despite the very lively noise. The error bars are 1 standard deviation, not 3 (sorry!)

Data Association Nearest neighbors choose the measurement with highest probability given predicted state popular, but can lead to catastrophe Probabilistic Data Association combine measurements, weighting by probability given predicted state gate using predicted state

Applications Vehicle tracking Surveillance Human-computer interaction