Download presentation
Presentation is loading. Please wait.
1
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour with coordinates ={x 1, x 2, …, x N } at the initial frame were the image is I We are interested in tracking this contour through several image frames, say through T image frames given by We will denote each of these contours by t and so 1 = t =1. We will track each of the coordinates of the initial contour, i.e., we will focus on the tracking of the N initial coordinates. Thus, at any time the new contour will be characterized by t = {x1, x2, …, xN }, and these coordinates need not be connected. In order to create a connected contour at each frame one may fit a spline or another form of linking of this sequence of N coordinates. In a Bayesian framework we attempt to obtain the probability We can make measurements at each image frame, accumulate them over time denoting by and we ask “Which state (coordinates) the contour will be at that time given all the measurements ?”
2
© 2003 by Davi GeigerComputer Vision November 2003 L1.2 Propagating Probability Density An interesting expansion of (1) is given by: Assuming that such probabilities can be obtained, namely than we have a recursive method to estimate Predict Correct or make Measurements
3
© 2003 by Davi GeigerComputer Vision November 2003 L1.3 Assumptions Independence assumption on data generation, i.e., the current state completely defines the probability of the measurements/data (often the previous data is also neglected) First order Markov process, i.e., the current state of the system is completely described by the immediate previous state, and possibly the data (often the data is also neglected). This is a normalization constant that can always be computed by normalizing the recurrence equation (2)
4
© 2003 by Davi GeigerComputer Vision November 2003 L1.4 Linear Dynamics Examples Random Walk Points moving with constant velocity Points moving with constant acceleration Periodic motion
5
© 2003 by Davi GeigerComputer Vision November 2003 L1.5 Drift: New position is the previous one plus noise d Random Walk of a Point
6
© 2003 by Davi GeigerComputer Vision November 2003 L1.6 Point moving with constant velocity
7
© 2003 by Davi GeigerComputer Vision November 2003 L1.7 Point moving with constant acceleration
8
© 2003 by Davi GeigerComputer Vision November 2003 L1.8 Point moving with periodic motion
9
© 2003 by Davi GeigerComputer Vision November 2003 L1.9 Tracking one point (dynamic programming like) Equation (2) is written as t
10
© 2003 by Davi GeigerComputer Vision November 2003 L1.10 Kalman Filter (Special Case of a Linear System) Simplify calculation of the probabilities due to good Gaussian properties (applied to ) Reminder: From equation (2) we have 1D case: For one dimension point and Gaussian distributions Kalman observed the recurrence: if then where
11
© 2003 by Davi GeigerComputer Vision November 2003 L1.11 Then Kalman Filter and Gaussian Properties Proof: Assume
12
© 2003 by Davi GeigerComputer Vision November 2003 L1.12 Kalman Filter …(continuing the proof)
13
© 2003 by Davi GeigerComputer Vision November 2003 L1.13 Using Kalman Filter
14
© 2003 by Davi GeigerComputer Vision November 2003 L1.14 Kalman Filter (Generalization to N-D)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.