Presentation is loading. Please wait.

Presentation is loading. Please wait.

Reducing Drift in Parametric Motion Tracking

Similar presentations


Presentation on theme: "Reducing Drift in Parametric Motion Tracking"— Presentation transcript:

1 Reducing Drift in Parametric Motion Tracking
Ali Rahimi, Louis-Philippe Morency, Trevor Darrell Vision Interface Group. MIT AI Lab. with Rolf Hauer-Schmidt

2 Outline Parametric motion tracking with vision Bayesian tracking
Maximum likelihood, drift. Propose drift-free solution. Hard to implement. Bayesian tracking Posterior pose estimate. Target dynamics, sensor model and dynamics. Bayesian motion tracking without drift Using multiple base frames.

3 Pose Change Estimation With Vision
Pixels in one image can be mapped to the following image. The parameter of the mapping describes the motion of the object (ex: translation, affine, rigid).

4 Pose Change Estimation With Vision
Previous image Current image image Pixel displacement function WGN to account for other appearance changes Motion parameters for the pair

5 Pose Change Estimation as ML
This can be rewritten as a likelihood over parameters: covariance mean Recovering motion parameters using Maximum Likelihood:

6 Pose Change Estimation as ML
Recovering motion parameters using Maximum Likelihood: This is actually just least squares:

7 Old School Tracking as ML
Q: How do you use this “for” tracking? A: Accumulate pose changes to get relative pose wrt first frame.

8 Old School Tracking Drifts!
Accumulating noisy pose changes incurs drift: (Uncertainty in position grows linearly with number of frames)

9 Drift Example: 2D motion

10 Reducing Drift: 1st Attempt
One solution: Track with respect to first frame. Problem: Limits the range of motion to be very small.

11 How to Reduce Drift Idea: Use some past frame as reference!

12 How to Reduce Drift Improvement: use multiple base frames.

13

14 Designing a Bayesian Tracker
Model what you know about the target Target dynamics: p(st+1|st) Model the sensor. Noise model: p(yt|st) Or sensor dynamics: p(yt|st,, yt-1) Use graphical model to describe independences in the system. Use Bayes rule to compute p(st | yt, yt-1 , …).

15 Modeling Target Dynamics
Stationary Target Markovian Dynamics Time Unpredictable Motion Time

16 Linear Gaussian Target Dynamics
The next state depends only on the previous state: Written in linear form: For example: Time

17 Modeling Sensor Noise Sequence of iid observations:
Hidden Markov Chain: Parameter Estimation: The goal is to calculate:

18 Linear State Space Models
Assume a sensor: And target dynamics: The sensor could observe location or velocity: Optimal posterior s is obtained by the Kalman Filter. Note that there is still drift if Cv is used.

19 Contrived and unnecessary
Examples Light, fast target Heavy object Compass, thermometer, dynamic environment Range finder, taking 4 measurements Thermometer, stationary environment Contrived and unnecessary with LSS model.

20 Sensor Model for Differential
Vision Tracker Model target dynamics The state st includes the pose of the target. Pick any useful, including uninformative, p(st+1|st) Sensor model is where the action is Vision sensor takes two frames yt and yt-1 and produces pose change estimate. Noise model: p(st| yt, yt-1)

21 The Sensor Model True poses generate images, which are observed by a sensor, which recovers pose changes:

22 The Sensor Model We can model: Which is: But what we really need is: …
covariance mean But what we really need is:

23 Approximate Measurement Model
We have: We yearn for this We know this Assume appearance does not help you determine pose CHANGE. The likelihood has the same form as the posterior. But it’s unnormalized! It’s useless as a density.

24 Approximate Measurement Model
Approximate the posterior with a Gaussian instead:

25 Approximate Measurement Model
Approximate the posterior with a Gaussian: Let the mean be the mode of the distribution (which is found by running the tracker). Have the Gaussian fit the curvature of the log posterior:

26 Approximate Measurement Model
Gaussian approximation: Scale certainty by amount of slop used to compensate for model error. Contribution of each pixel involves senstivity of warping function and featurefulness of pixel. Sensitivity to motion parameters Featurefulness

27 Reducing Drift: Idea Improvement: use multiple base frames.

28 The Full Model Having obtained a measurement model,
And assumed dynamics, Marginalize out images: To obtain:

29 The Full Model Include redundant pose estimates:

30 Joint Truth and Measurement
Assume the measurements are Gaussian-corrupted truth: The joint of the model: Can be solved for the posterior by solving sparse linear system:

31 Results 2D Tracker. 6 Degree of Freedom Tracker: “heads”. Egomotion.

32 2D Tracker: Drifting

33 2D Tracker: Drift-reduced

34 6 Degree of Freedom Tracker

35 Egomotion

36 Conclusion Intuitive way to reduce drift by using multiple base frames. An error model for motion tracking. Combining error model and intuition to compute posterior poses given pose change estimates.

37 Future Work Make framework online. Allow wider range of motion models.


Download ppt "Reducing Drift in Parametric Motion Tracking"

Similar presentations


Ads by Google