Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer Vision Non-linear tracking Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, M. Isard, T. Darrell …

Similar presentations


Presentation on theme: "Computer Vision Non-linear tracking Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, M. Isard, T. Darrell …"— Presentation transcript:

1 Computer Vision Non-linear tracking Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, M. Isard, T. Darrell …

2 Computer Vision 2 Jan 16/18-Introduction Jan 23/25CamerasRadiometry Jan 30/Feb1Sources & ShadowsColor Feb 6/8Linear filters & edgesTexture Feb 13/15Multi-View GeometryStereo Feb 20/22Optical flowProject proposals Feb27/Mar1Affine SfMProjective SfM Mar 6/8Camera CalibrationSegmentation Mar 13/15Springbreak Mar 20/22FittingProb. Segmentation Mar 27/29Silhouettes and Photoconsistency Linear tracking Apr 3/5Project UpdateNon-linear Tracking Apr 10/12Object Recognition Apr 17/19Range data Apr 24/26Final project Tentative class schedule

3 Computer Vision 3 Final project presentation No further assignments, focus on project Final presentation: Presentation and/or Demo (your choice, but let me know) Short paper (Due April 22 by 23:59) (preferably Latex IEEE proc. style) Final presentation/demo April 24 and 26

4 Computer Vision 4 Bayes Filters System state dynamics Observation dynamics We are interested in: Belief or posterior density Estimating system state from noisy observations

5 Computer Vision 5 From above, constructing two steps of Bayes Filters Predict: Update: Recall “law of total probability” and “Bayes’ rule”

6 Computer Vision 6 Predict: Update: Assumptions: Markov Process

7 Computer Vision 7 Bayes Filter How to use it? What else to know? Motion Model Perceptual Model Start from:

8 Computer Vision 8 Example 1 Step 0: initialization Step 1: updating

9 Computer Vision 9 Example 1 (continue) Step 3: updating Step 4: predicting Step 2: predicting

10 Computer Vision 10 Several types of Bayes filters They differs in how to represent probability densities –Kalman filter –Multihypothesis filter –Grid-based approach –Topological approach –Particle filter

11 Computer Vision 11 Kalman Filter Recall general problem Assumptions of Kalman Filter: Belief of Kalman Filter is actually a unimodal Gaussian Advantage: computational efficiency Disadvantage: assumptions too restrictive

12 Computer Vision 12

13 Computer Vision 13

14 Computer Vision 14 Multi-hypothesis Tracking Belief is a mixture of Gaussian Tracking each Gaussian hypothesis using a Kalman filter Deciding weights on the basis of how well the hypothesis predict the sensor measurements Advantage: –can represent multimodal Gaussian Disadvantage: –Computationally expensive –Difficult to decide on hypotheses

15 Computer Vision 15 Grid-based Approaches Using discrete, piecewise constant representations of the belief Tessellate the environment into small patches, with each patch containing the belief of object in it Advantage: –Able to represent arbitrary distributions over the discrete state space Disadvantage –Computational and space complexity required to keep the position grid in memory and update it

16 Computer Vision 16 Topological approaches A graph representing the state space –node representing object’s location (e.g. a room) –edge representing the connectivity (e.g. hallway) Advantage –Efficiency, because state space is small Disadvantage –Coarseness of representation

17 Computer Vision 17 Particle filters Also known as Sequential Monte Carlo Methods Representing belief by sets of samples or particles are nonnegative weights called importance factors Updating procedure is sequential importance sampling with re-sampling

18 Computer Vision 18 Example 2: Particle Filter Step 0: initialization Each particle has the same weight Step 1: updating weights. Weights are proportional to p(z|x)

19 Computer Vision 19 Example 2: Particle Filter Particles are more concentrated in the region where the person is more likely to be Step 3: updating weights. Weights are proportional to p(z|x) Step 4: predicting. Predict the new locations of particles. Step 2: predicting. Predict the new locations of particles.

20 Computer Vision 20 Compare Particle Filter with Bayes Filter with Known Distribution Example 1 Example 2 Example 1 Example 2 Predicting Updating

21 Computer Vision 21 Comments on Particle Filters Advantage: –Able to represent arbitrary density –Converging to true posterior even for non-Gaussian and nonlinear system –Efficient in the sense that particles tend to focus on regions with high probability Disadvantage –Worst-case complexity grows exponentially in the dimensions

22 Computer Vision 22 Particle Filtering in CV: Initial Particle Set Particles at t = 0 drawn from wide prior because of large initial uncertainty –Gaussian with large covariance –Uniform distribution from MacCormick & Blake, 1998 State includes shape & position; prior more constrained for shape

23 Computer Vision 23 Normalize N particle weights so that they sum to 1 Resample particles by picking randomly and uniformly in [0, 1] range N times –Analogous to spinning a roulette wheel with arc-lengths of bins equal to particle weights Adaptively focuses on promising areas of state space Particle Filtering: Sampling ¼ (1) ¼ (2) ¼ (3) ¼(N)¼(N) ¼ (N-1) courtesy of D. Fox

24 Computer Vision 24 Particle Filtering: Prediction Update each particle using generative form of dynamics: Drift may be nonlinear (i.e., different displacement for each particle) Each particle diffuses independently –Typically modeled with a Gaussian Random component (aka “diffusion”) Deterministic component (aka “drift”)

25 Computer Vision 25 Particle Filtering: Measurement For each particle s (i), compute new weight ¼ (i) as measurement likelihood ¼ (i) = P (z j s (i) ) Enforcing plausibility: Particles that represent impossible configurations are given 0 likelihood –E.g., positions outside of image from MacCormick & Blake, 1998 A snake measurement likelihood method

26 Computer Vision 26 Particle Filtering Steps (aka C ONDENSATION ) drift diffuse measure measurement likelihood from Isard & Blake, 1998 Sampling occurs here

27 Computer Vision 27 Particle Filtering Visualization courtesy of M. Isard 1-D system, red curve is measurement likelihood

28 Computer Vision 28 C ONDENSATION : Example State Posterior from Isard & Blake, 1998 Note how initial distribution “sharpens”

29 Computer Vision 29 Example: Contour-based Head Template Tracking courtesy of A. Blake

30 Computer Vision 30 Example: Recovering from Distraction from Isard & Blake, 1998

31 Computer Vision 31 Obtaining a State Estimate Note that there’s no explicit state estimate maintained—just a “cloud” of particles Can obtain an estimate at a particular time by querying the current particle set Some approaches –“Mean” particle Weighted sum of particles Confidence: inverse variance –Really want a mode finder—mean of tallest peak

32 Computer Vision 32 Condensation: Estimating Target State From Isard & Blake, 1998 State samples (thickness proportional to weight) Mean of weighted state samples

33 Computer Vision 33 More examples

34 Computer Vision 34 Multi-Modal Posteriors The MAP estimate is just the tallest one when there are multiple peaks in the posterior This is fine when one peak dominates, but when they are of comparable heights, we might sometimes pick the wrong one Committing to just one possibility can lead to mistracking –Want a wider sense of the posterior distribution to keep track of other good candidate states adapted from [Hong, 1995] Multiple peaks in the measurement likelihood

35 Computer Vision 35 MCMC-based particle filter Model interaction (higher dimensional state-space) CNN video (Khan, Balch & Dellaert PAMI05)

36 Computer Vision 36 Next class: recognition


Download ppt "Computer Vision Non-linear tracking Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, M. Isard, T. Darrell …"

Similar presentations


Ads by Google