Download presentation
Presentation is loading. Please wait.
Published byDonna Johnston Modified over 9 years ago
1
CASC Primer on tracking Sen-ching S. Cheung March 26, 2004
2
Slide 2 of 18 CASC An object tracking system Data association Sensor Data Processing Track Maintenance Gating Computations Prediction and Update
3
Slide 3 of 18 CASC Outline l Prediction and Update —Tracking for point targets or segmented objects —Tracking for unsegmented objects (feature tracking) – Mean-shift tracking l Data association in multi-object tracking — Global nearest neighbor — Joint Probabilistic Data Association — Multiple Hypothesis Tracking Dynamics State & Measurement LinearNon-linear Gaussian NoiseKalman Filter (KF) Extended KF, Unscented KF Non-Gaussian NoiseParticle Filter
4
Slide 4 of 18 CASC Prediction and Update l Assume single object tracking l Given: Track observations (e.g. positions): y 1, y 2,...,y t New observations at time t+1: z 1, z 2,..., z N l Goal : Which z’s should be the new y t+1 ? l Answer: Maximum A Posteriori or Bayesian y t+1 = arg max i=1,...,N P(z i | y 1, y 2,..., y t ) or y t+1 = i z i P(z i | y 1, y 2,..., y t ) / N
5
Slide 5 of 18 CASC State-space model l Key is to compute P(y t+1 =z |y 1,y 2,...,y t ) l Introduce state X t: —Given: – Dynamics: P(X t+1 |X t ) – Measurement: P(Y t |X t ) —Markov assumption: P(y t+1 |x t+1,y 1,y 2,...,y t ) = P(y t+1 |x t+1 ) —Why? X t-1 XtXt X t+1 Y t-1 YtYt Y t+1 P(Y t |X t ) P(X t+1 |X t )
6
Slide 6 of 18 CASC Prediction, time and measurement update Ans: Simple recursion to compute P(y t+1 = z | y 1,y 2,...,y t ) P(x t |y 1,...,y t ) P(x t+1 |y 1,...,y t ) P(x t+1 |y 1,...,y t+1 ) P(y t+1 = z |y 1,...,y t ) Prediction Measurement Update P(y t+1 = z | y 1,...,y t ) = ∫ P(y t+1 = z,x t+1 | y 1,...,y t ) dx t+1 = ∫ P(y t+1 = z |x t+1,y 1,...,y t ) P(x t+1 | y 1,...,y t ) dx t+1 = ∫ P(y t+1 = z |x t+1 ) P(x t+1 | y 1,...,y t ) dx t+1 P(x t+1 | y 1,...,y t ) = ∫ P(x t+1,x t | y 1,...,y t ) dx t = ∫ P(x t+1 |x t ) P(x t | y 1,...,y t ) dx t P(x t+1 | y 1,...,y t+1 ) = P(x t+1,y t+1 | y 1,...,y t ) / P(y t | y 1,...,y t ) P(y t+1 | x t+1 ) P(x t+1 | y 1,...,y t ) Time Update
7
Slide 7 of 18 CASC Kalman Filter l Linear System and Gaussian Noise x t+1 = Ax t + Gw t, w t ~N(0,Q) x t+1 |x t ~ N(Ax t, G T QG) y t = Cx t + v t, v t ~N(0,R) y t |x t ~ N(Cx t, R) l Time update t+1|t E(x t+1 |y 1,...,y t ) = A t|t t+1|t E(x t+1 |y 1,...,y t ) = A T t|t A + G T QG l Prediction E(y t+1 |y 1,...,y t ) = C t+1|t Cov(y t+1 |y 1,...,y t ) = C T t+1|t C + R l Measurement update t+1|t+1 = t+1|t + K t+1 (y t+1 -C t+1|t ) t+1|t+1 = t+1|t – K t+1 C t+1|t where Kalman gain, K t+1 t+1|t C T (C t+1|t C T +R) -1
8
Slide 8 of 18 CASC Simple example : constant velocity Dynamics model : a t = white noise E(a t )=0; E(a t a s )=k δ(s-t) Kalman Filter Implementation: x t = (p t p’ t ) T ; A= 1 T where T is the sampling period 0 1 G = I; Q = T 3 /3 T 2 /2 by computing Cov(x t,x t+1 ) T 2 /2 T C = [1 0]; R depends on measurement error Other types of models: Singer Acceleration, Constant Acceleration, Piecewise constant wiener process acceleration, Coordinated turn, etc. Adaptation: Use multiple KF with HMM : Interactive Multiple Model (IMM) Filtering ^^
9
Slide 9 of 18 CASC Non-linearity l Possible in measurement and/or dynamics Measurement : y t = tan -1 (x t (2)/x t (1)) + v l Incorporate non-linearity in computing mean & covar. x t (1) x t (2) ytyt
10
Slide 10 of 18 CASC Extended Kalman Filter l Taylor Series Expansion l Linearization
11
Slide 11 of 18 CASC Unscented Kalman Filter l Problem with EKF : need Jacobian matrix A, error propagation l “It is easier to approximate a PDF than it is to approximate an arbitrary nonlinear function.”- J.K. Uhlmann 1.Select a set of deterministic sigma points {s i,w i } i=1,...,N such that (a) w i = 1, (b) w i s i = x, (c) w i (s i - x )(s i - x ) T = x Example: Points along the covariance contour, s i = x +(-1) i [(½N x ) ½ ] i/2 w i =1/N 2.Map s i h(s i ) 3.Compute “sample mean” w i h(s i ) and “sample covariance” w i h(s i )h T (s i ) because... i/2 th column In fact, true up to the 2 nd derivative because the covar. is the same
12
Slide 12 of 18 CASC What if the noise is also non-Gaussian? l For example, colored noise from wrong dynamics, tracking through clutter, deformation, etc. l 1 st and 2 nd order statistics are no longer sufficient to characterize the posterior distribution. l Answer : particle filter condensation algorithm (CV) Sequential Monte Carlo Method (statistics) —use Markov-Chain Monte Carlo (MCMC) method in time update, prediction, and measurement update
13
Slide 13 of 18 CASC Particle sets A particle is a pair of random variables : state x and its weight 0 A particle set for a PDF f is an algorithm to generate (x i, i ) such that for any function g: lim n i i g(x i ) = E f (g(x)) “Converge by distribution” Probability State x i = centroid of ellipse i = area of ellipse
14
Slide 14 of 18 CASC Operations on particles l Idea: use particles {x i, i } i=1,..N to represent P(x t |y 1,...,y t ) l Recall —time update : P(x t+1 | y 1,...,y t )=∫ P(x t+1 |x t ) P(x t | y 1,...,y t ) dx t “Convolution” —prediction : P(y t+1 | y 1,...,y t )= ∫ P(y t+1 = z |x t+1 ) P(x t+1 | y 1,...,y t ) dx t+1 “Convolution” —measurement: P(x t+1 | y 1,...,y t+1 ) P(y t+1 | x t+1 ) P(x t+1 | y 1,...,y t ) “Multiplication” l Assume we know how to evaluate and generate random samples from all functions in red. l How to “convolve” and “multiple” sets of particles with other functions?
15
Slide 15 of 18 CASC Multiplication and Convolution of particles l Multiply by q(x) x i x i i q(x i ) i l Convolution with q(y|x) 1.Resampling {x i, i } i=1..N to a new set of particles {x i ’, i ’} i=1..N {x i, i } i=1..N {x i ’, i ’} i=1..N 2.Reweighting x i ’ Sample based on q(x|x i ’) i ’ i ’
16
Slide 16 of 18 CASC Why resampling? = Without resampling: 0 With resampling: l there are a lot more... — better resampling : fewer x i have the same values — how many particles? Effective sampling size
17
Slide 17 of 18 CASC What about object feature? l Simplest way : feature vector + point target maximize P(f(y t+1 )=f(z i ) | f(y 1 ),f(y 2 ),...,f(y t )) P(y t+1 =z i | y 1, y 2,..., y t ) —Too many possible z i if no foreground segmentation —occlusion l Mean-shift tracking —iterative hill-climbing algorithm Time t Time t+1 centroid c 0, object feature f w f (v;t) = likelihood that I t (v) is part of the object 1.New centroid of candidate obj. c 1 = v w f (v;t+1) / w f (v;t+1) 2.Move candidate to c 1 and repeat candidate object
18
Slide 18 of 18 CASC References Basic Kalman Filter 1. M. I. Jordan, An Introduction to Probabilistic Graphical Models, in preparation. (ask me) 2. Forsyth and Ponce. (2003) Computer Vision, a modern approach. Prentice Hall. Chapter 17. (Sapphire) A little out-dated but encyclopedic on most aspects of tracking 1. Backman & Popoli, (1999) Design and Analysis of Modern Tracking Systems. Artech House Publishers. 2. Bar-Shalom, Li (1993) Estimation and Tracking: Principles, Techniques, and Software. Artech House Publishers. Unscented Kalman Filter 1. Julier, S. and J.K. Uhlmann. (2004) “Unscented filtering and nonlinear estimation,” Proceedings of IEEE, vol. 92, no.3, pp. 401-422. Mean-shift tracking 1. Cheng, Y. (1992) “Mean shift, mode seeking, and clustering” PAMI, vol.17, pp.790-799. 2. Comaniciu, D et al. (2000) “Real-time tracking of non-rigid objects using mean shift,” CVPR, vol.2, pp. 142-149. More on particle filtering 1. MacCormick. (2002) Stochastic algorithms for visual tracking. Springer. (Sapphire) 2. Doucet, A. (2001) Sequential Monte Carlo Methods in Practice. Springer. (Sapphire) 3. Hue, C. and J.-P. Le Cadre. (2002) “Sequential Monte Carlo methods for multiple target tracking and data fusion,” IEEE Trans. On signal processing, vol. 50, no.2, pp. 309-325. 4. Djuric, P.M. et al. (2003) “Particle Filtering,” IEEE Signal Processing magazine, vol. 20, no.5, pp. 19-38. 5. See Spengler’s reference (attached)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.