Presentation is loading. Please wait.

Presentation is loading. Please wait.

Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.

Similar presentations


Presentation on theme: "Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF."— Presentation transcript:

1 Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF FEATURES CONCLUSION ➢ Principles from Horn-Schunck can be used to track features together ➢ Results demonstrate improved tracking performance over standard Lucas Kanade ➢ Aperture problem is overcome, so that features can be tracked in untextured regions ➢ Future work: ➢ applying robust penalty functions to prevent smoothing across motion discontinuities ➢ explicit modeling of occlusions ➢ interpolation of dense optical flow from sparse feature points Sparse features have traditionally been tracked from frame to frame independently of one another. We propose a framework in which features are tracked jointly. Combining ideas from Lucas-Kanade and Horn-Schunck, the estimated motion of a feature is influenced by the estimated motion of neighboring features. The approach also handles the problem of tracking edges in a unified way by estimating motion perpendicular to the edge, using the motion of neighboring features to resolve the aperture problem. Results are shown on several image sequences to demonstrate the improved results obtained by the approach. Algorithm: Standard Lucas-Kanade For each feature i, 1. Initialize u i ← (0, 0) T 2. Set λ i ← 0 3. For pyramid level n − 1 to 0 step −1, (a) Compute Z i (b) Repeat until convergence: i. Compute the difference I t between the first image and the shifted second image: I t (x, y) = I 1 (x, y) − I 2 (x + u i, y + v i ) ii. Compute e i iii. Solve Z i u′ i = e i for incremental motion u’ i iv. Add incremental motion to overall estimate: u i ← u i + u′ i (c) Expand to the next level: u i ←  u i, where  is the pyramid scale factor Algorithm: Joint Lucas-Kanade For each feature i, 1. Initialize u i ← (0, 0) T 2. Initialize i For pyramid level n − 1 to 0 step −1, 1. For each feature i, compute Z i 2. Repeat until convergence: (a) For each feature i, i. Determine ii. Compute the difference I t between the first image and the shifted second image: I t (x, y) = I 1 (x, y) − I 2 (x + u i, y + v i ) iii. Compute e i iv. Solve Z i u′ i = e i for incremental motion u’ i v. Add incremental motion to overall estimate: u i ← u i + u′ i 3. Expand to the next level: u i ←  u i, where  is the pyramid scale factor Rubber Whale HydrangeaVenus Dimetrodon Standard LK (Open CV) Joint LK (our algorithm) AlgorithmRubber WhaleHydrangeaVenusDimetrodon AEEPAEEPAEEPAEEP Standard LK (OpenCV)8.090.447.560.578.560.632.400.13 Joint LK (our algorithm)4.320.136.130.454.660.251.340.08 The average angular error (AE) in degrees and the average endpoint error (EP) in pixels of the two algorithms. Image Gradient magnitude Standard LK Joint LK Algorithm comparison when the scene does not contain much texture, as is often the case in indoor man-made environments. EXPERIMENTAL RESULTS optic flow constraint equation: Lucas Kanade (sparse feature tracking) Horn Schunck (dense optic flow) assumes unknown displacement u of a pixel is constant within some neighborhood i.e., finds displacement of a small window centered around a pixel by minimizing: regularizes the unconstrained optic flow equation by imposing a global smoothness term computes global displacement functions u(x, y) v(x, y) by minimizing: λ: regularization parameter, Ω: image domain minimum of the functional is found by solving the corresponding Euler-Lagrange equations, leading to: denotes convolution with an integration window of size ρ differentiating with respect to u and v, setting the derivatives to zero leads to a linear system: I : Image I x : Image derivative in x direction I y : Image derivative in y direction I t : Image temporal derivative (u, v): pixel displacement in x and y direction Joint Tracking : Combine algorithms of Lucas Kanade and Horn Schunck, i.e., aggregate global information to improve the tracking of sparse feature points. (cf. Bruhn et al., IJCV 2005) Joint Lucas Kanade energy functional : (N : Number of features) (data term) (smoothness term) Differentiating E JLK with respect to the displacement (u,v) gives a 2Nx2N matrix equation, whose (2i – 1)th and (2i)th rows are given by :


Download ppt "Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF."

Similar presentations


Ads by Google