The Measurement of Visual Motion P. Anandan Microsoft Research.

Slides:



Advertisements
Similar presentations
Optical Flow Estimation
Advertisements

Motion.
Investigation Into Optical Flow Problem in the Presence of Spatially-varying Motion Blur Mohammad Hossein Daraei June 2014 University.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Computer Vision Optical Flow
C280, Computer Vision Prof. Trevor Darrell Lecture 9: Motion.
Automatic Image Alignment (direct) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and Rick.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Optical Flow Methods 2007/8/9.
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Feature matching and tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on.
Computing motion between images
CSc83029 – 3-D Computer Vision/ Ioannis Stamos 3-D Computational Vision CSc Optical Flow & Motion The Factorization Method.
Announcements Project 1 test the turn-in procedure this week (make sure your folder’s there) grading session next Thursday 2:30-5pm –10 minute slot to.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Computing in Image Analysis
Optical Flow Estimation
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Optical Flow Estimation using Variational Techniques Darya Frolova.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
Linearizing (assuming small (u,v)): Brightness Constancy Equation: The Brightness Constraint Where:),(),(yxJyxII t  Each pixel provides 1 equation in.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
KLT tracker & triangulation Class 6 Read Shi and Tomasi’s paper on good features to track
Optical Flow Digital Photography CSE558, Spring 2003 Richard Szeliski (notes cribbed from P. Anandan)
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
Announcements Project1 due Tuesday. Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Supplemental:
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Optical flow Combination of slides from Rick Szeliski, Steve Seitz, Alyosha Efros and Bill Freeman.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 19 – Dense motion estimation (OF) 1.
Linearizing (assuming small (u,v)): Brightness Constancy Equation: The Brightness Constraint Where:),(),(yxJyxII t  Each pixel provides 1 equation in.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
The Brightness Constraint
Motion estimation Digital Visual Effects Yung-Yu Chuang with slides by Michael Black and P. Anandan.
EECS 274 Computer Vision Motion Estimation.
CSE 185 Introduction to Computer Vision Feature Tracking and Optical Flow.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Uses of Motion 3D shape reconstruction Segment objects based on motion cues Recognize events and activities Improve video quality Track objects Correct.
Effective Optical Flow Estimation
Computer Vision Lecture #10 Hossam Abdelmunim 1 & Aly A. Farag 2 1 Computer & Systems Engineering Department, Ain Shams University, Cairo, Egypt 2 Electerical.
3D Imaging Motion.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
Segmentation of Vehicles in Traffic Video Tun-Yu Chiang Wilson Lau.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion Estimation I What affects the induced image motion?
Linearizing (assuming small (u,v)): Brightness Constancy Equation: The Brightness Constraint Where:),(),(yxJyxII t  Each pixel provides 1 equation in.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Motion estimation Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/12 with slides by Michael Black and P. Anandan.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Linearizing (assuming small (u,v)): Brightness Constancy Equation: The Brightness Constraint Where:),(),(yxJyxII t  Each pixel provides 1 equation in.
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
Image Motion. The Information from Image Motion 3D motion between observer and scene + structure of the scene –Wallach O’Connell (1953): Kinetic depth.
MOTION Model. Road Map Motion Model Non Parametric Motion Field : Algorithms 1.Optical flow field estimation. 2.Block based motion estimation. 3.Pel –recursive.
Motion estimation Parametric motion (image alignment) Tracking Optical flow.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
Motion and optical flow
Motion and Optical Flow
The Brightness Constraint
The Brightness Constraint
The Brightness Constraint
Motion Estimation Today’s Readings
Announcements more panorama slots available now
Announcements Questions on the project? New turn-in info online
Announcements more panorama slots available now
Optical flow and keypoint tracking
Presentation transcript:

The Measurement of Visual Motion P. Anandan Microsoft Research

Direct Methods: Methods for motion and/or shape estimation, which recover the unknown parameters directly from measurable image quantities at each pixel in the image. Minimization step: Direct methods: Error measure based on dense measurable image quantities. Feature-based methods: Error measure based on distances of destinct features.

WHY BOTHER Visual Motion can be annoying –Camera instabilities, jitter –Measure it. Remove it. Visual Motion indicates dynamics in the scene –Moving objects, behavior –Track objects and analyze trajectories Visual Motion reveals spatial layout of the scene –Motion parallax

Video Enhancement Visual Motion can be annoying –Camera instabilities, jitter –Measure it. Remove it.

Temporal Information Visual Motion indicates dynamics in the scene –Moving objects, behavior –Track objects and analyze trajectories

Spatial Layout Visual Motion reveals spatial layout of the scene –Motion parallax Sprite Viewer Demo

Classes of Techniques Feature-based methods –Extract salient visual features (corners, textured areas) and track them over multiple frames –Analyze the global pattern of motion vectors of these features –Sparse motion fields, but possibly robust tracking –Suitable especially when image motion is large (10-s of pixels) Direct-methods –Directly recover image motion from spatio temporal image brightness variations –Global motion parameters directly recovered without an intermediate feature motion calculation –Dense motion fields, but more sensitive to appearance variations –Suitable for video and when image motion is small (< 10 pixels) Our Focus Today

Brightness Constancy Equation: The Brightness Constraint Or, better still, Minimize : Linearizing (assuming small (u,v)):

Gradient Constraint (or the Optical Flow Constraint) Minimizing: The gradient constraint – only one constraint for each pixel In general Hence,

Aperture Problem and Normal Flow

The gradient constraint: Defines a line in the (u,v) space u v Normal Flow:

Local Patch Analysis

Combining Local Constraints u v etc.

Patch Translation Minimizing Assume a single velocity for all pixels within an image patch On the LHS: sum of the 2x2 outer product tensor of the gradient vector

Singularities and the Aperture Problem Let Algorithm: At each pixel compute by solving M is singular if all gradient vectors point in the same direction -- e.g., along an edge -- of course, trivially singular if the summation is over a single pixel -- i.e., only normal flow is available (aperture problem) Corners and textured areas are OK and

Iterative Refinement Estimate velocity at each pixel using one iteration of Lucas and Kanade estimation Warp one image toward the other using the estimated flow field (easier said than done) Refine estimate by repeating the process

Motion and Gradients Consider 1-d signal; assume linear function of x x I t=0 t=1 t

Iterative refinement x t x t t x BUT!!

Limits of the gradient method 1.Fails when intensity structure within window is poor 2.Fails when the displacement is large (typical operating range is motion of 1 pixel) –Linearization of brightness is suitable only for small displacements Also, brightness is not strictly constant in images –actually less problematic than it appears, since we can pre-filter images to make them look similar

Pyramids Pyramids were introduced as a multi-resolution image computation paradigm in the early 80s. The most popular pyramid is the Burt pyramid, which foreshadows wavelets Two kinds of pyramids: Low pass or “Gaussian pyramid” Band-pass or “Laplacian pyramid”

Gaussian Pyramid Convolve image with a small Gaussian kernel –Typically 5x5 Subsample (decimate by 2) to get lower resolution image Repeat for more levels A sequence of low-pass filtered images

Laplacian pyramid Laplacian as Difference of Gaussian Band-pass filtered images Highlights edges at different spatial scales For matching, this is less sensitive to image illumination changes But more noisy than using Gaussians

image I image J JwJw warp refine + Pyramid of image JPyramid of image I image I image J Coarse-to-Fine Estimation u=10 pixels u=5 pixels u=2.5 pixels u=1.25 pixels ==> small u and v...

Global Motion Models 2D Models: Affine Quadratic Planar projective transform (Homography) 3D Models: Instantaneous camera motion models Homography+epipole Plane+Parallax

Example: Affine Motion Substituting into the B.C. Equation: Each pixel provides 1 linear constraint in 6 global unknowns (minimum 6 pixels necessary) Least Square Minimization (over all pixels):

Quadratic – instantaneous approximation to planar motion Other 2D Motion Models Projective – exact planar motion

3D Motion Models Local Parameter: Instantaneous camera motion: Global parameters: Homography+Epipole Local Parameter: Residual Planar Parallax Motion Global parameters: Local Parameter:

Correlation and SSD For larger displacements, do template matching –Define a small area around a pixel as the template –Match the template against each pixel within a search area in next image. –Use a match measure such as correlation, normalized correlation, or sum-of-squares difference –Choose the maximum (or minimum) as the match –Sub-pixel interpolation also possible

SSD Surface – Textured area

SSD Surface -- Edge

SSD Surface – homogeneous area

Discrete Search vs. Gradient Based Estimation Consider image I translated by The discrete search method simply searches for the best estimate. The gradient method linearizes the intensity function and solves for the estimate

Consider image I translated by Uncertainty in Local Estimation Now, This assumes uniform priors on the velocity field

Quadratic Approximation When are small After some fiddling around, we can show

Posterior uncertainty At edges is singular, but just take pseudo-inverse Note that the error is always convex, since is positive semi-definite i.e., even for occluded points and other false matches, this is the case… seems a bit odd!

Match plus confidence Numerically compute error for various Search for the peak Numerically fit a qudratic to around the peak Find sub-pixel estimate for and covariance If the matrix is negative, it is false match Or even better, if you can afford it, simply maintain a discrete sampling of and

Quadratic Approximation and Covariance Estimation When where

Choosing the Correlation Window Size Small windows lead to more false matches Large windows are better this way, but… –Neighboring flow vectors will be more correlated (since the template windows have more in common) –Flow resolution also lower (same reason) –More expensive to compute Another way to look at this: Small windows are good for local search but more precise and less smooth Large windows good for global search but less precise and more smooth method

Robust Estimation Standard Least Squares Estimation allows too much influence for outlying points

Robust Estimation Robust gradient constraint Robust SSD

Influence functions and Redescening Estimators GO TO THE BOARD

Reweighted Least-Squares Robust Minimization (over all pixels): An Outlier Measure Residual normal flow

Outlier rejection Original Outliers Synthesized

Locking Property ORIGINAL ENHNACED

Original sequencePlane-aligned sequenceRecovered shape “block sequence” [Kumar-Anandan-Hanna’94] Dense 3D Reconstruction (Plane+Parallax)

Original sequence Plane-aligned sequence Recovered shape

Dense 3D Reconstruction Brightness Constancy constraint The intersection of the two line constraints uniquely defines the displacement. Epipolar line epipole p

Limitations Limited search range (up to 10% of the image size). Brightness constancy.

Multi-sensor Alignment Originals: IR and EO images

Direct Correlation Based Alignment

Example: Affine Motion Substituting into the B.C. Equation: Each pixel provides 1 linear constraint in 6 global unknowns    yIxIIyIxII yyyxxx (minimum 6 pixels necessary)

The Brightness Constraint Brightness Constancy Equation: Linearizing J (assume small (u,v)): In practice (although not necessary) one assumes

Motion Models Motion vector (u,v) has two unknowns (at each pixel) Brightness Constraint provides one equation One approach is to use a regularizer (e.g., Horn and Schunk) Alternative: Global Motion Model Constraint –e.g., Affine, Quadratic, Projective (planar) transform (2D Models) –Instantaneous camera motion, homography+epipole, Plane+Parallax, etc. (3D Models)

J JwJw I warp refine + J JwJw I warp refine + J pyramid construction J JwJw I warp refine + I pyramid construction Coarse-to-Fine Estimation

Affine Motion Estimation Each pixel provides one constraint Least square estimation; Minimize:

Coarse-to-Fine Estimation Problem: Brightness linearization assumes small (u,v). Solution: Iterative refinement Coarse-to-fine estimation within a pyramid