Digital Visual Effects Yung-Yu Chuang

Slides:



Advertisements
Similar presentations
CSCE 643 Computer Vision: Lucas-Kanade Registration
Advertisements

Optical Flow Estimation
Outline Feature Extraction and Matching (for Larger Motion)
Computer Vision Optical Flow
Automatic Image Alignment (direct) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and Rick.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
LUCAS KANADE FEATURE TRACKER a pyramidal implementation
Iterative Image Registration: Lucas & Kanade Revisited Kentaro Toyama Vision Technology Group Microsoft Research.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Direct Methods for Visual Scene Reconstruction Paper by Richard Szeliski & Sing Bing Kang Presented by Kristin Branson November 7, 2002.
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Feature matching and tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on.
Announcements Project 1 test the turn-in procedure this week (make sure your folder’s there) grading session next Thursday 2:30-5pm –10 minute slot to.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Caltech, Oct Lihi Zelnik-Manor
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Lecture 3a: Feature detection and matching CS6670: Computer Vision Noah Snavely.
Optical Flow Estimation
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
KLT tracker & triangulation Class 6 Read Shi and Tomasi’s paper on good features to track
Computational Photography: Image Registration Jinxiang Chai.
Optical Flow Digital Photography CSE558, Spring 2003 Richard Szeliski (notes cribbed from P. Anandan)
Announcements Project1 due Tuesday. Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Supplemental:
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Optical flow Combination of slides from Rick Szeliski, Steve Seitz, Alyosha Efros and Bill Freeman.
Feature Tracking and Optical Flow
Generating panorama using translational movement model.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 19 – Dense motion estimation (OF) 1.
Feature Tracking and Optical Flow Computer Vision ECE 5554 Virginia Tech Devi Parikh 11/12/13 Slides borrowed from Derek Hoiem, who adapted many from Lana.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
Motion estimation Digital Visual Effects Yung-Yu Chuang with slides by Michael Black and P. Anandan.
The Measurement of Visual Motion P. Anandan Microsoft Research.
EECS 274 Computer Vision Motion Estimation.
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
CSE 185 Introduction to Computer Vision Feature Tracking and Optical Flow.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Uses of Motion 3D shape reconstruction Segment objects based on motion cues Recognize events and activities Improve video quality Track objects Correct.
Effective Optical Flow Estimation
Feature extraction: Corners 9300 Harris Corners Pkwy, Charlotte, NC.
Features, Feature descriptors, Matching Jana Kosecka George Mason University.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion Estimation I What affects the induced image motion?
Feature Tracking and Optical Flow Computer Vision ECE 5554, ECE 4984 Virginia Tech Devi Parikh 11/10/15 Slides borrowed from Derek Hoiem, who adapted many.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion estimation Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/12 with slides by Michael Black and P. Anandan.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Lucas-Kanade Image Alignment Iain Matthews. Paper Reading Simon Baker and Iain Matthews, Lucas-Kanade 20 years on: A Unifying Framework, Part 1
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
Motion estimation Parametric motion (image alignment) Tracking Optical flow.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
Feature Tracking and Optical Flow
CPSC 641: Image Registration
Motion and Optical Flow
Robust Visual Motion Analysis: Piecewise-Smooth Optical Flow
Feature Tracking and Optical Flow
Motion Estimation Today’s Readings
Announcements more panorama slots available now
Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
CSSE463: Image Recognition Day 30
Announcements Questions on the project? New turn-in info online
Coupled Horn-Schunck and Lukas-Kanade for image processing
Announcements more panorama slots available now
Optical flow Computer Vision Spring 2019, Lecture 21
CSSE463: Image Recognition Day 30
Optical flow and keypoint tracking
Presentation transcript:

Digital Visual Effects Yung-Yu Chuang Motion estimation Digital Visual Effects Yung-Yu Chuang with slides by Michael Black and P. Anandan

Motion estimation Parametric motion (image alignment) Tracking Optical flow

direct method for image stitching Parametric motion direct method for image stitching

Tracking

Optical flow

Three assumptions Brightness consistency Spatial coherence Temporal persistence

Brightness consistency Image measurement (e.g. brightness) in a small region remain the same although their location may change.

Spatial coherence Neighboring points in the scene typically belong to the same surface and hence typically have similar motions. Since they also project to nearby pixels in the image, we expect spatial coherence in image flow.

Temporal persistence The image motion of a surface patch changes gradually over time.

Image registration Goal: register a template image T(x) and an input image I(x), where x=(x,y)T. (warp I so that it matches T) Image alignment: I(x) and T(x) are two images Tracking: T(x) is a small patch around a point p in the image at t. I(x) is the image at time t+1. Optical flow: T(x) and I(x) are patches of images at t and t+1. warp I T fixed

Simple approach (for translation) Minimize brightness difference

Simple SSD algorithm For each offset (u, v) compute E(u,v); Choose (u, v) which minimizes E(u,v); Problems: Not efficient No sub-pixel accuracy

Lucas-Kanade algorithm

Newton’s method Root finding for f(x)=0 March x and test signs Determine Δx (small→slow; large→ miss)

Newton’s method Root finding for f(x)=0

Newton’s method Root finding for f(x)=0 Taylor’s expansion:

Newton’s method Root finding for f(x)=0 x0 x2 x1

Newton’s method pick up x=x0 iterate compute update x by x+Δx until converge Finding root is useful for optimization because Minimize g(x) → find root for f(x)=g’(x)=0

Lucas-Kanade algorithm

Lucas-Kanade algorithm

Lucas-Kanade algorithm iterate shift I(x,y) with (u,v) compute gradient image Ix, Iy compute error image T(x,y)-I(x,y) compute Hessian matrix solve the linear system (u,v)=(u,v)+(∆u,∆v) until converge

Parametric model translation affine Our goal is to find p to minimize E(p) for all x in T’s domain translation affine

Parametric model minimize with respect to minimize

Parametric model warped image target image image gradient Jacobian of the warp

Jacobian matrix The Jacobian matrix is the matrix of all first-order partial derivatives of a vector-valued function.

Jacobian matrix

Parametric model warped image target image image gradient Jacobian of the warp

Jacobian of the warp For example, for affine dxx dyx dxy dyy dx dy

(Approximated) Hessian Parametric model (Approximated) Hessian

Lucas-Kanade algorithm iterate warp I with W(x;p) compute error image T(x,y)-I(W(x,p)) compute gradient image with W(x,p) evaluate Jacobian at (x;p) compute compute Hessian solve update p by p+ until converge

Coarse-to-fine strategy J I J warp Jw refine I + J Jw I pyramid construction pyramid construction warp refine + J Jw I warp refine +

Application of image alignment

Direct vs feature-based Direct methods use all information and can be very accurate, but they depend on the fragile “brightness constancy” assumption. Iterative approaches require initialization. Not robust to illumination change and noise images. In early days, direct method is better. Feature based methods are now more robust and potentially faster. Even better, it can recognize panorama without initialization.

Tracking

Tracking (u, v) I(x,y,t) I(x+u,y+v,t+1)

optical flow constraint equation Tracking brightness constancy optical flow constraint equation

Optical flow constraint equation

Multiple constraints

Area-based method Assume spatial smoothness

Area-based method Assume spatial smoothness

Area-based method must be invertible

Area-based method The eigenvalues tell us about the local image structure. They also tell us how well we can estimate the flow in both directions. Link to Harris corner detector.

Textured area

Edge

Homogenous area

KLT tracking Select features by Monitor features by measuring dissimilarity

Aperture problem

Aperture problem

Aperture problem

Demo for aperture problem http://www.sandlotscience.com/Distortions/Breathing_Square.htm http://www.sandlotscience.com/Ambiguous/Barberpole_Illusion.htm

Aperture problem Larger window reduces ambiguity, but easily violates spatial smoothness assumption

KLT tracking http://www.ces.clemson.edu/~stb/klt/

KLT tracking http://www.ces.clemson.edu/~stb/klt/

SIFT tracking (matching actually) Frame 0  Frame 10

SIFT tracking Frame 0  Frame 100

SIFT tracking Frame 0  Frame 200

KLT vs SIFT tracking KLT has larger accumulating error; partly because our KLT implementation doesn’t have affine transformation? SIFT is surprisingly robust Combination of SIFT and KLT (example) http://www.frc.ri.cmu.edu/projects/buzzard/smalls/

Rotoscoping (Max Fleischer 1914) 1937

Tracking for rotoscoping

Tracking for rotoscoping

Waking life (2001)

A Scanner Darkly (2006) Rotoshop, a proprietary software. Each minute of animation required 500 hours of work.

Optical flow

Single-motion assumption Violated by Motion discontinuity Shadows Transparency Specular reflection …

Multiple motion

Multiple motion

Simple problem: fit a line

Least-square fit

Least-square fit

Robust statistics Recover the best fit for the majority of the data Detect and reject outliers

Approach

Robust weighting Rho psi Truncated quadratic

Robust weighting Geman & McClure

Robust estimation Rho psi

Fragmented occlusion

Regularization and dense optical flow Neighboring points in the scene typically belong to the same surface and hence typically have similar motions. Since they also project to nearby pixels in the image, we expect spatial coherence in image flow.

Input image Horizontal motion Vertical motion

Application of optical flow video matching

Input for the NPR algorithm

Brushes

Edge clipping

Gradient

Smooth gradient

Textured brush

Edge clipping

Frame-by-frame application of the NPR algorithm Temporal artifacts Frame-by-frame application of the NPR algorithm

Temporal coherence

References B.D. Lucas and T. Kanade, An Iterative Image Registration Technique with an Application to Stereo Vision, Proceedings of the 1981 DARPA Image Understanding Workshop, 1981, pp121-130. Bergen, J. R. and Anandan, P. and Hanna, K. J. and Hingorani, R., Hierarchical Model-Based Motion Estimation, ECCV 1992, pp237-252. J. Shi and C. Tomasi, Good Features to Track, CVPR 1994, pp593-600. Michael Black and P. Anandan, The Robust Estimation of Multiple Motions: Parametric and Piecewise-Smooth Flow Fields, Computer Vision and Image Understanding 1996, pp75-104. S. Baker and I. Matthews, Lucas-Kanade 20 Years On: A Unifying Framework, International Journal of Computer Vision, 56(3), 2004, pp221 - 255. Peter Litwinowicz, Processing Images and Video for An Impressionist Effects, SIGGRAPH 1997. Aseem Agarwala, Aaron Hertzman, David Salesin and Steven Seitz, Keyframe-Based Tracking for Rotoscoping and Animation, SIGGRAPH 2004, pp584-591.