Motion and optical flow

Slides:



Advertisements
Similar presentations
Optical Flow Estimation
Advertisements

Motion.
Actions in video Monday, April 25 Kristen Grauman UT-Austin.
Computer Vision Optical Flow
Automatic Image Alignment (direct) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and Rick.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Optical Flow Methods 2007/8/9.
Lecture 9 Optical Flow, Feature Tracking, Normal Flow
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Announcements Project 1 test the turn-in procedure this week (make sure your folder’s there) grading session next Thursday 2:30-5pm –10 minute slot to.
Caltech, Oct Lihi Zelnik-Manor
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Computing in Image Analysis
Optical Flow Estimation
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
1 Stanford CS223B Computer Vision, Winter 2006 Lecture 7 Optical Flow Professor Sebastian Thrun CAs: Dan Maynes-Aminzade, Mitul Saha, Greg Corrado Slides.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
Optical Flow Digital Photography CSE558, Spring 2003 Richard Szeliski (notes cribbed from P. Anandan)
Announcements Project1 due Tuesday. Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Supplemental:
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Motion and optical flow Thursday, Nov 20 Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys, S. Lazebnik.
1 Activity and Motion Detection in Videos Longin Jan Latecki and Roland Miezianko, Temple University Dragoljub Pokrajac, Delaware State University Dover,
Optical flow Combination of slides from Rick Szeliski, Steve Seitz, Alyosha Efros and Bill Freeman.
Feature Tracking and Optical Flow
Video Analysis Mei-Chen Yeh May 29, Outline Video representation Motion Actions in Video.
Feature Tracking and Optical Flow Computer Vision ECE 5554 Virginia Tech Devi Parikh 11/12/13 Slides borrowed from Derek Hoiem, who adapted many from Lana.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
The Measurement of Visual Motion P. Anandan Microsoft Research.
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
CSE 185 Introduction to Computer Vision Feature Tracking and Optical Flow.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Uses of Motion 3D shape reconstruction Segment objects based on motion cues Recognize events and activities Improve video quality Track objects Correct.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #16.
3D Imaging Motion.
Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Segmentation of Vehicles in Traffic Video Tun-Yu Chiang Wilson Lau.
Miguel Tavares Coimbra
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Feature Tracking and Optical Flow Computer Vision ECE 5554, ECE 4984 Virginia Tech Devi Parikh 11/10/15 Slides borrowed from Derek Hoiem, who adapted many.
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Suspicious Behavior in Outdoor Video Analysis - Challenges & Complexities Air Force Institute of Technology/ROME Air Force Research Lab Unclassified IED.
1 Lucas-Kanade Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion estimation Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/12 with slides by Michael Black and P. Anandan.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
Image Motion. The Information from Image Motion 3D motion between observer and scene + structure of the scene –Wallach O’Connell (1953): Kinetic depth.
Motion estimation Parametric motion (image alignment) Tracking Optical flow.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
Feature Tracking and Optical Flow
CPSC 641: Image Registration
Motion and Optical Flow
CSE 577 Image and Video Analysis
Image gradients and edges
Range Imaging Through Triangulation
Motion Estimation Today’s Readings
Announcements more panorama slots available now
Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Announcements Questions on the project? New turn-in info online
Coupled Horn-Schunck and Lukas-Kanade for image processing
Announcements more panorama slots available now
23 April 2019.
Optical flow Computer Vision Spring 2019, Lecture 21
Open Topics.
Optical flow and keypoint tracking
Presentation transcript:

Motion and optical flow Thurs Feb 2, 2017 Kristen Grauman UT Austin

Announcements A1 due tomorrow, Friday Due to AAAI travel Office hours Tues Feb 7 cancelled (by appt) Lecture Tues is ON as normal

Last time Texture is a useful property that is often indicative of materials, appearance cues Texture representations attempt to summarize repeating patterns of local structure Filter banks useful to measure redundant variety of structures in local neighborhood Feature spaces can be multi-dimensional Neighborhood statistics can be exploited to “sample” or synthesize new texture regions Example-based technique

Today Optical flow: estimating motion in video Background subtraction

Video A video is a sequence of frames captured over time Now our image data is a function of space (x, y) and time (t)

Uses of motion Estimating 3D structure Segmenting objects based on motion cues Learning dynamical models Recognizing events and activities Improving video quality (motion stabilization)

Motion field The motion field is the projection of the 3D scene motion into the image

Motion parallax http://psych.hanover.edu/KRANTZ/MotionParallax/MotionParallax.html

Motion field + camera motion Length of flow vectors inversely proportional to depth Z of 3d point points closer to the camera move more quickly across the image plane Figure from Michael Black, Ph.D. Thesis

Motion field + camera motion Zoom out Zoom in Pan right to left

Motion estimation techniques Direct methods Directly recover image motion at each pixel from spatio-temporal image brightness variations Dense motion fields, but sensitive to appearance variations Suitable for video and when image motion is small Feature-based methods Extract visual features (corners, textured areas) and track them over multiple frames Sparse motion fields, but more robust tracking Suitable when image motion is large (10s of pixels)

Optical flow Definition: optical flow is the apparent motion of brightness patterns in the image Ideally, optical flow would be the same as the motion field Have to be careful: apparent motion can be caused by lighting changes without any actual motion

Apparent motion != motion field Figure from Horn book

Problem definition: optical flow How to estimate pixel motion from image H to image I? Solve pixel correspondence problem given a pixel in H, look for nearby pixels of the same color in I Key assumptions color constancy: a point in H looks the same in I For grayscale images, this is brightness constancy small motion: points do not move very far This is called the optical flow problem Slide credit: Steve Seitz

Brightness constancy Figure by Michael Black

Optical flow constraints Let’s look at these constraints more closely brightness constancy: Q: what’s the equation? small motion: Slide credit: Steve Seitz

Optical flow equation Combining these two equations Slide credit: Steve Seitz

Optical flow equation Q: how many unknowns and equations per pixel? Slide credit: Steve Seitz

The aperture problem Perceived motion

The aperture problem Actual motion

The barber pole illusion http://en.wikipedia.org/wiki/Barberpole_illusion

Solving the aperture problem How to get more equations for a pixel? Spatial coherence constraint: pretend the pixel’s neighbors have the same (u,v) Figure by Michael Black

Solving the aperture problem How to get more equations for a pixel? Spatial coherence constraint: pretend the pixel’s neighbors have the same (u,v) If we use a 5x5 window, that gives us 25 equations per pixel Slide credit: Steve Seitz

Solving the aperture problem Prob: we have more equations than unknowns Solution: solve least squares problem minimum least squares solution given by solution (in d) of: The summations are over all pixels in the K x K window This technique was first proposed by Lucas & Kanade (1981) Slide credit: Steve Seitz

Conditions for solvability When is this solvable? ATA should be invertible ATA should not be very small eigenvalues l1 and l2 of ATA should not be very small ATA should be well-conditioned l1/ l2 should not be too large (l1 = larger eigenvalue) Slide by Steve Seitz, UW

Edge gradients very large or very small large l1, small l2 Slide credit: Steve Seitz

Low-texture region gradients have small magnitude small l1, small l2 Slide credit: Steve Seitz

High-texture region gradients are different, large magnitudes large l1, large l2 Slide credit: Steve Seitz

Example use of optical flow: facial animation http://www.fxguide.com/article333.html

Example use of optical flow: Motion Paint Use optical flow to track brush strokes, in order to animate them to follow underlying scene motion. http://www.fxguide.com/article333.html

Motion estimation techniques Direct methods Directly recover image motion at each pixel from spatio-temporal image brightness variations Dense motion fields, but sensitive to appearance variations Suitable for video and when image motion is small Feature-based methods Extract visual features (corners, textured areas) and track them over multiple frames Sparse motion fields, but more robust tracking Suitable when image motion is large (10s of pixels)

Motion magnification Liu et al. SIGGRAPH 2005

Fun with flow https://www.youtube.com/watch?v=3YE5tff8pqg http://www.youtube.com/watch?v=TbJrc6QCeU0&feature=related http://www.youtube.com/watch?v=pckFacsIWg4

Today Optical flow: estimating motion in video Background subtraction

Video as an “Image Stack” 255 time t Can look at video data as a spatio-temporal volume If camera is stationary, each line through time corresponds to a single ray in space Alyosha Efros, CMU

Input Video Alyosha Efros, CMU

Average Image Alyosha Efros, CMU

Slide credit: Birgi Tamersoy

Background subtraction Simple techniques can do ok with static camera …But hard to do perfectly Widely used: Traffic monitoring (counting vehicles, detecting & tracking vehicles, pedestrians), Human action recognition (run, walk, jump, squat), Human-computer interaction Object tracking

Slide credit: Birgi Tamersoy

Slide credit: Birgi Tamersoy

Slide credit: Birgi Tamersoy

Slide credit: Birgi Tamersoy

Frame differences vs. background subtraction Toyama et al. 1999

Slide credit: Birgi Tamersoy

Average/Median Image Alyosha Efros, CMU

Background Subtraction - = Alyosha Efros, CMU

Pros and cons When will this basic approach fail? Advantages: Extremely easy to implement and use! All pretty fast. Corresponding background models need not be constant, they change over time. Disadvantages: Accuracy of frame differencing depends on object speed and frame rate Median background model: relatively high memory requirements. Setting global threshold Th… When will this basic approach fail?

Background mixture models Idea: model each background pixel with a mixture of Gaussians; update its parameters over time. Adaptive Background Mixture Models for Real-Time Tracking, 1999, Chris Stauer & W.E.L. Grimson

So far: features and filters Transforming images; gradients, textures, edges, flow Slide credit: Kristen Grauman