COMP 9517 Computer Vision Motion 7/21/2018 COMP 9517 S2, 2012.

Slides:



Advertisements
Similar presentations
Investigation Into Optical Flow Problem in the Presence of Spatially-varying Motion Blur Mohammad Hossein Daraei June 2014 University.
Advertisements

December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Computer Vision Optical Flow
A New Block Based Motion Estimation with True Region Motion Field Jozef Huska & Peter Kulla EUROCON 2007 The International Conference on “Computer as a.
3D Computer Vision and Video Computing 3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2011 Cover Image/video credits: Rick Szeliski, MSR Zhigang.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Motion in 2D image sequences Definitely used in human vision Object detection and tracking Navigation and obstacle avoidance Analysis of actions or.
Optical Flow Methods 2007/8/9.
E.G.M. PetrakisDynamic Vision1 Dynamic vision copes with –Moving or changing objects (size, structure, shape) –Changing illumination –Changing viewpoints.
Segmentation Divide the image into segments. Each segment:
CSc83029 – 3-D Computer Vision/ Ioannis Stamos 3-D Computational Vision CSc Optical Flow & Motion The Factorization Method.
CSSE463: Image Recognition Day 30 Due Friday – Project plan Due Friday – Project plan Evidence that you’ve tried something and what specifically you hope.
Motion Computing in Image Analysis
Stockman MSU Fall Computing Motion from Images Chapter 9 of S&S plus otherwork.
Optical Flow Estimation using Variational Techniques Darya Frolova.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
1 Motion in 2D image sequences Definitely used in human vision Object detection and tracking Navigation and obstacle avoidance Analysis of actions or.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Image Subtraction for Real Time Moving Object Extraction Shahbe Mat Desa, Qussay A. Salih, CGIV’04.
3D Computer Vision and Video Computing 3D Vision Topic 8 of Part 2 Visual Motion (II) CSC I6716 Spring 2004 Zhigang Zhu, NAC 8/203A
1 Video Surveillance systems for Traffic Monitoring Simeon Indupalli.
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
Exploiting video information for Meeting Structuring ….
CSSE463: Image Recognition Day 30 This week This week Today: motion vectors and tracking Today: motion vectors and tracking Friday: Project workday. First.
Recognizing Action at a Distance Alexei A. Efros, Alexander C. Berg, Greg Mori, Jitendra Malik Computer Science Division, UC Berkeley Presented by Pundik.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
3D Imaging Motion.
Computer Vision Introduction to Digital Images.
Efficient Visual Object Tracking with Online Nearest Neighbor Classifier Many slides adapt from Steve Gu.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Jack Pinches INFO410 & INFO350 S INFORMATION SCIENCE Computer Vision I.
Segmentation of Vehicles in Traffic Video Tun-Yu Chiang Wilson Lau.
 Present by 陳群元.  Introduction  Previous work  Predicting motion patterns  Spatio-temporal transition distribution  Discerning pedestrians  Experimental.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
Video Surveillance Under The Guidance of Smt. D.Neelima M.Tech., Asst. Professor Submitted by G. Subrahmanyam Roll No: 10021F0013 M.C.A.
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
CSSE463: Image Recognition Day 29 This week This week Today: Surveillance and finding motion vectors Today: Surveillance and finding motion vectors Tomorrow:
CS 376b Introduction to Computer Vision 03 / 31 / 2008 Instructor: Michael Eckmann.
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Edge Segmentation in Computer Images CSE350/ Sep 03.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
Processing Images and Video for An Impressionist Effect Automatic production of “painterly” animations from video clips. Extending existing algorithms.
CSSE463: Image Recognition Day 29
COMP 9517 Computer Vision Motion and Tracking 6/11/2018
COMP 9517 Computer Vision Segmentation 7/2/2018 COMP 9517 S2, 2017.
Motion and Optical Flow
Eye Detection and Gaze Estimation
Fast and Robust Object Tracking with Adaptive Detection
Range Imaging Through Triangulation
Presented by: Cindy Yan EE6358 Computer Vision
CSSE463: Image Recognition Day 29
Announcements more panorama slots available now
Image and Video Processing
Image and Video Processing
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 29
Coupled Horn-Schunck and Lukas-Kanade for image processing
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 29
Announcements more panorama slots available now
Optical flow Computer Vision Spring 2019, Lecture 21
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 29
Optical flow and keypoint tracking
Presentation transcript:

COMP 9517 Computer Vision Motion 7/21/2018 COMP 9517 S2, 2012

Introduction A changing scene may be observed via a sequence of images 7/21/2018 COMP 9517 S2, 2012

Introduction Changes in an image sequence provide features for detecting objects that are moving computing their trajectories computing the motion of the viewer in the world recognising objects based on their behaviours detecting and recognising activities 7/21/2018 COMP 9517 S2, 2012

Applications Motion-based recognition: human identification based on gait, automatic object detection, etc; Automated surveillance: monitoring a scene to detect suspicious activities or unlikely events; Video indexing: automatic annotation and retrieval of the videos in multimedia databases; Human-computer interaction: gesture recognition, eye gaze tracking for data input to computers, etc.; Traffic monitoring: real-time gathering of traffic statistics to direct traffic flow. Vehicle navigation: video-based path planning and obstacle avoidance capabilities. 7/21/2018 COMP 9517 S2, 2012

Motion Phenomena Still camera, single moving object, constant background Still camera, several moving objects, constant background Moving camera, relatively constant scene Moving camera, several moving objects 7/21/2018 COMP 9517 S2, 2012

Image Subtraction Detecting an object moving across a constant background The forward and rear edges of the object advance only a few pixels per frame By subtracting the image It from the previous image It-1, there edges should be evident as the only pixels significantly different from zero 7/21/2018 COMP 9517 S2, 2012

Image Subtraction Change Detect Input: images It and It-Δ (or a model image) Input: an intensity threshold τ Output: a binary image Iout Output: a set of bounding boxes B For all pixels [r, c] in the input images, set Iout[r, c] = 1 if (|It[r, c] –It-Δ[r, c]|>τ) set Iout[r, c] = 0 otherwise 2. Perform connected components extraction on Iout Remove small regions assuming they are noise Perform a closing of Iout using a small disk to fuse neighbouring regions Compute the bounding boxes of all remaining regions of changed pixels Return Iout[r, c] and the bounding boxes B of regions of changed pixels 7/21/2018 COMP 9517 S2, 2012

Image Subtraction The steps: Derive a background image from a set of video frames at the beginning of the video sequence 7/21/2018 COMP 9517 S2, 2012

Image Subtraction The steps: The background image is then subtracted from each subsequent frame to create a difference image 7/21/2018 COMP 9517 S2, 2012

Image Subtraction The steps: Enhance the difference image to fuse neighbouring regions and remove noise 7/21/2018 COMP 9517 S2, 2012

Motion Vector Motion field: a 2-D array of 2-D vectors representing the motion of 3-D scene points The motion vector in the image represents the displacements of the images of moving 3-D points Tail at time t and head at time t+Δ Instantaneous velocity estimate at time t 7/21/2018 COMP 9517 S2, 2012

Motion Vector Two assumption: The intensity of a 3-D scene point and that of its neighbours remain nearly constant during the time interval The intensity differences observed along the images of the edges of objects are nearly constant during the time interval Image flow: the motion field computed under the assumption that image intensity near corresponding points is relatively constant Two methods for computing image flow: Sparse: point correspondence-based method Dense: spatial & temporal gradient-based method 7/21/2018 COMP 9517 S2, 2012

Point Correspondence-based Method A sparse motion field can be computed by identifying pairs of points that correspond in two images taken at times ti and ti+Δ Two steps: Detect interesting points Corner points Centroids of persistent moving regions Search corresponding points 7/21/2018 COMP 9517 S2, 2012

Point Correspondence-based Method Detect interesting points: Detect corner points Kirsch edge operator Frie-Chen ripple operator Interest operator Computes intensity variance in the vertical, horizontal and diagonal directions Interest point if the minimum of these four variances exceeds a threshold 7/21/2018 COMP 9517 S2, 2012

Point Correspondence-based Method Finding interesting points of a given input image Procedure detect_corner_points(I,V){ for (r = 0 to MaxRow – 1) for (c = 0 to MaxCol – 1) if (I[r,c] is a border pixel) break; elseif (interest_operator(I,r,c,w)>=t) add [(r,c),(r,c)] to set V; } Procedure interest_opertator (I,r,c,w){ v1 = variance of intensity of horizontal pixels I[r,c-w]…I[r,c-w]; v2 = variance of intensity of vertical pixels I[r-w,c]…I[r+w,]; v3 = variance of intensity of diagonal pixels I[r-w,c-w]…I[r+w,c+w]; v4 = variance of intensity of diagonal pixels I[r-w,c+w]…I[r+w,c-w]; return mini(v1, v2, v3, v4); 7/21/2018 COMP 9517 S2, 2012

Point Correspondence-based Method Search corresponding points: Given an interesting point Pj from I1, we take its neighbourhood in I1 and find the best correlating neighbourhood in I2 under the assumption that the amount of movement is limited 7/21/2018 COMP 9517 S2, 2012

Spatial & Temporal Gradient-based Method Assumption The object reflectivity and the illumination of the object do not change during the time interval The distance of the object from the camera or light sources do not vary significantly over this interval Each small intensity neighbourhood Nx,y at time t1 is observed in some shifted position Nx+Δx,y+Δy at time t2 These assumption may be not hold tight in real case, but provides useful computation and approximation 7/21/2018 COMP 9517 S2, 2012

Spatial & Temporal Gradient-based Method Optical flow equation Taylor series : Multivariable version: (1) Assuming the optical flow vector V=[Δx, Δy] carries the intensity neighbourhood N1(x, y) at time t1 to an identical intensity neighbourhood N2(x+Δx, y+Δy) at time t2 leads to (2) 7/21/2018 COMP 9517 S2, 2012

Spatial & Temporal Gradient-based Method Optical flow equation By combining (1) and (2) and ignoring the high order term, a linear constraint can be developed: V=(Vx,Vy ) is the velocity or optical flow of f(x,y,t) 7/21/2018 COMP 9517 S2, 2012

Spatial & Temporal Gradient-based Method The optical flow equation provides a constraint that can be applied at every pixel position However, the optical flow does not give unique solution and thus further constrains are required Example: using the optical flow equation for a group of adjacent pixels and assuming that all of them have the same velocity, the optical flow computation task is reduced to solving a linear system using the least square method 7/21/2018 COMP 9517 S2, 2012

References and Acknowledgements Shapiro and Stockman 2001 Chapter 19 Forsyth and Ponce 2003 Chapter 5 Szeliski 2010 Images drawn from the above references unless otherwise mentioned 7/21/2018 COMP 9517 S2, 2012