Recovering observer motion

Slides:



Advertisements
Similar presentations
Figure 1 ; Video File. Figure 2 : Anatomical orthogonal coordinate system Z Y +RY : Left Axial Rotation -RY: Right Axial Rotation +RZ : Right Lateral.
Advertisements

Elementary 3D Transformations - a "Graphics Engine" Transformation procedures Transformations of coordinate systems Translation Scaling Rotation.
M.Sc. CNS Visual Perception Optic Flow Stan Gielen Dept. of Biophysics.
02-1 Physics I Class 02 Two-Dimensional Motion One-Dimensional Motion with Constant Acceleration - Review.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
Computational Architectures in Biological Vision, USC, Spring 2001
3D Motion Estimation. 3D model construction Video Manipulation.
Summary of UCB MURI workshop on vector magnetograms Have picked 2 observed events for targeted study and modeling: AR8210 (May 1, 1998), and AR8038 (May.
Motion from normal flow. Optical flow difficulties The aperture problemDepth discontinuities.
3D Computer Vision and Video Computing 3D Vision Topic 8 of Part 2 Visual Motion (II) CSC I6716 Spring 2004 Zhigang Zhu, NAC 8/203A
2D Transformations x y x y x y. 2D Transformation Given a 2D object, transformation is to change the object’s Position (translation) Size (scaling) Orientation.
Computer vision.
AIDA Infrastructure for very forward calorimeters Design proposal Mechanical support structure Designers:In collaboration with: Eric DavidFrancois-Xavier.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
1-1 Measuring image motion velocity field “local” motion detectors only measure component of motion perpendicular to moving edge “aperture problem” 2D.
1 Computational Vision CSCI 363, Fall 2012 Lecture 31 Heading Models.
Computer Vision, Robert Pless Lecture 11 our goal is to understand the process of multi-camera vision. Last time, we studies the “Essential” and “Fundamental”
Transformations Day 1: Graphing. Vocabulary Transformations – mapping of a figure on the coordinate plane. 1) Reflection: Mirror image x-axis (x,y) →(x,-y)
Course 11 Optical Flow. 1. Concept Observe the scene by moving viewer Optical flow provides a clue to recover the motion. 2. Constraint equation.
CS332 Visual Processing Department of Computer Science Wellesley College Analysis of Motion Measuring image motion.
Geometrical Transformations 2 Adapted from Fundamentals of Interactive Computer Graphics, Foley and van Dam, pp , by Geb Thomas.
CS332 Visual Processing Department of Computer Science Wellesley College Analysis of Motion Recovering observer motion.
Active Vision Sensor Planning of CardEye Platform Sherif Rashad, Emir Dizdarevic, Ahmed Eid, Chuck Sites and Aly Farag ResearchersSponsor.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
Cylinders and Quadratic Surfaces A cylinder is the continuation of a 2-D curve into 3-D.  No longer just a soda can. Ex. Sketch the surface z = x 2.
CS332 Visual Processing Department of Computer Science Wellesley College Analysis of Motion Recovering 3-D structure from motion.
Copyright © 2012 Elsevier Inc. All rights reserved.. Chapter 19 Motion.
True FOE Computed FOE Recovering the observer’s rotation Velocity component due to observer’s translation Velocity component due to observer’s rotation.
CH Review -- how electric and magnetic fields are created Any charged particle creates an electric field at all points in space around it. A moving.
1.Optical Flow 2.LogPolar Transform 3.Inertial Sensor 4.Corner Detection 5. Feature Tracking 6.Lines.
Ta: Ryan Freedman Sushma Kini Chi Zhou.  Pipeline  Clipping  Transformation, homogeneous coordinates  Lighting  Perspective  Viewing.
Flight Simulator Overview Flight Compartment Host Computer Motion Control Cabinet Motion Platform 13/6/2016 Visual Display Visual Image Generator Interface.
Dynamic Perspective How a moving camera reveals scene depth and egomotion parameters Jitendra Malik UC Berkeley.
1 Computational Vision CSCI 363, Fall 2012 Lecture 29 Structure from motion, Heading.
Optical Snow and the Aperture Problem Michael Langer Richard Mann School of Computer Science McGill U. U. Waterloo.
CSCE 441 Computer Graphics: 2D Transformations
Chapter 2 – kinematics of particles Tuesday, September 8, 2015: Class Lecture 6 Today’s Objective: Polar Coordinates Relative motion – Translating Axes.
3D Rendering 2016, Fall.
렌즈왜곡 관련 논문 - 기반 논문: R.Y. Tsai, An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision. Proceedings of IEEE Conference on Computer.
Viewing Viewing and viewing space (camera space)
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Review: Transformations
Undergrad HTAs / TAs Help me make the course better!
Estimating Parametric and Layered Motion
Image-Based Rendering
(Constant acceleration)
3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2009
Review: Transformations
Visual Inspection Planning with Sensor Constraint Graph
3D Graphics Rendering PPT By Ricardo Veguilla.
Three-Dimensional Graphics
Mechanical Engineering
3D Motion Estimation.
ECE 383/ME 442: Intro to Robotics and Automation
Two-Dimensional Motion
Computing the velocity field
Last Time: Vectors Introduction to Two-Dimensional Motion Today:
Physics-based simulation for visual computing applications
Multiple View Geometry for Robotics
Three-Dimensional Graphics
Two-view geometry.
Measuring image motion
Two-view geometry.
Unit 3 - Energy Learning Target 3.2 – Be able to use the equations for Potential Energy & Kinetic Energy.
Neural Mechanisms of Visual Motion Perception in Primates
Optical Snow and the Aperture Problem
Tracking through Optical Snow
How does a cannonball fly?
Optical Snow and the Aperture Problem
3D Vision Lecture 5. Visual Motion CSc83300 Spring 2006
Presentation transcript:

Recovering observer motion Analysis of Motion Recovering observer motion

Recovering 3D observer motion & layout FOE: focus of expansion

Application: Automated driving systems DARPA Grand Challenge https://www.wired.com/story/darpa-grand-urban-challenge-self-driving-car/

Observer motion problem From image motion, compute:  observer translation (Tx Ty Tz)  observer rotation (Rx Ry Rz)  depth at every location Z(x, y)

Human perception of heading Warren & colleagues Human accuracy: 1° - 2° visual arc birds’ eye view Observer heading to the left or right of target on horizon?

Observer just translates toward FOE heading point Directions of velocity vectors intersect at FOE But… simple strategy doesn’t work if observer also rotates

Observer Translation + Rotation display simulates observer translation observer rotates their eyes display simulates translation + rotation Still recover heading with high accuracy!

Observer motion problem, revisited From image motion, compute:  Observer translation (Tx Ty Tz)  Observer rotation (Rx Ry Rz)  Depth at every location Z(x,y) Observer undergoes both translation + rotation

Equations of observer motion

Translational component of velocity Where is the FOE? x = y = Example 1: Tx = Ty = 0 Tz = 1 Z = 10 everywhere Vx = Vy = Sketch the velocity field Example 2: Tx = Ty = 2 Tz = 1 Z = 10 everywhere

Longuet-Higgins & Prazdny Along a depth discontinuity, velocity differences depend only on observer translation Velocity differences point to the focus of expansion

Rieger & Lawton’s algorithm (1) At each image location, compute distribution of velocity differences within neighborhood Appearance of sample distributions: (2) Find points with strongly oriented distribution, compute dominant direction (3) Compute focus of expansion from intersection of dominant directions