Lecture 10 Causal Estimation of 3D Structure and Motion

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

Image Registration  Mapping of Evolution. Registration Goals Assume the correspondences are known Find such f() and g() such that the images are best.
Probabilistic Tracking and Recognition of Non-rigid Hand Motion
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Sequence-to-Sequence Alignment and Applications. Video > Collection of image frames.
Introduction To Tracking
Probabilistic Robotics
Probabilistic Robotics SLAM. 2 Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot The SLAM Problem.
(Includes references to Brian Clipp
Vision Based Control Motion Matt Baker Kevin VanDyke.
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Vision-Based Motion Control of Robots
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
1 Face Tracking in Videos Gaurav Aggarwal, Ashok Veeraraghavan, Rama Chellappa.
Monte Carlo Localization
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
The Implicit Mapping into Feature Space. In order to learn non-linear relations with a linear machine, we need to select a set of non- linear features.
Course AE4-T40 Lecture 5: Control Apllication
Introduction to Object Tracking Presented by Youyou Wang CS643 Texas A&M University.
An Integrated Pose and Correspondence Approach to Image Matching Anand Rangarajan Image Processing and Analysis Group Departments of Electrical Engineering.
Overview and Mathematics Bjoern Griesbach
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Vision Guided Robotics
ROBOT MAPPING AND EKF SLAM
TP15 - Tracking Computer Vision, FCUP, 2013 Miguel Coimbra Slides by Prof. Kristen Grauman.
Geometry and Algebra of Multiple Views
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
Probabilistic Robotics Robot Localization. 2 Localization Given Map of the environment. Sequence of sensor measurements. Wanted Estimate of the robot’s.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
Metrology 1.Perspective distortion. 2.Depth is lost.
Young Ki Baik, Computer Vision Lab.
Computer Vision Michael Isard and Dimitris Metaxas.
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
Sparse Bayesian Learning for Efficient Visual Tracking O. Williams, A. Blake & R. Cipolloa PAMI, Aug Presented by Yuting Qi Machine Learning Reading.
1 Assignment, Project and Presentation Mobile Robot Localization by using Particle Filter by Chong Wang, Chong Fu, and Guanghui Luo. Tracking, Mapping.
State Estimation and Kalman Filtering
Robotics Club: 5:30 this evening
Feature Matching. Feature Space Outlier Rejection.
Autonomous Navigation for Flying Robots Lecture 6.3: EKF Example
Ch.9 Bayesian Models of Sensory Cue Integration (Mon) Summarized and Presented by J.W. Ha 1.
Tracking with dynamics
Nonlinear State Estimation
G. Casalino, E. Zereik, E. Simetti, A. Turetta, S. Torelli and A. Sperindè EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
1 Computational Vision CSCI 363, Fall 2012 Lecture 29 Structure from motion, Heading.
CSC2535: Computation in Neural Networks Lecture 7: Independent Components Analysis Geoffrey Hinton.
2D Motion is just the Beginning
Tracking Objects with Dynamics
Real Time Dense 3D Reconstructions: KinectFusion (2011) and Fusion4D (2016) Eleanor Tursman.
Structure from motion Input: Output: (Tomasi and Kanade)
Vehicle Segmentation and Tracking in the Presence of Occlusions
Application Solution: 3D Inspection Automation with SA
Advanced Computer Vision
Online Graph-Based Tracking
A Short Introduction to the Bayes Filter and Related Models
Introduction to Object Tracking
SENSOR BASED CONTROL OF AUTONOMOUS ROBOTS
Structure from motion Input: Output: (Tomasi and Kanade)
Occlusion and smoothness probabilities in 3D cluttered scenes
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Presentation transcript:

Lecture 10 Causal Estimation of 3D Structure and Motion MASKS © 2004 Invitation to 3D vision

machine to INTERACT with the environment VISION as a SENSOR machine to INTERACT with the environment NEED to estimate relative 3D MOTION 3D SHAPE (TASK) REAL-TIME CAUSAL processing representation of SHAPE (only supportive of representation of motion) POINT-FEATURES MASKS © 2004 Invitation to 3D vision

TRADEOFFS SFM EASY CORRESPONDENCE HARD/IMPOSSIBLE LARGE BASELINE MASKS © 2004 Invitation to 3D vision

TRADEOFFS SFM CORRESPONDENCE LARGE BASELINE EASY HARD/IMPOSSIBLE SMALL MASKS © 2004 Invitation to 3D vision

TRADEOFFS SFM CORRESPONDENCE LARGE BASELINE EASY HARD/IMPOSSIBLE GLOBAL SMALL BASELINE HARD/IMPOSSIBLE EASY LOCAL MASKS © 2004 Invitation to 3D vision

WHAT DOES IT TAKE ? SFM CORRESPONDENCE LARGE BASELINE EASY HARD/IMPOSSIBLE GLOBAL SMALL BASELINE HARD/IMPOSSIBLE EASY LOCAL INTEGRATE visual information OVER TIME OCCLUSIONS! MASKS © 2004 Invitation to 3D vision

Setup and notation MASKS © 2004 Invitation to 3D vision Temporal evolution: MASKS © 2004 Invitation to 3D vision

Structure and motion as a filtering problem Given measurements of the “output” (feature point positions) Given modeling assumptions about the “input” (acceleration = noise) Estimate the “state” (3D structure, pose, velocity) MASKS © 2004 Invitation to 3D vision

Model is non-linear (output map = projection) Difficulties … Model is non-linear (output map = projection) State-space is non-linear! (SE(3)) Noise: need to specify what we mean by “estimate” Even without noise: model is not observable! MASKS © 2004 Invitation to 3D vision

Observability Equivalent class of state-space trajectories generate the same measurements Fix, e.g., the direction of 3 points, and the depth of one point (Gauge transformation) MASKS © 2004 Invitation to 3D vision

Local coordinatization of the state space MASKS © 2004 Invitation to 3D vision

Minimal realization Now it looks very much like: And we are looking for (a point statistic of): MASKS © 2004 Invitation to 3D vision

For single rigid body/static scene, expect unimodal posterior EFK vs. particle filter? For single rigid body/static scene, expect unimodal posterior No need to estimate entire density; point estimate suffices Robust (M-) version of EKF works well in practice … … and in real time for a few hundred feature points MASKS © 2004 Invitation to 3D vision

Adding/removing features (subfilters) In practice … Adding/removing features (subfilters) Multiple motions/outliers (M-filter, innovation tests) Tracking drift (reset with wide-baseline matching) Switching the reference features (hard! Causes unavoidable global drift) Global registration (maintain DB of lost features) MASKS © 2004 Invitation to 3D vision

MASKS © 2004 Invitation to 3D vision