Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Mobile Robot Localization and Mapping using the Kalman Filter
Registration for Robotics Kurt Konolige Willow Garage Stanford University Patrick Mihelich JD Chen James Bowman Helen Oleynikova Freiburg TORO group: Giorgio.
EKF, UKF TexPoint fonts used in EMF.
Probabilistic Robotics
Multiple View Reconstruction Class 24 Multiple View Geometry Comp Marc Pollefeys.
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
Probabilistic Robotics SLAM. 2 Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot The SLAM Problem.
Robot Localization Using Bayesian Methods
Structure from motion Class 9 Read Chapter 5. 3D photography course schedule (tentative) LectureExercise Sept 26Introduction- Oct. 3Geometry & Camera.
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Sam Pfister, Stergios Roumeliotis, Joel Burdick
Structure from motion.
Active SLAM in Structured Environments Cindy Leung, Shoudong Huang and Gamini Dissanayake Presented by: Arvind Pereira for the CS-599 – Sequential Decision.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Vision-Based Motion Control of Robots
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Kalman Filtering Jur van den Berg. Kalman Filtering (Optimal) estimation of the (hidden) state of a linear dynamic process of which we obtain noisy (partial)
Multiple View Reconstruction Class 23 Multiple View Geometry Comp Marc Pollefeys.
Computer Vision Linear Tracking Jan-Michael Frahm COMP 256 Some slides from Welch & Bishop.
CSCE 641 Computer Graphics: Image-based Modeling (Cont.) Jinxiang Chai.
Lecture 12: Structure from motion CS6670: Computer Vision Noah Snavely.
SLAM: Simultaneous Localization and Mapping: Part II BY TIM BAILEY AND HUGH DURRANT-WHYTE Presented by Chang Young Kim These slides are based on: Probabilistic.
Overview and Mathematics Bjoern Griesbach
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
ROBOT MAPPING AND EKF SLAM
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
Kalman filter and SLAM problem
Itamar Kahn, Thomas Lin, Yuval Mazor
SLAM (Simultaneously Localization and Mapping)
/09/dji-phantom-crashes-into- canadian-lake/
3D SLAM for Omni-directional Camera
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Young Ki Baik, Computer Vision Lab.
Karman filter and attitude estimation Lin Zhong ELEC424, Fall 2010.
Human-Computer Interaction Kalman Filter Hanyang University Jong-Il Park.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh.
State Estimation and Kalman Filtering
Bundle Adjustment A Modern Synthesis Bill Triggs, Philip McLauchlan, Richard Hartley and Andrew Fitzgibbon Presentation by Marios Xanthidis 5 th of No.
An Introduction To The Kalman Filter By, Santhosh Kumar.
Visual Odometry David Nister, CVPR 2004
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Object Tracking - Slide 1 Object Tracking Computer Vision Course Presentation by Wei-Chao Chen April 05, 2000.
Using Kalman Filter to Track Particles Saša Fratina advisor: Samo Korpar
Tracking with dynamics
Extended Kalman Filter
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
1 Long-term image-based motion estimation Dennis Strelow and Sanjiv Singh.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
SLAM : Simultaneous Localization and Mapping
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Simultaneous Localization and Mapping
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Probabilistic Robotics
Extended Kalman Filter
Kalman Filtering COS 323.
Extended Kalman Filter
Nome Sobrenome. Time time time time time time..
Presentation transcript:

Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM

Overview SFM and SLAM Extended Kalman filter Visual SLAM details Results Next

Estimating structure and motion Factorization [Tomasi & Kanade ’92] –Batch method –Efficient –Originally for affine camera –Missing data? –Finite camera [Sturm & Triggs] W = MX

Estimating structure and motion Reconstruction from N views [Hartley & Zisserman ’00] –Multiview geomteric entities and algorithms described by Faugeras, H, Z, and others –Minimize global error with bundle adjustment –Can be used sequentially –Upgrade to Euclidean with auto calibration x  F  P  Xx  F  P  X

SLAM Simultaneous Localisation And Mapping Estimate robot’s pose and map feature positions Probabilistic framework maintains –current estimate –estimate uncertainty (covariance) Update based on measurements and model Many systems use –odometry and active sensors as measurement devices –limited motion models

Vision-based SLAM Camera for measurements Trinocular –3D measurements by triangulation –Offline [Ayache, Faugeras ’89] –Real-time with SIFTs [Se, Lowe, Little ’01] Real-time monocular [Chiuso et al. ’00]

Kalman filter [Swerling ’58] [Welch, Bishop ’01] Estimates state of dynamic system Integrates noisy measurements to give optimal estimate Noise is Gaussian First order Markov process

KF: key variables estimate of state at time k error covariance (estimate uncertainty) state transition function measurement state to measure noise covariances

KF: Two phase estimation Predict –Predicted state –Predicted covariance

KF: Two phase estimation Update –Innovation –Innov. covar. –Kalman gain –State –Covariance

EKF: Extended Kalman filter Allow non-linear functions (F, H) Apply functions to state Apply jacobian to covariances Linearizing functions around current estimate

Visual SLAM details [Davison ’03] State representation x, P Process model F (motion) Measurement model H (projection) State update System initialization Adding and removing feature

State representation Scene structure (feature points) –Depth from reference image [Azarbayejani, Pentland ’95] –x,y,z coordinates Camera –Pose –Motion

State estimate vector Points y i Camera x v –6DOF pose –Constant velocity motion model –Acceleration modeled as noise

Covariance matrix Covariance blocks –P xx camera params –P y i y i point I Off diagonals represent correlation between estimates

Process model Points don’t move: y k = y k-1 Add velocity and acceleration to current camera parameters Covariance updated using jacobian

Measurement model H models projection of the predicted points by the predicted camera Covariance S i guides feature match search

Making measurements / Update Project innovation covariance to search ellipse Warp template based on camera and point prediction If viewing angle is good, match to get measurement Compute Kalman gain and update state and covariance

System initialization Need initial estimate and covariance –Calibration object –SFM Process covariance –Small: small searches, but can only handle small accelerations –Large: can handle big accelerations, but need many measurements Measurement covariance –Function of matching method (camera resolution)

Adding and removing features Add –Select salient feature in desired region –Search along epipolar line Remove –If matching repeatedly fails Davison ’03

Preliminary results Simulation [implemented with Birkbeck] –Behaves according to model –Initial estimate of camera and 4 key points is true value + small amount of noise –Initial estimate of other points is true value + significant noise –Initial covariance is scaled identity

Simulation

Adding points

Simulation with visibility

Next Real images (video sequence) –Feature matching –Tracking –SIFTs ? Real-time issues –Postponement [Davison ’01] Loop closing –Davison’s system automatically corrects if feature becomes visible and is correctly measured, but… –Prevent drift by incorporating explicit loop closing [Newman, Ho ’05]

References