Daniel Shepard and Todd Humphreys

Slides:



Advertisements
Similar presentations
EU funded FP7: Oct 11 – Sep 14 Co-evolution of Future AR Mobile Platforms Paul Chippendale, Bruno Kessler Foundation FBK, Italy.
Advertisements

For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Millimeter-accurate Augmented Reality enabled by Carrier-Phase Differential GPS Ken Pesyna, Daniel Shepard, Todd Humphreys ION GNSS 2012 Conference, Nashville,
Position and Attitude Determination using Digital Image Processing Sean VandenAvond Mentors: Brian Taylor, Dr. Demoz Gebre-Egziabher A UROP sponsored research.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Accurate Non-Iterative O( n ) Solution to the P n P Problem CVLab - Ecole Polytechnique Fédérale de Lausanne Francesc Moreno-Noguer Vincent Lepetit Pascal.
Attitude Estimation Thomas Bak Institute of Electronic Systems
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Ubiquitous Navigation
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
1 Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Joseph Djugash Sanjiv Singh George Kantor Wei Zhang Carnegie Mellon University.
Towards Helicopter Tracking CS223b Project #20 Ritchie Lee Yu-Tai Ray Chen.
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
InerVis Mobile Robotics Laboratory Institute of Systems and Robotics ISR – Coimbra Contact Person: Jorge Lobo Human inertial sensor:
Overview and Mathematics Bjoern Griesbach
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
TAXISAT PROJECT Low Cost GNSS and Computer Vision based data fusion solution for driverless vehicles Marc POLLINA
The 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems October 11-15, 2009 St. Louis, USA.
Indoor positioning and navigation with camera phones A. Mulloni et al., Graz Univ. of Tech., IEEE Pervasive Computing, pp.
Satellites in Our Pockets: An Object Positioning System using Smartphones Justin Manweiler, Puneet Jain, Romit Roy Choudhury TsungYun
Airborne Attitude Determination and Ground Target Location Using GPS Information and Vision Technique Shan-Chih Hsieh, Luke K.Wang, Yean-Nong Yang †,Fei-Bin.
Real-time Dense Visual Odometry for Quadrocopters Christian Kerl
Shape Recognition and Pose Estimation for Mobile Augmented Reality Author : N. Hagbi, J. El-Sana, O. Bergig, and M. Billinghurst Date : Speaker.
SLAM (Simultaneously Localization and Mapping)
3D Fingertip and Palm Tracking in Depth Image Sequences
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
Consistent Visual Information Processing Axel Pinz EMT – Institute of Electrical Measurement and Measurement Signal Processing TU Graz – Graz University.
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
David Wheeler Kyle Ingersoll EcEn 670 December 5, 2013 A Comparison between Analytical and Simulated Results The Kalman Filter: A Study of Covariances.
Chapter 5: Spatial Cognition Slide Template. FRAMES OF REFERENCE.
Sketch­based interface on a handheld augmented reality system Rhys Moyne Honours Minor Thesis Supervisor: Dr. Christian Sandor.
Vision-based Landing of an Unmanned Air Vehicle
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
Monitoring, Modelling, and Predicting with Real-Time Control Dr Ian Oppermann Director, CSIRO ICT Centre.
IMPROVE THE INNOVATION Today: High Performance Inertial Measurement Systems LI.COM.
Asian Institute of Technology
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
Augmented Reality Tyler Ocwieja.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.
© Anselm Spoerri Lecture 9 Augmented Reality –Definition –Visualization Approaches –Applications –Tools and Links.
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
Looking at people and Image-based Localisation Roberto Cipolla Department of Engineering Research team
State Estimation for Autonomous Vehicles
Tightly-Coupled Opportunistic Navigation for Deep Urban and Indoor Positioning Ken Pesyna, Zak Kassas, Jahshan Bhatti, and Todd Humphreys Presentation.
Constructing a Continuous Phase Time History from TDMA Signals for Opportunistic Navigation Ken Pesyna, Zak Kassas, Todd Humphreys IEEE/ION PLANS Conference,
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
Visual Odometry David Nister, CVPR 2004
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
What is Multimedia Anyway? David Millard and Paul Lewis.
Augmented Reality Services based on Embedded Metadata Byoung-Dai Lee Department of Computer Science, Kyonggi University, Suwon, Korea Abstract.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.
UAV for Indoors Inspection and Measurement
CS 4501: Introduction to Computer Vision Augmented and Virtual Reality
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Sensors for industrial mobile Robots: environment referred laser scanning Need more/better pictures.
Vision Based Motion Estimation for UAV Landing
Augmented Reality SDK Introduction
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Anastasios I. Mourikis and Stergios I. Roumeliotis
DrillSim July 2005.
Sensor Fusion Localization and Navigation for Visually Impaired People
Presentation transcript:

Daniel Shepard and Todd Humphreys High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase-Based GPS, and Inertial Measurements Daniel Shepard and Todd Humphreys 2014 IEEE/ION PLANS Conference, Monterey, CA | May 8, 2014

Overview Globally-Referenced Visual SLAM Motivating Application: Augmented Reality Estimation Architecture Bundle Adjustment (BA) Simulation Results for BA

Stand-Alone Visual SLAM Produces high-precision estimates of Camera motion (with ambiguous scale for monocular SLAM) A map of the environment Limited in application due to lack of a global reference [1] G. Klein and D. Murray, “Parallel tracking and mapping for small AR workspaces,” in 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. IEEE, 2007, pp. 225–234.

Visual SLAM with Fiduciary Markers Globally-referenced solution if fiduciary markers are globally-referenced Requires substantial infrastructure and/or mapping effort Microsoft’s augmented reality maps (TED2010[2]) [2] B. A. y Arcas, “Blaise Aguera y Arcas demos augmented-reality maps,” TED, Feb. 2010, http://www.ted.com/talks/blaise aguera.html.

Can globally-referenced position and attitude (pose) be recovered from combining visual SLAM and GPS?

Observability of Visual SLAM + GPS No GPS positions Translation Rotation Scale  1 GPS position Translation Rotation Scale   2 GPS positions Translation Rotation Scale  ~ 3 GPS positions Translation Rotation Scale 

Combined Visual SLAM and CDGPS CDGPS anchors visual SLAM to a global reference frame Can add an IMU to improve dynamic performance (not required!) Can be made inexpensive Requires little infrastructure Very Accurate!

Motivating Application: Augmented Reality Augmenting a live view of the world with computer-generated sensory input to enhance one’s current perception of reality[3] Current applications are limited by lack of accurate global pose Potential uses in Construction Real-Estate Gaming Social Media [3] Graham, M., Zook, M., and Boulton, A. "Augmented reality in urban places: contested content and the duplicity of code." Transactions of the Institute of British Geographers. .

Estimation Architecture Motivation Sensors: Camera Two GPS antennas (reference and mobile) IMU How can the information from these sensors best be combined to estimate the camera pose and a map of the environment? Real-time operation Computational burden vs. precision

Sensor Fusion Approach Tighter coupling = higher precision, but increased computational burden IMU Visual SLAM CDGPS IMU Visual SLAM CDGPS IMU Visual SLAM CDGPS IMU Visual SLAM CDGPS

The Optimal Estimator

IMU only for Pose Propagation

Tightly-Coupled Architecture

Loosely-Coupled Architecture

Hybrid Batch/Sequential Estimator Only geographically diverse frames (keyframes) in batch estimator

Bundle Adjustment State and Measurements State Vector: 𝑿 𝐵𝐴 = 𝒄 𝒑 , 𝒄= … 𝒙 𝐺 𝐶 𝑖 𝑇 𝒒 𝐺 𝐶 𝑖 𝑇 … 𝑇 , 𝒑= … 𝒙 𝐺 𝑝 𝑗 𝑇 … 𝑇 Measurement Models: CDGPS Positions: 𝒙 𝐺 𝐴 𝑖 = 𝒉 𝑥 𝒙 𝐺 𝐶 𝑖 , 𝒒 𝐺 𝐶 𝑖 + 𝒘 𝑥 𝑖 = 𝒙 𝐺 𝐶 𝑖 +𝑅 𝒒 𝐺 𝐶 𝑖 𝒙 𝐶 𝐴 + 𝒘 𝑥 𝑖 Image Feature Measurements: 𝒔 𝐼 𝑖 𝑝 𝑗 = 𝒉 𝑠 𝒙 𝐶 𝑖 𝑝 𝑗 + 𝒘 𝐼 𝑖 𝑝 𝑗 = 𝑥 𝐶 𝑖 𝑝 𝑗 𝑧 𝐶 𝑖 𝑝 𝑗 𝑦 𝐶 𝑖 𝑝 𝑗 𝑧 𝐶 𝑖 𝑝 𝑗 𝑇 + 𝒘 𝐼 𝑖 𝑝 𝑗 𝒙 𝐶 𝑖 𝑝 𝑗 = 𝑥 𝐶 𝑖 𝑝 𝑗 𝑦 𝐶 𝑖 𝑝 𝑗 𝑧 𝐶 𝑖 𝑝 𝑗 𝑇 = 𝑅 𝒒 𝐺 𝐶 𝑖 𝑇 ( 𝒙 𝐺 𝑝 𝑗 − 𝒙 𝐺 𝐶 𝑖 )

Bundle Adjustment Cost Minimization Weighted least-squares cost function Employs robust weight functions to handle outliers argmin 𝑿 𝐵𝐴 1 2 𝑖=1 𝑁 Δ 𝒙 𝐺 𝐴 𝑖 2 + 𝑗=1 𝑀 𝑤 𝑉 Δ 𝒔 𝐼 𝑖 𝑝 𝑗 Δ 𝒔 𝐼 𝑖 𝑝 𝑗 2 Δ 𝒙 𝐺 𝐴 𝑖 = 𝑅 𝒙 𝐺 𝐴 𝑖 −1/2 𝒙 𝐺 𝐴 𝑖 − 𝒙 𝐺 𝐴 𝑖 Δ 𝒔 𝐼 𝑖 𝑝 𝑗 = 𝑅 𝒔 𝐼 𝑖 𝑝 𝑗 −1/2 𝒔 𝐼 𝑖 𝑝 𝑗 − 𝒔 𝐼 𝑖 𝑝 𝑗 Sparse Levenberg-Marquart algorithm Computational complexity linear in number of point features, but cubic in number of keyframes

Bundle Adjustment Initialization Initialize BA based on stand-alone visual SLAM solution and CDGPS positions Determine similarity transform relating coordinate systems argmin 𝒙 𝐺 𝑉 , 𝒒 𝐺 𝑉 , 𝑠 1 2 𝑖=1 𝑁 𝒙 𝐺 𝐴 𝑖 − 𝒙 𝐺 𝑉 −𝑅 𝒒 𝐺 𝑉 𝑠 𝒙 𝑉 𝐶 𝑖 +𝑅 𝒒 𝑉 𝐶 𝑖 𝒙 𝐶 𝐴 2 Generalized form of Horn’s transform[4] Rotation: Rotation that best aligns deviations from mean camera position Scale: A ratio of metrics describing spread of camera positions Translation: Difference in mean antenna position [4] B. K. Horn, “Closed-form solution of absolute orientation using unit quaternions,” JOSA A, vol. 4, no. 4, pp. 629–642, 1987.

Simulation Scenario for BA Simulations investigating estimability included in paper Hallway Simulation: Measurement errors: 2 cm std for CDGPS 1 pixel std for vision Keyframes every 0.25 m 242 keyframes 1310 point features Three scenarios: GPS available GPS lost when hallway entered GPS reacquired when hallway exited A D ←C ←B

Simulation Results for BA

Summary Hybrid batch/sequential estimator for loosely-coupled visual SLAM and CDGPS with IMU for state propagation Compared to optimal estimator Outlined algorithm for BA (batch) Presented a novel technique for initialization of BA BA simulations Demonstrated positioning accuracy of ~1 cm and attitude accuracy of ~ 0.1 ∘ in areas of GPS availability Attained slow drift during GPS unavailability (0.4% drift over 50 m)

Navigation Filter State Vector: Propagation Step: 𝑿 𝐹 = 𝒙 𝐺 𝐶 𝑇 𝒗 𝐺 𝐶 𝑇 𝒃 𝐵 𝑓 𝑇 𝒒 𝐺 𝐶 𝑇 𝒃 𝐵 𝜔 𝑇 𝑇 Propagation Step: Standard EKF propagation step using accelerometer and gyro measurements Accelerometer and gyro biases modeled as a first-order Gauss-Markov processes More information in paper …

Navigation Filter (cont.) Measurement Update Step: Image feature measurements from all non-keyframes Temporarily augment the state with point feature positions Prior from map produced by BA Must ignore cross-covariances ⇒ filter inconsistency Similar block diagonal structure in the normal equations as BA 𝑈 𝐹 𝑊 𝐹 𝑊 𝐹 𝑇 𝑉 𝐹 𝛿 𝑿 𝐹 𝛿𝒑 = 𝝐 𝐹 𝝐 𝑝 ⇒ 𝑈 𝐹 − 𝑊 𝐹 𝑉 𝐹 −1 𝑊 𝐹 𝑇 0 𝑊 𝐹 𝑇 𝑉 𝐹 𝛿𝒄 𝛿𝒑 = 𝐼 − 𝑊 𝐹 𝑉 𝐹 −1 0 𝐼 𝝐 𝐹 𝝐 𝑝

Simulation Results for BA (cont.)