1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Reducing Drift in Parametric Motion Tracking
Hybrid Position-Based Visual Servoing
Silvina Rybnikov Supervisors: Prof. Ilan Shimshoni and Prof. Ehud Rivlin HomePage:
(Includes references to Brian Clipp
Chapter 6 Feature-based alignment Advanced Computer Vision.
IR Lab, 16th Oct 2007 Zeyn Saigol
September, School of Aeronautics & Astronautics Engineering Performance of Integrated Electro-Optical Navigation Systems Takayuki Hoshizaki
1 Long-term image-based motion estimation Dennis Strelow.
Motion from image and inertial measurements Dennis Strelow Carnegie Mellon University.
Camera calibration and epipolar geometry
Structure from motion.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Multi video camera calibration and synchronization.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Motion from image and inertial measurements Dennis Strelow Honeywell Advanced Technology Lab.
Motion from image and inertial measurements (additional slides) Dennis Strelow Carnegie Mellon University.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Precise Omnidirectional Camera Calibration Dennis Strelow, Jeffrey Mishler, David Koes, and Sanjiv Singh.
Motion from image and inertial measurements Dennis Strelow Carnegie Mellon University.
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
Motion from image and inertial measurements Dennis Strelow Carnegie Mellon University.
Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang
Single Point of Contact Manipulation of Unknown Objects Stuart Anderson Advisor: Reid Simmons School of Computer Science Carnegie Mellon University.
August, School of Aeronautics & Astronautics Engineering Optical Navigation Systems Takayuki Hoshizaki Prof. Dominick Andrisani.
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Overview and Mathematics Bjoern Griesbach
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
Colorado Center for Astrodynamics Research The University of Colorado STATISTICAL ORBIT DETERMINATION Project Report Unscented kalman Filter Information.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
3D SLAM for Omni-directional Camera
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Source: Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on Author: Paucher, R.; Turk, M.; Adviser: Chia-Nian.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
State Estimation for Autonomous Vehicles
An Introduction To The Kalman Filter By, Santhosh Kumar.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Tracking with dynamics
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
A Low-Cost and Fail-Safe Inertial Navigation System for Airplanes Robotics 전자공학과 깡돌가
G. Casalino, E. Zereik, E. Simetti, A. Turetta, S. Torelli and A. Sperindè EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
1 Long-term image-based motion estimation Dennis Strelow and Sanjiv Singh.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
ASEN 5070: Statistical Orbit Determination I Fall 2014
Motion from image and inertial measurements
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Tracking Objects with Dynamics
Simultaneous Localization and Mapping
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Dynamical Statistical Shape Priors for Level Set Based Tracking
Closing the Gaps in Inertial Motion Tracking
Multiple View Geometry for Robotics
Presentation transcript:

1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University

2 Introduction (1) micro air vehicle (MAV) navigation AeroVironment Black WidowAeroVironment Microbat

3 Introduction (2) mars rover navigation Mars Exploration Rovers (MER)Hyperion

4 Introduction (3) robotic search and rescue Rhex Center for Robot-Assisted Search and Rescue, U. of South Florida

5 Introduction (4) NASA ISS personal satellite assistant

6 Introduction (5) Each of these requires: six degree of freedom motion in unknown environments without GPS or other absolute positioning over the long term …and in many cases… small, light, and cheap sensors

7 Introduction (6) If we adopt a camera as our sensor, estimate… the vehicle (i.e., sensor) motion …and as a by-product: the sparse structure of the environment

8 Introduction (7) One paradigm for estimating the motion and sparse structure uses two steps: tracking: track image features through the image sequence estimation: find 6 DOF camera positions and 3D point positions consistent with the image features

9 Introduction (8)

10 Introduction (8) Lucas-Kanade

11 Introduction (9)

12 Introduction (10)

13 Introduction (10) bundle adjustment, a.k.a. nonlinear shape- from-motion

14 Introduction (11) This example sequence is benign: small number of images large number of easily tracked image features all points visible in all frames …and the estimated motion and scene structure are still ambiguous

15 Introduction (12) Sequences from navigation applications much harder: long observations sequences small number of features, which may be poorly tracked each feature visible in a small section of the image stream

16 Introduction (13) We have been working on both: tracking estimation This talk: mostly estimation

17 Introduction (14)

18 Introduction (15) Image measurements only often the only option Image and inertial measurements can disambiguate image-only estimates more complex: requires more calibration, estimation of additional unknowns

19 Introduction (16) Batch estimation: uses all of the observations at once all observations must be available before computation begins Online estimation: observations are incorporated as they arrive suitable for long or “infinite” sequences

20 Introduction (17) Conventional images: images from standard cameras model as pinhole projection with radial distortion Omnidirectional images: images that result from combining a conventional camera with a convex mirror requires a more complex projection model

21 Introduction (18)

22 Introduction (19)

23 Outline Background: image-only batch estimation Image-and-inertial batch estimation Online estimation Experiments Future work Relevance to the personal satellite assistant

24 Background: image-only batch estimation (1)

25 Background: image-only batch estimation (1) Tracking provides image location x ij for each point j that appears in image i

26 Background: image-only batch estimation (2) Suppose we also have estimates of: the camera rotation ρ i and translation t i at time of each image Three dimensional point positions X j of each tracked point Then the reprojections are:

27 Background: image-only batch estimation (3)

28 Background: image-only batch estimation (4)

29 Background: image-only batch estimation (5) So, minimize: with respect to all the ρ i, t i, X j “Bundle adjustment”: use iterative nonlinear minimization on this error

30 Background: image-only batch estimation (6) Bundle adjustment pros: any projection model can be used some missing points can be tolerated can be extended Bundle adjustment cons: requires some initial estimate slow

31 Background: image-only batch estimation (7) Batch estimation cons: all observations must be in hand before estimation begins Image-only estimation cons: sensitivity to mistracking cannot recover global scale

32 Outline Background: image-only batch estimation Image-and-inertial batch estimation Online estimation Experiments Future work Relevance to the personal satellite assistant

33 Image-and-inertial batch estimation (1)

34 Image-and-inertial batch estimation (2) Image and inertial measurements are highly complementary

35 Image-and-inertial batch estimation (2) Image and inertial measurements are highly complementary Inertial measurements can: establish the global scale provide robustness if: too few features features infinitely far away features in an “accidental” configuration

36 Image-and-inertial batch estimation (3) Image measurements can: reduce inertial integration drift separate effects of: rotation gravity acceleration bias in inertial readings

37 Image-and-inertial batch estimation (4) Gyro measurements: ω’, ω: measured and actual angular velocity b ω : gyro bias n: gaussian noise

38 Image-and-inertial batch estimation (5) Accelerometer measurements: a’, a: measured and actual acceleration g: gravity vector b a : accelerometer bias n: gaussian noise

39 Image-and-inertial batch estimation (6) Minimizes a combined error:

40 Image-and-inertial batch estimation (7) Image term E image is the same as before

41 Image-and-inertial batch estimation (8) Inertial error term E inertial is:

42 Image-and-inertial batch estimation (8) Inertial error term E inertial is:

43 Image-and-inertial batch estimation (8) Inertial error term E inertial is:

44 Image-and-inertial batch estimation (9) timeτ i-1 (time of image i - 1) t i-1 = t(τ i-1 ) t i = t(τ i ) I(τ i-1, τ i, …, t i-1 ) τ i (time of image i) translation

45 Image-and-inertial batch estimation (10) time τ0τ0 translation τ1τ1 τ2τ2 τ5τ5 τ3τ3 τ4τ4 τ f-3 τ f-2 τ f-1

46 Image-and-inertial batch estimation (11)

47 Image-and-inertial batch estimation (11)

48 Image-and-inertial batch estimation (12) I t (τ i-1, τ i,…, t i-1 ) depends on: τ i-1, τ i (known) all inertial measurements for times τ i-1 < τ < τ i (known) ρ i-1, t i-1 camera linear velocities: v i g b ω, b a

49 Image-and-inertial batch estimation (13) The combined error function (image-and- inertial) is then minimized with respect to: ρ i, t i X j

50 Image-and-inertial batch estimation (14) The combined error function (image and inertial) is then minimized with respect to: ρ i, t i X j v i g b ω, b a

51 Image-and-inertial batch estimation (15)

52 Image-and-inertial batch estimation (16) accurate motions even when image-only and inertial-only motions are poor

53 Image-and-inertial batch estimation (17) In addition: recovers good gravity values, even with poor initialization global scale often recovered with less than 5% error

54 Outline Background: image-only batch estimation Image-and-inertial batch estimation Online estimation image measurements only image and inertial measurements Experiments Future work Relevance to the personal satellite assistant

55 Online algorithms (1)

56 Online algorithms (2) Batch estimation: uses all of the observations at once all observations must be available before computation begins Online estimation: observations are incorporated as they arrive suitable for long or “infinite” sequences

57 Online algorithms (3) Our online algorithms are iterated extended Kalman filters (IEKFs) Some components of the IEKF when applied to vehicle motion: state estimate distribution gaussian: mean and covariance motion model measurement model

58 Online algorithms (4) initialization prior distribution on unknowns timestamped observations time propagation prior distribution on unknowns timestamp k init + 1 measurement update prior distribution on unknowns observation k init + 1 prior distribution on unknowns time propagation prior distribution on unknowns timestamp i measurement update prior distribution on unknowns observation i 1, …, k init

59 Online estimation (5): image-only For the image-only online algorithm, the unknowns (state) are: ρ(τ), t(τ) currently visible point positions X j

60 Online estimation (6): image-only, cont. initialization prior distribution on unknowns timestamped observations 1, …, k init apply the batch image-only algorithm to observations 1, …, k init

61 Online estimation (7): image-only, cont. prior distribution on unknowns time propagation prior distribution on unknowns timestamp i assume that ρ, t are perturbed by gaussian noise assume that X j are unchanged

62 Online estimation (8): image-only, cont. prior distribution on unknowns measurement update prior distribution on unknowns observation i To update given an image measurement z, use the reprojection equation:

63 Online estimation (9): image-only, cont. For points that become visible after online operation has begun… …adapt Smith, Self, and Cheeseman’s method for SLAM

64 Online estimation (10): image-and- inertial initialization prior distribution on unknowns timestamped observations time propagation prior distribution on unknowns timestamp k init + 1 measurement update prior distribution on unknowns observation k init + 1 prior distribution on unknowns time propagation prior distribution on unknowns timestamp i measurement update prior distribution on unknowns observation i 1, …, k init

65 Online estimation (11): image-and- inertial, cont. Augmented state includes: ρ(τ), t(τ) currently visible point positions X j

66 Online estimation (11): image-and- inertial, cont. Augmented state includes: ρ(τ), t(τ) currently visible point positions X j v(τ) g b ω, b a

67 Online estimation (11): image-and- inertial, cont. Augmented state includes: ρ(τ), t(τ) currently visible point positions X j v(τ) g b ω, b a camera system angular velocity: ω(τ) world system linear acceleration: a(τ)

68 Online estimation (12): image-and- inertial, cont. initialization prior distribution on unknowns timestamped observations 1, …, k init apply the batch image-and-inertial algorithm

69 Online estimation (13): image-and- inertial, cont. assume that ω, a are perturbed by gaussian noise assume that X j, g, b ω, b a are unchanged prior distribution on unknowns time propagation prior distribution on unknowns timestamp i

70 Online estimation (14): image-and- inertial, cont. prior distribution on unknowns measurement update prior distribution on unknowns observation i use the inertial sensor model we’ve seen:

71 Outline Background: image-only batch estimation Image and inertial batch estimation Online estimation Experiments Hyperion CMU crane Future work Relevance to the personal satellite assistant

72 Experiments (1): Hyperion

73 Experiments (2): Hyperion, cont.

74 Experiments (3): Hyperion, cont.

75 Experiments (4): Hyperion, cont.

76 Experiments (4): Hyperion, cont.

77 Experiments (5): Hyperion, cont.

78 Experiments (6): Hyperion, cont.

79 Experiments (7): CMU crane Crane capable of translating a platform… …through x, y, z… …through a workspace of about 10 x 10 x 5 m

80 Experiments (8): CMU crane, cont. y translation (meters) x translation (meters) (x, y) translation ground truth

81 Experiments (9): CMU crane, cont. z (m) time z translation ground truth No change in rotation

82 Experiments (10): CMU crane, cont.

83 Experiments (11): CMU crane, cont. Hard sequence: Each image contains an average of 56.0 points Each point appears in an average of 62.3 images (4.4% of sequence) Image-and-inertial online algorithm applied 40 images used in batch initialization

84 Experiments (12): CMU crane, cont.

85 Experiments (13): CMU crane, cont. Estimated z camera translations

86 Experiments (14): CMU crane, cont. 6 DOF errors, after scaled rigid alignment: Rotation: 0.14 radians average Translation: 31.5 cm average (0.9% of distance traveled) Global scale error: -3.4%

87 Outline Background: image-only batch estimation Image and inertial batch estimation Online estimation Experiments Future work Relevance to the personal satellite assistant

88 Future work (1) For long sequences: no single external point is always visible Estimated positions will drift due to… random and gross observation errors modeling errors suboptimality in online estimation

89 Future work (2)

90 Future work (3)

91 Future work (4) Need to incorporate some advances from the SLAM community to deal with this issue… …in particular, reacquiring revisited features

92 Future work (5) Two issues: (1) recognizing revisited features (2) exploiting revisited features in the estimation Lowe’s SIFT features a good candidate for (1)

93

94 Outline Background: image-only batch estimation Image and inertial batch estimation Online estimation Experiments Future work Relevance to the personal satellite assistant

95 Relevance to personal satellite assistant (1) PSA localization problem is the same as ours in some essential ways… image and inertial data only – no GPS full 6 DOF required no fiducials [known] gravity not required for orientation long term operation

96 Relevance to personal satellite assistant (2) We have not emphasized: significant independent motion in the environment

97 Relevance to personal satellite assistant (3) Some opportunities specific to the PSA: independent fields of view stereo views visual appearance of station dynamic, but not completely unknown a priori

98 Thanks! Pointers to: these slides related work by others related publications by our group feature tracking movies VRML models at: