Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.

Similar presentations


Presentation on theme: "1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University."— Presentation transcript:

1 1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University

2 2 Introduction (1) micro air vehicle (MAV) navigation AeroVironment Black WidowAeroVironment Microbat

3 3 Introduction (2) mars rover navigation Mars Exploration Rovers (MER)Hyperion

4 4 Introduction (3) robotic search and rescue Rhex Center for Robot-Assisted Search and Rescue, U. of South Florida

5 5 Introduction (4) NASA ISS personal satellite assistant

6 6 Introduction (5) Each of these requires: six degree of freedom motion in unknown environments without GPS or other absolute positioning over the long term …and in many cases… small, light, and cheap sensors

7 7 Introduction (6) If we adopt a camera as our sensor, estimate… the vehicle (i.e., sensor) motion …and as a by-product: the sparse structure of the environment

8 8 Introduction (7) One paradigm for estimating the motion and sparse structure uses two steps: tracking: track image features through the image sequence estimation: find 6 DOF camera positions and 3D point positions consistent with the image features

9 9 Introduction (8)

10 10 Introduction (8) Lucas-Kanade

11 11 Introduction (9)

12 12 Introduction (10)

13 13 Introduction (10) bundle adjustment, a.k.a. nonlinear shape- from-motion

14 14 Introduction (11) This example sequence is benign: small number of images large number of easily tracked image features all points visible in all frames …and the estimated motion and scene structure are still ambiguous

15 15 Introduction (12) Sequences from navigation applications much harder: long observations sequences small number of features, which may be poorly tracked each feature visible in a small section of the image stream

16 16 Introduction (13) We have been working on both: tracking estimation This talk: mostly estimation

17 17 Introduction (14)

18 18 Introduction (15) Image measurements only often the only option Image and inertial measurements can disambiguate image-only estimates more complex: requires more calibration, estimation of additional unknowns

19 19 Introduction (16) Batch estimation: uses all of the observations at once all observations must be available before computation begins Online estimation: observations are incorporated as they arrive suitable for long or “infinite” sequences

20 20 Introduction (17) Conventional images: images from standard cameras model as pinhole projection with radial distortion Omnidirectional images: images that result from combining a conventional camera with a convex mirror requires a more complex projection model

21 21 Introduction (18)

22 22 Introduction (19)

23 23 Outline Background: image-only batch estimation Image-and-inertial batch estimation Online estimation Experiments Future work Relevance to the personal satellite assistant

24 24 Background: image-only batch estimation (1)

25 25 Background: image-only batch estimation (1) Tracking provides image location x ij for each point j that appears in image i

26 26 Background: image-only batch estimation (2) Suppose we also have estimates of: the camera rotation ρ i and translation t i at time of each image Three dimensional point positions X j of each tracked point Then the reprojections are:

27 27 Background: image-only batch estimation (3)

28 28 Background: image-only batch estimation (4)

29 29 Background: image-only batch estimation (5) So, minimize: with respect to all the ρ i, t i, X j “Bundle adjustment”: use iterative nonlinear minimization on this error

30 30 Background: image-only batch estimation (6) Bundle adjustment pros: any projection model can be used some missing points can be tolerated can be extended Bundle adjustment cons: requires some initial estimate slow

31 31 Background: image-only batch estimation (7) Batch estimation cons: all observations must be in hand before estimation begins Image-only estimation cons: sensitivity to mistracking cannot recover global scale

32 32 Outline Background: image-only batch estimation Image-and-inertial batch estimation Online estimation Experiments Future work Relevance to the personal satellite assistant

33 33 Image-and-inertial batch estimation (1)

34 34 Image-and-inertial batch estimation (2) Image and inertial measurements are highly complementary

35 35 Image-and-inertial batch estimation (2) Image and inertial measurements are highly complementary Inertial measurements can: establish the global scale provide robustness if: too few features features infinitely far away features in an “accidental” configuration

36 36 Image-and-inertial batch estimation (3) Image measurements can: reduce inertial integration drift separate effects of: rotation gravity acceleration bias in inertial readings

37 37 Image-and-inertial batch estimation (4) Gyro measurements: ω’, ω: measured and actual angular velocity b ω : gyro bias n: gaussian noise

38 38 Image-and-inertial batch estimation (5) Accelerometer measurements: a’, a: measured and actual acceleration g: gravity vector b a : accelerometer bias n: gaussian noise

39 39 Image-and-inertial batch estimation (6) Minimizes a combined error:

40 40 Image-and-inertial batch estimation (7) Image term E image is the same as before

41 41 Image-and-inertial batch estimation (8) Inertial error term E inertial is:

42 42 Image-and-inertial batch estimation (8) Inertial error term E inertial is:

43 43 Image-and-inertial batch estimation (8) Inertial error term E inertial is:

44 44 Image-and-inertial batch estimation (9) timeτ i-1 (time of image i - 1) t i-1 = t(τ i-1 ) t i = t(τ i ) I(τ i-1, τ i, …, t i-1 ) τ i (time of image i) translation

45 45 Image-and-inertial batch estimation (10) time τ0τ0 translation τ1τ1 τ2τ2 τ5τ5 τ3τ3 τ4τ4 τ f-3 τ f-2 τ f-1

46 46 Image-and-inertial batch estimation (11)

47 47 Image-and-inertial batch estimation (11)

48 48 Image-and-inertial batch estimation (12) I t (τ i-1, τ i,…, t i-1 ) depends on: τ i-1, τ i (known) all inertial measurements for times τ i-1 < τ < τ i (known) ρ i-1, t i-1 camera linear velocities: v i g b ω, b a

49 49 Image-and-inertial batch estimation (13) The combined error function (image-and- inertial) is then minimized with respect to: ρ i, t i X j

50 50 Image-and-inertial batch estimation (14) The combined error function (image and inertial) is then minimized with respect to: ρ i, t i X j v i g b ω, b a

51 51 Image-and-inertial batch estimation (15)

52 52 Image-and-inertial batch estimation (16) accurate motions even when image-only and inertial-only motions are poor

53 53 Image-and-inertial batch estimation (17) In addition: recovers good gravity values, even with poor initialization global scale often recovered with less than 5% error

54 54 Outline Background: image-only batch estimation Image-and-inertial batch estimation Online estimation image measurements only image and inertial measurements Experiments Future work Relevance to the personal satellite assistant

55 55 Online algorithms (1)

56 56 Online algorithms (2) Batch estimation: uses all of the observations at once all observations must be available before computation begins Online estimation: observations are incorporated as they arrive suitable for long or “infinite” sequences

57 57 Online algorithms (3) Our online algorithms are iterated extended Kalman filters (IEKFs) Some components of the IEKF when applied to vehicle motion: state estimate distribution gaussian: mean and covariance motion model measurement model

58 58 Online algorithms (4) initialization prior distribution on unknowns timestamped observations time propagation prior distribution on unknowns timestamp k init + 1 measurement update prior distribution on unknowns observation k init + 1 prior distribution on unknowns time propagation prior distribution on unknowns timestamp i measurement update prior distribution on unknowns observation i 1, …, k init

59 59 Online estimation (5): image-only For the image-only online algorithm, the unknowns (state) are: ρ(τ), t(τ) currently visible point positions X j

60 60 Online estimation (6): image-only, cont. initialization prior distribution on unknowns timestamped observations 1, …, k init apply the batch image-only algorithm to observations 1, …, k init

61 61 Online estimation (7): image-only, cont. prior distribution on unknowns time propagation prior distribution on unknowns timestamp i assume that ρ, t are perturbed by gaussian noise assume that X j are unchanged

62 62 Online estimation (8): image-only, cont. prior distribution on unknowns measurement update prior distribution on unknowns observation i To update given an image measurement z, use the reprojection equation:

63 63 Online estimation (9): image-only, cont. For points that become visible after online operation has begun… …adapt Smith, Self, and Cheeseman’s method for SLAM

64 64 Online estimation (10): image-and- inertial initialization prior distribution on unknowns timestamped observations time propagation prior distribution on unknowns timestamp k init + 1 measurement update prior distribution on unknowns observation k init + 1 prior distribution on unknowns time propagation prior distribution on unknowns timestamp i measurement update prior distribution on unknowns observation i 1, …, k init

65 65 Online estimation (11): image-and- inertial, cont. Augmented state includes: ρ(τ), t(τ) currently visible point positions X j

66 66 Online estimation (11): image-and- inertial, cont. Augmented state includes: ρ(τ), t(τ) currently visible point positions X j v(τ) g b ω, b a

67 67 Online estimation (11): image-and- inertial, cont. Augmented state includes: ρ(τ), t(τ) currently visible point positions X j v(τ) g b ω, b a camera system angular velocity: ω(τ) world system linear acceleration: a(τ)

68 68 Online estimation (12): image-and- inertial, cont. initialization prior distribution on unknowns timestamped observations 1, …, k init apply the batch image-and-inertial algorithm

69 69 Online estimation (13): image-and- inertial, cont. assume that ω, a are perturbed by gaussian noise assume that X j, g, b ω, b a are unchanged prior distribution on unknowns time propagation prior distribution on unknowns timestamp i

70 70 Online estimation (14): image-and- inertial, cont. prior distribution on unknowns measurement update prior distribution on unknowns observation i use the inertial sensor model we’ve seen:

71 71 Outline Background: image-only batch estimation Image and inertial batch estimation Online estimation Experiments Hyperion CMU crane Future work Relevance to the personal satellite assistant

72 72 Experiments (1): Hyperion

73 73 Experiments (2): Hyperion, cont.

74 74 Experiments (3): Hyperion, cont.

75 75 Experiments (4): Hyperion, cont.

76 76 Experiments (4): Hyperion, cont.

77 77 Experiments (5): Hyperion, cont.

78 78 Experiments (6): Hyperion, cont.

79 79 Experiments (7): CMU crane Crane capable of translating a platform… …through x, y, z… …through a workspace of about 10 x 10 x 5 m

80 80 Experiments (8): CMU crane, cont. y translation (meters) x translation (meters) 3.0 0.0 -3.0 0.0 3.0 (x, y) translation ground truth

81 81 Experiments (9): CMU crane, cont. z (m) time 3.5 4.5 z translation ground truth No change in rotation

82 82 Experiments (10): CMU crane, cont.

83 83 Experiments (11): CMU crane, cont. Hard sequence: Each image contains an average of 56.0 points Each point appears in an average of 62.3 images (4.4% of sequence) Image-and-inertial online algorithm applied 40 images used in batch initialization

84 84 Experiments (12): CMU crane, cont.

85 85 Experiments (13): CMU crane, cont. Estimated z camera translations

86 86 Experiments (14): CMU crane, cont. 6 DOF errors, after scaled rigid alignment: Rotation: 0.14 radians average Translation: 31.5 cm average (0.9% of distance traveled) Global scale error: -3.4%

87 87 Outline Background: image-only batch estimation Image and inertial batch estimation Online estimation Experiments Future work Relevance to the personal satellite assistant

88 88 Future work (1) For long sequences: no single external point is always visible Estimated positions will drift due to… random and gross observation errors modeling errors suboptimality in online estimation

89 89 Future work (2)

90 90 Future work (3)

91 91 Future work (4) Need to incorporate some advances from the SLAM community to deal with this issue… …in particular, reacquiring revisited features

92 92 Future work (5) Two issues: (1) recognizing revisited features (2) exploiting revisited features in the estimation Lowe’s SIFT features a good candidate for (1)

93 93

94 94 Outline Background: image-only batch estimation Image and inertial batch estimation Online estimation Experiments Future work Relevance to the personal satellite assistant

95 95 Relevance to personal satellite assistant (1) PSA localization problem is the same as ours in some essential ways… image and inertial data only – no GPS full 6 DOF required no fiducials [known] gravity not required for orientation long term operation

96 96 Relevance to personal satellite assistant (2) We have not emphasized: significant independent motion in the environment

97 97 Relevance to personal satellite assistant (3) Some opportunities specific to the PSA: independent fields of view stereo views visual appearance of station dynamic, but not completely unknown a priori

98 98 Thanks! Pointers to: these slides related work by others related publications by our group feature tracking movies VRML models at: http://www.cs.cmu.edu/~dstrelow/psa


Download ppt "1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University."

Similar presentations


Ads by Google