Download presentation
Presentation is loading. Please wait.
Published byOscar Cole Modified over 9 years ago
1
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh
2
2 On the web Related materials: these slides related papers movies VRML models at: http://www.cs.cmu.edu/~dstrelow/northrop
3
3 Introduction (1) micro air vehicle (MAV) navigation AeroVironment Black WidowAeroVironment Microbat
4
4 Introduction (2) mars rover navigation Mars Exploration Rovers (MER)Hyperion
5
5 Introduction (3) robotic search and rescue Rhex Center for Robot-Assisted Search and Rescue, U. of South Florida
6
6 Introduction (4) NASA ISS personal satellite assistant
7
7 Introduction (5) Each of these problems requires: 6 DOF motion in unknown environments without GPS or other absolute positioning over the long term …and some of the problems require: small, light, and cheap sensors
8
8 Introduction (6) Monocular, image-based motion estimation is a good candidate In particular, simultaneous estimation of: multiframe motion sparse scene structure is the most promising approach
9
9 Outline Image-based motion estimation Improving estimation Improving feature tracking Reacquisition
10
10 Outline Image-based motion estimation refresher difficulties Improving estimation Improving feature tracking Reacquisition
11
11 Image-based motion estimation: refresher (1) A two-step process is typical… First, sparse feature tracking: Inputs: raw images Outputs: projections
12
12 Image-based motion estimation: refresher (2)
13
13 Image-based motion estimation: refresher (3) Second, estimation: Input: Outputs: 6 DOF camera position at the time of each image 3D position of each tracked point projections from tracker
14
14 Image-based motion estimation: refresher (4)
15
15 Image-based motion estimation: refresher (5) Algorithms exist For tracking: Lucas-Kanade (Lucas and Kanade, 1981)
16
16 Image-based motion estimation: refresher (6) For estimation: SVD-based factorization (Tomasi and Kanade, 1992) bundle adjustment (various, 1950’s) Kalman filtering (Broida and Chellappa, 1990) variable state dimension filter (McLauchlan, 1996)
17
17 Image-based motion estimation: difficulties (1) So, the problem is solved?
18
18 Image-based motion estimation: difficulties (2) If so, where are the automatic systems for estimating the motion of: in unknown environments? from images in unknown environments?
19
19 Image-based motion estimation: difficulties (3) …and for automatically modeling rooms buildings cities from a handheld camera?
20
20 Image-based motion estimation: difficulties (4) Estimation step can be very sensitive to: incorrect or insufficient image feature tracking camera modeling and calibration errors outlier detection thresholds sequences with degenerate camera motions
21
21 Image-based motion estimation: difficulties (5) …and for recursive methods in particular: poor prior assumptions on the motion poor approximations in state error modeling
22
22 Image-based motion estimation: difficulties (6) 151 images, 23 points
23
23 Image-based motion estimation: difficulties (7)
24
24 Outline Image-based motion estimation Improving estimation overview image and inertial measurements Improving feature tracking Reacquisition
25
25 Improving estimation: overview
26
26 Improving estimation: overview
27
27 Improving estimation: image and inertial (1) Image and inertial measurements are highly complimentary Inertial measurements can: resolve the ambiguities in image-only estimates establish the global scale
28
28 Improving estimation: image and inertial (2) Images measurements can: reduce the drift in integrating inertial measurements distinguish between rotation, gravity, acceleration, bias, noise in accelerometer readings
29
29 Improving estimation: image and inertial (3)
30
30 Improving estimation: image and inertial (4)
31
31 Improving estimation: image and inertial (5) Other examples: global scale typically within 5% better convergence than image-only estimation
32
32 Improving estimation: image and inertial (6) Many more details in: Dennis Strelow and Sanjiv Singh. Motion estimation from image and inertial measurements. International Journal of Robotics Research, to appear.
33
33 Outline Image-based motion estimation Improving estimation Improving feature tracking Lucas-Kanade Lucas-Kanade and real sequences The “smalls” tracker Reacquisition
34
34 Improving feature tracking: Lucas- Kanade (1) Lucas-Kanade has been the go-to feature tracker from shape-from-motion iteratively minimize the intensity matching error… …with respect to the feature’s position in the new image
35
35 Improving feature tracking: Lucas- Kanade (2) Additional heuristics used to apply Lucas- Kanade to shape-from-motion: task:heuristic: choose features to trackhigh image texture detect mistracking or occlusion convergence and matching error handle large motionsimage pyramid
36
36 Improving feature tracking: Lucas- Kanade (3) Lucas-Kanade advantages: fast subpixel resolution can handle some large motions well uses general minimization, so easily extendible
37
37 Improving feature tracking: Lucas- Kanade (4) 0.1 average pixel reprojection error!
38
38 Improving feature tracking: Lucas- Kanade and real sequences (1) But Lucas-Kanade performs poorly on many real sequences…
39
39 Improving feature tracking: Lucas- Kanade and real sequences (2) …and image-based motion estimation can be sensitive to errors in feature tracking
40
40 Improving feature tracking: Lucas- Kanade and real sequences (3)
41
41 Improving feature tracking: Lucas- Kanade and real sequences (4)
42
42 Improving feature tracking: Lucas- Kanade and real sequences (5)
43
43 Improving feature tracking: Lucas- Kanade and real sequences (6) Why does Lucas-Kanade perform poorly on many real sequences? the heuristics are poor the features are tracked independently task:heuristic: choose features to trackhigh image texture detect mistracking or occlusion convergence and matching error handle large motionsimage pyramid
44
44 Improving feature tracking: the “smalls” tracker (1) smalls is a new feature tracker for shape-from- motion and similar applications eliminates the heuristics normally used with Lucas-Kanade enforces the rigid scene constraint
45
45 Improving feature tracking: the “smalls” tracker (2) Leonard Smalls; tracker, manhunter
46
46 Improving feature tracking: the “smalls” tracker (3) epipolar geometry 1-D matching along epipolar lines geometric mistracking detection feature death and birth outputto 6 DOF featuresestimation
47
47 Improving feature tracking: the “smalls” tracker (3) epipolar geometry 1-D matching along epipolar lines geometric mistracking detection feature death and birth outputto 6 DOF SIFT featuresestimation features
48
48 Improving feature tracking: the “smalls” tracker (4) SIFT keypoints (Lowe, IJCV 2004): image interest points can be extracted despite of large changes in viewpoint to subpixel accuracy A keypoint’s feature vectors in two images usually match
49
49 Improving feature tracking: the “smalls” tracker (5) Epipolar geometry between adjacent images is determined using… SIFT extraction and matching two-frame bundle adjustment RANSAC epipolar geometry SIFT features
50
50 Improving feature tracking: the “smalls” tracker (6) Search for new feature locations constrained to epipolar lines: 1.initial position from nearby SIFT matches 2.discrete SSD search (e.g., 60 pixels) 3. 1-D Lucas-Kanade refines the match 1-D matching along epipolar lines
51
51 Improving feature tracking: the “smalls” tracker (7) Mistracked or occluded features are detected using geometric consistency between triples of images geometric mistracking detection three-frame bundle adjustment RANSAC
52
52 Improving feature tracking: the “smalls” tracker (8) After tracking in each image: features are pruned to maintain a minimum separation new features are selected in those parts of the image not already covered feature death and birth outputto 6 DOF featuresestimation
53
53 Improving feature tracking: the “smalls” tracker (9)
54
54 Improving feature tracking: the “smalls” tracker (10)
55
55 Improving feature tracking: the “smalls” tracker (11)
56
56 Improving feature tracking: the “smalls” tracker (12)
57
57 Outline Image-based motion estimation Improving image-based motion estimation Improving feature tracking Reacquisition
58
58 Reacquisition (1) Image-based motion estimates from any system will drift: if the features we see are always changing given sufficient time if we don’t recognize when we’ve revisited a location
59
59 Reacquisition (2)
60
60 Reacquisition (3)
61
61 Thanks! Related materials: these slides related papers movies VRML models at: http://www.cs.cmu.edu/~dstrelow/northrop
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.