Download presentation
Presentation is loading. Please wait.
1
3D SLAM for Omni-directional Camera
Yuttana Suttasupa Advisor: Asst.Prof. Attawith Sudsang
2
Introduction Localization Mapping SLAM
Robot can estimate its location with respect to landmarks in an environment Mapping Robot can reconstruct the position of landmarks that its encounter in an environment SLAM Robot build up a map and localize itself simultaneously while traversing in an unknown environment
3
The Problem Propose SLAM method for a hand-held omni-directional Camera Omni-directional camera move freely in an unknown indoor environment without knowing of camera motion model Using only bearing data from omni-images and no need any initialize information Reconstruct 3D camera path and 3D environment map (landmark-based)
4
The Problem Input a captured image sequence from an omni-directional camera
5
The Problem Output a camera state - 3D position and direction
an environment map - 3D landmark positions
6
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
7
Omni-directional Camera
Our omni-directional camera Two parabolic mirrors CCD camera with 640× Hz 360° horizontal field of view -5° to 65° vertical field of view
8
Omni-directional Camera
Normal camera Omni-directional camera Motivation ที่ใช้กล้องออมนิ คนอื่นไม่ทำ
9
Omni camera Calibration
Find a mapping function from 2D image to 3D object Using Omnidirectional Camera Calibration Toolbox (Scaramuzza et al., 2006)
10
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
11
EKF SLAM Using extended kalman filter to solve SLAM problem
Assume a robot position and a map probability distributions as gaussian distributions Predict a robot position and landmarks distributions using a robot motion model Correct the distributions using an observation model
12
EKF SLAM The distribution representation state Initial state
Assume a robot position distribution with some value at the initial state state covariance robot probability distribution
13
EKF SLAM Predict state Using a robot motion model to predict a robot position robot รายละเอียดอยู่ในตัวเล่ม รู้ model หรือเปล่า Predicted state Predicted estimate covariance
14
EKF SLAM Correction state
Using an observation model to update a robot position and landmark positions robot landmark measurement Observation model Innovation residual Updated state estimate adjustment Updated estimate covariance Innovation covariance Optimal Kalman gain
15
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
16
Introduction to Problem
Feature detection Problem How a computer recognizes objects from an image Feature association Problem How can we find feature relations between two images
17
Introduction to Problem
Observability Problem A camera given only a bearing-only data How can we estimate a high dimensional state with low dimensional measurements Landmark How far is it? Camera
18
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
19
Solution to Problem The proposed algorithm includes 3 steps
Image Processing Detect features Find feature associations Calculate feature measurements SLAM Apply measurement data to SLAM Features and reference frames management Add and remove features from SLAM state Add and remove reference frames from SLAM state
20
Solution to Problem System Coordinate World Frame Camera Frame
Reference Frames landmark Camera Frame World Frame Reference Frame
21
Solution to Problem SLAM State Camera state – represent camera frame
Reference frame states – represent reference frames Landmark states – represent landmark positions
22
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
23
Image Processing Input Output an image from an omni-directional camera
Old SLAM state Output Feature measurement Feature association
24
Image Processing Feature detection (for new features)
Using point features Finds corners in the image using harris corner detector
25
Image Processing Feature associations
Describe which landmark that the feature in current image is associated to Find the relation between a current image and old features in an old image Using optical flow to track features Using template matching to refine a feature position
26
Image Processing Feature associations – features tracking
Tracking features from a previous image to get a current features position Using pyramid Lucas-Kanade optical flow
27
Image Processing Feature associations – feature positions refinement
Track features using optical flow may cause a feature drift Using pyramid template matching to correct a feature position search region feature patch current image with a drifted feature result after refinement using template matching
28
Image Processing Feature associations – feature position refinements
Select patch from a reference image Patch rotation and scale may not match Transform function may need to apply to patch Not match Reference image Match Current image
29
Image Processing Feature associations – feature position refinements
Find transform function by project 3D patch creating from a current image to a reference image 3D patch Image sphere Current image Reference image
30
Image Processing Find transform function
Project every patch pixel may lead to a computational cost problem Use perspective transform as a transform function instead Need 4 project points to calculate a perspective transform function Real distortion Perspective distortion
31
Image Processing Feature associations – example
32
Image Processing Feature measurements
Using feature points in omni-image to be a measurement data Feature points must be converted into bearing-only measurement in the form of yaw and pitch angles z landmark Ray (r) y ใส่ รูปที่มี rx ry ด้วย x
33
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
34
Simultaneous localization and mapping (SLAM)
Using EKF SLAM to estimate Camera state, Reference frame states and Landmark states Prediction Determine how a camera move Find state transition model (camera motion model) Correction How to measurement a landmark Find observation model
35
Simultaneous localization and mapping (SLAM)
Input Measurement data from omni-image Output Estimated SLAM state Camera state Reference frame states Landmark states
36
Simultaneous localization and mapping (SLAM)
Prediction Determine how a camera move But the camera motion is unpredictable Assume that a camera can move freely in any direction with some limit velocity Before predict After predict
37
Simultaneous localization and mapping (SLAM)
Correction Using a bunch of measurement (include current measurements data and old measurements data at reference frame) to update SLAM state landmark Reference frame Current camera Reference frame
38
Simultaneous localization and mapping (SLAM)
Correction Measurement data for landmark i Observation model for each measurement y' landmark is a landmark position in X coordinate
39
Simultaneous localization and mapping (SLAM)
Correction step can separate in 2 parts Camera and reference frames Correction Landmarks Correction
40
Simultaneous localization and mapping (SLAM)
Camera and reference frames Correction Assume that the measurement data can measurement landmark positions accurately The correction affects only a camera state and reference frame states Before correction After correction
41
Simultaneous localization and mapping (SLAM)
Landmarks Correction Assume that the camera state is accurate The correction affects only landmark states Before correction After correction
42
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
43
Features and reference frames management
Remove features Feature points is out of image bound The landmark position is not accurate enough the feature of this landmark is out of bound
44
Features and reference frames management
Add features Add new features using harris corner detector to detect a new feature Add new features when we have a new reference frame Add old features Consider that the old landmark may be appear in the omni image again เมื่อไหร่จะเพิ่ม reference frame เมื่อไหร่
45
Features and reference frames management
Add new features Add new landmarks to SLAM state Estimate a landmark position by assume a large variance for a range data เมื่อไหร่จะเพิ่ม reference frame เมื่อไหร่
46
Features and reference frames management
Add old features project an old landmark to the current image check if a feature available in the image using template matching landmark feature Image sphere
47
Features and reference frames management
Add reference frame When no suitable reference frames for feature tracking When landmark number is below some threshold Select a current camera state as a new reference frame
48
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
49
Experimental Results
50
Experimental Results
51
Experimental Results
52
Experimental Results
53
Experimental Results
54
Experimental Results
55
Outline Introduction Omni-directional Camera EKF-SLAM
Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
56
Result Evaluation Localization Evaluation Mapping Evaluation
2D localization evaluation 3D localization evaluation Mapping Evaluation
57
Result Evaluation 2D Localization Evaluation
Using wiimote as a bird eye view camera Detect IR point on the omni camera while traversing in 2D plane by a mobile robot IR point
58
Result Evaluation 2D Localization Evaluation
59
Result Evaluation 2D Localization Evaluation
60
Result Evaluation 2D Localization Evaluation
61
Result Evaluation 3D Localization Evaluation
Using wiimote attach with an omni-directional camera to localize the 3D camera position related to reference IR board
62
Result Evaluation 3D Localization Evaluation
63
Result Evaluation 3D Localization Evaluation
64
Result Evaluation 3D Localization Evaluation
65
Result Evaluation Mapping Evaluation
Compare the mapping result with known structure environment
66
Result Evaluation Mapping Evaluation
67
Conclusion Summary Our algorithm can localize camera position and build up a map in 3D using only a omni-camera image Omni-directional camera move freely in an unknown indoor environment without knowing of camera motion model Evaluation Result shows the correspondence of the localization and mapping outcome with the ground truth
68
Thank you
69
Statistic Location Conf. room Corridor Stairway Max feature count 38
36 28 Min feature count 14 13 10 Avg. Feature count Max landmark count 43 65 Min landmark count 25 Avg. Landmark count Max time per frame (ms) Min time per frame (ms) 19.657 Avg. time per frame (ms) Frame count 804 1157 747
70
Visual SLAM for 3D Large-Scale Seabed Acquisition Employing Underwater Vehicles
71
Featureless Vehicle-Based Visual SLAM with a Consumer Camera
72
Scan-SLAM: Combining EKF-SLAM and Scan Correlation
73
The Problem Input Output
a captured image sequence from an omni-directional camera Output a camera state 3D position and direction an environment map 3D landmark positions
74
Omni camera Calibration
75
Omni camera Calibration
76
Image Processing Feature detection (for new feature)
Using point features Finds corners in the image using harris corner detector
77
Simultaneous localization and mapping (SLAM)
Prediction Determine how a camera move But the camera motion is unpredictable Assume that a camera can move freely in any direction with some limit velocity Predicted State Predicted estimate covariance
78
Simultaneous localization and mapping (SLAM)
Correction Measurement data for landmark i Observation model for landmark i y' while is transform function which transform a landmark position (y) from world coordinate to reference coordinate (x)
79
Simultaneous localization and mapping (SLAM)
Camera and reference frames Correction Assume that the measurement data can measurement landmark positions accurately The correction affects only a camera state and reference frame states Innovation or measurement residual Innovation (or residual) covariance Optimal Kalman gain Update state estimate Update estimate covariance is Jacobian Matrix of function h at is Jacobian Matrix of function h at
80
Simultaneous localization and mapping (SLAM)
Landmarks Correction Assume that the camera state is accurate The correction affects only landmark states Innovation or measurement residual Innovation (or residual) covariance Optimal Kalman gain Update state estimate Update estimate covariance is Jacobian Matrix of function h at is Jacobian Matrix of function h at
81
Solution to Problem SLAM State Camera state – represent camera frame
Reference frame states – represent reference frames Landmark states – represent landmark positions
82
Features and reference frames management
Select reference frame for feature tracking x y z x y z reference frame camera frame
83
Features and reference frames management
Select reference frame for update SLAM state x y z x y z reference frame camera frame
84
Feature ray pitch yaw landmark x y z camera
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.