3D SLAM for Omni-directional Camera Yuttana Suttasupa Advisor: Asst.Prof. Attawith Sudsang
Introduction Localization Mapping SLAM Robot can estimate its location with respect to landmarks in an environment Mapping Robot can reconstruct the position of landmarks that its encounter in an environment SLAM Robot build up a map and localize itself simultaneously while traversing in an unknown environment
The Problem Propose SLAM method for a hand-held omni-directional Camera Omni-directional camera move freely in an unknown indoor environment without knowing of camera motion model Using only bearing data from omni-images and no need any initialize information Reconstruct 3D camera path and 3D environment map (landmark-based)
The Problem Input a captured image sequence from an omni-directional camera
The Problem Output a camera state - 3D position and direction an environment map - 3D landmark positions
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
Omni-directional Camera Our omni-directional camera Two parabolic mirrors CCD camera with 640×480 pixel @ 29.97 Hz 360° horizontal field of view -5° to 65° vertical field of view
Omni-directional Camera Normal camera Omni-directional camera Motivation ที่ใช้กล้องออมนิ คนอื่นไม่ทำ
Omni camera Calibration Find a mapping function from 2D image to 3D object Using Omnidirectional Camera Calibration Toolbox (Scaramuzza et al., 2006)
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
EKF SLAM Using extended kalman filter to solve SLAM problem Assume a robot position and a map probability distributions as gaussian distributions Predict a robot position and landmarks distributions using a robot motion model Correct the distributions using an observation model
EKF SLAM The distribution representation state Initial state Assume a robot position distribution with some value at the initial state state covariance robot probability distribution
EKF SLAM Predict state Using a robot motion model to predict a robot position robot รายละเอียดอยู่ในตัวเล่ม รู้ model หรือเปล่า Predicted state Predicted estimate covariance
EKF SLAM Correction state Using an observation model to update a robot position and landmark positions robot landmark measurement Observation model Innovation residual Updated state estimate adjustment Updated estimate covariance Innovation covariance Optimal Kalman gain
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
Introduction to Problem Feature detection Problem How a computer recognizes objects from an image Feature association Problem How can we find feature relations between two images
Introduction to Problem Observability Problem A camera given only a bearing-only data How can we estimate a high dimensional state with low dimensional measurements Landmark How far is it? Camera
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
Solution to Problem The proposed algorithm includes 3 steps Image Processing Detect features Find feature associations Calculate feature measurements SLAM Apply measurement data to SLAM Features and reference frames management Add and remove features from SLAM state Add and remove reference frames from SLAM state
Solution to Problem System Coordinate World Frame Camera Frame Reference Frames landmark Camera Frame World Frame Reference Frame
Solution to Problem SLAM State Camera state – represent camera frame Reference frame states – represent reference frames Landmark states – represent landmark positions
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
Image Processing Input Output an image from an omni-directional camera Old SLAM state Output Feature measurement Feature association
Image Processing Feature detection (for new features) Using point features Finds corners in the image using harris corner detector
Image Processing Feature associations Describe which landmark that the feature in current image is associated to Find the relation between a current image and old features in an old image Using optical flow to track features Using template matching to refine a feature position
Image Processing Feature associations – features tracking Tracking features from a previous image to get a current features position Using pyramid Lucas-Kanade optical flow
Image Processing Feature associations – feature positions refinement Track features using optical flow may cause a feature drift Using pyramid template matching to correct a feature position search region feature patch current image with a drifted feature result after refinement using template matching
Image Processing Feature associations – feature position refinements Select patch from a reference image Patch rotation and scale may not match Transform function may need to apply to patch Not match Reference image Match Current image
Image Processing Feature associations – feature position refinements Find transform function by project 3D patch creating from a current image to a reference image 3D patch Image sphere Current image Reference image
Image Processing Find transform function Project every patch pixel may lead to a computational cost problem Use perspective transform as a transform function instead Need 4 project points to calculate a perspective transform function Real distortion Perspective distortion
Image Processing Feature associations – example
Image Processing Feature measurements Using feature points in omni-image to be a measurement data Feature points must be converted into bearing-only measurement in the form of yaw and pitch angles z landmark Ray (r) y ใส่ รูปที่มี rx ry ด้วย x
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
Simultaneous localization and mapping (SLAM) Using EKF SLAM to estimate Camera state, Reference frame states and Landmark states Prediction Determine how a camera move Find state transition model (camera motion model) Correction How to measurement a landmark Find observation model
Simultaneous localization and mapping (SLAM) Input Measurement data from omni-image Output Estimated SLAM state Camera state Reference frame states Landmark states
Simultaneous localization and mapping (SLAM) Prediction Determine how a camera move But the camera motion is unpredictable Assume that a camera can move freely in any direction with some limit velocity Before predict After predict
Simultaneous localization and mapping (SLAM) Correction Using a bunch of measurement (include current measurements data and old measurements data at reference frame) to update SLAM state landmark Reference frame Current camera Reference frame
Simultaneous localization and mapping (SLAM) Correction Measurement data for landmark i Observation model for each measurement y' landmark is a landmark position in X coordinate
Simultaneous localization and mapping (SLAM) Correction step can separate in 2 parts Camera and reference frames Correction Landmarks Correction
Simultaneous localization and mapping (SLAM) Camera and reference frames Correction Assume that the measurement data can measurement landmark positions accurately The correction affects only a camera state and reference frame states Before correction After correction
Simultaneous localization and mapping (SLAM) Landmarks Correction Assume that the camera state is accurate The correction affects only landmark states Before correction After correction
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
Features and reference frames management Remove features Feature points is out of image bound The landmark position is not accurate enough the feature of this landmark is out of bound
Features and reference frames management Add features Add new features using harris corner detector to detect a new feature Add new features when we have a new reference frame Add old features Consider that the old landmark may be appear in the omni image again เมื่อไหร่จะเพิ่ม reference frame เมื่อไหร่
Features and reference frames management Add new features Add new landmarks to SLAM state Estimate a landmark position by assume a large variance for a range data เมื่อไหร่จะเพิ่ม reference frame เมื่อไหร่
Features and reference frames management Add old features project an old landmark to the current image check if a feature available in the image using template matching landmark feature Image sphere
Features and reference frames management Add reference frame When no suitable reference frames for feature tracking When landmark number is below some threshold Select a current camera state as a new reference frame
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
Experimental Results
Experimental Results
Experimental Results
Experimental Results
Experimental Results
Experimental Results
Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation
Result Evaluation Localization Evaluation Mapping Evaluation 2D localization evaluation 3D localization evaluation Mapping Evaluation
Result Evaluation 2D Localization Evaluation Using wiimote as a bird eye view camera Detect IR point on the omni camera while traversing in 2D plane by a mobile robot IR point
Result Evaluation 2D Localization Evaluation
Result Evaluation 2D Localization Evaluation
Result Evaluation 2D Localization Evaluation
Result Evaluation 3D Localization Evaluation Using wiimote attach with an omni-directional camera to localize the 3D camera position related to reference IR board
Result Evaluation 3D Localization Evaluation
Result Evaluation 3D Localization Evaluation
Result Evaluation 3D Localization Evaluation
Result Evaluation Mapping Evaluation Compare the mapping result with known structure environment
Result Evaluation Mapping Evaluation
Conclusion Summary Our algorithm can localize camera position and build up a map in 3D using only a omni-camera image Omni-directional camera move freely in an unknown indoor environment without knowing of camera motion model Evaluation Result shows the correspondence of the localization and mapping outcome with the ground truth
Thank you
Statistic Location Conf. room Corridor Stairway Max feature count 38 36 28 Min feature count 14 13 10 Avg. Feature count 23.9279 20.4866 13.7657 Max landmark count 43 65 Min landmark count 25 Avg. Landmark count 33.1654 45.0207 19.7684 Max time per frame (ms) 1019.73 1424.87 703.188 Min time per frame (ms) 34.9818 19.657 15.1698 Avg. time per frame (ms) 160.925 479.858 172.845 Frame count 804 1157 747
Visual SLAM for 3D Large-Scale Seabed Acquisition Employing Underwater Vehicles
Featureless Vehicle-Based Visual SLAM with a Consumer Camera
Scan-SLAM: Combining EKF-SLAM and Scan Correlation
The Problem Input Output a captured image sequence from an omni-directional camera Output a camera state 3D position and direction an environment map 3D landmark positions
Omni camera Calibration
Omni camera Calibration
Image Processing Feature detection (for new feature) Using point features Finds corners in the image using harris corner detector
Simultaneous localization and mapping (SLAM) Prediction Determine how a camera move But the camera motion is unpredictable Assume that a camera can move freely in any direction with some limit velocity Predicted State Predicted estimate covariance
Simultaneous localization and mapping (SLAM) Correction Measurement data for landmark i Observation model for landmark i y' while is transform function which transform a landmark position (y) from world coordinate to reference coordinate (x)
Simultaneous localization and mapping (SLAM) Camera and reference frames Correction Assume that the measurement data can measurement landmark positions accurately The correction affects only a camera state and reference frame states Innovation or measurement residual Innovation (or residual) covariance Optimal Kalman gain Update state estimate Update estimate covariance is Jacobian Matrix of function h at is Jacobian Matrix of function h at
Simultaneous localization and mapping (SLAM) Landmarks Correction Assume that the camera state is accurate The correction affects only landmark states Innovation or measurement residual Innovation (or residual) covariance Optimal Kalman gain Update state estimate Update estimate covariance is Jacobian Matrix of function h at is Jacobian Matrix of function h at
Solution to Problem SLAM State Camera state – represent camera frame Reference frame states – represent reference frames Landmark states – represent landmark positions
Features and reference frames management Select reference frame for feature tracking x y z x y z reference frame camera frame
Features and reference frames management Select reference frame for update SLAM state x y z x y z reference frame camera frame
Feature ray pitch yaw landmark x y z camera