3D pose by Kinect --Reading the work “Point-Plane Slam for Handheld 3D” (a beta version) A tutorial By KH Wong 3d kinect tracking v.5d (Beta)1.

Slides:



Advertisements
Similar presentations
Page 6 As we can see, the formula is really the same as the formula. So, Furthermore, if an equation of the tangent line at (a, f(a)) can be written as:
Advertisements

Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Image Processing and Computer Vision Chapter 10: Pose estimation by the iterative method (restart at week 10) Pose estimation V4h31.
Pose Estimation Using Four Corresponding Points M.L. Liu and K.H. Wong, "Pose Estimation using Four Corresponding Points", Pattern Recognition Letters,
Section 7.2: Direction Fields and Euler’s Methods Practice HW from Stewart Textbook (not to hand in) p. 511 # 1-13, odd.
Discussion topics SLAM overview Range and Odometry data Landmarks
Probabilistic Robotics
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
IR Lab, 16th Oct 2007 Zeyn Saigol
Introduction to Probabilistic Robot Mapping. What is Robot Mapping? General Definitions for robot mapping.
MCL -> SLAM. x: pose m: map u: robot motions z: observations.
Mean transform, a tutorial KH Wong mean transform v.5a1.
Today’s class Romberg integration Gauss quadrature Numerical Methods
Visual Recognition Tutorial
Project Proposal Coffee delivery mission Oct, 3, 2007 NSH 3211 Hyun Soo Park, Iacopo Gentilini Robotic Motion Planning Potential Field Techniques.
Paper Discussion: “Simultaneous Localization and Environmental Mapping with a Sensor Network”, Marinakis et. al. ICRA 2011.
MULTIPLE INTEGRALS MULTIPLE INTEGRALS Recall that it is usually difficult to evaluate single integrals directly from the definition of an integral.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Probabilistic Robotics
Face Recognition Based on 3D Shape Estimation
Monte Carlo Localization
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
ROBOTIC GUIDANCE Joe Stawicki. PROJECT DESCRIPTION  Teach a robot to guide a person to a predefined destination.  The robot must use a cam and a vision.
Simultaneous Rate and Power Control in Multirate Multimedia CDMA Systems By: Sunil Kandukuri and Stephen Boyd.
ROBOT MAPPING AND EKF SLAM
Kalman filter and SLAM problem
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
/09/dji-phantom-crashes-into- canadian-lake/
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
3D SLAM for Omni-directional Camera
SA-1 Mapping with Known Poses Ch 4.2 and Ch 9. 2 Why Mapping? Learning maps is one of the fundamental problems in mobile robotics Maps allow robots to.
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
MESA LAB Multi-view image stitching Guimei Zhang MESA LAB MESA (Mechatronics, Embedded Systems and Automation) LAB School of Engineering, University of.
WMNL Sensors Deployment Enhancement by a Mobile Robot in Wireless Sensor Networks Ridha Soua, Leila Saidane, Pascale Minet 2010 IEEE Ninth International.
(Wed) Young Ki Baik Computer Vision Lab.
A Framework for use in SLAM algorithms Principle Investigator: Shaun Egan Supervisor: Dr. Karen Bradshaw.
MESA LAB Two papers in icfda14 Guimei Zhang MESA LAB MESA (Mechatronics, Embedded Systems and Automation) LAB School of Engineering, University of California,
Distributed Tracking Using Kalman Filtering Aaron Dyreson, Faculty Advisor: Ioannis Schizas, Ph.D. Department of Electrical Engineering, The University.
By: 1- Aws Al-Nabulsi 2- Ibrahim Wahbeh 3- Odai Abdallah Supervised by: Dr. Kamel Saleh.
Finding Limits Graphically and Numerically 2015 Limits Introduction Copyright © Cengage Learning. All rights reserved. 1.2.
A Region Based Stereo Matching Algorithm Using Cooperative Optimization Zeng-Fu Wang, Zhi-Gang Zheng University of Science and Technology of China Computer.
Numerical Methods Part: False-Position Method of Solving a Nonlinear Equation
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Lecture 5 - Single Variable Problems CVEN 302 June 12, 2002.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Principle Investigator: Lynton Dicks Supervisor: Karen Bradshaw CO-OPERATIVE MAPPING AND LOCALIZATION OF AUTONOMOUS ROBOTS.
The Development of a Relative Point SLAM Algorithm and a Relative Plane SLAM Algorithm.
A tutorial on using mirror to calibrate non-overlapping view cameras
Probabilistic Robotics
Application of Stereo Vision in Tracking *This research is supported by NSF Grant No. CNS Opinions, findings, conclusions, or recommendations.
Aggregate Stock Market 1. Introduction The standard framework for thinking about aggregate stock market behavior has been the consumption-based approach.
HIGH-SPEED LASER LOCALIZATION FOR MOBILE ROBOTS - Kai Lingemann, Andreas Nuchter -Robotics and Autonomous Systems (2005)
SLAM : Simultaneous Localization and Mapping
Mobile Robotics. Fundamental Idea: Robot Pose 2D world (floor plan) 3 DOF Very simple model—the difficulty is in autonomy.
EXAMPLE FORMULA DEFINITION 1.
Paper – Stephen Se, David Lowe, Jim Little
Algorithm Efficiency Algorithm efficiency
Mean transform , a tutorial
A study of the paper Rui Rodrigues, João P
Solving Nonlinear Equation
Chapter 1: False-Position Method of Solving a Nonlinear Equation
Introduction to Robot Mapping
Breakthroughs in 3D Reconstruction and Motion Analysis
Image processing and computer vision
Pose estimation methods
CHAPTER Five: Collection & Analysis of Rate Data
Simultaneous Localization and Mapping
MATH 1910 Chapter 3 Section 8 Newton’s Method.
Probabilistic Robotics Bayes Filter Implementations FastSLAM
Presentation transcript:

3D pose by Kinect --Reading the work “Point-Plane Slam for Handheld 3D” (a beta version) A tutorial By KH Wong 3d kinect tracking v.5d (Beta)1

Overview Based on the paper [1] and discuss various problems and techniques. [1] Yuichi Taguchi ∗, Yong-Dian Jian†, Srikumar Ramalingam ∗, and Chen Feng‡, “Point-Plane SLAM for Hand-Held 3D Sensors”, Robotics and Automation (ICRA), 2013 IEEE International Conference on n/taguchi2013icra.pdfRobotics and Automation (ICRA), 2013 IEEE International Conference on 3d kinect tracking v.5d (Beta)2

Introduction The work is to use 3-D points and planes obtained by arrange sensor (Kinect) as features for 3D pose estimation and reconstruction to achieve (SLAM) Simultaneous Localization and mapping. 3d kinect tracking v.5d (Beta)3

Problem definition To be written later, can be found in the paper 3d kinect tracking v.5d (Beta)4

Derivation I have problem in understanding eq. (13), however, I can derive the result from eq.(12) to (15) directly, the steps are shown below. Problem and notation There are i=1,..,N, 3D points, first at landmark position then moved to measured position. In eq.(12) – P l i = i-th landmark 3-D point (=model point) position – P k m = i-th measured 3-D point at time k, – We want to find the pose (R=rotation, t=translation) of the point at time k, that P i l =R*P k m +t k Same as to find R,t, so||P i l –(R*P k m +t k ) || 2 = (12) 3d kinect tracking v.5d (Beta)5

Continue 3d kinect tracking v.5d (Beta)6

3d kinect tracking v.5d (Beta)7

3d kinect tracking v.5d (Beta)8

3d kinect tracking v.5d (Beta)9

Iterative formula using Newton's method 3d kinect tracking v.5d (Beta)10

Newton’s method Algorithm 3d kinect tracking v.5d (Beta)11

Known problem I don’t understand when [x,y,z]’ are needed to be guessed and calculated, because they should be given. Landmark_[x,y,z]’ is given the Kinect measured at k=1. And measured_at_k_[x,y,z]’ should also be given by the Kinect sensor at time k. So I think the system is not j=1,..,9 but j=1,,6. Meaning the parameters to be guessed are (tx,ty,ty,  x,  y,  z) excluding (x,y,z) We should test this idea. 3d kinect tracking v.5d (Beta)12