Image Processing and Computer Vision Chapter 10: Pose estimation by the iterative method (restart at week 10) Pose estimation V4h31.

Slides:



Advertisements
Similar presentations
Announcements. Structure-from-Motion Determining the 3-D structure of the world, and/or the motion of a camera using a sequence of images taken by a moving.
Advertisements

Numerical Computation Lecture 4: Root Finding Methods - II United International College.
Pose Estimation Using Four Corresponding Points M.L. Liu and K.H. Wong, "Pose Estimation using Four Corresponding Points", Pattern Recognition Letters,
Introduction to Computer Vision 3D Vision Lecture 4 Calibration CSc80000 Section 2 Spring 2005 Professor Zhigang Zhu, Rm 4439
SOLUTION OF STATE EQUATION
ME 4135 Differential Motion and the Robot Jacobian Slide Series 6 Fall 2011 R. R. Lindeke, Ph.D.
Kinematics & Grasping Need to know: Representing mechanism geometry Standard configurations Degrees of freedom Grippers and graspability conditions Goal.
Image Processing and Computer Vision Chapter 11: Bundle adjustment Structure reconstruction SFM from N-frames Bundle adjustment– structure reconstruction.
Mapping: Scaling Rotation Translation Warp
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
ENGG 1801 Engineering Computing MATLAB Lecture 7: Tutorial Weeks Solution of nonlinear algebraic equations (II)
Computer Graphics (Fall 2008) COMS 4160, Lecture 3: Transformations 1
MSU CSE 803 Fall 2008 Stockman1 CV: 3D sensing and calibration Coordinate system changes; perspective transformation; Stereo and structured light.
Computer Graphics (Fall 2004) COMS 4160, Lecture 3: Transformations 1
Screw Rotation and Other Rotational Forms
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Lecture 12: Structure from motion CS6670: Computer Vision Noah Snavely.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #15.
ME 4135 Differential Motion and the Robot Jacobian
ME 4135 Differential Motion and the Robot Jacobian Fall 2012 R. R. Lindeke, Ph.D.
Roots of a Polynomial: Root of a polynomial is the value of the independent variable at which the polynomial intersects the horizontal axis (the function.
Mathematical Fundamentals
SFM Kalman V5b21 3D computer vision Structure from motion using Kalman filter.
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Simpson Rule For Integration.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Geometric Models & Camera Calibration
3D SLAM for Omni-directional Camera
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
MA/CS 375 Fall MA/CS 375 Fall 2002 Lecture 31.
Jinxiang Chai Composite Transformations and Forward Kinematics 0.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION ASEN 5070 LECTURE 11 9/16,18/09.
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Radial Distortion (Extrinsic Calibration?) 1.
CS-498 Computer Vision Week 7, Day 1 3-D Geometry
3 DERIVATIVES.  Remember, they are valid only when x is measured in radians.  For more details see Chapter 3, Section 4 or the PowerPoint file Chapter3_Sec4.ppt.
Just a quick reminder with another example
1) Magnetic total field (T) obtained from airborne survey (see R.J.Blakely, 1995) (ΔT) Total field anomaly (IGRF removal), which satisfy potential theory,
COS429 Computer Vision =++ Assignment 4 Cloning Yourself.
Introduction to VRML for generating 3-D display
INTRODUCTION TO DYNAMICS ANALYSIS OF ROBOTS (Part 4)
A tutorial on using mirror to calibrate non-overlapping view cameras
Image Processing and Computer Vision
Foundations of Computer Graphics (Spring 2012) CS 184, Lecture 3: Transformations 1
Determining 3D Structure and Motion of Man-made Objects from Corners.
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
The formulae for the roots of a 3rd degree polynomial are given below
Problem Set 2 Reconstructing a Simpler World COS429 Computer Vision Due October (one week from today)13 th.
Learning from the Past, Looking to the Future James R. (Jim) Beaty, PhD - NASA Langley Research Center Vehicle Analysis Branch, Systems Analysis & Concepts.
CMSC5711 Image processing and computer vision
Character Animation Forward and Inverse Kinematics
Geometric Transformations
Paper – Stephen Se, David Lowe, Jim Little
The formulae for the roots of a 3rd degree polynomial are given below
The formulae for the roots of a 3rd degree polynomial are given below
Image Processing and Computer Vision
Introduction to VRML for generating 3-D display
CMSC5711 Revision 3 CMSC5711 revision 3 ver.x67.8c.
Introduction to Vectors and Frames
The formulae for the roots of a 3rd degree polynomial are given below
CMSC5711 Image processing and computer vision
ENGG 1801 Engineering Computing
Chapter 1: Image processing and computer vision Introduction
CSCE441: Computer Graphics 2D/3D Transformations
The formulae for the roots of a 3rd degree polynomial are given below
Pose Estimation Using Four Corresponding Points
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM CISE301_Topic1.
The formulae for the roots of a 3rd degree polynomial are given below
The formulae for the roots of a 3rd degree polynomial are given below
Presentation transcript:

Image Processing and Computer Vision Chapter 10: Pose estimation by the iterative method (restart at week 10) Pose estimation V4h31

Overview Define the terms Define Structure From Motion SFM Methods for SFM Define pose estimation, and why we need to study it Newton's method Iterative algorithm for pose estimation Pose estimation V4h3 2

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Define the terms 3 3D Model=X j =[X,Y,Z] T : where i=feature index =1,2…n features. X can found by manual measurement Pose  t is the Rotation (R) and Translation (T) of the object at a time t, where  t ={R 3x3,T 3x1 } t q t i = [u;v] t i is the image point of the i th 3D feature at time t X i=1 =[102,18,23] T X Y Z X i=2 =[92,126,209] T u= horizontal image position, v=vertical image position Pose estimation V4h3

Intro. | Motivation | Pose est.| Newton’s method | Iterative method What is Structure From Motion SFM? 3D Model=X j : where j=feature index =1,2…n features 4 Time (t)  t=1 t= 2 t=3 … Pose estimation V4h3

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Methods of Structure From Motion SFM :3D reconstruction from N-frames Factorization (linear, fast, not too accurate) Bundle adjustment BA (slower but more accurate), can use factorization results as the first guess. – Non-linear iterative methods are more accurate than linear method, require first guess (e.g. From factorization). – Many different implementations, but the concept is the same. – Two-step Bundle Adjustment (a special form of Bundle adjustment BA) Iterative pose estimation Iterative structure reconstruction Pose estimation V4h35

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Motivation In order to understand bundle adjustment for 3-D model structure reconstruction from N-frames, we need to understand pose estimation first. Pose estimation problem definition: There are N features in a known 3D object. We take m pictures of the object at different views. Input : We know the n model points of the object Image sequence I 1,I 2,…I m. Each image has n image feature points Output (structure=model, and motion=pose) Pose (R-rotation, T-translation) of the object in 3-D at each image. Model of the object (X, Y, Z of each of the n feature points on the object) Pose estimation V4h36

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Example: Bundle adjustment 3D reconstruction (see also Grand Canyon Demo Flask Robot Pose estimation V4h

Intro. | Motivation | Pose est.| Newton’s method | Iterative method The iterative SFM alternating (2-step)bundle adjustment Break down the system into two phases: --SFM1: find pose phase --SFM2: find model phase Initialize first guess of model – The first guess is a flat model perpendicular to the image and is Zinit away (e.g. Zinit = 0.5 meters or any reasonable guess) Iterative while ( Err is not small ) { – SFM1: find pose phase – SFM2: find model phase – Measurement error(Err) or(model and pose stabilized) } Pose estimation V4h38

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Define SFM1: the pose estimation algorithm (assume the model is found or given) Use KLT (FeatureDetector interface (or lkdemo.c) in opencv, or to obtain features in [u,v] T There are t=1,2,…,  image frames, So there are  t=1 ={R,T} t=1,  t=2 ={R,T} t=2, ….,  t=  ={R,T} t=  poses. Given: focal length f and one model M i =[X,Y,Z] I,with i=1,..,N features Initialize first guess of model – The first guess is a flat model perpendicular to the image and is Zinit away (e.g. Zinit = 0.5 meters or any reasonable guess) – For (t=1; t<  ; t++) – {(for every time frame t, use all N features, run SFM1 once); – so SFM1 {SFM1: find pose : to find  t } runs  times here – } – After the above is run –  t=1 ={R,T} t=1,  t=2 ={R,T} t=2, ….,  t=  ={R,T} t=  poses are found Pose estimation V4h3 9

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Define SFM2: the structure estimation algorithm (assume poses are found or given) To be discussed in the next chapter : Bundle adjustment Similar to pose estimation. – In pose estimation: model is known, pose is unknown. – Here (Model finding by the iterative method) Assume pose is known, model is unknown. – The ideas of the algorithms are similar. Pose estimation V4h3 10

Intro. | Motivation | Pose est.| Newton’s method | Iterative method SFM1 : find pose (R 3x3,T 3x1 ) t Pose estimation One image (taken at time t) is enough for finding the pose at time t, if the model is known. Pose estimation V4h311

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Problem setting Pose estimation V4h312

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Pose estimation problem definition (given model points and one image at t, find pose ) There are N 3-D feature points on the model. The relative positions of the 3D features are known through measurements. At time t (t=1,..m)there are N image features {q i=1..,N } t Assume you know the correspondences for all i=1,…,N, That means: (X,Y,Z) i=1,..N  {q i=1..,N } t The target is to find R,T from {q i=1..,N } t Only one image at t is need. Pose estimation V4h3 13 Model R (rotation), T (translation) [X i=1,Y i=1,Z i=1 ] T [X i=2,Y i=2,Z i=2 ] T World coordinates [X’ i=1,Y’ i=1,Z’ i=1 ] T i=1, …,N Total N-features [X ’ i=2,Y ’ i=2, Z ’ i=2 ] T

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Example of pose estimation We know the 3-D positions of the features on this box. (e.g. 4 points as shown, corners of a 10cm^3 cube) In the image at time t, we know the correspondences of which corners appear in the image and their image positions. (image correspondences) We can find R,T from this image at time t1. Pose estimation V4h3 14 Time t=0 Time t=t1 R,T q i=1,t=0 q i=2,t=0 q i=2,t=t1 q i=1,t=t1

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Exercise 0:Pose estimation & image correspondence a)What are the input and output of a pose estimation algorithm? b)How many images are enough for pose estimation? c)Estimate the correspondences and Fill in the blanks Pose estimation V4h3 15 R,T q i=1,t=0 q i=2,t=0 q i=2,t=t1 q i=1,t=t1 Image at Time t=t 0 Image at Time t=t1 u= V543210V Time t q1q2q3q4 uvuvuvuv t=t t=t u= q4q4 q3q3

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Exercise 1: Newton’s method (An itervative method ) An iterative method for finding the solution of a non-linear system Exercise 1.Find sqrt(5), same as find the non-linear function. – f(x)=x 2 -5=0 – Taylor series (by definition) – f(x)=f(x 0 )+f’(x 0 )*(x-x 0 )=0 – f’(x 0 )=2*x 0, so – f(x)=f(x 0 )+2*x 0 *(x-x 0 )=0 First guess, x 0 =2. f(x)=f(x 0 )+ f’(x 0 ) *(x-x 0 )  0 0  f(x 0 ) + f’(x 0 ) *(x-x 0 ) [0-f(x 0 )]/f’(x 0 )  (x-x 0 ) [0-(x )]/2*x 0 =  x  (x-x 0 ) [0-(x )]/2*x 0 =  x Take x 0 =2, [0-(2 2 -5)]/2*2 =  x ¼=  x Since  x  (x-x 0 ), x=new guess, x 0 =old_guess ¼  x-2, x  2.25 That means the next guess is x  x  Exercise: Complete the steps to find the solution. For your reference: sqrt(5)= (by calculator) Pose estimation V4h otes/approx/newton.html otes/approx/newton.html

Intro. | Motivation | Pose est.| Newton’s method | Iterative method The main idea of Newton's method We saw this formula before: f(x)=f(x 0 )+f’(x 0 )*(x-x 0 )  (i) From f(x)=f(x 0 )+f’(x 0 )*(x-x 0 )  0 0  f(x 0 )+f’(x 0 )*(x-x 0 ) 0 - f(x 0 )= f’(x 0 )*(x-x 0 ) [0 - f(x 0 )]/ f’(x 0 )=  x=(x-x 0 ) We can compute  x=[0 - f(x 0 )]/ f’(x 0 ), then Since  x=(x-x 0 ), so x=x 0 +  x That means: X new_guess = x 0(old_guess) +  x Pose estimation V4h3 17

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Pose estimation in 3D Pose estimation V4h3 18 q i=1,t0 q i=2,t0 Camera center 3D object at t 0 3D object at t=t 1 q i=1,t1 q i=2,t1

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Exercise 2 From 3-D :P=[X,Y,Z] T to 2-D :q=[u,v] T image projection Pose estimation V4h319 Here you relate pose (R,T) and model (X,Y,Z) i with image point (u,v) q i=1,t q i=2,t Camera center 3D object image

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Exercise 3 Pose estimation V4h3 20

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Pose estimation V4h3 21 Using Taylor series

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Exercise 4 Pose estimation V4h3 22

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Continue Pose estimation V4h3 23

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Continue Pose estimation V4h3 24

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Find partial derivatives Pose estimation V4h3 25

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Pose estimation V4h3 26

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Pose estimation V4h3 27

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Pose estimation V4h3 28 Similarly, so we have the results for v i, Exercise 5: complete the proofs on the right hand side

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Exercise 6 Pose estimation V4h3 29

Intro. | Motivation | Pose est.| Newton’s method | Iterative method continue Pose estimation V4h3 30 Measured Guessed :have hats , when guessed pose  is given, guessed u,v are found by equation 5(a),5(b) To be found (  1,2,3,T 1,2,3 ) Calculated from guessed(  1,2,3,T 1,2,3 ) j 2x6 Important idea

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Exercise 7 (7a) Referring to previous notes and write the answers (7b) Rewrite equation 7(b) if measured ui=132,vi=215, Xi=100,Yi=200,Zi=300, focal length is 788 and guessed pose  t is [0.1,0.2, 0.3, 1111,2222,3333] T. Angles are in radian. After you put in the above values to equation 7(b) what are the unknowns left? Pose estimation V4h3 31

Pose estimation V4h3 32 At time t, there are N features The formulas apply to one frame at time t. There are 1=1,2,…N features. each time t,  t is found. SFM1 will run  times, each time is independent. From Eq. 7b

Intro. | Motivation | Pose est.| Newton’s method | Iterative method The Jacobian (J) can also be written as follows (J can be found when guessed pose and model M are given) Pose estimation V4h333

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Define terms for the iterative algorithm Pose estimation V4h334

Intro. | Motivation | Pose est.| Newton’s method | Iterative method Recall: The main idea of Newton's method We saw this formula before: f(x)=f(x 0 )+f’(x 0 )*(x-x 0 )  (i) From f(x)=f(x 0 )+f’(x 0 )*(x-x 0 )  f(x 0 )= f’(x 0 )*(x-x 0 ) (ii) [0 - f(x 0 )]/ f’(x 0 )=  x=(x-x 0 ) We can compute  x=[0 - f(x 0 )]/ f’(x 0 ), then Since  x=(x-x 0 ), so x=x 0 +  x That means: X new_guess = x 0(old_guess) +  x In our pose estimation algo. X becomes  E=J* , J -1 *E= , this is similar to (ii), J -1  1/f’(x 0 ), E  [0 - f(x 0 )],  x   E=[u measure -u guess ] J -1 *E=  =  (k+1 new_guess) -  (kth-old_guess) Use J -1 *E =  to find  since  =  (k+1 new_guess) -  (kth-old_guess)  (k+1_new_guess)=  (kth_old_guess) + , see next slide Pose estimation V4h3 35

Intro. | Motivation | Pose est.| Newton’s method | Iterative method The iterative algorithm (SFM1) Pose estimation V4h3 36 The formulas apply to one frame at time t. There are 1=1,2,…N features. each time t,  t is found. SFM1 will  times, each time is independent. In the end,  t=1 ={R,T} t=1,  t=2 ={R,T} t=2, ….,  t=  ={R,T} t=  poses are found SFM1: This algorithm is to find the pose 

Summary Studied the iterative algorithm for pose estimation Pose estimation V4h3 37

Appendix Pose estimation V4h3 38

rot_syms.m : use the matlab symbolic processor to show varies possible arrangements of the Rotational matrix % rot_syms.m %show varies arrangement of the %R rotation matrix, khw syms an_x an_y an_z real Rz=[cos(an_z) sin(an_z) 0 -sin(an_z) cos(an_z) ]; Ry=[cos(an_y) 0 -sin(an_y) sin(an_y) 0 cos(an_y)]; Rx=[ cos(an_x) sin(an_x) 0 -sin(an_x) cos(an_x)]; Rxyz= Rz*Ry*Rx %do x first then y, z transpose_Rxyz= (Rz*Ry*Rx)' inverse_Rxyz= inv(Rz*Ry*Rx) Rzyx= Rx*Ry*Rz %do z first then y, x transpose_Rzyx= (Rx*Ry*Rz)' %do z first, then z, and y inverse_Rzyx= inv(Rx*Ry*Rz)%do z first, then z, and y %ANOTHER SET, IN THIS SET r_x=Rx', r_y=Ry', r_z=Rz' rz=Rz'; ry=Ry'; rx=Rx'; rxyz= rz*ry*rx %do x first then y, z transpose_rxyz= (rz*ry*rx)' inverse_rxyz= inv(rz*ry*rx) rzyx= rx*ry*rz %do z first then y, x transpose_rzyx= (rx*ry*rz)' %do z first, then z, and y inverse_Rzyx= inv(rx*ry*rz)%do z first, then z, and y Pose estimation V4h3 39

Output of rot_syms.m (page1) >> rot_syms Rxyz = [ cos(an_y)*cos(an_z), cos(an_x)*sin(an_z) + cos(an_z)*sin(an_x)*sin(an_y), sin(an_x)*sin(an_z) - cos(an_x)*cos(an_z)*sin(an_y)] [ -cos(an_y)*sin(an_z), cos(an_x)*cos(an_z) - sin(an_x)*sin(an_y)*sin(an_z), cos(an_z)*sin(an_x) + cos(an_x)*sin(an_y)*sin(an_z)] [ sin(an_y), -cos(an_y)*sin(an_x), cos(an_x)*cos(an_y)] transpose_Rxyz = [ cos(an_y)*cos(an_z), -cos(an_y)*sin(an_z), sin(an_y)] [ cos(an_x)*sin(an_z) + cos(an_z)*sin(an_x)*sin(an_y), cos(an_x)*cos(an_z) - sin(an_x)*sin(an_y)*sin(an_z), -cos(an_y)*sin(an_x)] [ sin(an_x)*sin(an_z) - cos(an_x)*cos(an_z)*sin(an_y), cos(an_z)*sin(an_x) + cos(an_x)*sin(an_y)*sin(an_z), cos(an_x)*cos(an_y)] inverse_Rxyz = [ cos(an_y)*cos(an_z), -cos(an_y)*sin(an_z), sin(an_y)] [ cos(an_x)*sin(an_z) + cos(an_z)*sin(an_x)*sin(an_y), cos(an_x)*cos(an_z) - sin(an_x)*sin(an_y)*sin(an_z), -cos(an_y)*sin(an_x)] [ sin(an_x)*sin(an_z) - cos(an_x)*cos(an_z)*sin(an_y), cos(an_z)*sin(an_x) + cos(an_x)*sin(an_y)*sin(an_z), cos(an_x)*cos(an_y)] Rzyx = [ cos(an_y)*cos(an_z), cos(an_y)*sin(an_z), -sin(an_y)] [ cos(an_z)*sin(an_x)*sin(an_y) - cos(an_x)*sin(an_z), cos(an_x)*cos(an_z) + sin(an_x)*sin(an_y)*sin(an_z), cos(an_y)*sin(an_x)] [ sin(an_x)*sin(an_z) + cos(an_x)*cos(an_z)*sin(an_y), cos(an_x)*sin(an_y)*sin(an_z) - cos(an_z)*sin(an_x), cos(an_x)*cos(an_y)] transpose_Rzyx = [ cos(an_y)*cos(an_z), cos(an_z)*sin(an_x)*sin(an_y) - cos(an_x)*sin(an_z), sin(an_x)*sin(an_z) + cos(an_x)*cos(an_z)*sin(an_y)] [ cos(an_y)*sin(an_z), cos(an_x)*cos(an_z) + sin(an_x)*sin(an_y)*sin(an_z), cos(an_x)*sin(an_y)*sin(an_z) - cos(an_z)*sin(an_x)] [ -sin(an_y), cos(an_y)*sin(an_x), cos(an_x)*cos(an_y)] inverse_Rzyx = [ cos(an_y)*cos(an_z), cos(an_z)*sin(an_x)*sin(an_y) - cos(an_x)*sin(an_z), sin(an_x)*sin(an_z) + cos(an_x)*cos(an_z)*sin(an_y)] [ cos(an_y)*sin(an_z), cos(an_x)*cos(an_z) + sin(an_x)*sin(an_y)*sin(an_z), cos(an_x)*sin(an_y)*sin(an_z) - cos(an_z)*sin(an_x)] [ -sin(an_y), cos(an_y)*sin(an_x), cos(an_x)*cos(an_y)] Pose estimation V4h3 40

Output of rot_syms.m (page2) rxyz = [ cos(an_y)*cos(an_z), cos(an_z)*sin(an_x)*sin(an_y) - cos(an_x)*sin(an_z), sin(an_x)*sin(an_z) + cos(an_x)*cos(an_z)*sin(an_y)] [ cos(an_y)*sin(an_z), cos(an_x)*cos(an_z) + sin(an_x)*sin(an_y)*sin(an_z), cos(an_x)*sin(an_y)*sin(an_z) - cos(an_z)*sin(an_x)] [ -sin(an_y), cos(an_y)*sin(an_x), cos(an_x)*cos(an_y)] transpose_rxyz = [ cos(an_y)*cos(an_z), cos(an_y)*sin(an_z), -sin(an_y)] [ cos(an_z)*sin(an_x)*sin(an_y) - cos(an_x)*sin(an_z), cos(an_x)*cos(an_z) + sin(an_x)*sin(an_y)*sin(an_z), cos(an_y)*sin(an_x)] [ sin(an_x)*sin(an_z) + cos(an_x)*cos(an_z)*sin(an_y), cos(an_x)*sin(an_y)*sin(an_z) - cos(an_z)*sin(an_x), cos(an_x)*cos(an_y)] inverse_rxyz = [ cos(an_y)*cos(an_z), cos(an_y)*sin(an_z), -sin(an_y)] [ cos(an_z)*sin(an_x)*sin(an_y) - cos(an_x)*sin(an_z), cos(an_x)*cos(an_z) + sin(an_x)*sin(an_y)*sin(an_z), cos(an_y)*sin(an_x)] [ sin(an_x)*sin(an_z) + cos(an_x)*cos(an_z)*sin(an_y), cos(an_x)*sin(an_y)*sin(an_z) - cos(an_z)*sin(an_x), cos(an_x)*cos(an_y)] rzyx = [ cos(an_y)*cos(an_z), -cos(an_y)*sin(an_z), sin(an_y)] [ cos(an_x)*sin(an_z) + cos(an_z)*sin(an_x)*sin(an_y), cos(an_x)*cos(an_z) - sin(an_x)*sin(an_y)*sin(an_z), -cos(an_y)*sin(an_x)] [ sin(an_x)*sin(an_z) - cos(an_x)*cos(an_z)*sin(an_y), cos(an_z)*sin(an_x) + cos(an_x)*sin(an_y)*sin(an_z), cos(an_x)*cos(an_y)] transpose_rzyx = [ cos(an_y)*cos(an_z), cos(an_x)*sin(an_z) + cos(an_z)*sin(an_x)*sin(an_y), sin(an_x)*sin(an_z) - cos(an_x)*cos(an_z)*sin(an_y)] [ -cos(an_y)*sin(an_z), cos(an_x)*cos(an_z) - sin(an_x)*sin(an_y)*sin(an_z), cos(an_z)*sin(an_x) + cos(an_x)*sin(an_y)*sin(an_z)] [ sin(an_y), -cos(an_y)*sin(an_x), cos(an_x)*cos(an_y)] inverse_Rzyx = [ cos(an_y)*cos(an_z), cos(an_x)*sin(an_z) + cos(an_z)*sin(an_x)*sin(an_y), sin(an_x)*sin(an_z) - cos(an_x)*cos(an_z)*sin(an_y)] [ -cos(an_y)*sin(an_z), cos(an_x)*cos(an_z) - sin(an_x)*sin(an_y)*sin(an_z), cos(an_z)*sin(an_x) + cos(an_x)*sin(an_y)*sin(an_z)] [ sin(an_y), -cos(an_y)*sin(an_x), cos(an_x)*cos(an_y)] Pose estimation V4h3 41

Rotation matrix Pose estimation V4h3 42

Matlab for pose Jacobian matrix (full) %********************feb 2013*** for extended lowe****************************** function TestJacobian % Try to solve the differentiate equations without simplification clc,clear; disp('TestJacobian'); syms a b c; %yaw( around x axis), pitch(around y), roll(aroudn z) respectively syms f X Y Z T1 T2 T3; F = [ %u f*((cos(b)*cos(c)*X - cos(b)*sin(c)*Y + sin(b)*Z+ T1)... /((-cos(a)*sin(b)*cos(c) + sin(a)*sin(c))*X... + (cos(a)*sin(b)*sin(c)+ sin(a)*cos(c))*Y + cos(a)*cos(b)*Z + T3)); %v ((sin(a)*sin(b)*cos(c)+ cos(a)*sin(c))*X... + (-sin(a)*sin(b)*sin(c)+ cos(a)*cos(c))*Y - sin(a)*sin(b)*Z + T2)... /((-cos(a)*sin(b)*cos(c) + sin(a)*sin(c))*X + (cos(a)*sin(b)*sin(c)... + sin(a)*cos(c))*Y + cos(a)*cos(b)*Z + T3)] V = [a,b,c]; Fjaco = jacobian(F,V); disp('Fjaco ='); disp(Fjaco); size(Fjaco) Fjaco(1,1) %************************ Pose estimation V4h3 43

Matlab for pose Jacobian matrix (approximation) '===test jacobian for chang,wong ieee_mm 2 pass lowe= for lowe212.m======' %use twist (small) angles approximation. clear, clc; syms R dR M TT XYZ ZZ x y z f u v a1 a2 a3 t1 t2 t3 aa1 aa2 aa3 tt1 tt2 tt3 R=[1 -aa3 aa2; aa3 1 -aa1; -aa2 aa1 1]; dR=[1 -a3 a2; a3 1 -a1; -a2 a1 1]; M=[x;y;z]; TT=[tt1;tt2;tt3]; dt=[t1;t2;t3] XYZ=dR*R*M+TT+dt; % R is a matrix multiplication transform u=f*XYZ(1)/XYZ(3); v=f*XYZ(2)/XYZ(3); ja=jacobian([u ;v],[a1 a2 a3]) Pose estimation V4h3 44

Alternative form: jacobian for chang,wong ieee_mm 2 pass lowe '==========test jacobian for chang,wong ieee_mm 2 pass, for lowe212.m====' clear % a1=yaw, a2=pitch, a3=roll, % t1=translation in x, t2=translation in y, t3=translation in z, syms R dR M TT XYZ ZZ x y z f u v a1 a2 a3 t1 t2 t3 aa1 aa2 aa3 tt1 tt2 tt3 R=[1 -aa3 aa2 aa3 1 -aa1 -aa2 aa1 1]; dR=[1 -a3 a2 a3 1 -a1 -a2 a1 1]; M=[x;y;z]; TT=[tt1;tt2;tt3]; dt=[t1;t2;t3] % XX=(dR.*R)*M+TT; %not correct, becuase R is a matrix multiplication transform XYZ=dR*R*M+TT+dt; %correct, becuase R is a matrix multiplication transform % XX=(dR+R)*M+TT; %not correct becuase R is not an addition transform u=f*XYZ(1)/XYZ(3); v=f*XYZ(2)/XYZ(3); %diff (u,a3) %diff (v,a3) ja=jacobian([u ;v],[a1 a2 a3]) jt=jacobian([u ;v],[t1 t2 t3]) Pose estimation V4h345

Answer0: Exercise 0:Pose estimation & image correspondence a)What are the input and output of a pose estimation algorithm? b)How many images are enough for pose estimation? c)Estimate the correspondences and fill in the blanks Pose estimation V4h3 46 R,T q i=1,t=0 q i=2,t=0 q i=2,t=t1 q i=1,t=t1 Image at Time t=t 0 Image at Time t=t1 u= V543210V Time t q1q2q3q4 uvuvuvuv t=t t=t u= q4q4 q3q3 Answer (a): Inputs: model of the object, (X,Y,Z) i of all i feature points, i=1,..,n Output: R 3x3,T 3x1 of the object at time t Answer (b): One

Answer 1: Newton’s method An iterative method for finding the solution of a non-linear system Exercise 1.Find sqrt(5), same as find the non-linear function. sqrt(5)= (by calculator) – f(x)=x 2 -5=0 – Taylor series (by definition) – f(x)=f(x0)+f’(x0)*(x-x0)=0 – f’(x0)=2*x0, so – f(x)=f(x0)+2*x0*(x-x0)=0 Pose estimation V4h347 Guess, x0=2.25 f(x)=f(x0)+2*x0*(x-x0)=0 f(x)=(x0 2 -5)+2*x0*(x-x0)=0 (5.06-5)+2*2.25*(x-2.25)= *(x-2.25)=0 X=((4.5*2.25)-0.06)/4.5 X= (temporally solution, but is good enough. ||Previous solution-current solution||2 =|| ||2= (small enough), continue if needed... Otherwise the solution is sqrt(5)=

Answer2 for exercise 2 Image Projection Pose estimation V4h3 48

Answer3 for exercise 3 Pose estimation V4h3 49

Answer4: Exercise 4 Pose estimation V4h3 50

Pose estimation V4h3 51 Similarly, so we have the results for v i, ??Answer5(left for student exercises): Exercise 5: complete the proves for the right hand side

Answer 6: exercise 6 Pose estimation V4h3 52

Answer7a: Exercise 7a Check the previous notes and Pose estimation V4h3 53

Answer 7b Pose estimation V4h3 54

Answer8: exercise 8 Pose estimation V4h3 55