UAV pose estimation using POSIT algorithm Chayatat Ratanasawanya Min He July 21, 2010
Overview The goal Experimental setup POSIT algorithm Homogeneous transformation Optitrack system Result calculation Results Conclusion Questions/comments
Experimental goal To be able to know the pose of the UAV from images taken by the on-board camera. Use POSIT algorithm in a part of the process Calculated pose is compared to the reading from the Optitrack system.
Test setup
Test setup Move the Q-ball around the test area in 17 different locations Q-ball pose is different in each location For each location, take a picture using on-board camera and record Optitrack pose reading Process the pictures offline using POSIT algorithm. Calculate Q-ball pose using homogeneous transformation and inverse kinematics. Results are compared to Optitrack readings.
POSIT algorithm Developers: Daniel DeMenthon & Philip David The algorithm determines the pose of an object relative to the camera from a set of 2D image points Image coordinates of min. 4 non-coplanar feature points POSIT Daniel DeMenthon and Philip David are at the university of Maryland Rotation matrix of object wrt. camera 3D world coordinates of the same points Translation of object wrt. camera Camera intrinsic parameters Reference: http://www.cfar.umd.edu/~daniel/SoftPOSIT.txt
Homogeneous transformation Homogeneous transformation is a matrix which shows how one coordinate frame is related to another. It is used to convert the location of a point between two frames. y x z Frame A y x z Frame C (dx, dy, dz)
Homogeneous transformation Multiplication: Inverse: y x z Frame A y x z Frame B BTA y x z Frame C CTB
Forward kinematics The process of deriving the transformation matrix from a known transformation (rotation and translation) between two frames y x z Frame A y x z Frame A y x z Frame C y x z Frame A ψ (dx, dy, dz) θ
Inverse kinematics The process of deriving the transformation (rotation and translation) between two frames from a known transformation matrix Translation Inverse kinematics formula Rotation angles
Inverse kinematics formulas ψ y x z θ
Optitrack system A motion capture system It tracks the movement of IR reflectors attached to an object in the workspace using six IR cameras Origin of the workspace (world) coordinates has to be set up during system calibration Point cloud mode: gives x,y,z-coordinates of individual reflector in a group Trackable mode: gives pose of an object defined by a group of reflectors
Q-ball trackable object
Result calculation y x z Box frame, B y x z Q-ball frame, Q y x z Cam frame, C CTB POSIT y x z World frame, W
Result calculation y x z Q-ball frame, Q y x z Box frame, B z x y Cam frame, C WTB y x z World frame, W
Result calculation y x z Q-ball frame, Q y x z Box frame, B z x y Cam frame, C QTC y x z World frame, W
Result calculation Inverse kinematics formula Translation y x z Q-ball frame, Q y x z Box frame, B z x y Cam frame, C Inverse kinematics formula CTB WTB QTC y x z World frame, W Translation Rotation angles
Results Experiment Optitrack measurements of Q-ball pose Calculation of Q-ball pose using POSIT algorithm x (cm) y (cm) z (cm) roll (deg) yaw (deg) pitch (deg) 1 -55.92 78.27 -33.51 -6.432 -11.43 -8.805 -56.9388 88.6125 -31.7526 -2.1859 -11.8127 -12.1935 2 -46.26 63.91 -32.31 -1 -9.369 -0.6592 -46.1600 71.5917 -31.5440 0.8890 -8.4284 -3.5944 3 -66.03 66.32 17.57 -10.48 -20.46 0.028 -63.2006 51.1865 15.7156 -9.7964 -18.8214 7.8211 4 3.569 103.7 -40.82 -5.253 0.423 -11.02 18.7217 109.0871 -47.2341 -6.7481 5.8695 -14.7075 5 32.92 103.2 -54.8 -1.326 20.29 -11.55 29.4440 109.7598 -55.9636 -5.4079 17.9145 -15.6576 6 61.71 103.1 -65.01 12.11 31.45 -12.75 62.8314 109.1026 -70.7933 3.4818 34.5913 -12.1920 7 82.7 102.2 -70.74 13.1 41.74 -13.25 85.1300 104.3310 -78.3997 5.1808 47.0439 -6.5138 8 23.1 103.5 -40.13 9.587 16.73 -12.26 34.8409 108.5184 -48.2129 4.2641 23.8152 -12.6185 9 -53.57 92.41 -52.75 -5.341 -16.65 -12.19 -49.9633 98.3089 -54.4165 -0.7202 -15.7037 -14.3379 10 -46.95 92.62 -72.83 -1.814 -25.09 -13.02 -45.4508 86.6950 -71.7959 3.3547 -23.8781 -10.2785 11 -41.74 92.17 -65.05 1.685 -10.55 -10.03 -38.9600 98.1515 -65.5449 3.8103 -8.2466 -13.2455 12 -62.84 103.6 -9.212 1.721 -10.9 -11.78 -68.0221 105.1375 -11.2639 5.5375 -11.9621 -13.3388 13 -59.93 100.9 -2.383 -0.6124 -5.947 0.5845 -66.7789 103.5136 5.9820 -1.1543 -7.3512 0.6478 14 -54.06 66.81 -5.23 6.3 -9.559 11.55 -53.8803 83.6447 -2.1709 3.7949 -9.1901 5.5188 15 -60.56 76.76 -18.05 0.1576 -24.6 1.029 -60.5642 76.0576 -18.4691 -0.3815 -24.2316 1.3763 16 -53.01 76.78 -26.84 -0.9526 -23.04 0.681 -56.0662 83.8747 -28.5519 1.6016 -23.9035 -2.3402 17 -43.46 79.25 -40.21 -8.653 -10.15 -9.22 -46.0110 87.4031 -41.7292 -4.6202 -12.5088 -11.3917
Results: Translational DOF
Results: Rotational DOF
Results: Error Sources of error: Max. Error x = 15 cm Max. Error y = 17 cm Max. Error z = 8 cm Max. Error roll = 8.5 deg Max. Error yaw = 7 deg Max. Error pitch = 8 deg Sources of error: Optitrack measurement accuracy of 4 cm Imaginary c.g. of Q-ball trackable object does not correspond exactly to c.g. of Q-ball used to define QTC
Conclusion POSIT algorithm can be used to estimate the pose of the UAV offline A 3D object of known dimension must be in the scene At least 4 non-coplanar feature points must be seen in image. Position of the object in the world frame must be known (frame WTB)
Summary Experimental goal and setup POSIT algorithm Results of POSIT & homogeneous transformation Optitrack system How to calculate the result Test results
Thank you