Download presentation
Presentation is loading. Please wait.
Published byBerenice Holmes Modified over 9 years ago
1
3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction
2
3D Shape from X shading silhouette texture stereo light striping motion mainly research used in practice
3
Structured Light 3D data can also be derived using a single camera a light source that can produce stripe(s) on the 3D object light source camera light stripe
4
Structured Light 3D Computation 3D data can also be derived using a single camera a light source that can produce stripe(s) on the 3D object light source x axis f (x´,y´,f) 3D point (x, y, z) b b [x y z] = --------------- [x´ y´ f] f cot - x´ (0,0,0) 3D image
5
Depth from Multiple Light Stripes What are these objects?
6
Our (former) System 4-camera light-striping stereo projector rotation table cameras 3D object
7
Review: The Camera Model How do we get an image point IP from a world point P? c 11 c 12 c 13 c 14 c 21 c 22 c 23 c 24 c 31 c 32 c 33 1 s IP r s IP c s PxPyPz1PxPyPz1 = image point camera matrix C world point What’s in C?
8
Review: The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates. 1. CP = T R WP 2. FP = (f) CP s FP x s FP y s FP z s 1 0 0 0 0 1 0 0 0 0 1 0 0 0 1/f 0 CP x CP y CP z 1 = perspective transformation image point 3D point in camera coordinates Why is there not a scale factor here?
9
Camera Calibration In order work in 3D, we need to know the parameters of the particular camera setup. Solving for the camera parameters is called calibration. ywyw xwxw zwzw W ycyc xcxc zczc C intrinsic parameters are of the camera device extrinsic parameters are where the camera sits in the world
10
Intrinsic Parameters principal point (u 0,v 0 ) scale factors (d x,d y ) aspect ratio distortion factor focal length f lens distortion factor (models radial lens distortion) C (u 0,v 0 ) f
11
Extrinsic Parameters translation parameters t = [t x t y t z ] rotation matrix r 11 r 12 r 13 0 r 21 r 22 r 23 0 r 31 r 32 r 33 0 0 0 0 1 R = Are there really nine parameters?
12
Calibration Object The idea is to snap images at different depths and get a lot of 2D-3D point correspondences.
13
The Tsai Procedure The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used. Several images are taken of the calibration object yielding point correspondences at different distances. Tsai’s algorithm requires n > 5 correspondences {(x i, y i, z i ), (u i, v i )) | i = 1,…,n} between (real) image points and 3D points.
14
In this* version of Tsai’s algorithm, The real-valued (u,v) are computed from their pixel positions (r,c): u = d x (c-u 0 ) v = -d y (r - v 0 ) where - (u 0,v 0 ) is the center of the image - d x and d y are the center-to-center (real) distances between pixels and come from the camera’s specs - is a scale factor learned from previous trials * This version is for single-plane calibration.
15
Tsai’s Procedure 1. Given the n point correspondences ((x i,y i,z i ), (u i,v i )) Compute matrix A with rows a i a i = (v i *x i, v i *y i, -u i *x i, -u i *v i, v i ) These are known quantities which will be used to solve for intermediate values, which will then be used to solve for the parameters sought.
16
Intermediate Unknowns 2. The vector of unknowns is = ( 1, 2, 3, 4, 5 ): 1 =r 11 /t y 2 =r 12 /t y 3 =r 21 /t y 4 =r 22 /t y 5 =t x /t y where the r’s and t’s are unknown rotation and translation parameters. 3. Let vector b = (u 1,u 2,…,u n ) contain the u image coordinates. 4. Solve the system of linear equations A = b for unknown parameter vector .
17
Use to solve for ty, tx, and 4 rotation parameters 5. Let U = 1 2 + 2 2 + 3 2 + 4 2. Use U to calculate t y 2.
18
6. Try the positive square root t y = (t 2 ) and use it to compute translation and rotation parameters. 1/2 r 11 = 1 t y r 12 = 2 t y r 21 = 3 t y r 22 = 4 t y t x = 5 t y Now we know 2 translation parameters and 4 rotation parameters. except…
19
Determine true sign of t y and compute remaining rotation parameters. 7. Select an object point P whose image coordinates (u,v) are far from the image center. 8. Use P’s coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P. If its coordinates have the same signs as (u,v), then keep t y, else negate it. 9. Use the first 4 rotation parameters to calculate the remaining 5.
20
Calculating the remaining 5 rotation parameters:
21
Solve another linear system. 10. We have t x and t y and the 9 rotation parameters. Next step is to find t z and f. Form a matrix A´ whose rows are: a i ´ = (r 21 *x i + r 22 *y i + t y, v i ) and a vector b´ whose rows are: b i ´ = (r 31 *x i + r 32 *y i ) * v i 11. Solve A´*v = b´ for v = (f, t z ).
22
Almost there 12. If f is negative, change signs (see text). 13. Compute the lens distortion factor and improve the estimates for f and t z by solving a nonlinear system of equations by a nonlinear regression. 14. All parameters have been computed. Use them in 3D data acquisition systems.
23
We use them for general stereo. P P 1 =(r 1,c 1 ) P 2 =(r 2,c 2 ) y1y1 y2y2 x1x1 x2x2 e1e1 e2e2 B C
24
Now we can build 3D models from real objects.
30
image plane at depth d (u,v,d) (x,y,z) OUTSIDE one of many cubes 3D space is made up of many cubes.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.