Jan. 12, 1998 CS260 Winter 1999-Wittenbrink, lect. 3 1 CS 260 Computer Graphics Craig M. Wittenbrink Lecture 3: Camera Calibration and Views.

Slides:



Advertisements
Similar presentations
Zhengyou Zhang Vision Technology Group Microsoft Research
Advertisements

CSE473/573 – Stereo and Multiple View Geometry
Last 4 lectures Camera Structure HDR Image Filtering Image Transform.
CS 691 Computational Photography Instructor: Gianfranco Doretto 3D to 2D Projections.
Multimedia Specification Design and Production 2012 / Semester 1 / week 6 Lecturer: Dr. Nikos Gazepidis
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Computer vision: models, learning and inference
Computer vision. Camera Calibration Camera Calibration ToolBox – Intrinsic parameters Focal length: The focal length in pixels is stored in the.
Fisheye Camera Calibration Arunkumar Byravan & Shai Revzen University of Pennsylvania.
A Versatile Depalletizer of Boxes Based on Range Imagery Dimitrios Katsoulas*, Lothar Bergen*, Lambis Tassakos** *University of Freiburg **Inos Automation-software.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Announcements. Projection Today’s Readings Nalwa 2.1.
MSU CSE 240 Fall 2003 Stockman CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more.
1 3D Sensing Camera Model and 3D Transformations Camera Calibration (Tsai’s Method) Depth from General Stereo (overview) Pose Estimation from 2D Images.
Lecture 5: Projection CS6670: Computer Vision Noah Snavely.
CS485/685 Computer Vision Prof. George Bebis
Feature tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on good features.
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Single-view geometry Odilon Redon, Cyclops, 1914.
The Pinhole Camera Model
Panoramas and Calibration : Rendering and Image Processing Alexei Efros …with a lot of slides stolen from Steve Seitz and Rick Szeliski.
CSCE 641 Computer Graphics: Image-based Modeling (Cont.) Jinxiang Chai.
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Camera Parameters and Calibration. Camera parameters From last time….
Stockman MSU/CSE Math models 3D to 2D Affine transformations in 3D; Projections 3D to 2D; Derivation of camera matrix form.
Cameras, lenses, and calibration
Today: Calibration What are the camera parameters?
Sebastian Thrun CS223B Computer Vision, Winter Stanford CS223B Computer Vision, Winter 2006 Lecture 4 Camera Calibration Professor Sebastian Thrun.
Projective Geometry and Single View Modeling CSE 455, Winter 2010 January 29, 2010 Ames Room.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Automatic Camera Calibration
Computer vision: models, learning and inference
Camera Geometry and Calibration Thanks to Martial Hebert.
Last Lecture (optical center) origin principal point P (X,Y,Z) p (x,y) x y.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Geometric Models & Camera Calibration
Sebastian Thrun CS223B Computer Vision, Winter Stanford CS223B Computer Vision, Winter 2005 Lecture 2 Lenses and Camera Calibration Sebastian Thrun,
Camera calibration Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz,, Fred Pighin and Marc Pollefyes.
Shape from Stereo  Disparity between two images  Photogrammetry  Finding Corresponding Points Correlation based methods Feature based methods.
3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: , Perspective Geometry Camera Model Stereo Triangulation 3D Reconstruction by.
Geometric Camera Models
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
Vision Review: Image Formation Course web page: September 10, 2002.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
Peripheral drift illusion. Multiple views Hartley and Zisserman Lowe stereo vision structure from motion optical flow.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Computer Vision Lecture #10 Hossam Abdelmunim 1 & Aly A. Farag 2 1 Computer & Systems Engineering Department, Ain Shams University, Cairo, Egypt 2 Electerical.
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Radial Distortion (Extrinsic Calibration?) 1.
3D Sensing Camera Model Camera Calibration
Bahadir K. Gunturk1 Phase Correlation Bahadir K. Gunturk2 Phase Correlation Take cross correlation Take inverse Fourier transform  Location of the impulse.
October 5, Memorial University of Newfoundland Camera Calibration In the field of machine vision, camera calibration refers to the experimental.
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
Foundations of Computer Graphics (Spring 2012) CS 184, Lecture 5: Viewing
Computer vision: models, learning and inference M Ahad Multiple Cameras
3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.
Coordinate Systems Lecture 1 Fri, Sep 2, The Coordinate Systems The points we create are transformed through a series of coordinate systems before.
Single-view geometry Odilon Redon, Cyclops, 1914.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
Lecture 14: Projection CS4670 / 5670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
Viewing. Classical Viewing Viewing requires three basic elements - One or more objects - A viewer with a projection surface - Projectors that go from.
렌즈왜곡 관련 논문 - 기반 논문: R.Y. Tsai, An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision. Proceedings of IEEE Conference on Computer.
Depth from disparity (x´,y´)=(x+D(x,y), y)
Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/4/17
CSCE 441 Computer Graphics 3-D Viewing
Geometric Camera Models
The Pinhole Camera Model
Presentation transcript:

Jan. 12, 1998 CS260 Winter 1999-Wittenbrink, lect. 3 1 CS 260 Computer Graphics Craig M. Wittenbrink Lecture 3: Camera Calibration and Views

CS260 Winter 1999-Wittenbrink, lect. 3 2 Overview Tracking Applications Tracking and Calibration (videos) Camera Models Tsai Camera Calibration Project 1 Discussion Conclusions

CS260 Winter 1999-Wittenbrink, lect. 3 3 Tracking Applications Virtual Reality Augmented Reality Stereo/Crystal eyes displays Multiuser Virtual Workbench 3D Mouse Animation through motion capture Gait analysis Image Based Rendering (copyright Polhemus 1998) (copyright Bouguet/CS Caltech 1998) (copyright Derby Gait Lab1998)

CS260 Winter 1999-Wittenbrink, lect. 3 4 Applications for Tracking: Scanning Polhemus ( Handheld Laser Scanner (HLS) Scan by time of flight light sensing to create 3D model Magnetic tracking of position and orientation of wand Handheld Laser Scanner VRML Surface Model Copyright Polhemus 1998 Copyright Polhemus 1998

CS260 Winter 1999-Wittenbrink, lect. 3 5 Applications for Tracking: Camera Calibration Jean-Yves Bouguet “3D Photography with weak structured lighting” presentation given to Intel Santa Clara, Dec. 5, – Use corners of squares to solve for camera intrinsic and extrinsic parameters Can then remove camera distortion in images (copyright Bouguet/CS Caltech 1998)

CS260 Winter 1999-Wittenbrink, lect. 3 6 Applications for tracking: Gait analysis Motion capture used for medical analysis, sports training, and natural looking animation – Derby Gait Analysis Laboratory Use 3D Elite System from BTS Milan Infrared tracking of highly reflective markers Use multiple cameras Diagnose and plan treatment (copyright Derby Gait Lab 1998)

CS260 Winter 1999-Wittenbrink, lect. 3 7 Applications for tracking: Virtual Studio InterSense Inertial tracker – Use optical gyros and accelerometers Similar to technology in flight guidance control systems Couple with magnetic, ultrasonic, or infrared tracking, IS 900CT Use Kalman filtering to fuse sensor input (copyright InterSense 1998)

CS260 Winter 1999-Wittenbrink, lect. 3 8 InterSense IS-900CT System Solid State inertial measurement –Angular rate of rotation and linear acceleration in 3 axes Drift removed by Ultrasonic time-of-flight distance measurements Infrared triggers beacons (copyright InterSense 1998) (copyright InterSense 1998)

CS260 Winter 1999-Wittenbrink, lect. 3 9 Tracking Videos

CS260 Winter 1999-Wittenbrink, lect Camera Models Watt and Watt OpenGL Tsai

CS260 Winter 1999-Wittenbrink, lect Camera Models Watt and Watt page 7-8 (Figure 1.4) Center of projection, C View direction, N View up vector, V Positive X axis, U World Coordinates C N V U Camera Coordinates

CS260 Winter 1999-Wittenbrink, lect OpenGL Camera Model Where camera points by default in world coordinates Points down negative z axis, from origin Specified through model, perspective, and viewport matrices World Coordinates Camera View direction

CS260 Winter 1999-Wittenbrink, lect Tsai Camera Model Distortion due to non pinhole camera –Fig. 1, page 326 from paper: Roger Y. Tsai, “A Versatile Camera Calibration …”, IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4, Aug, 1987, pages image plane World Coordinates Camera Coordinates or Effective focal length

CS260 Winter 1999-Wittenbrink, lect Tsai Camera Model Looks up positive Z –OpenGL looks down negative Z (or is left handed) OpenGL camera model Tsai camera model

CS260 Winter 1999-Wittenbrink, lect Camera Calibration Two good sources: Tsai, R.Y. “A Versatile Camera Calibration Technique for High- Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses”, IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4, pages Code available: Heikkila, Janne and Olli Silven, “A Four-step Camera Calibration Procedure with Implicit Image Correction”, In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’97), San Juan, Puerto Rico, pages , June Matlab tool box available:

CS260 Winter 1999-Wittenbrink, lect Camera Calibration (cont.) Basic idea, use known 3D world coordinate points to solve for the camera’s transformation and distortion Solve for two types of parameters: –intrinsic parameters-internal camera optical characteristic –extrinsic parameters-3D position and orientation of camera

CS260 Winter 1999-Wittenbrink, lect Camera Calibration-Motivation Infer 3D location from 2D images Detect features such as laser spots, corners of equipment, structured light Determine Ray in 3D space that object point must lie on Applicable to many applications (as seen previously) Tsai’s approach: 1) autonomous 2) accurate 3) reasonably efficient

CS260 Winter 1999-Wittenbrink, lect Tsai’s review of techniques 1. Full-scale nonlinear optimization –advantage: accuracy comparable to Tsai –disadvantage: not automatic, compute intensive –2. Computing perspective transformation matrix first using linear equation solving –adv: linear optimization used –dis: lens distortion not included, error prone –3. Two plane method –adv: linear optimization dis: more unknowns –4. Geometric technique –adv: linear optimization dis: must provide focal length

CS260 Winter 1999-Wittenbrink, lect Tsai Experimental Setup Use a CCD camera Aquire a grayscale image, Threshold Link edge points to extract boundary edges. Compute across edges to get true edge fit straight lines, and solve for corners Letraset sheet positioned on steel block Find corners

CS260 Winter 1999-Wittenbrink, lect Tsai Transformations Four transformations All encapsulated in the following parameters –RotationRx, Ry, Rz –TranslationTx, Ty, Tz –focal lengthf –image centerCx, Cy –scale factor sx –radial distortionkappa1

CS260 Winter 1999-Wittenbrink, lect Tsai’s approach, Camera Model Four steps in transformation (Fig 2 from paper) Rigid body transformation Projection Matrix Radial lens distortion Viewport transform 3D World Coordinates 3D Camera coordinates Ideal undistorted image coordinate Distorted image coordinate Computer image coordinate

CS260 Winter 1999-Wittenbrink, lect Tsai’s approach, Camera Model Four steps in transformation World Coordinates Camera Coordinates or

CS260 Winter 1999-Wittenbrink, lect Step 1: Rigid Body Transformation 3D World Coordinate to 3D camera coordinate system Rotation followed by translation Similar to OpenGL Model Transformation

CS260 Winter 1999-Wittenbrink, lect Step 2: Perspective Projection 3D camera coordinates to Ideal undistorted image coordinate, involves focal length, and perspective foreshortening

CS260 Winter 1999-Wittenbrink, lect Step 3: Radial Distortion is the distorted or true image coordinate on the image plane, and The distortion is calculated by

CS260 Winter 1999-Wittenbrink, lect Step 4: Viewport calculation Calculate actual discrete pixel address location within image, with given centering

CS260 Winter 1999-Wittenbrink, lect Tsai transformation operations World space coordinates to computer image space coordinates Step 1: Rigid body Step2: Projection Step 3: distortion Distorted image coordinates Undistorted image coordinates camera coordinates

CS260 Winter 1999-Wittenbrink, lect Tsai’s versus OpenGL Four steps in transformation (Fig 2 from paper) Rigid body transformation Projection Matrix Radial lens distortion Viewport transform 3D World Coordinates 3D Camera coordinates Ideal undistorted image coordinate Distorted image coordinate Computer image coordinate Model-View Matrix Projection Matrix Perspective Division Viewport Transformation Object coordinates eye coordinates clip coordinates Normalized device coordinates window coordinates

CS260 Winter 1999-Wittenbrink, lect OpenGL Camera Model glTranslatef(0.0, 0.0, -5.0) Can see either camera moving, or object moving, it only modifies the Model-view matrix -5.0

CS260 Winter 1999-Wittenbrink, lect OpenGL transformation operations Normalized device coordinates from object coordinates Model-View Matrix Projection Matrix Perspective division Normalized device coordinates Clip coordinates Eye coordinates

CS260 Winter 1999-Wittenbrink, lect OpenGL transformation Operations cont. Calculation of window coordinates Window center width height factor and offset computed with zNear zFar

CS260 Winter 1999-Wittenbrink, lect OpenGL transformation Operations cont. Viewport set by glviewport(int x, int y, sizei w, sizei h); Z depth factor and offset are set by glDepthRange(campd n, clampd f)(zNear, zFar)

CS260 Winter 1999-Wittenbrink, lect How the calibration code works Approach, use the radial alignment as a constraint Function of only the relative rotation and translation between the camera and calibration points World Coordinates Camera Coordinates or Oi and Pd and Pu on same line

CS260 Winter 1999-Wittenbrink, lect Project 1, Set Tsai view in OpenGL Description of files Data set description No glyphs because of occlusion G01bsmall.gif/ppm Output from my solution

CS260 Winter 1999-Wittenbrink, lect Project 1: File definitions Pink blocks show what is to be implemented Error G01.corr G01.cpcc Image proc. Tsai View 35 mm camera Project 1 G01.ppm Output image cmp

CS260 Winter 1999-Wittenbrink, lect Project 1: File definitions cont. G01.cor - correlation file xw,yw,zw, Xu,Yu for all seeable glyph locations. Images have been warped to remove radial distortion G01.cpcc - view parameters from Tsai: Rx, Ry, Rz, Tx, Ty, Tz, f, Cx, Cy, sx, kappa1 Code is provided to read *.cor and *.cpcc Computed Errors are on the order of a pixel ~ 1

CS260 Winter 1999-Wittenbrink, lect Project 1: cont 4 file sets are given, different viewpoints Original images are very large (1536x1024 is subsampled from Kodak Photo CD resolution 4X in each dimension of that) A default view is provided with gluLookAt You could compute errors with that, and they will be very large Read README, run demo scripts RUN, RUN2

CS260 Winter 1999-Wittenbrink, lect Project 1: cont. Actual world coordinates description planes are drawn in provided code at locations of poster, and glyphs are drawn too. World Coordinates (x=587,y=588,z=-824) (x=587,y=588,z=0) (x=-825,y=-706,z=0) Delta z = 824 Delta y = 1294 Delta x = 1412

CS260 Winter 1999-Wittenbrink, lect Conclusions Tracking has wide application in 3D graphics Camera calibration is important for your first project and specifying view is essential in 3D graphics as well Tsai’s parameters were described Project 1 was reviewed –Tsai, R.Y. “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses”, IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4, pages Next time: Assigned Reading Szeliski’s slides on image based rendering.