Simple Calibration of Non-overlapping Cameras with a Mirror

Slides:



Advertisements
Similar presentations
Zhengyou Zhang Vision Technology Group Microsoft Research
Advertisements

Steerable Projector Calibration Talk for Procams 2005 workshop, 25 June 2005 Mark ASHDOWN Yoichi SATO
The fundamental matrix F
A Unified Approach to Calibrate a Network of Camcorders & ToF Cameras M 2 SFA 2 Marseille France 2008 Li Guan Marc Pollefeys {lguan, UNC-Chapel.
Computer vision: models, learning and inference
Multiple View Reconstruction Class 24 Multiple View Geometry Comp Marc Pollefeys.
N-view factorization and bundle adjustment CMPUT 613.
Self-calibration.
Chapter 6 Feature-based alignment Advanced Computer Vision.
Two-view geometry.
Omnidirectional camera calibration
Fisheye Camera Calibration Arunkumar Byravan & Shai Revzen University of Pennsylvania.
Camera calibration and epipolar geometry
Stanford CS223B Computer Vision, Winter 2005 Lecture 11: Structure From Motion 2 Sebastian Thrun, Stanford Rick Szeliski, Microsoft Hendrik Dahlkamp and.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Uncalibrated Geometry & Stratification Sastry and Yang
Many slides and illustrations from J. Ponce
Synchronization and Calibration of Camera Networks from Silhouettes Sudipta N. Sinha Marc Pollefeys University of North Carolina at Chapel Hill, USA.
Uncalibrated Epipolar - Calibration
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Lecture 16: Single-view modeling, Part 2 CS6670: Computer Vision Noah Snavely.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Single-view geometry Odilon Redon, Cyclops, 1914.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Multiple View Reconstruction Class 23 Multiple View Geometry Comp Marc Pollefeys.
Projected image of a cube. Classical Calibration.
Camera Calibration from Planar Patterns Homework 2 Help SessionCS223bStanford University Mitul Saha (courtesy: Jean-Yves Bouguet, Intel)
CSCE 641 Computer Graphics: Image-based Modeling (Cont.) Jinxiang Chai.
Camera Calibration CS485/685 Computer Vision Prof. Bebis.
Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp Marc Pollefeys.
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Camera Parameters and Calibration. Camera parameters From last time….
Lecture 12: Structure from motion CS6670: Computer Vision Noah Snavely.
Today: Calibration What are the camera parameters?
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
Single-view modeling, Part 2 CS4670/5670: Computer Vision Noah Snavely.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Geometry and Algebra of Multiple Views
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Last Lecture (optical center) origin principal point P (X,Y,Z) p (x,y) x y.
Sebastian Thrun CS223B Computer Vision, Winter Stanford CS223B Computer Vision, Winter 2005 Lecture 2 Lenses and Camera Calibration Sebastian Thrun,
Ray Divergence-Based Bundle Adjustment Conditioning for Multi-View Stereo Mauricio Hess-Flores 1, Daniel Knoblauch 2, Mark A. Duchaineau 3, Kenneth I.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Single-view geometry Odilon Redon, Cyclops, 1914.
Ch. 3: Geometric Camera Calibration
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
EECS 274 Computer Vision Geometric Camera Calibration.
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
Plane-based external camera calibration with accuracy measured by relative deflection angle Chunhui Cui , KingNgiNgan Journal Image Communication Volume.
Calibration.
3D reconstruction from uncalibrated images
ROBOT VISION Lesson 4: Camera Models and Calibration Matthias Rüther Slides partial courtesy of Marc Pollefeys Department of Computer Science University.
3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.
MASKS © 2004 Invitation to 3D vision Uncalibrated Camera Chapter 6 Reconstruction from Two Uncalibrated Views Modified by L A Rønningen Oct 2008.
EECS 274 Computer Vision Projective Structure from Motion.
Geometric Camera Calibration
Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/4/17
3D Photography: Epipolar geometry
Structure from motion Input: Output: (Tomasi and Kanade)
Auto-Calibration of Multi-Projector Display Walls
Multiple View Geometry for Robotics
Uncalibrated Geometry & Stratification
Noah Snavely.
Camera Calibration Coordinate change Translation P’ = P – O
Two-view geometry.
Two-view geometry.
Multi-view geometry.
Single-view geometry Odilon Redon, Cyclops, 1914.
Camera Calibration from Planar Patterns
Structure from motion Input: Output: (Tomasi and Kanade)
Presentation transcript:

Simple Calibration of Non-overlapping Cameras with a Mirror Ram Krishan Kumar1, Adrian Ilie1, Jan-Michael Frahm1 , Marc Pollefeys1,2 Department of Computer Science 1UNC Chapel Hill 2ETH Zurich USA Switzerland & CVPR, Alaska, June 2008

Motivation Courtesy: Microsoft Research

Motivation Surveillance: Camera 1 Camera 2 Non-overlapping cameras Having different cameras pointing in different directions. Since, we want to cover up the maximum area we generally have a minimal overlap in their FOVs Ram these cameras have substantial overlap it seems to me!!! Camera 2 Non-overlapping cameras

Motivation 3D reconstruction: UrbanScape cameras: cameras with minimal overlap

Motivation Panorama stitching Courtesy: www.ptgrey.com A minimum overlap in the views of the cameras; For stitching the panoramas, we need to know the calibration of each camera. RAM: Here you need to reference the source of your images Courtesy: www.ptgrey.com

Motivation (Only 4 of 6 images shown here) Courtesy: Microsoft Research

Single camera calibration Previous Work Single camera calibration Fixed 3D Geometry Tsai (1987) Plane based approach Zhang (2000) Ram Tsai is not 1897!!!! Multiple images of the checker board pattern assumed at Z=0 are observed

Single camera calibration Previous Work Single camera calibration Fixed 3D Geometry Tsai (1987) Plane based approach Zhang (2000) Yields both internal and external camera parameters

Previous Work Multi-camera environment Calibration board with 3D laser pointer Kitahara et al. (2001) All of these method on the overlap of Fovs of cameras and can not be reliably used in the cases where there is no overlap

Previous Work Multi-camera environment Calibration board with 3D laser pointer Kitahara et al. (2001) All cameras observe a common dominant plane and track objects moving in this plane (e.g. ground) Lee et al.(2000) All of these methods rely on the overlap of Fovs of cameras and can not be reliably used in the cases where there is no overlap Ram: I don’t understand as long as they have the same plane accurately estimated it should be just fine.

Previous Work Multi-camera environment Calibration board with 3D laser pointer Kitahara et al. (2001) All cameras observe a common dominant plane and track objects moving in this plane (e.g. ground) Lee et al.(2000) Automatic calibration yielding complete camera projections using only a laser pointer Svoboda et al. (2005) All of these method on the overlap of Fovs of cameras and can not be reliably used in the cases where there is no overlap

Previous Work Multi-camera environment Calibration board with 3D laser pointer Kitahara et al. (2001) All cameras observe a common dominant plane and track objects moving in this plane (e.g. ground) Lee et al.(2000) Automatic calibration yielding complete camera projections using only a laser pointer Svoboda et al. (2005) Camera network calibration from dynamic silhouettes Sinha et al (2004) All of these method on the overlap of Fovs of cameras and can not be reliably used in the cases where there is no overlap

Previous Work Multi-camera environment Calibration board with 3D laser pointer Kitahara et al. (2001) All cameras observe a common dominant plane and track objects moving in this plane (e.g. ground) Lee et al.(2000) Automatic calibration yielding complete camera projections using only a laser pointer Svoboda et al. (2005) Camera network calibration from dynamic silhouettes Sinha et al.(2004) All of these methods require an overlap in field of views (FOVs) of the cameras All of these method on the overlap of Fovs of cameras and can not be reliably used in the cases where there is no overlap

Previous Work Pose computation of object without direct view Sturm et al. (2006) Rely on computing the mirror plane

Proposed Approach mirror mirror Calibration Pattern

Using a Planar Mirror A real camera observing point X’ is equivalent to a mirrored camera observing the real point X itself Real camera pose Point on calibration pattern C . X x’ RHS to LHS mirror . x X’ C’ Mirrored camera pose

. Proposed Approach C X x’ Real camera pose mirror x Ram: What are the light gray lines for in this slide? Could you remove them if they are not serving any purpose or color them differently if they are serving a purpose. x Mirrored camera pose C’

. Proposed Approach Move the mirror to a different position C X x’

Proposed Approach . X C x C’

. Proposed Approach X mirror mirror x x x x x Family of mirrored camera pose

. Proposed Approach Reduces to Standard calibration method: Use any standard technique that give extrinsic camera parameters in addition to internal camera parameters. . X mirror mirror Ram: here you should blend them in mirror & mirrored camera for each position otherwise nobody will get this. x x x x x Family of mirrored camera pose

Recovering Internal Parameters A two stage process STAGE 1: Internal calibration Image pixel x= x’ =>intrinsic parameters & radial distortion are the same C . X x’ mirror . You can say more here but keep the text on the slide short x X’ C’

. . Proposed Approach A two stage process : STAGE 2 : External camera calibration . r3 C r2 X x’ Real camera pose r1 mirror C-C’ . r1’ X’ x Mirrored camera pose r2’ C’ r3’ 23

Recovery of External Parameters r1 + r1’ r3 r2 r2’ C r1’ Real camera pose r1 C-C’ mirror 3 Non-linear constraints r2’ <r1 + r1’,C’-C> = 0 Mirrored camera pose <r2 + r2’,C’-C> = 0 r1’ C’ <r3 + r3’,C’-C> = 0 r3’ (C’-C)T (rk’ + rk ) = 0 for k = 1, 2, 3 24

Recovery of External Parameters r1 + r1’ r3 r2 r2’ C r1’ Real camera pose r1 C-C’ mirror 3 Non-linear constraints r1’ <r1 + r1’,C’-C> = 0 Mirrored camera pose r2’ <r2 + r2’,C’-C> = 0 C’ <r3 + r3’,C’-C> = 0 r3’ C’T rk’ + C’T rk - CT rk’ - CT rk = 0 for k = 1, 2, 3 Non-linear 25

Recovery of External Parameters r1 + r1’ r3 r2 C r1’ r1 mirror r1’ r2’ C’ r3’ Each mirror position generates 3 non-linear constraints Unknowns : r1 , r2 , r3 , C (12) Equations : 3 constraints for each mirror position + 6 constraints of rotation matrix

Recovery of External Parameters C’T rk’ + C’T rk - CT rk’ - CT rk = 0 for k = 1, 2, 3 linearize CT rk = sk (Introduced variables) Number of unknowns: 12 + 3 (s1, s2, s3 ) ; At least 5 images are needed to solve for the camera center and rotation matrix linearly

Recovery of External Parameters Once we have obtained the external camera parameters, we apply bundle adjustment to minimize the reprojection error Enforce r1, r2 , r3 to constitute a valid rotation matrix R = [r1 r2 r3 ]

Experiments Ram: we discussed not to show a percentage error here since it is meaningless. So put absolute numbers Five randomly generated mirror positions which enable the camera to view the calibration pattern Error in recovered camera center vs noise level in pixel

Experiments Ram: switch this plot to axis angle Five randomly generated mirror positions which enable the camera to view the calibration pattern Error in rotation matrix vs noise level in pixel

Evaluation on Real Data Experimental Setup with checkerboard pattern kept on the ground Ladybug Cameras

Evaluation on Real Data Camera 1 Ram: switch the next slides accordingly and make the animation for this one so that one image comes in after the other (for the other slides just blend the whole stack in at once)

Evaluation on Real Data Camera 2

Evaluation on Real Data Camera 3

Evaluation on Real Data Camera 4

Evaluation on Real Data Camera 5

Evaluation on Real Data Camera 6

Evaluation on Real Data Top View: Initial estimate of the recovered camera poses

Evaluation on Real Data Ram add comparison with specs of the Ladybug Top View : Recovered camera poses after Bundle adjustment

Evaluation on Real Data Result: 37.3 cm 35.1 cm 37.6 cm 36.2 cm 34.7 cm Actual radius: 37.5 cm

Summary Using a plane mirror to calibrate a network of camera Cameras need not see the calibration object directly Knowledge about mirror parameters is not required !

Practical Considerations Need a sufficiently big calibration object so that they occupy a significant portion in the image Use any other calibration object and any other calibration technique which gives both intrinsic and extrinsic parameters

Acknowledgements We gratefully acknowledge the partial support of the IARPA VACE program, an NSF Career IIS 0237533 and a Packard Fellowship for Science and Technology Software at: http://www.cs.unc.edu/~ramkris/MirrorCameraCalib.html

Questions

Take Away Ideas .