Surveillance and Security

Slides:



Advertisements
Similar presentations
Lecture 11: Two-view geometry
Advertisements

CSE473/573 – Stereo and Multiple View Geometry
3D reconstruction.
Computer vision: models, learning and inference
Two-View Geometry CS Sastry and Yang
Computer vision. Camera Calibration Camera Calibration ToolBox – Intrinsic parameters Focal length: The focal length in pixels is stored in the.
Camera calibration and epipolar geometry
Structure from motion.
3D Geometry and Camera Calibration. 3D Coordinate Systems Right-handed vs. left-handedRight-handed vs. left-handed x yz x yz.
Computer Vision : CISC 4/689
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Srikumar Ramalingam Department of Computer Science University of California, Santa Cruz 3D Reconstruction from a Pair of Images.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Uncalibrated Geometry & Stratification Sastry and Yang
Epipolar Geometry and the Fundamental Matrix F
CS485/685 Computer Vision Prof. George Bebis
3D reconstruction of cameras and structure x i = PX i x’ i = P’X i.
Uncalibrated Epipolar - Calibration
CAU Kiel DAGM 2001-Tutorial on Visual-Geometric 3-D Scene Reconstruction 1 The plan for today Leftovers and from last time Camera matrix Part A) Notation,
Panorama signups Panorama project issues? Nalwa handout Announcements.
Lecture 20: Two-view geometry CS6670: Computer Vision Noah Snavely.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Single-view geometry Odilon Redon, Cyclops, 1914.
Projected image of a cube. Classical Calibration.
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
CSCE 641 Computer Graphics: Image-based Modeling (Cont.) Jinxiang Chai.
Cameras, lenses, and calibration
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Automatic Camera Calibration
Computer vision: models, learning and inference
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: –Cameras’ extrinsic parameters, i.e. the geometric relationship.
Camera Geometry and Calibration Thanks to Martial Hebert.
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Geometric Camera Models and Camera Calibration
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
3D-2D registration Kazunori Umeda Chuo Univ., Japan CRV2010 Tutorial May 30, 2010.
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
Metrology 1.Perspective distortion. 2.Depth is lost.
Multiview Geometry and Stereopsis. Inputs: two images of a scene (taken from 2 viewpoints). Output: Depth map. Inputs: multiple images of a scene. Output:
Geometric Camera Models
Vision Review: Image Formation Course web page: September 10, 2002.
Acquiring 3D models of objects via a robotic stereo head David Virasinghe Department of Computer Science University of Adelaide Supervisors: Mike Brooks.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Computer Vision Stereo Vision. Bahadir K. Gunturk2 Pinhole Camera.
Affine Structure from Motion
Single-view geometry Odilon Redon, Cyclops, 1914.
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Radial Distortion (Extrinsic Calibration?) 1.
Bahadir K. Gunturk1 Phase Correlation Bahadir K. Gunturk2 Phase Correlation Take cross correlation Take inverse Fourier transform  Location of the impulse.
Plane-based external camera calibration with accuracy measured by relative deflection angle Chunhui Cui , KingNgiNgan Journal Image Communication Volume.
EECS 274 Computer Vision Affine Structure from Motion.
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
Feature Matching. Feature Space Outlier Rejection.
Calibration.
Computer vision: models, learning and inference M Ahad Multiple Cameras
3D Reconstruction Using Image Sequence
Single-view geometry Odilon Redon, Cyclops, 1914.
Project 1 Due NOW Project 2 out today –Help session at end of class Announcements.
Computer vision: geometric models Md. Atiqur Rahman Ahad Based on: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince.
Calibrating a single camera
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Epipolar geometry.
Two-view geometry.
GEOMETRIC CAMERA MODELS
Geometric Camera Models
Multiple View Geometry for Robotics
Uncalibrated Geometry & Stratification
Single-view geometry Odilon Redon, Cyclops, 1914.
Presentation transcript:

Surveillance and Security

Project Industrial site Monitored by two camera’s

Camera Calibration Thomas Flamant

Goal Camera Calibration Reconstruction Camera matrix get_mapped.m (WP3) Calibration

Camera Calibration (Camera resectioning) = the process of finding the true parameters of the camera that produced a given photograph or video Pinhole camera model Camera matrix Represents the camera parameters Calibration

Pinhole camera model Describes the mathematical relationship between the coordinates of a 3D point and its projection onto the image plane of an ideal pinhole camera Calibration

Camera matrix Describes the mapping of a pinhole camera from 3D points in the world to 2D points in an image. 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = ??? ∗ ??? ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Intrinsic parameters Extrinsic parameters Calibration

Intrinsic Parameters (intrinsic calibration) Used to convert between the coordinates of the image plane and the actual pixel-coordinates Depends on camera hardware Consists of: Focal length 𝑓 𝑥 , 𝑓 𝑦 Camera centre 𝑐 𝑥 , 𝑐 𝑦 Lens distortion ? Barrel Pincushion mustache 𝐴= 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 Calibration

Extrinsic Parameters (extrinsic calibration) Used to describe the relative position and orientation of the camera with regards to a world coordinate system Does not depend on camera hardware Consists of: 3x3 rotation matrix 𝑅 3x1 translation vector 𝑡 𝑅= 𝑟 11 𝑟 12 𝑟 13 𝑟 21 𝑟 22 𝑟 23 𝑟 31 𝑟 32 𝑟 33 𝑡= 𝑡 𝑥 𝑡 𝑦 𝑡 𝑧 Calibration

Camera matrix Describes the mapping of a pinhole camera from 3D points in the world to 2D points in an image. 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = ??? ∗ ??? ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Intrinsic parameters Extrinsic parameters Calibration

Camera matrix Describes the mapping of a pinhole camera from 3D points in the world to 2D points in an image. 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 𝐴 ∗ 𝑅 | 𝑡 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Intrinsic parameters Extrinsic parameters Calibration

Camera matrix Describes the mapping of a pinhole camera from 3D points in the world to 2D points in an image. 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 ∗ 𝑟 11 𝑟 12 𝑟 13 𝑡 1 𝑟 21 𝑟 22 𝑟 23 𝑡 2 𝑟 31 𝑟 32 𝑟 33 𝑡 3 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Intrinsic parameters Extrinsic parameters Calibration

Camera matrix Describes the mapping of a pinhole camera from 3D points in the world to 2D points in an image. 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 ∗ 𝑟 11 𝑟 12 𝑟 13 𝑡 1 𝑟 21 𝑟 22 𝑟 23 𝑡 2 𝑟 31 𝑟 32 𝑟 33 𝑡 3 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Intrinsic parameters Extrinsic parameters (Bouguet) (POSIT) Calibration

Camera Calibration Toolbox for Matlab (Bouguet) Input: 20+ images of a planar checkerboard Output: Intrinsic parameters Corner extraction Automatic: RADOCC Camera Calibration Toolbox (CornerFinder.m) Calibration

Camera Calibration Toolbox for Matlab (Bouguet) Input: 30 images of each camera (squares = 28mm) Camera 1: Camera 2: Calibration

Camera Calibration Toolbox for Matlab (Bouguet) Input: 30 images of each camera (squares = 28mm) Camera 1: Camera 2: Calibration

Camera Calibration Toolbox for Matlab (Bouguet) Output: intrinsic parameters of each camera Camera 1: Camera 2: 𝐴 𝑐𝑎𝑚1 = 1458.4 0 686.5 0 1456.9 350.1 0 0 1 𝐴 𝑐𝑎𝑚2 = 2067.7 0 678.8 0 2068.2 340.5 0 0 1 Calibration

POSIT Description: POSIT is a fast iterative algorithm for finding the pose (rotation and translation) of an object or scene with respect to a camera when points of the object are given in some object coordinate system and these points are visible in the camera image and recognizable, so that corresponding image points and object points can be listed in the same order. [rotation, translation] = Posit(imagePoints, objectPoints, focalLength, center) Calibration

POSIT Input: [rotation, translation] = Posit(imagePoints, objectPoints, focalLength, center) Input: nbPts: 4+ noncoplanar feature points of the object imagePoints: matrix of size nbPts x 2 objectPoints: matrix of size nbPts x 3 focalLength: focal length of the camera in pixels Center: row vector with the elements of the image center Camera 2 Calibration

POSIT Input: Output: [rotation, translation] = Posit(imagePoints, objectPoints, focalLength, center) Input: nbPts: 4+ noncoplanar feature points of the object imagePoints: matrix of size nbPts x 2 objectPoints: matrix of size nbPts x 3 focalLength: focal length of the camera in pixels Center: row vector with the elements of the image center Output: rotation: 3 x 3 rotation matrix of scene with respect to camera translation: 3 x 1 translation vector from projection center of camera to FIRST POINT in list of object points Calibration

POSIT Output: extrinsic parameters of each camera Camera 1: Camera 2: 𝑅 𝑐𝑎𝑚1 = 1.2454 −0.3342 −0.0195 −0.0904 −0.4501 −0.6249 0.2001 0.7800 −0.5908 𝑡 𝑐𝑎𝑚1 = 1051 2830 32268 𝑅 𝑐𝑎𝑚2 = 1.3088 0.0228 −0.0423 −0.0242 −0.3105 −0.6971 −0.0290 0.9135 −0.4058 𝑡 𝑐𝑎𝑚2 = 357 3823 26989 Calibration

Camera matrix Describes the mapping of a pinhole camera from 3D points in the world to 2D points in an image. 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 ∗ 𝑟 11 𝑟 12 𝑟 13 𝑡 1 𝑟 21 𝑟 22 𝑟 23 𝑡 2 𝑟 31 𝑟 32 𝑟 33 𝑡 3 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Intrinsic parameters Extrinsic parameters (Bouguet) (POSIT) Calibration

Camera matrix Describes the mapping of a pinhole camera from 3D points in the world to 2D points in an image. Camera 1: Camera 2: 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 1458.4 0 686.5 0 1456.9 350.1 0 0 1 ∗ 1.2454 −0.3342 −0.0195 1051 −0.0904 −0.4501 −0.6249 2830 0.2001 0.7800 −0.5908 32268 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 2067.7 0 678.8 0 2068.2 340.5 0 0 1 ∗ 1.3088 0.0228 −0.0423 357 −0.0242 −0.3105 −0.6971 3823 −0.0290 0.9135 −0.4058 26989 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Calibration

Goal Camera Calibration Reconstruction Camera matrix get_mapped.m (WP3) Calibration

Reconstruction 3D-reconstruction: Triangulation 2D-reconstruction: Calibration matrix Calibration

3D-reconstruction (Triangulation) Epipolar geometry: When two cameras view a 3D scene from two distinct positions, there are a number of geometric relations between the 3D points and their projections onto the 2D images that lead to constraints between the image points. Calibration

3D-reconstruction (Triangulation) The process of determining a point in 3D space given its projections onto two, or more, images. In order to solve this problem it is necessary to know the parameters of the camera projection function from 3D to 2D for the cameras involved, represented by the camera matrices Calibration

2D-reconstruction (Calibration matrix) 3 dimensions ? 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 ∗ 𝑟 11 𝑟 12 𝑟 13 𝑡 1 𝑟 21 𝑟 22 𝑟 23 𝑡 2 𝑟 31 𝑟 32 𝑟 33 𝑡 3 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Camera 2 Calibration

2D-reconstruction (Calibration matrix) 2 dimensions ! 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 ∗ 𝑟 11 𝑟 12 𝑟 13 𝑡 1 𝑟 21 𝑟 22 𝑟 23 𝑡 2 𝑟 31 𝑟 32 𝑟 33 𝑡 3 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 𝑧 𝑤𝑜𝑟𝑙𝑑 1 Camera 2 Calibration

2D-reconstruction (Calibration matrix) 2 dimensions ! 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 = 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 ∗ 𝑟 11 𝑟 12 𝑡 1 𝑟 21 𝑟 22 𝑡 2 𝑟 31 𝑟 32 𝑡 3 ∗ 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 1 Camera 2 Calibration

2D-reconstruction (Calibration matrix) 2 dimensions ! 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 1 = 𝑖𝑛𝑣𝑒𝑟𝑠 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 ∗ 𝑟 11 𝑟 12 𝑡 1 𝑟 21 𝑟 22 𝑡 2 𝑟 31 𝑟 32 𝑡 3 ∗ 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 Camera 2 Calibration

2D-reconstruction (Calibration matrix) 2 dimensions ! Function get_mapped.m  WP3 function [ x_world , y_world ] = get_mapped(camera_nr, x_pixel, y_pixel) 𝑥 𝑤𝑜𝑟𝑙𝑑 𝑦 𝑤𝑜𝑟𝑙𝑑 1 = 𝑖𝑛𝑣𝑒𝑟𝑠 𝑓 𝑥 0 𝑐 𝑥 0 𝑓 𝑦 𝑐 𝑦 0 0 1 ∗ 𝑟 11 𝑟 12 𝑡 1 𝑟 21 𝑟 22 𝑡 2 𝑟 31 𝑟 32 𝑡 3 ∗ 𝑥 𝑝𝑖𝑥𝑒𝑙 𝑦 𝑝𝑖𝑥𝑒𝑙 1 Calibration

Motion detection Hannes Van De Vreken

Person detection Jan Heuninck

Demonstration