Stereo Ranging with verging Cameras Based on the paper by E. Krotkov, K.Henriksen and R. Kories.

Slides:



Advertisements
Similar presentations
C280, Computer Vision Prof. Trevor Darrell Lecture 2: Image Formation.
Advertisements

Single-view geometry Odilon Redon, Cyclops, 1914.
Chapter 23.
Chapter 30 Lenses. Lens – a lens is a transparent material that bends light rays depending on its shape Converging lens – a lens (top left) in which light.
Three Dimensional Viewing
Lab 10: Lenses 1.Focal Length 2.Magnification 3.Lens Equation 4.Depth of Field 5.Spherical Aberrations and Curved Focal Plane 6.Chromatic Aberration 7.Astigmatism.
Geometry 1: Projection and Epipolar Lines Introduction to Computer Vision Ronen Basri Weizmann Institute of Science.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Light Field Stitching with a Plenoptic Camera Zhou Xue LCAV - École Polytechnique Fédérale de Lausanne Dec
Camera calibration and epipolar geometry
Announcements. Projection Today’s Readings Nalwa 2.1.
Lecture 5: Projection CS6670: Computer Vision Noah Snavely.
CS485/685 Computer Vision Prof. George Bebis
Announcements Mailing list Project 1 test the turnin procedure *this week* (make sure it works) vote on best artifacts in next week’s class Project 2 groups.
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
Lecture 12: Projection CS4670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Single-view geometry Odilon Redon, Cyclops, 1914.
3D Computer Vision and Video Computing 3D Vision Lecture 14 Stereo Vision (I) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
Visualization- Determining Depth From Stereo Saurav Basu BITS Pilani 2002.
Cameras, lenses, and calibration
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #15.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Image formation & Geometrical Transforms Francisco Gómez J MMS U. Central y UJTL.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: –Cameras’ extrinsic parameters, i.e. the geometric relationship.
Viewing and Projections
Physics 213 General Physics Lecture Last Meeting: Diffraction Today: Optical Instruments.
Geometric Camera Models and Camera Calibration
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Lenses. 3 camera obscura / pinhole camera 3 Focal length is the distance between the lens and the point where the light rays converge. It controls.
Integral University EC-024 Digital Image Processing.
Shape from Stereo  Disparity between two images  Photogrammetry  Finding Corresponding Points Correlation based methods Feature based methods.
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
1 Finding depth. 2 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders.
Geometric Camera Models
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
CS348B Lecture 7Pat Hanrahan, 2005 Camera Simulation EffectCause Field of viewFilm size, stops and pupils Depth of field Aperture, focal length Motion.
Peripheral drift illusion. Multiple views Hartley and Zisserman Lowe stereo vision structure from motion optical flow.
50 Miscellaneous Parabolas Hyperbolas Ellipses Circles
: Chapter 11: Three Dimensional Image Processing 1 Montri Karnjanadecha ac.th/~montri Image.
Single-view geometry Odilon Redon, Cyclops, 1914.
Lecture 18 Optical Instruments
0 Assignment 1 (Due: 3/9) The projections of two parallel lines, l 1 and l 2, which lie on the ground plane G, onto the image plane I converge at a point.
Its now time to see the light…..  A lens is a curved transparent material that is smooth and regularly shaped so that when light strikes it, the light.
Active Vision Sensor Planning of CardEye Platform Sherif Rashad, Emir Dizdarevic, Ahmed Eid, Chuck Sites and Aly Farag ResearchersSponsor.
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
12/24/2015 A.Aruna/Assistant professor/IT/SNSCE 1.
Computer vision: models, learning and inference M Ahad Multiple Cameras
A Photograph of two papers
Camera LENSES, APERTURE AND DEPTH OF FIELD. Camera Lenses Wide angle lenses distort the image so that extreme wide angle can look like its convex such.
Single-view geometry Odilon Redon, Cyclops, 1914.
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
GEOMETRICAL OPTICS. Laws of Reflection Laws of Refraction.
Forward Kinematics Where is my hand ?. Examples Denavit-Hartenberg Specialized description of articulated figures (joints) Each joint has only one degree.
Lecture 18: Cameras CS4670 / 5670: Computer Vision KavitaBala Source: S. Lazebnik.
Lenses Are classified by their Focal Length.
3B Reflections 9-2 in textbook
: Chapter 11: Three Dimensional Image Processing
Notes 23.3: Lenses and Images
Three Dimensional Viewing
Lecture 13: Cameras and geometry
A movement of a figure in a plane.
Announcements Midterm out today Project 1 demos.
Credit: CS231a, Stanford, Silvio Savarese
Course 6 Stereo.
Single-view geometry Odilon Redon, Cyclops, 1914.
Exercise Write the opposite of 7. – 7.
Announcements Midterm out today Project 1 demos.
Presentation transcript:

Stereo Ranging with verging Cameras Based on the paper by E. Krotkov, K.Henriksen and R. Kories

The system description Two cameras mounted on a platform can translate horizontally and vertically, and rotate left-right and up- down. Motors mounted on each lens adjust the focal length, focusing distance, and aperture diameter. Further, the two camera can verge by rotating towards each other (converging) or away form each other (diverging). The travel from minimum to maximum vergence angle is approximately 6degrees, covering 90,000motor steps. Potential advantage of vergence include increasing the field of view common to two cameras and constraining the stereo disparity.

Model The lens is modeled as pinhole. The lens centers are separated by a baseline distance b, and both lenses have focal length f. Each camera is associated with a reference frame L and R with origins at the lens center and Z axes coincident with the optic axes positive in the direction fo the scene. We define a Cyclopean reference frame W with origin at the midpoint of the baseline, Z-axes normal to the base line and X axis coincident with the baseline.

Vergence mechanism

Reference Frames

The problem

Computing Range

Perspective transformation yields the image coordinates

Derivation for Z

The distance

Relationship of parameters to observation

Indirect measurements

Getting f and baseline from measurements

Constrain on offset

Parameter identification

Parameter Identification Second, we acquire disparity data from the scene viewed with different vergence angles. We servo the vergence motor to N different positions. At each we digitize a stereo pair of images and identify conjugate image points for each of the M objects. The outcome of this procedure is MxN conjugate points below and the associate vergence motor positions V.

Parameter identification Third we search for the least square values for baseline and the offset.

Experimental Results

Figure explanation We identify conjugate image points for each of 7 objects. There were 9 vergence positions. For each vergence position 7 different distant places were measured. The outcome is 7 x 9 conjugate points and associated vergence motor ositions.

Distance versus disparity

Vergence Changes Baseline Distance

This figure illustrates the case when the center of rotation is not the same as the lens center