Omnidirectional Stereo Vision

Slides:



Advertisements
Similar presentations
Epipolar Geometry.
Advertisements

Stereo Vision Reading: Chapter 11
CS 376b Introduction to Computer Vision 04 / 21 / 2008 Instructor: Michael Eckmann.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Two-View Geometry CS Sastry and Yang
Lecture 8: Stereo.
Stereo.
Camera calibration and epipolar geometry
A Multicamera Setup for Generating Stereo Panoramic Video Tzavidas, S., Katsaggelos, A.K. Multimedia, IEEE Transactions on Volume: 7, Issue:5 Publication.
3D Computer Vision and Video Computing 3D Vision Topic 3 of Part II Stereo Vision CSc I6716 Spring 2011 Zhigang Zhu, City College of New York
Vision, Video and Virtual Reality Omnidirectional Vision Lecture 6 Omnidirectional Cameras CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
3D Computer Vision and Video Computing Omnidirectional Vision Topic 11 of Part 3 Omnidirectional Cameras CSC I6716 Spring 2003 Zhigang Zhu, NAC 8/203A.
Direct Methods for Visual Scene Reconstruction Paper by Richard Szeliski & Sing Bing Kang Presented by Kristin Branson November 7, 2002.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Uncalibrated Geometry & Stratification Sastry and Yang
Mosaic-Based 3D Scene Representation and Rendering Zhigang Zhu Visual Computing Lab Department of Computer Science City College and Graduate Center City.
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
3D Computer Vision and Video Computing 3D Vision Topic 4 of Part II Stereo Vision CSc I6716 Spring 2008 Zhigang Zhu, City College of New York
3D Computer Vision and Video Computing 3D Vision Lecture 15 Stereo Vision (II) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
3D Computer Vision and Video Computing 3D Vision Lecture 14 Stereo Vision (I) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
May 2004Stereo1 Introduction to Computer Vision CS / ECE 181B Tuesday, May 11, 2004  Multiple view geometry and stereo  Handout #6 available (check with.
Rendering with Concentric Mosaics Heung-Yeung Shum Li-Wei he Microsoft Research.
CSE473/573 – Stereo Correspondence
Stereo Sebastian Thrun, Gary Bradski, Daniel Russakoff Stanford CS223B Computer Vision (with slides by James Rehg and.
COMP322/S2000/L271 Stereo Imaging Ref.V.S.Nalwa, A Guided Tour of Computer Vision, Addison Wesley, (ISBN ) Slides are adapted from CS641.
Stockman MSU/CSE Math models 3D to 2D Affine transformations in 3D; Projections 3D to 2D; Derivation of camera matrix form.
Multiple View Geometry : Computational Photography Alexei Efros, CMU, Fall 2006 © Martin Quinn …with a lot of slides stolen from Steve Seitz and.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography.
Computer Vision Spring ,-685 Instructor: S. Narasimhan WH 5409 T-R 10:30am – 11:50am Lecture #15.
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Stereo Many slides adapted from Steve Seitz.
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
1 Howard Schultz, Edward M. Riseman, Frank R. Stolle Computer Science Department University of Massachusetts, USA Dong-Min Woo School of Electrical Engineering.
Binocular Stereo #1. Topics 1. Principle 2. binocular stereo basic equation 3. epipolar line 4. features and strategies for matching.
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
Computer Vision Lecture #10 Hossam Abdelmunim 1 & Aly A. Farag 2 1 Computer & Systems Engineering Department, Ain Shams University, Cairo, Egypt 2 Electerical.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
Bahadir K. Gunturk1 Phase Correlation Bahadir K. Gunturk2 Phase Correlation Take cross correlation Take inverse Fourier transform  Location of the impulse.
Lecture 16: Stereo CS4670 / 5670: Computer Vision Noah Snavely Single image stereogram, by Niklas EenNiklas Een.
stereo Outline : Remind class of 3d geometry Introduction
Feature Matching. Feature Space Outlier Rejection.
Computer vision: models, learning and inference M Ahad Multiple Cameras
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
Lec 26: Fundamental Matrix CS4670 / 5670: Computer Vision Kavita Bala.
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
Stereo CS4670 / 5670: Computer Vision Noah Snavely Single image stereogram, by Niklas EenNiklas Een.
Calibration ECE 847: Digital Image Processing Stan Birchfield Clemson University.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
3D Vision Topic 3 of Part II Stereo Vision CSc I6716 Fall 2009
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
The Brightness Constraint
Omnidirectional Vision
Epipolar geometry.
Zhigang Zhu, K. Deepak Rajasekar Allen R. Hanson, Edward M. Riseman
Common Classification Tasks
What have we learned so far?
Topic 7 of Part 2 Stereo Vision (II)
The Brightness Constraint
Omnidirectional Vision
Omnidirectional epipolar geometry
Two-view geometry.
The Brightness Constraint
Reconstruction.
3D Vision Topic 4 of Part II Stereo Vision CSc I6716 Fall 2005
Computer Vision Stereo Vision.
Chapter 11: Stereopsis Stereopsis: Fusing the pictures taken by two cameras and exploiting the difference (or disparity) between them to obtain the depth.
Stereo vision Many slides adapted from Steve Seitz.
Presentation transcript:

Omnidirectional Stereo Vision Capstone 2004 Lecture 18 Zhigang Zhu Computer Science Department The City College, CUNY zhu@cs.ccny.cuny.edu http://www-cs.engr.ccny.cuny.edu/~zhu/

Acknowledgements Collaborators at UMass Supported by Edward Riseman Allen Hanson Deepak Karuppiah Howard Schultz … … Supported by NSF Environmental Monitoring DARPA/ITO Mobile Autonomous Robot S/W China NSF Scene Modeling Paper (with references) http://www-cs.engr.ccny.cuny.edu/~zhu/zOmniStereo01.pdf 11/22/2018 @Z. Zhu CCNY

The Class of Omnistereo (omnidirectional stereo vision) Omnidirectional Vision : How to look Viewer-centered: outward looking Object-centered: inward looking Omnistereo Vision: How many viewpoints Binocular/N-Ocular: a few (2 or more) fixed Circular Projection: many inside a small area Dynamic Omnistereo: a few, but configurable Object-centered: many, in a large space 11/22/2018 @Z. Zhu CCNY

Important Issues of Omnistereo What this lecture is about Omnistereo Imaging principle for sensor designs Epipolar geometry for correspondence Depth error characterization in both direction and distance Other important issues not in this talk Sensor designs Calibration methods Correspondence algorithms 11/22/2018 @Z. Zhu CCNY

Omni Imaging & Representation Omnidirectional (panoramic) Imaging Catadioptric Camera (single effective viewpoint) ParaVision by RemoteReality, PAL, and many… Image Mosaicing Rotating camera, translating camera, arbitrary motion Omnidirectional Representation Cylindrical Representation Spherical Representation 11/22/2018 @Z. Zhu CCNY

Panoramic Annular Lens (PAL) Panoramic Camera Panoramic Annular Lens (PAL) By Pal Greguss 11/22/2018 @Z. Zhu CCNY

Panoramic Mosaics from a Rotating Camera (ICMCS99) 11/22/2018 @Z. Zhu CCNY

1st frame Cylindrical Panorama connecting frame conic mosaic head-tail stitching panorama 11/22/2018 @Z. Zhu CCNY

Cylindrical Projection Image projection (f, v) of a 3D point P (X,Y,Z) Z Y X O P (X, Y, Z) v f D Distance Cylindrical image Vertical axis 11/22/2018 @Z. Zhu CCNY

Binocular / N-Ocular Omnistereo A few fixed viewpoints Three configurations Horizontally-aligned binocular (H-Bi) omnistereo Vertically-aligned binocular (V-Bi) omnistereo N-ocular omnistereo – trinocular case Issues Distance error in the direction of 360 degrees Distance error versus distance Epipolar geometry 11/22/2018 @Z. Zhu CCNY

H-Bi Omnistereo: depth error From Image pair { (f1, v1), (f2, v2) } to a 3D point P (X,Y,Z) Triangulation Z Y X O1 O2 P (X, Y, Z) v1 v2 f1 f2 B D f - Fixed baseline B - Horizontal disparity (vergent angle) Depth Error dD = B (cosq2 sinq dq2 - sinq2 cosq dq) / sinq**2 = B sin (q+q2) / sinq**2 dq = D**2 cosq / B sinq2 dq <= D**2 / B sinq2 dq Instead of providing the above, I will present how to derive error equation for the binocular planar stereo vision Z = F B/ dx dZ = Z**2 / FB d(dx) Why not dZ =- FB/dx**2 d(dx) Beacuse dx is also a function of F and B! Depth accuracy is non-isotropic; max vergent only when f2 =90 Not make full use of the 360 viewing Depth error proportional to Depth2 / Baseline 11/22/2018 @Z. Zhu CCNY

H-Bi Omnistereo: singularity case Zero Vergent angle when f1=f2=0 or 180 degree v1 v2 P (X,Y,Z) O1 O2 B D epipoles Distance Ratio Method - Visible Epipoles: the images of the camera centers in the others could be visible! - Vertical disparity and vertical epipolar lines 11/22/2018 @Z. Zhu CCNY

H-Bi Omnistereo: Epipolar geometry Given point (f2, v2), search for (f1, v1) f1 v1 180 360 triangulation singularity depth-blind spots -The epipolar curves are sine curves in the non-singularity cases and - The epipolar lines are along the v direction in the singularity cases 11/22/2018 @Z. Zhu CCNY

V-Bi Omnistereo From Image pair { (f1, v1), (f2, v2) } to a 3D point P (X,Y,Z) v1 v2 O1 O2 P X Y Z Bv - Vertical baseline Bv - Vertical disparity v - Same as perspective stereo Depth accuracy isotropic in all directions - Depth error proportional to square of distance Epipolar lines are simply vertical lines - But NO stereo viewing without 3D reconstruction 11/22/2018 @Z. Zhu CCNY

N-Ocular Omnistereo Why more viewpoints ? Every point of the 360 FOV from the center of the sensor-triangle can be covered by at least two pairs of rays from different cameras with good triangulations depth accuracy is still not isotropic, but is more uniform in directions - one pair of stereo match can be verified using the second pair - However no gain in epipolar geometry 11/22/2018 @Z. Zhu CCNY

Circular Projection Omnistereo Many viewpoints on a viewing circle Omnivergent Stereo (Shum et al ICCV99) every point in the scene is imaged from two cameras that are vergent on that point with maximum vergence angle; and stereo recovery yields isotropic depth resolution in all directions. Solution: Circular Projection/ Concentric Mosiacs A single off-center rotating camera (Peleg CVPR 99, Shum ICCV99) Full optical design (Peleg PAMI 2000) My catadioptric omnistereo rig 11/22/2018 @Z. Zhu CCNY

Circular Projection: principle Many viewpoints on a viewing circle Z viewing circle Case 2: two 1D sensors O Case 1: an omni sensor A virtual camera moving in a viewing circle captures two set of rays on a plane tangent to the viewing circle: the left-eye in clockwise direction, and the right-eye in counterclockwise direction 11/22/2018 @Z. Zhu CCNY

Circular Projection: geometry Max vergent angles for left and right rays O P left-eye ray right-eye ray f2 f1 D viewing circle r B O1 O2 f “baseline” “disparity” P: 3D space point r: radius of the viewing circle f1,f2: viewing directions of left and right rays f: vergent angle (angular disparity) B: baseline length (< 2r); D: distance (OP) 11/22/2018 @Z. Zhu CCNY

Circular Projection: properties Depth estimation is isotropic Same depth error in all directions Make full use of the 360 viewing Depth error proportional to depth2/baseline Same as H-Bi Omnistereo limited baseline (B < 2r) Horizontal Epipolar lines Superior than H-Bi Omnistereo when a single viewing circle for left and right omni-images Extension to Concentric Mosaics with viewing circles of different radii? 11/22/2018 @Z. Zhu CCNY

Circular Projection: Implementation Cameras: Single? Multiple? Standard? Special? Z viewing circle Case 1: two 1D sensors O Z viewing circle O Case 2: an omni sensor Requirements: Two sets of rays 180o apart Methods 1: Two Rectilinear Cameras 2: An Omnidirectional camera Question: Can we do it with a single rectilinear camera? 11/22/2018 @Z. Zhu CCNY

Circular Projection: Implementation (I) Single camera approach viewing circle O rotation axis image plane path of optical center left-eye ray right-eye ray 2b d R V V O Rotate a rectilinear camera off its optical center Take two columns with angular distance 2b << 180o Viewing circle smaller than circular path of the optical center Stretching your arm out, camera viewer may be too far from your eyes 11/22/2018 @Z. Zhu CCNY

Circular Projection: Implementation (2) Catadioptric approach O image plane viewing circle rotation axis (optical center) path of two “virtual” cameras right-eye ray left-eye ray R d Rv OL OR mirror pair 2g b >2b O Rotate a pair of mirror with a camera around its optical center Look outward at the scene through two slit windows Larger viewing circle since mirrors enlarge the viewing angle Camera viewer right in front of your eyes 11/22/2018 @Z. Zhu CCNY

Dynamic Ominstereo a few viewpoints moving freely (OmniVision2000) Requiements: Optimal configuration for any given point in the world Change the vergent angle and the baseline freely Issues: Dynamic Calibration View Planning Target Baseline Camera 1 Camera 2 Image 2 Image 1 11/22/2018 @Z. Zhu CCNY

Dynamic Ominstereo: depth error Question 1: Vergent angle Max vergent angle (f2 = 90o) Question 2: Baseline The larger the better? The error in estimating the baseline rotation shift 11/22/2018 @Z. Zhu CCNY

Dynamic Ominstereo: mutual calibration PAL 2 PAL 1 Sensors as calibration targets Make use of the visible epipoles Known target geometry Cylindrical body of the moving platform O2 O1 B a cylinder body Rc 11/22/2018 @Z. Zhu CCNY

Mutual calibration and human tracking: an example Pano 1: Image of the 2nd robot Images of a person Pano 2: Image of the 1st robot Results: B = 180 cm, D1 = 359 cm, D2 = 208 cm 11/22/2018 @Z. Zhu CCNY

Dynamic Ominstereo: Optimal view Baseline error proportional to B2 Larger baseline, even larger error Overall distance error is min if “Best” baseline and max vergent angle Distance error with optimal configuration proportional to D1.5 11/22/2018 @Z. Zhu CCNY

Dynamic Ominstereo: Optimal view application Track a single target by two robots One stationary, one moving Omnistereo head with reconfigurable vergent and baseline O1 O2(1) O2(2) T(1) T(2) rotation shift 11/22/2018 @Z. Zhu CCNY

Dynamic Ominstereo: error simulation Student project in the spring of 2003 Java Applet rotation shift http://www-cs.engr.ccny.cuny.edu/~zhu/omnistereo/simulation/ 11/22/2018 @Z. Zhu CCNY

Comparisons Four Cases Java Interactive Simulations Fixed viewpoint omnistereo One fixed, one circular projection Both circular projection Dynamic omnistereo Java Interactive Simulations http://www-cs.engr.ccny.cuny.edu/~zhu/omnistereo/errormaps/ 11/22/2018 @Z. Zhu CCNY

Java Interactive Simulations http://www-cs.engr.ccny.cuny.edu/~zhu/omnistereo/errormaps/ 11/22/2018 @Z. Zhu CCNY

Object-Centered OmniStereo Looking inward rather than Looking outward Modeling objects rather than scenes Many viewpoints over a large space Earth plane translation outward rotation in-ward rotation object Modeling a building Modeling the Earth 11/22/2018 @Z. Zhu CCNY

Omni modeling of an object Inward-Looking Rotation Many viewpoints over a large circle Circular projection: viewing circle within the object Can rotate the (small) object (e.g. human) instead moving the camera virtual viewing circle O rotation axis image plane “path” of optical center left-eye ray right-eye ray 2b d R object 11/22/2018 @Z. Zhu CCNY

Omni modeling of the earth Modeling the earth Airplane flying along great circles Taking the leading and trailing edge of each frame Data amount: 1017 pixels if 10 cm2/pixel 1015 pixels if 1 m2/pixel 1012 = 1 Tera = 1000 Giga Modeling a small area Rotation can be approximated as translation Parallel-perspective stereo mosaics Virtual flying through Earth plane 11/22/2018 @Z. Zhu CCNY

Parallel-perspective stereo mosaics Ideal model: Sensor motion is 1D translation, Nadir view Two “virtual” Pushbroom cameras Real Applications Airborne camera (Umass, Sarnoff..) Ground Vehicles (Tsinghua, Osaka) Sensor Image Plane “Right” Mosaic “Left” Mosaic 11/22/2018 @Z. Zhu CCNY

Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines 11/22/2018 @Z. Zhu CCNY

Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines 11/22/2018 @Z. Zhu CCNY

Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines 11/22/2018 @Z. Zhu CCNY

Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines 11/22/2018 @Z. Zhu CCNY

Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines 11/22/2018 @Z. Zhu CCNY

Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines 11/22/2018 @Z. Zhu CCNY

Recovering Depth from Mosaics Parallel-perspective stereo mosaics Depth accuracy independent of depth (in theory) Two views from different perspective stereo P(X,Y,Z) Height H from Laser Profiler GPS/IMU Adaptive baseline displacement disparity Fixed ! 11/22/2018 @Z. Zhu CCNY

Stereo mosaics of Amazon rain forest 166-frame telephoto video sequence -> 7056*944 mosaics Left Mosaic Right Mosaic Depth Map 11/22/2018 @Z. Zhu CCNY

Stereo viewing Red: Right view; Blue/Green: Left view 11/22/2018 @Z. Zhu CCNY

Accuracy of 3D from stereo mosaics (ICCV01, VideoReg01) Adaptive baselines and fixed disparity -uniform depth resolution in theory and accuracy proportional to depth in practice 3D recovery accuracy of parallel-perspective stereo mosaics is comparable to that of a perspective stereo with an optimal baseline 11/22/2018 @Z. Zhu CCNY

Conclusions Config. View-points Epipolar Geometry Error in direction Error in Distance Binocular 2, fixed Sine curve Non-isoptric  D2/B Dynamic 2, free Optimal for target  D1.5 VCP viewer-centered Many, small circle Horizontal line isoptric  D2 /2r OCP object-centered Many, large circle PPP Para-perspective Many, on a line uniform everywhere uniform or D 11/22/2018 @Z. Zhu CCNY