More Mosaic Madness © Jeffrey Martin (jeffrey-martin.com)

Slides:



Advertisements
Similar presentations
Computer Vision, Robert Pless
Advertisements

Last 4 lectures Camera Structure HDR Image Filtering Image Transform.
RANSAC and mosaic wrap-up cs129: Computational Photography James Hays, Brown, Fall 2012 © Jeffrey Martin (jeffrey-martin.com) most slides from Steve Seitz,
CS 691 Computational Photography Instructor: Gianfranco Doretto 3D to 2D Projections.
More Mosaic Madness : Computational Photography Alexei Efros, CMU, Fall 2011 © Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen from.
RANSAC and mosaic wrap-up cs195g: Computational Photography James Hays, Brown, Spring 2010 © Jeffrey Martin (jeffrey-martin.com) most slides from Steve.
CSCE 641 Computer Graphics: Image Mosaicing Jinxiang Chai.
Mosaics con’t CSE 455, Winter 2010 February 10, 2010.
Computational Photography: Image Mosaicing Jinxiang Chai.
Announcements Project 2 Out today Sign up for a panorama kit ASAP! –best slots (weekend) go quickly...
Announcements. Projection Today’s Readings Nalwa 2.1.
Homographies and Mosaics : Computational Photography Alexei Efros, CMU, Fall 2005 © Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen.
Lecture 5: Projection CS6670: Computer Vision Noah Snavely.
Lecture 7: Image Alignment and Panoramas CS6670: Computer Vision Noah Snavely What’s inside your fridge?
Announcements Mailing list Project 1 test the turnin procedure *this week* (make sure it works) vote on best artifacts in next week’s class Project 2 groups.
Lecture 13: Projection, Part 2
Announcements Project 1 artifact voting Project 2 out today panorama signup help session at end of class Guest lectures next week: Li Zhang, Jiwon Kim.
Panorama signups Panorama project issues? Nalwa handout Announcements.
1Jana Kosecka, CS 223b Cylindrical panoramas Cylindrical panoramas with some slides from R. Szeliski, S. Seitz, D. Lowe, A. Efros,
Lecture 6: Image Warping and Projection CS6670: Computer Vision Noah Snavely.
CSCE 641 Computer Graphics: Image Mosaicing Jinxiang Chai.
More Mosaic Madness : Computational Photography Alexei Efros, CMU, Fall 2005 © Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen from.
Panoramas and Calibration : Rendering and Image Processing Alexei Efros …with a lot of slides stolen from Steve Seitz and Rick Szeliski.
Announcements Project 1 Grading session this afternoon Artifacts due Friday (voting TBA) Project 2 out (online) Signup for panorama kits ASAP (weekend.
More Mosaic Madness : Computational Photography Alexei Efros, CMU, Fall 2006 © Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen from.
Announcements Project 1 artifact voting Project 2 out today (help session at end of class)
Cameras, lenses, and calibration
Today: Calibration What are the camera parameters?
Announcements Midterm due Friday, beginning of lecture Guest lecture on Friday: Antonio Criminisi, Microsoft Research.
Single-view modeling, Part 2 CS4670/5670: Computer Vision Noah Snavely.
Image Stitching Ali Farhadi CSE 455
Lecture 14: Projection CS4670 / 5670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Last Lecture (optical center) origin principal point P (X,Y,Z) p (x,y) x y.
Camera calibration Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz,, Fred Pighin and Marc Pollefyes.
Mosaics Today’s Readings Szeliski, Ch 5.1, 8.1 StreetView.
Image stitching Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac.
Geometric Camera Models
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Radial Distortion (Extrinsic Calibration?) 1.
Mosaics part 3 CSE 455, Winter 2010 February 12, 2010.
Project 1 Due NOW Project 2 out today –Help session at end of class Announcements.
Lecture 14: Projection CS4670 / 5670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Lecture 18: Cameras CS4670 / 5670: Computer Vision KavitaBala Source: S. Lazebnik.
Image Stitching Computer Vision CS 691E Some slides from Richard Szeliski.
Homographies and Mosaics : Computational Photography Alexei Efros, CMU, Fall 2006 © Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen.
Prof. Adriana Kovashka University of Pittsburgh October 3, 2016
CS4670 / 5670: Computer Vision Kavita Bala Lecture 20: Panoramas.
More Mosaic Madness © Jeffrey Martin (jeffrey-martin.com)
CS5670: Computer Vision Lecture 9: Cameras Noah Snavely
The Camera : Computational Photography
Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/4/17
Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/19
Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/4/15
RANSAC and mosaic wrap-up
More Mosaic Madness : Computational Photography
Overview Pin-hole model From 3D to 2D Camera projection
More Mosaic Madness © Jeffrey Martin (jeffrey-martin.com)
Idea: projecting images onto a common plane
Announcements Project 2 out today (help session at end of class)
More Mosaic Madness : Computational Photography
Geometric Camera Models
Digital Visual Effects Yung-Yu Chuang
The Camera : Computational Photography
Announcements Midterm out today Project 1 demos.
Image Stitching Computer Vision CS 678
Projective geometry Readings
Announcements Project 1 Project 2 Vote for your favorite artifacts!
Announcements Project 1 artifact voting Project 2 out today
Camera Calibration Reading:
Announcements Project 2 extension: Friday, Feb 8
Presentation transcript:

More Mosaic Madness © Jeffrey Martin (jeffrey-martin.com) CS194: Image Manipulation & Computational Photography Alexei Efros, UC Berkeley, Fall 2018 with a lot of slides stolen from Steve Seitz and Rick Szeliski

Spherical Panoramas All light rays through a point form a ponorama See also: Time Square New Years Eve http://www.panoramas.dk/fullscreen3/f1.html All light rays through a point form a ponorama Totally captured in a 2D array -- P(q,f) Where is the geometry???

Homography A: Projective – mapping between any two PPs with the same center of projection rectangle should map to arbitrary quadrilateral parallel lines aren’t but must preserve straight lines same as: project, rotate, reproject called Homography PP2 H p p’ PP1 To apply a homography H Compute p’ = Hp (regular matrix multiply) Convert p’ from homogeneous to image coordinates

Rotational Mosaics Can we say something more about rotational mosaics? i.e. can we further constrain our H?

3D → 2D Perspective Projection u (Xc,Yc,Zc) uc f K

3D Rotation Model Projection equations Project from image to 3D ray (x0,y0,z0) = (u0-uc,v0-vc,f) Rotate the ray by camera motion (x1,y1,z1) = R01 (x0,y0,z0) Project back into new (source) image (u1,v1) = (fx1/z1+uc,fy1/z1+vc) Therefore: Our homography has only 3,4 or 5 DOF, depending if focal length is known, same, or different. This makes image registration much better behaved (x,y,z) (u,v,f) R (x,y,z) f (u,v,f)

Pairwise alignment Given two sets of matching points, compute R Procrustes Algorithm [Golub & VanLoan] Given two sets of matching points, compute R pi’ = R pi with 3D rays pi = N(xi,yi,zi) = N(ui-uc,vi-vc,f) A = Σi pi pi’T = Σi pi piT RT = U S VT = (U S UT) RT VT = UT RT R = V UT

Rotation about vertical axis What if our camera rotates on a tripod? What’s the structure of H?

Do we have to project onto a plane? mosaic PP

Full Panoramas What if you want a 360 field of view? mosaic Projection Cylinder

Cylindrical projection Map 3D point (X,Y,Z) onto cylinder X Y Z unwrapped cylinder Convert to cylindrical coordinates cylindrical image Convert to cylindrical image coordinates unit cylinder

Cylindrical Projection X

Inverse Cylindrical projection X Y Z (X,Y,Z) (sinq,h,cosq)

Cylindrical panoramas Steps Reproject each image onto a cylinder Blend Output the resulting mosaic

Cylindrical image stitching What if you don’t know the camera rotation? Solve for the camera rotations Note that a rotation of the camera is a translation of the cylinder!

Assembling the panorama Stitch pairs together, blend, then crop

Problem: Drift Vertical Error accumulation small (vertical) errors accumulate over time apply correction so that sum = 0 (for 360° pan.) Horizontal Error accumulation can reuse first/last image to find the right panorama radius

Full-view (360°) panoramas

Spherical projection f Map 3D point (X,Y,Z) onto sphere Convert to spherical coordinates spherical image Convert to spherical image coordinates f unwrapped sphere

Spherical Projection Y X

Inverse Spherical projection φ (x,y,z) cos φ sin φ (sinθcosφ,cosθcosφ,sinφ) Y Z cos θ cos φ X

3D rotation φ cos φ sin φ cos θ cos φ p = R p Rotate image before placing on unrolled sphere φ (x,y,z) cos φ X Y Z (sinθcosφ,cosθcosφ,sinφ) sin φ cos θ cos φ p = R p _ _ _ _

Full-view Panorama + + + +

Other projections are possible You can stitch on the plane and then warp the resulting panorama What’s the limitation here? Or, you can use these as stitching surfaces But there is a catch…

Cylindrical reprojection Focal length – the dirty secret… Image 384x300 top-down view f = 180 (pixels) f = 280 f = 380 Therefore, it is desirable if the global motion model we need to recover is translation, which has only two parameters. It turns out that for a pure panning motion, if we convert two images to their cylindrical maps with known focal length, the relationship between them can be represented by a translation. Here is an example of how cylindrical maps are constructed with differently with f’s. Note how straight lines are curved. Similarly, we can map an image to its longitude/latitude spherical coordinates as well if f is given.

What’s your focal length, buddy? Focal length is (highly!) camera dependant Can get a rough estimate by measuring FOV: Can use the EXIF data tag (might not give the right thing) Can use several images together and try to find f that would make them match Can use a known 3D object and its projection to solve for f Etc. There are other camera parameters too: Optical center, non-square pixels, lens distortion, etc.

Distortion Radial distortion of the image Caused by imperfect lenses No distortion Pin cushion Barrel Radial distortion of the image Caused by imperfect lenses Deviations are most noticeable for rays that pass through the edge of the lens

Radial distortion Correct for “bending” in wide field of view lenses Use this instead of normal projection

Polar Projection Extreme “bending” in ultra-wide fields of view

Camera calibration Determine camera parameters from known 3D points or calibration object(s) internal or intrinsic parameters such as focal length, optical center, aspect ratio: what kind of camera? external or extrinsic (pose) parameters: where is the camera in the world coordinates? World coordinates make sense for multiple cameras / multiple images How can we do this?

Approach 1: solve for projection matrix Place a known object in the scene identify correspondence between image and scene compute mapping from scene to image

Direct linear calibration Solve for Projection Matrix P using least-squares (just like in homework) Advantages: All specifics of the camera summarized in one matrix Can predict where any world point will map to in the image Disadvantages: Doesn’t tell us about particular parameters Mixes up internal and external parameters pose specific: move the camera and everything breaks

Approach 2: solve for parameters A camera is described by several parameters Translation T of the optical center from the origin of world coords Rotation R of the image plane focal length f, principle point (x’c, y’c), pixel size (sx, sy) blue parameters are called “extrinsics,” red are “intrinsics” Projection equation The projection matrix models the cumulative effect of all parameters Useful to decompose into a series of operations projection intrinsics rotation translation identity matrix Solve using non-linear optimization

Multi-plane calibration Images courtesy Jean-Yves Bouguet, Intel Corp. Advantage Only requires a plane Don’t have to know positions/orientations Good code available online! Intel’s OpenCV library: http://www.intel.com/research/mrl/research/opencv/ Matlab version by Jean-Yves Bouget: http://www.vision.caltech.edu/bouguetj/calib_doc/index.html Zhengyou Zhang’s web site: http://research.microsoft.com/~zhang/Calib/

Setting alpha: simple averaging Alpha = .5 in overlap region

Setting alpha: center seam Distance Transform bwdist Alpha = logical(dtrans1>dtrans2)

Setting alpha: blurred seam Distance transform Alpha = blurred

Simplification: Two-band Blending Brown & Lowe, 2003 Only use two bands: high freq. and low freq. Blends low freq. smoothly Blend high freq. with no smoothing: use binary alpha

2-band “Laplacian Stack” Blending Low frequency (l > 2 pixels) High frequency (l < 2 pixels)

Linear Blending

2-band Blending