RANSAC and mosaic wrap-up cs195g: Computational Photography James Hays, Brown, Spring 2010 © Jeffrey Martin (jeffrey-martin.com) most slides from Steve.

Slides:



Advertisements
Similar presentations
Summary of Friday A homography transforms one 3d plane to another 3d plane, under perspective projections. Those planes can be camera imaging planes or.
Advertisements

Last 4 lectures Camera Structure HDR Image Filtering Image Transform.
RANSAC and mosaic wrap-up cs129: Computational Photography James Hays, Brown, Fall 2012 © Jeffrey Martin (jeffrey-martin.com) most slides from Steve Seitz,
CS 691 Computational Photography Instructor: Gianfranco Doretto 3D to 2D Projections.
More Mosaic Madness : Computational Photography Alexei Efros, CMU, Fall 2011 © Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen from.
CSCE 641 Computer Graphics: Image Mosaicing Jinxiang Chai.
Mosaics con’t CSE 455, Winter 2010 February 10, 2010.
Computational Photography: Image Mosaicing Jinxiang Chai.
Announcements Project 2 Out today Sign up for a panorama kit ASAP! –best slots (weekend) go quickly...
Announcements. Projection Today’s Readings Nalwa 2.1.
Lecture 5: Projection CS6670: Computer Vision Noah Snavely.
1Jana Kosecka, CS 223b Cylindrical panoramas Cylindrical panoramas.
Lecture 7: Image Alignment and Panoramas CS6670: Computer Vision Noah Snavely What’s inside your fridge?
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Announcements Mailing list Project 1 test the turnin procedure *this week* (make sure it works) vote on best artifacts in next week’s class Project 2 groups.
Lecture 20: Light, color, and reflectance CS6670: Computer Vision Noah Snavely.
Lecture 13: Projection, Part 2
Announcements Project 1 artifact voting Project 2 out today panorama signup help session at end of class Guest lectures next week: Li Zhang, Jiwon Kim.
1Jana Kosecka, CS 223b Cylindrical panoramas Cylindrical panoramas with some slides from R. Szeliski, S. Seitz, D. Lowe, A. Efros,
Lecture 6: Image Warping and Projection CS6670: Computer Vision Noah Snavely.
CSCE 641 Computer Graphics: Image Mosaicing Jinxiang Chai.
More Mosaic Madness : Computational Photography Alexei Efros, CMU, Fall 2005 © Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen from.
Panoramas and Calibration : Rendering and Image Processing Alexei Efros …with a lot of slides stolen from Steve Seitz and Rick Szeliski.
More Mosaic Madness : Computational Photography Alexei Efros, CMU, Fall 2006 © Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen from.
Automatic Image Alignment (feature-based) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.
CSE 576, Spring 2008Projective Geometry1 Lecture slides by Steve Seitz (mostly) Lecture presented by Rick Szeliski.
Announcements Project 1 artifact voting Project 2 out today (help session at end of class)
Today: Calibration What are the camera parameters?
Feature Matching and RANSAC : Computational Photography Alexei Efros, CMU, Fall 2005 with a lot of slides stolen from Steve Seitz and Rick Szeliski.
Announcements Midterm due Friday, beginning of lecture Guest lecture on Friday: Antonio Criminisi, Microsoft Research.
Projective Geometry and Single View Modeling CSE 455, Winter 2010 January 29, 2010 Ames Room.
Single-view modeling, Part 2 CS4670/5670: Computer Vision Noah Snavely.
Image Stitching Ali Farhadi CSE 455
Lecture 14: Projection CS4670 / 5670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Last Lecture (optical center) origin principal point P (X,Y,Z) p (x,y) x y.
Camera calibration Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz,, Fred Pighin and Marc Pollefyes.
Mosaics Today’s Readings Szeliski, Ch 5.1, 8.1 StreetView.
Image stitching Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac.
Example: line fitting. n=2 Model fitting Measure distances.
Geometric Camera Models
Mosaics part 3 CSE 455, Winter 2010 February 12, 2010.
776 Computer Vision Jan-Michael Frahm Spring 2012.
Mosaics Today’s Readings Szeliski and Shum paper (sections 1 and 2, skim the rest) –
Summary of Monday A homography transforms one 3d plane to another 3d plane, under perspective projections. Those planes can be camera imaging planes or.
Lecture 14: Projection CS4670 / 5670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Image Stitching Computer Vision CS 691E Some slides from Richard Szeliski.
Invariant Local Features Image content is transformed into local feature coordinates that are invariant to translation, rotation, scale, and other imaging.
776 Computer Vision Jan-Michael Frahm Spring 2012.
CS4670 / 5670: Computer Vision Kavita Bala Lecture 20: Panoramas.
More Mosaic Madness © Jeffrey Martin (jeffrey-martin.com)
Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/4/17
Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/4/15
RANSAC and mosaic wrap-up
More Mosaic Madness : Computational Photography
Image Stitching Slides from Rick Szeliski, Steve Seitz, Derek Hoiem, Ira Kemelmacher, Ali Farhadi.
More Mosaic Madness © Jeffrey Martin (jeffrey-martin.com)
Idea: projecting images onto a common plane
Announcements Project 2 out today (help session at end of class)
More Mosaic Madness : Computational Photography
Digital Visual Effects Yung-Yu Chuang
Feature Matching and RANSAC
Image Stitching Computer Vision CS 678
Projective geometry Readings
More Mosaic Madness © Jeffrey Martin (jeffrey-martin.com)
Announcements Project 1 Project 2 Vote for your favorite artifacts!
Announcements Project 1 artifact voting Project 2 out today
Calibration and homographies
Back to equations of geometric transformations
Image Stitching Linda Shapiro ECE/CSE 576.
Image Stitching Linda Shapiro ECE P 596.
Presentation transcript:

RANSAC and mosaic wrap-up cs195g: Computational Photography James Hays, Brown, Spring 2010 © Jeffrey Martin (jeffrey-martin.com) most slides from Steve Seitz, Rick Szeliski, and Alexei A. Efros

Feature matching ?

Exhaustive search for each feature in one image, look at all the other features in the other image(s) Hashing compute a short descriptor from each feature vector, or hash longer descriptors (randomly) Nearest neighbor techniques kd-trees and their variants

What about outliers? ?

Feature-space outlier rejection Let’s not match all features, but only these that have “similar enough” matches? How can we do it? SSD(patch1,patch2) < threshold How to set threshold?

Feature-space outlier rejection A better way [Lowe, 1999]: 1-NN: SSD of the closest match 2-NN: SSD of the second-closest match Look at how much better 1-NN is than 2-NN, e.g. 1-NN/2-NN That is, is our best match so much better than the rest?

Feature-space outliner rejection Can we now compute H from the blue points? No! Still too many outliers… What can we do?

Matching features What do we do about the “bad” matches?

RANdom SAmple Consensus Select one match, count inliers

RANdom SAmple Consensus Select one match, count inliers

Least squares fit Find “average” translation vector

RANSAC for estimating homography RANSAC loop: 1.Select four feature pairs (at random) 2.Compute homography H (exact) 3.Compute inliers where SSD(p i ’, H p i) < ε 4.Keep largest set of inliers 5.Re-compute least-squares H estimate on all of the inliers

RANSAC The key idea is not just that there are more inliers than outliers, but that the outliers are wrong in different ways.

RANSAC

Mosaic odds and ends

mosaic PP Do we have to project onto a plane?

Full Panoramas What if you want a 360  field of view? mosaic Projection Cylinder

Map 3D point (X,Y,Z) onto cylinder Cylindrical projection X Y Z unit cylinder unwrapped cylinder Convert to cylindrical coordinates cylindrical image Convert to cylindrical image coordinates

Cylindrical Projection Y X

Inverse Cylindrical projection X Y Z (X,Y,Z)(X,Y,Z) (sin ,h,cos  )

Cylindrical panoramas Steps Reproject each image onto a cylinder Blend Output the resulting mosaic

Cylindrical image stitching What if you don’t know the camera rotation? Solve for the camera rotations –Note that a rotation of the camera is a translation of the cylinder!

Assembling the panorama Stitch pairs together, blend, then crop

Problem: Drift Vertical Error accumulation small (vertical) errors accumulate over time apply correction so that sum = 0 (for 360° pan.) Horizontal Error accumulation can reuse first/last image to find the right panorama radius

Full-view (360°) panoramas

Other projections are possible You can stitch on the plane and then warp the resulting panorama What’s the limitation here? Or, you can use these as stitching surfaces But there is a catch…

f = 180 (pixels) Cylindrical reprojection f = 380f = 280 Image 384x300 top-down view Focal length – the dirty secret…

What’s your focal length, buddy? Focal length is (highly!) camera dependant Can get a rough estimate by measuring FOV: Can use the EXIF data tag (might not give the right thing) Can use several images together and try to find f that would make them match Can use a known 3D object and its projection to solve for f Etc. There are other camera parameters too: Optical center, non-square pixels, lens distortion, etc.

Spherical projection unwrapped sphere Convert to spherical coordinates spherical image Convert to spherical image coordinates X Y Z Map 3D point (X,Y,Z) onto sphere 

Spherical Projection Y X

Inverse Spherical projection X Y Z (x,y,z) (sinθcosφ,cosθcosφ,sinφ) cos φ φ cos θ cos φ sin φ

3D rotation Rotate image before placing on unrolled sphere X Y Z (x,y,z) (sinθcosφ,cosθcosφ,sinφ) cos φ φ cos θ cos φ sin φ _ p = R p

Full-view Panorama

Distortion Radial distortion of the image Caused by imperfect lenses Deviations are most noticeable for rays that pass through the edge of the lens No distortionPin cushionBarrel

Radial distortion Correct for “bending” in wide field of view lenses Use this instead of normal projection

Polar Projection Extreme “bending” in ultra-wide fields of view

Camera calibration Determine camera parameters from known 3D points or calibration object(s) 1.internal or intrinsic parameters such as focal length, optical center, aspect ratio: what kind of camera? 2.external or extrinsic (pose) parameters: where is the camera in the world coordinates? World coordinates make sense for multiple cameras / multiple images How can we do this?

Approach 1: solve for projection matrix Place a known object in the scene identify correspondence between image and scene compute mapping from scene to image

Direct linear calibration Solve for Projection Matrix  using least-squares (just like in homework) Advantages: All specifics of the camera summarized in one matrix Can predict where any world point will map to in the image Disadvantages: Doesn’t tell us about particular parameters Mixes up internal and external parameters –pose specific: move the camera and everything breaks

Projection equation The projection matrix models the cumulative effect of all parameters Useful to decompose into a series of operations projectionintrinsicsrotationtranslation identity matrix Approach 2: solve for parameters A camera is described by several parameters Translation T of the optical center from the origin of world coords Rotation R of the image plane focal length f, principle point (x’ c, y’ c ), pixel size (s x, s y ) blue parameters are called “extrinsics,” red are “intrinsics” Solve using non-linear optimization

Multi-plane calibration Images courtesy Jean-Yves Bouguet, Intel Corp. Advantage Only requires a plane Don’t have to know positions/orientations Good code available online! –Intel’s OpenCV library: –Matlab version by Jean-Yves Bouget: –Zhengyou Zhang’s web site:

Final Project

Mainstream computational photography Photoshop CS5 was just released

Mainstream computational photography Photoshop CS5 was just released Many new features directly related to this course image morphing image matting image completion HDR – registration and tone mapping Painterly rendering