Download presentation
Presentation is loading. Please wait.
1
Automatic Panoramic Image Stitching using Local Features Matthew Brown and David Lowe, University of British Columbia
2
Introduction Are you getting the whole picture? –Compact Camera FOV = 50 x 35°
3
Introduction Are you getting the whole picture? –Compact Camera FOV = 50 x 35° –Human FOV = 200 x 135°
4
Introduction Are you getting the whole picture? –Compact Camera FOV = 50 x 35° –Human FOV = 200 x 135° –Panoramic Mosaic = 360 x 180°
5
Why Use Local Features?
6
1D Rotations () –Ordering matching images
7
Why Use Local Features? 1D Rotations () –Ordering matching images
8
Why Use Local Features? 1D Rotations () –Ordering matching images
9
Why Use Local Features? 2D Rotations (, ) –Ordering matching images 1D Rotations () –Ordering matching images
10
Why Use Local Features? 1D Rotations () –Ordering matching images 2D Rotations (, ) –Ordering matching images
11
Why Use Local Features? 1D Rotations () –Ordering matching images 2D Rotations (, ) –Ordering matching images
12
Recognising Panoramas [ Brown and Lowe ICCV 2003, IJCV 2007 ]
13
Automatic Stitching Feature Matching Image Matching Image Alignment Rendering Results Conclusions
14
Automatic Stitching Feature Matching –SIFT Features –Nearest Neighbour Matching Image Matching Image Alignment Rendering Results Conclusions
15
Invariant Features Schmid & Mohr 1997, Lowe 1999, Baumberg 2000, Tuytelaars & Van Gool 2000, Mikolajczyk & Schmid 2001, Brown & Lowe 2002, Matas et. al. 2002, Schaffalitzky & Zisserman 2002
16
SIFT Features Invariant Features –Establish invariant frame Maxima/minima of scale-space DOG x, y, s Maximum of distribution of local gradients –Form descriptor vector Histogram of smoothed local gradients 128 dimensions SIFT features are… –Geometrically invariant to similarity transforms, some robustness to affine change –Photometrically invariant to affine changes in intensity
17
Automatic Stitching Feature Matching –SIFT Features –Nearest Neighbour Matching Image Matching Image Alignment Rendering Results Conclusions
18
Nearest Neighbour Matching Nearest neighbour matching Use k-d tree –k-d tree recursively bi-partitions data at mean in the dimension of maximum variance –Approximate nearest neighbours found in O(n log n) Find k-NN for each feature –k number of overlapping images (we use k = 4) [ Beis Lowe 1997, Nene Nayar 1997, Gray Moore 2000, Shakhnarovich 2003 ]
19
K-d tree
21
Automatic Stitching Feature Matching Image Matching –Motion Model –RANSAC Image Alignment Rendering Results Conclusions
22
2D Motion Models Linear (affine) Homography
23
Projection equation → pairwise homographies Homography for Rotation set t = 0 for a pair where
24
Automatic Stitching Feature Matching Image Matching –Motion Model –RANSAC Image Alignment Rendering Results Conclusions
25
RANSAC for Homography
28
RANSAC: 1D Line Fitting least squares line
29
RANSAC: 1D Line Fitting
30
RANSAC line
31
Match Verification
32
Finding the panoramas
36
Automatic Stitching Feature Matching Image Matching Image Alignment –Bundle Adjustment –Automatic Straightening Rendering Results Conclusions
37
Recall our image motion model Parameterise each camera by rotation and focal length Motion Model Revisited
38
Bundle Adjustment Sum of squared projection errors –n = #images –I(i) = set of image matches to image i –F(i, j) = set of feature matches between images i,j –r ij k = residual of k th feature match between images i,j Huber (robust) error function
39
Bundle Adjustment Adjust rotation, focal length of each image to minimise error in matched features
40
Bundle Adjustment Adjust rotation, focal length of each image to minimise error in matched features
41
Automatic Stitching Feature Matching Image Matching Image Alignment –Bundle Adjustment –Automatic Straightening Rendering Results Conclusions
42
Automatic Straightening
43
Heuristic: user does not twist camera relative to horizon Up-vector perpendicular to plane of camera x vectors
44
Automatic Straightening
45
Automatic Stitching Feature Matching Image Matching Image Alignment Rendering Results Conclusions
46
Gain Compensation No gain compensation
47
Gain Compensation Gain compensation –Single gain parameter g i for each image ( Better solution = HDR [ Debevec 1997 ] )
48
Multi-band Blending No blending
49
Multi-band Blending Linear blending –Each pixel is a weighted sum
50
Multi-band Blending Multi-band blending –Each pixel is a weighted sum (for each band)
51
Multi-band Blending Linear blending Multi-band blending [ Burt Adelson 1983 ]
52
2-band Blending
53
Low frequency ( > 2 pixels) High frequency ( < 2 pixels) 2-band Blending
54
Seam Selection (simple) Choose image with max “weight”: (better) …also minimise error on seams [ Agarwala et al SIGGRAPH 04 ]
55
Automatic Stitching Feature Matching Image Matching Image Alignment Rendering Results Conclusions
56
Demo
57
Evaluation 200+ test sequences…
58
Ground Truth Real: stitch “by hand” Synthetic: sample virtual camera views Stitched panorama Synthetic camera views
59
Error function Compare test stitch with ground truth –Ground truth –Test stitch Evaluation function – sum of pairwise projection errors wrt ground truth
60
Results Testing performance (image scale) Tuning parameters (Huber sigma)
61
Conclusions Image Stitching using Local Features –c.f. “direct methods”: fast, robust, –2D stitching is a recognition problem Multi-Image Matching Framework –Local features, RANSAC, bundle adjustment, blending Future Work –camera model += radial distortion, camera translation, scene motion, vignetting, HDR, flash … –Full 3D case e.g. Photo Tourism [Snavely et al SIGGRAPH 2006] http://research.microsoft.com/~brown
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.