Download presentation
Presentation is loading. Please wait.
Published byBrook Owens Modified over 9 years ago
1
Image Based Rendering an overview
2
2 Photographs We have tools that acquire and tools that display photographs at a convincing quality level
3
3
4
4
5
5
6
6
7
7
8
8
9
9
10
10
11
11 Photographs We have tools that acquire and tools that display photographs at a convincing quality level, for almost 100 years now
12
12 Sergei Mikhailovich Prokudin-Gorskii. A Settler's Family, ca. 1907-1915.
13
13 Sergei Mikhailovich Prokudin-Gorskii. Tea Factory in Chakva. Chinese Foreman Lau-Dzhen-Dzhau. ca. 1907-1915.
14
14 Sergei Mikhailovich Prokudin-Gorskii. The Emir of Bukhara, 1911.
15
15 RGB in early 1900’s
16
16 Plenoptic function Defines all the rays –through any point in space (x, y, z) –with any orientation (θ, φ) –over all wavelenghts (λ) –at any given moment in time (t)
17
17 IBR summary explicitimplicit geometric model texture mappingpanoramasview morphing 3D image warping ray databases Representation of plenoptic function
18
18 Lightfield – Lumigraph approach [Levoy96, Gortler96] Take all photographs you will ever need to display Model becomes database of rays Rendering becomes database querying
19
19 Overview Introduction Lightfield – Lumigraph –definition –construction –compression
20
20 Overview Introduction Lightfield – Lumigraph –definition –construction –compression
21
21 From 7D to 4D Static scene, t constant λ approximated with RGB consider only convex hull of objects, so the origin of the ray does not matter
22
22 4D Lightfield / Lumigraph
23
23 Discreet 4D Lightfield
24
24 Lightfield: set of images with COPs on regular grid
25
25 or Lightfield: set of images of a point seen at various angles
26
26 Depth correction of rays
27
27 Overview Introduction Lightfield – Lumigraph –definition –construction –compression
28
28 Overview Introduction Lightfield – Lumigraph –definition –construction –compression
29
29 Construction from dense set of photographs
30
30 Construction from sparse set of photographs acquisition stage camera positions blue screeningspace carving
31
31 Filling in gaps using pull-push algorithm Pull phase low res levels are created gaps are shrunk Push phase gaps at high res levels are filled using low res levels
32
32 Overview Introduction Lightfield – Lumigraph –definition –construction –compression
33
33 Overview Introduction Lightfield – Lumigraph –definition –construction –compression
34
34 Compression Large size uncompressed: 1.125GB –32x32 (s, t) x 256x256 (u, v) x 6 faces x 3 B Compression –jpeg + mpeg (200:1 to 6MB) –or vector quantization + entropy encoding
35
35 Vector Quantization (VQ) Principle –codebook made of codewords –replace actual word with closest codeword Implementation –training on representative set of words to derive best codebook –compression: replacing word with index to closest codeword –decompression: retrieve indexed codeword from codebook
36
36 Lightfield compression using VQ
37
37 View morphing
38
38 Motivation – rendering from images Given –left image –right image Create intermediate images –simulates camera movement [Seitz96]
39
39 Previous work Panoramas ([Chen95], etc) –user can look in any direction at few given locations Image-morphing ([Wolberg90], [Beier92], etc) –linearly interpolated intermediate positions of features –input: two images and correspondences –output: metamorphosis of one image into other as sequence of intermediate images
40
40 Previous work limitations Panoramas ([Chen95], etc.) –no camera translations allowed Image morphing ([Wolberg90], [Beier92], etc.) –not shape-preserving –image morphing is also a morph of the object –to simulate rendering with morphing, the object should be rigid when camera moves
41
41 Overview Introduction Image morphing View morphing –image pre-warping –image morphing –image post-warping
42
42 Overview Introduction Image morphing View morphing –image pre-warping –image morphing –image post-warping
43
43 Image morphing 1.Correspondences
44
44 Image morphing 1.Correspondences
45
45 Image morphing 1.Correspondences
46
46 Image morphing 1.Correspondences
47
47 Image morphing 1.Correspondences 2.Linear interpolation P0P0 PkPk PnPn frame 0frame kframe n
48
48 Image morphing not shape preserving
49
49 Early IBR research Soft watch at moment of first explosion – Salvador Dali 1954
50
50 Overview Introduction Image morphing View morphing –image pre-warping –image morphing –image post-warping
51
51 Overview Introduction Image morphing View morphing –image pre-warping –image morphing –image post-warping
52
52 View morphing Shape preserving morph Three step algorithm 1.Prewarp first and last images to parallel views 2.Image morph between prewarped images 3.Postwarp to interpolated view
53
53 Step 1: prewarp to parallel views Parallel views –same image plane –image plane parallel to segment connecting the two centers of projection Prewarp –compute parallel views I 0p, I np –rotate I 0 and I n to parallel views –prewarp corrs. (P 0, P n ) -> (P op, P np ) CnCn C0C0 I0I0 InIn I np I 0p P P0P0 PnPn P 0p P np
54
54 Step 2: morph parallel images Shape preserving Use prewarped correspondences Interpolate C k from C 0 C n CnCn C0C0 I0I0 InIn I np I 0p P P0P0 PnPn P 0p P np CkCk P kp
55
55 Step 3: Postwarping Postwarp morphed image –create intermediate view C k is known interpolate view direction and tilt –rotate morphed image to intermediate view CnCn C0C0 I0I0 InIn I 0p CkCk
56
56 View morphing shape preserving
57
57 Overview Introduction Image morphing View morphing, more details –image pre-warping –image morphing –image post-warping
58
58 Step 1: prewarp to parallel views Parallel views –use C 0 C n for x (a p vector) –use (a 0 x b 0 ) x (a n x b n ) as y (-b p ) –pick a p and b p to resemble a 0 b 0 as much as possible –use same pixel size –use wider field of view CnCn C0C0 I0I0 InIn I np I 0p P P0P0 PnPn P 0p P np
59
59 Step 1: prewarp to parallel views prewarping using texture mapping –create polygon for image plane –consider it texture mapped with the image itself –render the “scene” from prewarped view –if you go this path you will have to implement clipping with the COP plane –you have texture mapping already alternative: prewarping using reprojection of rays –look up all the rays of the prewarped view in the original view CnCn C0C0 I0I0 InIn I np I 0p P P0P0 PnPn P 0p P np
60
60 Step 1: prewarp to parallel views prewarping correspondences –for all pairs of correspondence P 0 P n project P 0 on I 0p, computing P 0p project P n on I np, computing P np prewarped correspondence is P op P np CnCn C0C0 I0I0 InIn I np I 0p P P0P0 PnPn P 0p P np
61
61 Step 2: morph parallel images Image morphing –use prewarped correspondences to compute a correspondence for all pixels in I 0p –linearly interpolate I 0p to intermediate positions –useful observation corresponding pixels are on same line in prewarped views –preventing holes use larger footprint (ex 2x2) or linearly interpolate between consecutive samples or postprocess morphed image looking for background pixels and replacing them with neighboring values –visibility artifacts collision of samples –zbuffer on disparity holes –morph I np to I kp –use additional views CnCn C0C0 I0I0 InIn I np I 0p P P0P0 PnPn P 0p P np CkCk P kp
62
62 Step 3: Postwarping create intermediate view –C k is known –current view direction is a linear interpolation of the start and end view directions –current up vector is a linear interpolation of the start and end up vectors rotate morphed image to intermediate view –same as prewarping CnCn C0C0 I0I0 InIn I 0p CkCk
63
63 Image-Based Rendering by Warping
64
64 Overview Introduction Depth extraction methods Reconstruction for IBRW Visibility without depth Sample selection
65
65 Overview Introduction –comparison to other IBR methods –3D warping equation –reconstruction
66
66 IBR by Warping (IBRW) Images enhanced with per-pixel depth [McMillan95] P1P1
67
67 IBR by Warping (IBRW) P1P1 1/w 1 also called generalized disparity another notation δ(u 1, v 1 )
68
68 IBR by Warping (IBRW) P1P1 P2P2
69
69 3D warping equations
70
70 A complete IBR method Façade system –(coarse) geometric model needed Panoramas –viewer confined to center of panorama View morphing –correspondences needed IBRW –rendering to arbitrary new views
71
71 DeltaSphere - depth&color acquisition device Lars Nyland et al.
72
72
73
73
74
74
75
75
76
76 Reconstructing by splatting Estimate shape and size of footprint of warped samples –expensive to do accurately –lower image quality if crudely approximated Samples are z-buffered
77
77 Overview Introduction Depth extraction methods Reconstruction for IBRW Visibility without depth Sample selection
78
78 Finding depth
79
79 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders
80
80 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders
81
81 Depth from stereo P1P1 P2P2 two cameras with known parameters infer 3D location of point seen in both images sub problem: correspondences for a point seen in the left image, find its projection in the right image
82
82 Depth from stereo: déjà-vu math P1P1 P2P2 unknowns are w 1 and w 2 overconstrained system the u 2 v 2 coordinates of a point seen at u 1 v 1 are constrained to an epipolar line
83
83 Epipolar line P1P1 P2P2 C 1, C 2, P 1 define a plane P 2 will be on that plane P 2 is also on the image plane 2 So P 2 will be on the line defined by the two planes’ intersection
84
84 Search for correspondences on epipolar line P1P1 P2P2 Reduces the dimensionality of the search space Walk on epipolar segment rather than search in entire image
85
85 Parallel views Preferred stereo configuration epipolar lines are horizontal, easy to search
86
86 Parallel views Limit search to epipolar segment from u 2 = u 1 (P is infinitely far away) to 0 (P is close)
87
87 Depth precision analysis 1/z linear with disparity (u 1 – u 2 ) better depth resolution for nearby objects important to determine correspondences with subpixel accuracy
88
88 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders
89
89 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders
90
90 Depth from stereo problem Correspondences are difficult to find Structured light approach –replace one camera with projector –project easily detectable patterns –establishing correspondences becomes a lot easier
91
91 Depth from structured light P1P1 P2P2 C 1 is a projector Projects a pattern centered at u 1 v 1 Pattern center hits object scene at P Camera C 2 sees pattern at u 2 v 2, easy to find 3D location of P is determined
93
93
94
94 Depth from structured light challenges Associated with using projectors –expensive, cannot be used outdoors, not portable Difficult to identify pattern –I found a corner, which corner is it? Invasive, change the color of the scene –one could use invisible light, IR
95
95 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders
96
96 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders
97
97 Depth of field aperture object image CF F’ Thin lenses rays through lens center (C) do not change direction rays parallel to optical axis go through focal point (F’)
98
98 Depth of field aperture object image plane CF F’ For a given focal length, only objects that are at a certain depth are in focus
99
99 Out of focus aperture object image plane CF F’ When object at different depth One point projects to several locations in the image Out of focus, blurred image
100
100 Focusing aperture object image plane CF F’ Move lens to focus for new depth Relationship between focus and depth can be exploited to extract depth
101
101 Determine z for points in focus aperture object image plane CF F’ hihi h af z
102
102 Depth from defocus Take images of a scene with various camera parameters Measuring defocus variation, infer range to objects Does not need to find the best focusing planes for the various objects Examples by Shree Nayar, Columbia U
103
103 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders
104
104 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders
105
105 Laser range finders Send a laser beam to measure the distance –like RADAR, measures time of flight
106
106 DeltaSphere - depth&color acquisition device Lars Nyland et al. courtesy 3 rd Tech Inc.
107
107 300 o x 300 o panorama this is the reflected light
108
108 300 o x 300 o panorama this is the range light
109
109 courtesy 3 rd Tech Inc. spherical range panoramas planar re-projection
110
110 courtesy 3 rd Tech Inc. Jeep – one scan
111
111 courtesy 3 rd Tech Inc. Jeep – one scan
112
112 courtesy 3 rd Tech Inc. Complete Jeep model
113
113
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.