Download presentation
Presentation is loading. Please wait.
Published byBruno Cain Modified over 9 years ago
1
112/5/2015 12:54 Graphics II 91.547 Image Based Rendering Session 11
2
212/5/2015 12:54 A Rendering Taxonomy
3
312/5/2015 12:54 The Plenoptic Function “… the pencil of rays visible from any point in space, at any time, and over any range of wavelengths” Given a set of discrete samples (complete or incomplete) from the plenoptic function, the goal of image-based rendering is to generate a continuous representation of that function.
4
412/5/2015 12:54 Movie Map (Lippman 1980) Find Nearest Sample Movie Storage Image
5
512/5/2015 12:54 Taxonomy of “Virtual Camera” Movement (Chen et al. 1995) 0 Camera Rotation -Camera fixed at a particular location -Three rotational degrees of freedom =Pitch (up and down) =Yaw (about vertical axis) =Roll (about camera axis) 0 Object Rotation -Camera always pointing at center of object -Viewpoint constrained to move over surface of sphere -Three angular degrees of freedom 0 Camera movement -Viewpoint unconstrained -Viewing direction unconstrained
6
612/5/2015 12:54 Environment Maps Map Geometries Cube SphereCylinder
7
712/5/2015 12:54 Quick Time VR TM (Chen 1995) 2500 Pixels 768 Pixels 2500 x 768 = 1.9 G Pixels x 3 B/pixel = 5.8 GB 10:1 compression 500 MB/panorama
8
812/5/2015 12:54 Image Distortion from Cylindrical Environment Map Projection Plane Cylindrical Environment Map Pre-warped Projection onto Plane
9
912/5/2015 12:54 Quick Time VR Image Warping for Correct Perspective View
10
1012/5/2015 12:54 Quick Time VR Panoramic Display Process Compressed Tiles Visible Tiles CD ROM or Hard Disk Main Memory Compressed Tiles Cache Visible Region Display Window Warp Decompress Offscreen Buffer
11
1112/5/2015 12:54 Quick Time VR Accomplishing (Limited) Camera Motion
12
1212/5/2015 12:54 Accomplishing Camera Motion Greene&Kass (1993 Apple Tech Doc.) 0 Regular 3-D lattice of cubic environment maps 0 Each environment map is a z-buffered rendering from a discrete viewpoint 0 Image from a new viewpoint is generated by re-sampling the environment map 0 Re-sampling involves rendering the pixels in the environment maps as 3-D polygons from the new viewpoint 0 Rendering time proportional to the environment map resolution but independent of scene complexity 0 Not suitable for real-time walkthrough performance on typical desktop computers (especially in 1993!)
13
1312/5/2015 12:54 Alternative approach: Work entirely in Image Space 0 Sequence of images from closely spaced viewpoints is highly coherent 0 Depends upon the ability to establish a pixel-by-pixel correspondence between adjacent images -Can be computed if range data and camera parameters are known (true for rendered images) -For natural images, there are several techniques including manual user intervention 0 Pairwise correspondence between two images can be stored as a pair of morph maps -Bi-directional maps required because of possible many to one and one to many pixel correspondences 0 Can be represented by graph data structure where nodes are images and arcs are bi-directional morph maps
14
1412/5/2015 12:54 N-Dimensional Graph Data Structure Image Bi-directional Morph Maps
15
1512/5/2015 12:54 Simple View Interpolation Reference Image 1Reference Image 2 Corresponding Pixels Morph maps
16
1612/5/2015 12:54 Image Overlap or Image Folding P1 P2 Reference View Interpolated View
17
1712/5/2015 12:54 Image Holes or Image Stretching Interpolated View Reference View P1 P2
18
1812/5/2015 12:54 Example of Hole Region Viewpoint 1Viewpoint 2
19
1912/5/2015 12:54 Example of Hole Region Minimizing by Closely Spaced Viewpoints Viewpoint 1Viewpoint 2
20
2012/5/2015 12:54 Source Image Viewed from Camera Moved to the Right Ref. View 1 Ref. View 2
21
2112/5/2015 12:54 Offset Vectors for Camera Motion Morph Map
22
2212/5/2015 12:54 Locus of Morph Map for Motion Parallel to Image Plane and Floor
23
2312/5/2015 12:54 Distortion of Intermediate Images with Linear Warp Linear path of one feature
24
2412/5/2015 12:54 Morphing Parallel Views Reference image Interpolated image
25
2512/5/2015 12:54 View Interpolation: The Algorithm 11 2 2 3
26
2612/5/2015 12:54 Example 1 of calculated intermediate images Reference Image 1Reference Image 2 Intermediate Views
27
2712/5/2015 12:54 Example 2 of calculated intermediate images Reference Image 1Reference Image 2Interpolated Image
28
2812/5/2015 12:54 Multiple-Center-of-Projection Images (Rademacher&Bishop 1998) 0 Information from a set of viewpoints stored in a single image 0 Features -Greater connectivity information compared with collections of standard images -Greater flexibility in the acquisition of image-based datasets, e.g. sampling different portions of the scene at different resolutions
29
2912/5/2015 12:54 Multiple-Center-of-Projection Images Definition 0 A multiple-center-of-projection image consists of a two- dimensional image and a parameterized set of cameras meeting the following conditions: -The cameras must lie on either a continuous curve or a continuous surface -Each pixel is acquired by a single camera -Viewing rays vary continuously across neighboring pixels -Two neighboring pixels must either correspond to the same camera or to neighboring cameras -Each pixel contains range information
30
3012/5/2015 12:54 MCOP Image
31
3112/5/2015 12:54 Strip Camera used for Capture of Real MCOP Images
32
3212/5/2015 12:54 Camera Path in Capturing MCOP Image of Castle
33
3312/5/2015 12:54 Image Plane for Camera Motion
34
3412/5/2015 12:54 Resulting 1000 x 500 MCOP Image
35
3512/5/2015 12:54 Reprojection Camera model, stored per column: Center of projection Vector from to image plane origin Horizontal axis of viewing plane Vertical axis of viewing plane Disparity = distance from to the image plane divided by distance from to the pixel’s world space point Reprojection Formula:
36
3612/5/2015 12:54 View of Castle Reconstructed from MCOP Image
37
3712/5/2015 12:54 AnotherView of Castle Reconstructed from MCOP Image
38
3812/5/2015 12:54 Lumigraphs 0 Lumigraph = a representation of the light resulting from a scene 0 Limited data representation of the plenoptic function 0 Generated from multiple images and camera “poses” 0 Rendering: Image = Lumigraph + Camera Model 0 Special case of 4D Light Field (Levoy, Hanrahan)
39
3912/5/2015 12:54 What is a Lumigraph? For all points on the surrounding surface, For all directions, The color intensity of the ray. Assumption: We are outside a convex hull containing the objects
40
4012/5/2015 12:54 Parameterization of the Lumigraph Images from Steven Gortler, SIGGRAPH 1999
41
4112/5/2015 12:54 Building the Lumigraph
42
4212/5/2015 12:54 Approximating the Lumigraph With Discrete Samples
43
4312/5/2015 12:54 Views of a Light Field (Lumigraph) Levoy & Hanrahan, Light Field Rendering, Computer Graphics
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.