Presentation is loading. Please wait.

Presentation is loading. Please wait.

Image-Based Rendering

Similar presentations


Presentation on theme: "Image-Based Rendering"— Presentation transcript:

1 Image-Based Rendering

2 3D Scene = Shape + Shading
Source: Leonard mcMillan, UNC-CH

3 Modeling is Hard! Good 3D geometric models need a lot of work.
Getting correct material properties is even harder: R, G, B, Ka, Kd, Ks, specularity for Phong model. BRDF Subsurface reflectance, diffraction…etc.

4 How to Create 3D Contents
AutoCAD: used for architectures (buildings) 3D Studio Max, Blender…etc. Maya is a major production tool used in movie studios. Problems? It takes an artist, and it’s still hard to make it look real!

5 QuickTime VR

6 3D Photography Can building 3D models be as easy as taking 2D photos?
How do we digitize the massive assets in various museums? QuickTime VR object movies 3D Scans: Cyberware scanner, Digital Michelangeo Source:

7 Image-Based Rendering
Can we build 3D contents from photographs directly? Difference from computer vision? Can we make the objects look more real? Difference from texture mapping?

8 Top Level Survey 3D Graphics Image-Based Rendering & Modeling
Geometry or Surface Based Rendering & Modeling Sample-Based Graphics Image-Based Rendering & Modeling Volume Rendering

9 Traditional Computer Graphics
Input: Geometry, Material Properties (Color, Reflectance,…etc.), Lighting. Transformation and Rasterization. Transform (& Lighting) Rasterization 3D Graphics Pipeline

10 Role of Images Used as textures.
Or, as input to computer vision methods in order to recover the 3D models. Vision 3D Graphics Pipeline

11 Image-Based Rendering
To bypass the 3D models altogether. Image-Based Rendering 3D Graphics Pipeline Vision

12 Image-Based Rendering
Input: Regular Images or “Depth Images." No 3D model is constructed. Example: 3D Warping.

13 3D Warping: Another Example
Reading room of UNC CS department Source images contain depths in each pixel. The depths are obtained from a laser range finder.

14 Why IBR? Geometry IBR Modeling Difficult Easy Complexity #triangles
#pixels Fidelity Synthetic Acquired Problems of triangle-based graphics: Always starts from scratch. Millions of sub-pixel triangles.

15 Why is It Possible? 5D Plenoptic Function. 4D Light Field/Lumigraph
Color = f(x, y, z, , ) (x, y, z) defines the viewpoint. (, ) defines the view direction. 4D Light Field/Lumigraph Color = f(u, v, s, t) (u, v) defines the viewpoint. (s, t) defines the pixel coord. Picture source: Leonard McMillan 5D function is hard to visualize, but easier to understand by looking at the parameters. Purpose: sampling and reconstruction. Redundancy? Same colors along a ray.  4D is more compact! Another benefit of 4D parameterization: each image is a collection of samples.

16 3D Image Warping Each pixel in the source images has coordinates (u1, v1), depth info d1, and color. Warping Equation is applied to each pixel (u2, v2) = f(u1, v1, d1) = ( au1+bv1+c+dd1 , eu1+fv1+g+hd1 ) iu1+jv1+k+ld iu1+jv1+k+ld1 where variables a to l are fixed for the same view. Rendering Time = O(#pixels) To render the output image by 3D image warping, we apply the warping equation, showing here on the slide, to each pixel of the source image. Because the equation is evaluated once for each pixel, the rendering time is proportional to the number of pixels in the source images.

17 v1 v2 u1 u2

18

19 Occlusion Compatible Order

20 Artifacts of 3D Image Warping
Surfaces that were occluded in source images. Non-uniform sampling (an example in the next slide). There are two typical artifacts associated with image warping. One of them is the disocclusion artifact, such as those areas in the blue background color next to the teapots. The other artifact is caused by non-uniform sampling of the surfaces. In this example, they are those crackings on the teapots and on the floor. We can work around the non-uniform sampling problem by reconstructing the surfaces. For example, this is the result after we apply over-sized splattings to reconstruct the surfaces. However, we can not remove the disocclusion artifacts without knowing what are behind the teapots.

21

22 Reconstruction Solving it by meshing? Solving it by splatting?
Must know the connectivity between points. Solving it by splatting? Size determined by viewing distance. Shape determined by surface normal. References: LDI [SG98], Surfel [SG2000], Surface Splatting [SG2001].

23 Meshing Artifacts Picture source: Leonard McMillan

24 Using Multiple Source Images

25 Layered Depth Image (LDI)
Shade et. al. SIGGRAPH 98 Merge multiple source images. Allow multiple pixels at the same pixel location. The regular depth image has only one layer at each pixel. The Layered Depth Image is similar to the normal depth image, except that many layers of pixels may be stored at the same pixel location. Therefore it may be used for merging multiple depth images. LDI

26 IBR Survey Image-Based Rendering & Modeling Light Field Rendering
(or Lumigraph) Stanford/Microsoft 3D Warping (or Plenoptic Modeling) UNC-Chapel Hill

27 Light Field & Lumigraph

28 Images as 4D Samples Consider each image pixel a sample of 4D Light Field.

29 Does it Matter Where We Place the Planes?
Yes! Depth correction in Lumigraphs:

30 Concentric Mosaic Hold a camera on a stick, then sweep a circle.
Viewpoint is constrained on a 2D plane. Reducing the 4D light field to a 3D subspace.

31 Surface Light Field May be considered a compression scheme for light field data. 3D geometry required.

32 LYTRO Light Field Camera
See

33 Further Reading on Light Field
See Marc Levoy’s IEEE Computer 2006 article at:

34 Camera Array Source:

35 Image-Based Lighting -- Light Stage
Take photos under single point light at various positions. Trivial questions: how to produce new images at: Two point lights? Area light? Environment light (captured by light probe)?

36


Download ppt "Image-Based Rendering"

Similar presentations


Ads by Google