Image-Based Rendering

Slides:



Advertisements
Similar presentations
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Modelling, Rendering and Lighting.
Advertisements

Object Space EWA Surface Splatting: A Hardware Accelerated Approach to High Quality Point Rendering Liu Ren Hanspeter Pfister Matthias Zwicker CMU.
Advanced Computer Graphics CSE 190 [Spring 2015], Lecture 14 Ravi Ramamoorthi
Rendering with Concentric Mosaics Heung – Yeung Shum and Li – Wei He Presentation By: Jonathan A. Bockelman.
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 16: Image-Based Rendering and Light Fields Ravi Ramamoorthi
Advanced Computer Graphics (Spring 2005) COMS 4162, Lecture 21: Image-Based Rendering Ravi Ramamoorthi
Representations of Visual Appearance COMS 6160 [Spring 2007], Lecture 4 Image-Based Modeling and Rendering
Image-Based Modeling and Rendering CS 6998 Lecture 6.
High-Quality Video View Interpolation
Global Illumination May 7, Global Effects translucent surface shadow multiple reflection.
Copyright  Philipp Slusallek IBR: View Interpolation Philipp Slusallek.
Image or Object? Michael F. Cohen Microsoft Research.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Siggraph’2000, July 27, 2000 Jin-Xiang Chai Xin Tong Shing-Chow Chan Heung-Yeung Shum Microsoft Research, China Plenoptic Sampling SIGGRAPH’2000.
Computational Photography Light Field Rendering Jinxiang Chai.
Rendering with Concentric Mosaics Heung-Yeung Shum Li-Wei he Microsoft Research.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
Convergence of vision and graphics Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
 Marc Levoy IBM / IBR “The study of image-based modeling and rendering is the study of sampled representations of geometry.”
CS 563 Advanced Topics in Computer Graphics Introduction To IBR By Cliff Lindsay Slide Show ’99 Siggraph[6]
View interpolation from a single view 1. Render object 2. Convert Z-buffer to range image 3. Re-render from new viewpoint 4. Use depths to resolve overlaps.
Image Based Rendering And Modeling Techniques And Their Applications Jiao-ying Shi State Key laboratory of Computer Aided Design and Graphics Zhejiang.
The Story So Far The algorithms presented so far exploit: –Sparse sets of images (some data may not be available) –User help with correspondences (time.
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Real-Time High Quality Rendering CSE 291 [Winter 2015], Lecture 6 Image-Based Rendering and Light Fields
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 15: Image-Based Rendering and Light Fields Ravi Ramamoorthi
Image-Based Rendering. 3D Scene = Shape + Shading Source: Leonard mcMillan, UNC-CH.
Image-Based Rendering from a Single Image Kim Sang Hoon Samuel Boivin – Andre Gagalowicz INRIA.
Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
Image-based rendering Michael F. Cohen Microsoft Research.
03/12/03© 2003 University of Wisconsin Last Time NPR Assignment Projects High-Dynamic Range Capture Image Based Rendering Intro.
Image-based Rendering. © 2002 James K. Hahn2 Image-based Rendering Usually based on 2-D imagesUsually based on 2-D images Pre-calculationPre-calculation.
03/24/03© 2003 University of Wisconsin Last Time Image Based Rendering from Sparse Data.
CS 395: Adv. Computer Graphics Light Fields and their Approximations Jack Tumblin
Spring 2015 CSc 83020: 3D Photography Prof. Ioannis Stamos Mondays 4:15 – 6:15
03/09/05© 2005 University of Wisconsin Last Time HDR Image Capture Image Based Rendering –Improved textures –Quicktime VR –View Morphing NPR Papers: Just.
CSL 859: Advanced Computer Graphics Dept of Computer Sc. & Engg. IIT Delhi.
112/5/ :54 Graphics II Image Based Rendering Session 11.
CS559: Computer Graphics Final Review Li Zhang Spring 2010.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Sub-Surface Scattering Real-time Rendering Sub-Surface Scattering CSE 781 Prof. Roger Crawfis.
Mohammed AM Dwikat CIS Department Digital Image.
Graphics Graphics Korea University cgvr.korea.ac.kr Introduction to Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Presented by 翁丞世  View Interpolation  Layered Depth Images  Light Fields and Lumigraphs  Environment Mattes  Video-Based.
Chapter 10: Computer Graphics
Advanced Computer Graphics
- Introduction - Graphics Pipeline
CS262 – Computer Vision Lect 4 - Image Formation
Photorealistic Rendering vs. Interactive 3D Graphics
Image-based Lighting Computational Photography
CSI-447: Multimedia Systems
donated by Paul Debevec
Steps Towards the Convergence of Graphics, Vision, and Video
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Image-Based Rendering
Bashar Mu’ala Ahmad Khader
CSc 8820 Advanced Graphics Algorithms
Deferred Lighting.
The Graphics Rendering Pipeline
View-Dependent Textured Splatting for Rendering Live Scenes
© University of Wisconsin, CS559 Fall 2004
© 2005 University of Wisconsin
Chapter 14 Shading Models.
Image Based Modeling and Rendering (PI: Malik)
Chapter V Vertex Processing
CS5500 Computer Graphics May 29, 2006
Chapter 14 Shading Models.
Directional Occlusion with Neural Network
Presentation transcript:

Image-Based Rendering

3D Scene = Shape + Shading Source: Leonard mcMillan, UNC-CH

Modeling is Hard! Good 3D geometric models need a lot of work. Getting correct material properties is even harder: R, G, B, Ka, Kd, Ks, specularity for Phong model. BRDF Subsurface reflectance, diffraction…etc.

How to Create 3D Contents AutoCAD: used for architectures (buildings) 3D Studio Max, Blender…etc. Maya is a major production tool used in movie studios. Problems? It takes an artist, and it’s still hard to make it look real!

QuickTime VR

3D Photography Can building 3D models be as easy as taking 2D photos? How do we digitize the massive assets in various museums? QuickTime VR object movies 3D Scans: Cyberware scanner, Digital Michelangeo Source: www.cyberware.com

Image-Based Rendering Can we build 3D contents from photographs directly? Difference from computer vision? Can we make the objects look more real? Difference from texture mapping?

Top Level Survey 3D Graphics Image-Based Rendering & Modeling Geometry or Surface Based Rendering & Modeling Sample-Based Graphics Image-Based Rendering & Modeling Volume Rendering

Traditional Computer Graphics Input: Geometry, Material Properties (Color, Reflectance,…etc.), Lighting. Transformation and Rasterization. Transform (& Lighting) Rasterization 3D Graphics Pipeline

Role of Images Used as textures. Or, as input to computer vision methods in order to recover the 3D models. Vision 3D Graphics Pipeline

Image-Based Rendering To bypass the 3D models altogether. Image-Based Rendering 3D Graphics Pipeline Vision

Image-Based Rendering Input: Regular Images or “Depth Images." No 3D model is constructed. Example: 3D Warping.

3D Warping: Another Example Reading room of UNC CS department Source images contain depths in each pixel. The depths are obtained from a laser range finder.

Why IBR? Geometry IBR Modeling Difficult Easy Complexity #triangles #pixels Fidelity Synthetic Acquired Problems of triangle-based graphics: Always starts from scratch. Millions of sub-pixel triangles.

Why is It Possible? 5D Plenoptic Function. 4D Light Field/Lumigraph Color = f(x, y, z, , ) (x, y, z) defines the viewpoint. (, ) defines the view direction. 4D Light Field/Lumigraph Color = f(u, v, s, t) (u, v) defines the viewpoint. (s, t) defines the pixel coord. Picture source: Leonard McMillan 5D function is hard to visualize, but easier to understand by looking at the parameters. Purpose: sampling and reconstruction. Redundancy? Same colors along a ray.  4D is more compact! Another benefit of 4D parameterization: each image is a collection of samples.

3D Image Warping Each pixel in the source images has coordinates (u1, v1), depth info d1, and color. Warping Equation is applied to each pixel (u2, v2) = f(u1, v1, d1) = ( au1+bv1+c+dd1 , eu1+fv1+g+hd1 ) iu1+jv1+k+ld1 iu1+jv1+k+ld1 where variables a to l are fixed for the same view. Rendering Time = O(#pixels) To render the output image by 3D image warping, we apply the warping equation, showing here on the slide, to each pixel of the source image. Because the equation is evaluated once for each pixel, the rendering time is proportional to the number of pixels in the source images.

v1 v2 u1 u2

Occlusion Compatible Order

Artifacts of 3D Image Warping Surfaces that were occluded in source images. Non-uniform sampling (an example in the next slide). There are two typical artifacts associated with image warping. One of them is the disocclusion artifact, such as those areas in the blue background color next to the teapots. The other artifact is caused by non-uniform sampling of the surfaces. In this example, they are those crackings on the teapots and on the floor. We can work around the non-uniform sampling problem by reconstructing the surfaces. For example, this is the result after we apply over-sized splattings to reconstruct the surfaces. However, we can not remove the disocclusion artifacts without knowing what are behind the teapots.

Reconstruction Solving it by meshing? Solving it by splatting? Must know the connectivity between points. Solving it by splatting? Size determined by viewing distance. Shape determined by surface normal. References: LDI [SG98], Surfel [SG2000], Surface Splatting [SG2001].

Meshing Artifacts Picture source: Leonard McMillan

Using Multiple Source Images

Layered Depth Image (LDI) Shade et. al. SIGGRAPH 98 Merge multiple source images. Allow multiple pixels at the same pixel location. The regular depth image has only one layer at each pixel. The Layered Depth Image is similar to the normal depth image, except that many layers of pixels may be stored at the same pixel location. Therefore it may be used for merging multiple depth images. LDI

IBR Survey Image-Based Rendering & Modeling Light Field Rendering (or Lumigraph) Stanford/Microsoft 3D Warping (or Plenoptic Modeling) UNC-Chapel Hill

Light Field & Lumigraph

Images as 4D Samples Consider each image pixel a sample of 4D Light Field.

Does it Matter Where We Place the Planes? Yes! Depth correction in Lumigraphs:

Concentric Mosaic Hold a camera on a stick, then sweep a circle. Viewpoint is constrained on a 2D plane. Reducing the 4D light field to a 3D subspace.

Surface Light Field May be considered a compression scheme for light field data. 3D geometry required.

LYTRO Light Field Camera See https://www.lytro.com/science_inside

Further Reading on Light Field See Marc Levoy’s IEEE Computer 2006 article at: http://graphics.stanford.edu/papers/lfphoto/levoy-lfphoto-ieee06.pdf

Camera Array Source: http://sabia.tic.udc.es/gc/Contenidos%20adicionales/trabajos/Peliculas/FX/ej3.html

Image-Based Lighting -- Light Stage Take photos under single point light at various positions. Trivial questions: how to produce new images at: Two point lights? Area light? Environment light (captured by light probe)?