CS 395: Adv. Computer Graphics Image-Based Modeling and Rendering Jack Tumblin

Slides:



Advertisements
Similar presentations
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Modelling, Rendering and Lighting.
Advertisements

CS395/495: Spring 2004 IBMR: Image Based Modeling and Rendering Introduction Jack Tumblin
CS 376b Introduction to Computer Vision 04 / 21 / 2008 Instructor: Michael Eckmann.
3D Graphics Rendering and Terrain Modeling
Measuring BRDFs. Why bother modeling BRDFs? Why not directly measure BRDFs? True knowledge of surface properties Accurate models for graphics.
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
Texture Visual detail without geometry. Texture Mapping desire for heightened realism.
1. What is Lighting? 2 Example 1. Find the cubic polynomial or that passes through the four points and satisfies 1.As a photon Metal Insulator.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
Lecture 23: Photometric Stereo CS4670/5760: Computer Vision Kavita Bala Scott Wehrwein.
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2008 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
Copyright  Philipp Slusallek Cs fall IBR: Model-based Methods Philipp Slusallek.
CS 395: Adv. Computer Graphics Overview: Image-Based Modeling and Rendering Jack Tumblin
Spring 2003 IBMR: Image Based Modeling and Rendering Jack Tumblin Computer Science Dept., Northwestern University
1 Image-Based Visual Hulls Paper by Wojciech Matusik, Chris Buehler, Ramesh Raskar, Steven J. Gortler and Leonard McMillan [
Multi-view stereo Many slides adapted from S. Seitz.
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2005 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
View interpolation from a single view 1. Render object 2. Convert Z-buffer to range image 3. Re-render from new viewpoint 4. Use depths to resolve overlaps.
3D from multiple views : Rendering and Image Processing Alexei Efros …with a lot of slides stolen from Steve Seitz and Jianbo Shi.
I hope that you: Looked at book & website Checked Pre-requisites (change before Friday!) Participate! Ask Questions! Get Inspired … CS395: Advanced Computer.
Copyright  Philipp Slusallek IBR: View Interpolation Philipp Slusallek.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
1Jana Kosecka, CS 223b Cylindrical panoramas Cylindrical panoramas with some slides from R. Szeliski, S. Seitz, D. Lowe, A. Efros,
Siggraph’2000, July 27, 2000 Jin-Xiang Chai Xin Tong Shing-Chow Chan Heung-Yeung Shum Microsoft Research, China Plenoptic Sampling SIGGRAPH’2000.
Computational Photography Light Field Rendering Jinxiang Chai.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
CS 395/495-25: Spring 2003 IBMR: Week 9A Image-Based Physics: Measuring Light & Materials Jack Tumblin
CS 563 Advanced Topics in Computer Graphics View Interpolation and Image Warping by Brad Goodwin Images in this presentation are used WITHOUT permission.
Convergence of vision and graphics Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
View interpolation from a single view 1. Render object 2. Convert Z-buffer to range image 3. Re-render from new viewpoint 4. Use depths to resolve overlaps.
Measure, measure, measure: BRDF, BTF, Light Fields Lecture #6
Image Based Rendering And Modeling Techniques And Their Applications Jiao-ying Shi State Key laboratory of Computer Aided Design and Graphics Zhejiang.
The Story So Far The algorithms presented so far exploit: –Sparse sets of images (some data may not be available) –User help with correspondences (time.
CSCE 641 Computer Graphics: Image-based Modeling Jinxiang Chai.
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Previous Lecture The 7d plenoptic function, indexing all light. Lightfields: a 4d (not 5d!) data structure which captures all outgoing light from a region.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 15: Image-Based Rendering and Light Fields Ravi Ramamoorthi
Image-Based Rendering. 3D Scene = Shape + Shading Source: Leonard mcMillan, UNC-CH.
11/15/11 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Image-Based Rendering from a Single Image Kim Sang Hoon Samuel Boivin – Andre Gagalowicz INRIA.
-Global Illumination Techniques
. Wild Dreams for Cameras Jack Tumblin Northwestern University From May 24 Panel Discussion on cameras.
Image-based Rendering. © 2002 James K. Hahn2 Image-based Rendering Usually based on 2-D imagesUsually based on 2-D images Pre-calculationPre-calculation.
03/24/03© 2003 University of Wisconsin Last Time Image Based Rendering from Sparse Data.
CS 395: Adv. Computer Graphics Light Fields and their Approximations Jack Tumblin
Image Based Rendering. Light Field Gershun in 1936 –An illuminated objects fills the surrounding space with light reflected of its surface, establishing.
Image-Based Lighting © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec CS194: Image Manipulation & Computational Photography Alexei Efros,
03/09/05© 2005 University of Wisconsin Last Time HDR Image Capture Image Based Rendering –Improved textures –Quicktime VR –View Morphing NPR Papers: Just.
Image-Based Rendering of Diffuse, Specular and Glossy Surfaces from a Single Image Samuel Boivin and André Gagalowicz MIRAGES Project.
Inverse Global Illumination: Recovering Reflectance Models of Real Scenes from Photographs Computer Science Division University of California at Berkeley.
- Laboratoire d'InfoRmatique en Image et Systèmes d'information
112/5/ :54 Graphics II Image Based Rendering Session 11.
Yizhou Yu Texture-Mapping Real Scenes from Photographs Yizhou Yu Computer Science Division University of California at Berkeley Yizhou Yu Computer Science.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Don Norman’s Fall 2003 Seminar 8-Dimensional Digital Photography for Historical Preservation Jack Tumblin
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Radiometry of Image Formation Jitendra Malik. A camera creates an image … The image I(x,y) measures how much light is captured at pixel (x,y) We want.
Radiometry of Image Formation Jitendra Malik. What is in an image? The image is an array of brightness values (three arrays for RGB images)
Presented by 翁丞世  View Interpolation  Layered Depth Images  Light Fields and Lumigraphs  Environment Mattes  Video-Based.
CS 395: Adv. Computer Graphics Light Fields and their Approximations Jack Tumblin
Image-based Lighting Computational Photography
donated by Paul Debevec
Image-Based Rendering
Image-Based Rendering
3D Graphics Rendering PPT By Ricardo Veguilla.
© University of Wisconsin, CS559 Fall 2004
Image Based Modeling and Rendering (PI: Malik)
(c) 2002 University of Wisconsin
Week 11 - Monday CS361.
Presentation transcript:

CS 395: Adv. Computer Graphics Image-Based Modeling and Rendering Jack Tumblin

GOAL: First-Class Primitive Want images as ‘first-class’ primitivesWant images as ‘first-class’ primitives –Useful as BOTH input and output –Convert to/from traditional scene descriptions Want to mix real & synthetic scenes freelyWant to mix real & synthetic scenes freely Want to extend photographyWant to extend photography –Easily capture scene: shape, movement, surface/BRDF, lighting … –Modify & Render the captured scene data “You can’t always get what you want”“You can’t always get what you want” –(Mick Jagger 1968)

Back To Basics: Scene & Image Light + 3D Scene: Illumination, shape, movement, surface BRDF,… Image Plane I(x,y) Angle( ,  ) Position(x,y) 2D Image: Collection of rays through a point

Trad. Computer Graphics Image Plane I(x,y) Angle( ,  ) Position(x,y) 2D Image: Collection of rays through a point Light + 3D Scene: Illumination, shape, movement, surface BRDF,… Reduced,IncompleteInformation

!TOUGH!‘ILL-POSED’ Many Simplifications, External knowledge… Trad. Computer Vision Image Plane I(x,y) Angle( ,  ) Position(x,y) 2D Image: Collection of rays through a point Light + 3D Scene: Light + 3D Scene: Illumination, shape, movement, surface BRDF,…

for a given scene, describe:for a given scene, describe: –ALL rays through –ALL pixels, of –ALL cameras, at –ALL wavelengths, –ALL time F(x,y,z, , ,, t) “Eyeballs Everywhere” function (7-D!) Plenoptic Function (Adelson, Bergen `91) … … … … … … … … … ……

A Big Plenoptic Question: Image entraps a partial scene description,… –Computer Vision problem: 3D->2D –Image point  scene surface point (usually) –Occlusion hides some scene surfaces –(BRDF * irradiance) tough to split apart! ? Does Plenoptic fcn. contain full scene ? –Exhaustive record of all image rays –Even SIMPLEST scene is huge, redundant, –The ‘consequences’ of all possible renderings* so

‘Scene’ causes Light Field Light field: holds all outgoing light rays Shape,Position,Movement, BRDF,Texture,Scattering EmittedLight Reflected,Scattered, Light … Scene modulates outgoing light; light field captures it all.

A Big Plenoptic Question: Image entraps a partial scene description –Many-to-One map; 3D->2D –Occlusion hides some scene features –(BRDF * irradiance) tough to split! –limited resolution ? Does Plenoptic fcn. contain full scene ? Two Options for IBMR methods: –Find a limited subset of scene info, –Use MORE than plenoptic function data: (vary lights, etc.) !NO!

8-to-10-Dimensional Ideal? Light field(4D) + light sources(4D) + time + Light field(4D) + light sources(4D) + time + Shape,Position,Movement, BRDF,Texture,Scattering EmittedLight Reflected,Scattered, Light …

It gets worse… A ‘Circular problem’: PLUS! depth-of-focus, sampling, indirect illum… Shape BRDF Irradiance Surface Normal

Practical IBMR What useful partial solutions are possible? Texture Maps++:Texture Maps++: Image(s)+Depth: (3D shell)Image(s)+Depth: (3D shell) Estimating Depth & SilhouettesEstimating Depth & Silhouettes ‘Light Probe’ measures real-world light‘Light Probe’ measures real-world light Light control measures BRDFLight control measures BRDF Hybrids: BTF, stitching, …Hybrids: BTF, stitching, …

Texture Maps ++ Re-use rendering results: ‘Impostors’, ‘Billboards’, ‘3D sprites’ Render portion of scene as a textureRender portion of scene as a texture Apply to mesh or plane  to C.O.P.;Apply to mesh or plane  to C.O.P.; Replace if eyepoint changes too muchReplace if eyepoint changes too much

Images + Depth 1 Image + Depth: a ‘thin shell’1 Image + Depth: a ‘thin shell’ –Reprojection (well known); Z-buffers can help –McMillan`95: 4-way raster ensures depth order –Problem: ‘holes’, occlusion, matching Multiple Images:Multiple Images: –LDI, LDI trees for multiresolution Limitations:Limitations: –Presumes diffuse-only environment –Depth capture tough: laser TOF reflectometer, manual scanner, structured light, or …

Shape Problems: Correspondence Can you find ray intersections? Or ray depth? Ray colors might not match for non-diffuse materials (BRDF)

Shape Problems: Correspondence Can you find ray intersections? Or ray depth? Ray colors might not match for non-diffuse materials (BRDF)

Estimating Depth, Silhouettes Mildly new IBMR methods can help… Sparse, manual image correspondences (Debevec, Seitz,)Sparse, manual image correspondences (Debevec, Seitz,) Video sequences with camera motion trackingVideo sequences with camera motion tracking Image (silhouette)-based Visual Hulls, ‘voxel carving’ (VIDEO!)Image (silhouette)-based Visual Hulls, ‘voxel carving’ (VIDEO!) Mostly a Classic Computer Vision Problem: Epipolar Geometry: reduce search for correspondencesEpipolar Geometry: reduce search for correspondences Global & local tracking & alignment methods…Global & local tracking & alignment methods…

Light Probe: Irradiance Estimate Place mirrored ball in scene,Place mirrored ball in scene, Photograph (careful! High contrast image!)Photograph (careful! High contrast image!) Map position on sphere to incoming angle, intensity to irradiance.Map position on sphere to incoming angle, intensity to irradiance. Repeat where illumination changes greatlyRepeat where illumination changes greatly (in shadows, etc.) Uses: -mixing real & synthetic objects (Ward 96) -separating reflectance & illum (Yu 97) -separating reflectance & illum (Yu 97)

Light Control Methods Form estimates of surface properties (BRDF vs. position) by moving camera, light source, or both. Carefully control incoming light direction (light stages, whirling banks of lights, etc)Carefully control incoming light direction (light stages, whirling banks of lights, etc) Establish surface geometry (before, during)Establish surface geometry (before, during) Sort pixels by incoming/outgoing surf. AngleSort pixels by incoming/outgoing surf. Angle Scattered data interpolation to get BRDF.Scattered data interpolation to get BRDF.

Conclusion Very active areaVery active area Heavy overlap with computer vision: careful not to re-invent & re-name!Heavy overlap with computer vision: careful not to re-invent & re-name! Compute-intensive, but easily parallel; applies graphics hardware to broader probs.Compute-intensive, but easily parallel; applies graphics hardware to broader probs.