CS395/495: Spring 2004 IBMR: Image Based Modeling and Rendering Introduction Jack Tumblin

Slides:



Advertisements
Similar presentations
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Modelling, Rendering and Lighting.
Advertisements

Graphics Pipeline.
Introduction to Image-Based Rendering Jian Huang, CS 594, Spring 2002 A part of this set of slides reference slides used at Standford by Prof. Pat Hanrahan.
3D Graphics Rendering and Terrain Modeling
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
Lecture 23: Photometric Stereo CS4670/5760: Computer Vision Kavita Bala Scott Wehrwein.
Advanced Computer Graphics CSE 190 [Spring 2015], Lecture 14 Ravi Ramamoorthi
Plenoptic Stitching: A Scalable Method for Reconstructing 3D Interactive Walkthroughs Daniel G. Aliaga Ingrid Carlbom
Copyright  Philipp Slusallek Cs fall IBR: Model-based Methods Philipp Slusallek.
CS 395: Adv. Computer Graphics Overview: Image-Based Modeling and Rendering Jack Tumblin
CS 395: Adv. Computer Graphics Image-Based Modeling and Rendering Jack Tumblin
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 16: Image-Based Rendering and Light Fields Ravi Ramamoorthi
Advanced Computer Graphics (Spring 2005) COMS 4162, Lecture 21: Image-Based Rendering Ravi Ramamoorthi
Multiple View Geometry : Computational Photography Alexei Efros, CMU, Fall 2005 © Martin Quinn …with a lot of slides stolen from Steve Seitz and.
Spring 2003 IBMR: Image Based Modeling and Rendering Jack Tumblin Computer Science Dept., Northwestern University
Representations of Visual Appearance COMS 6160 [Spring 2007], Lecture 4 Image-Based Modeling and Rendering
Image-Based Modeling and Rendering CS 6998 Lecture 6.
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2005 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
View interpolation from a single view 1. Render object 2. Convert Z-buffer to range image 3. Re-render from new viewpoint 4. Use depths to resolve overlaps.
CS 395/495-26: Spring 2003 IBMR: Week 1B 2-D Projective Geometry Jack Tumblin
Image Stitching and Panoramas
Copyright  Philipp Slusallek IBR: View Interpolation Philipp Slusallek.
Siggraph’2000, July 27, 2000 Jin-Xiang Chai Xin Tong Shing-Chow Chan Heung-Yeung Shum Microsoft Research, China Plenoptic Sampling SIGGRAPH’2000.
Computational Photography Light Field Rendering Jinxiang Chai.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
NVIDIA Lecture 10 Copyright  Pat Hanrahan Image-Based Rendering: 1st Wave Definition: Using images to enhance the realism of 3D graphics Brute Force in.
CS 395/495-25: Spring 2003 IBMR: Week 9A Image-Based Physics: Measuring Light & Materials Jack Tumblin
Stereo Guest Lecture by Li Zhang
CS 563 Advanced Topics in Computer Graphics View Interpolation and Image Warping by Brad Goodwin Images in this presentation are used WITHOUT permission.
Project 1 artifact winners Project 2 questions Project 2 extra signup slots –Can take a second slot if you’d like Announcements.
 Marc Levoy IBM / IBR “The study of image-based modeling and rendering is the study of sampled representations of geometry.”
 Marc Levoy IBM / IBR “The study of image-based modeling and rendering is the study of sampled representations of geometry.”
Announcements Project 1 artifact voting Project 2 out today (help session at end of class)
View interpolation from a single view 1. Render object 2. Convert Z-buffer to range image 3. Re-render from new viewpoint 4. Use depths to resolve overlaps.
Multiple View Geometry : Computational Photography Alexei Efros, CMU, Fall 2006 © Martin Quinn …with a lot of slides stolen from Steve Seitz and.
The Story So Far The algorithms presented so far exploit: –Sparse sets of images (some data may not be available) –User help with correspondences (time.
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Real-Time High Quality Rendering CSE 291 [Winter 2015], Lecture 6 Image-Based Rendering and Light Fields
COMP 175: Computer Graphics March 24, 2015
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 15: Image-Based Rendering and Light Fields Ravi Ramamoorthi
Image-Based Rendering. 3D Scene = Shape + Shading Source: Leonard mcMillan, UNC-CH.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
Image-based rendering Michael F. Cohen Microsoft Research.
03/12/03© 2003 University of Wisconsin Last Time NPR Assignment Projects High-Dynamic Range Capture Image Based Rendering Intro.
Ray Tracing Chapter CAP4730: Computational Structures in Computer Graphics.
. Wild Dreams for Cameras Jack Tumblin Northwestern University From May 24 Panel Discussion on cameras.
CS 638, Fall 2001 Today Project Stage 0.5 Environment mapping Light Mapping.
Image-based Rendering. © 2002 James K. Hahn2 Image-based Rendering Usually based on 2-D imagesUsually based on 2-D images Pre-calculationPre-calculation.
03/24/03© 2003 University of Wisconsin Last Time Image Based Rendering from Sparse Data.
CS 395: Adv. Computer Graphics Light Fields and their Approximations Jack Tumblin
Image-Based Lighting © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec CS194: Image Manipulation & Computational Photography Alexei Efros,
What is light? Electromagnetic radiation (EMR) moving along rays in space R( ) is EMR, measured in units of power (watts) – is wavelength Light: Travels.
03/09/05© 2005 University of Wisconsin Last Time HDR Image Capture Image Based Rendering –Improved textures –Quicktime VR –View Morphing NPR Papers: Just.
CSL 859: Advanced Computer Graphics Dept of Computer Sc. & Engg. IIT Delhi.
112/5/ :54 Graphics II Image Based Rendering Session 11.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Don Norman’s Fall 2003 Seminar 8-Dimensional Digital Photography for Historical Preservation Jack Tumblin
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Radiometry of Image Formation Jitendra Malik. A camera creates an image … The image I(x,y) measures how much light is captured at pixel (x,y) We want.
Sub-Surface Scattering Real-time Rendering Sub-Surface Scattering CSE 781 Prof. Roger Crawfis.
Radiometry of Image Formation Jitendra Malik. What is in an image? The image is an array of brightness values (three arrays for RGB images)
Presented by 翁丞世  View Interpolation  Layered Depth Images  Light Fields and Lumigraphs  Environment Mattes  Video-Based.
CS 395: Adv. Computer Graphics Light Fields and their Approximations Jack Tumblin
Advanced Computer Graphics
Photorealistic Rendering vs. Interactive 3D Graphics
Image-based Lighting Computational Photography
donated by Paul Debevec
Image-Based Rendering
© 2005 University of Wisconsin
Presentation transcript:

CS395/495: Spring 2004 IBMR: Image Based Modeling and Rendering Introduction Jack Tumblin

Admin: How this course works Refer to class website: (soon) 3spr_IBMR/IBMRsyllabus2004.htmRefer to class website: (soon) 3spr_IBMR/IBMRsyllabus2004.htm 3spr_IBMR/IBMRsyllabus2004.htm 3spr_IBMR/IBMRsyllabus2004.htm Tasks:Tasks: –Reading, lectures, class participation, projects Evaluation:Evaluation: –Progressive Programming Project –Take-Home Midterm –In-class project demo; no final exam

GOAL: First-Class Primitive Want images as ‘first-class’ primitivesWant images as ‘first-class’ primitives –Useful as BOTH input and output –Convert to/from traditional scene descriptions Want to mix real & synthetic scenes freelyWant to mix real & synthetic scenes freely Want to extend photographyWant to extend photography –Easily capture scene: shape, movement, surface/BRDF, lighting … –Modify & Render the captured scene data --BUT----BUT-- images hold only PARTIAL scene informationimages hold only PARTIAL scene information “You can’t always get what you want” “You can’t always get what you want” –(Mick Jagger 1968)

Back To Basics: Scene & Image Light + 3D Scene: Illumination, shape, movement, surface BRDF,… Image Plane I(x,y) Angle( ,  ) Position(x,y) 2D Image: Collection of rays through a point

Trad. Computer Graphics Image Plane I(x,y) Angle( ,  ) Position(x,y) 2D Image: Collection of rays through a point Light + 3D Scene: Illumination, shape, movement, surface BRDF,… Reduced,IncompleteInformation

!TOUGH!‘ILL-POSED’ Many Simplifications, External knowledge… Trad. Computer Vision Image Plane I(x,y) Angle( ,  ) Position(x,y) 2D Image: Collection of rays through a point Light + 3D Scene: Light + 3D Scene: Illumination, shape, movement, surface BRDF,…

2D Display Image(s) IBMR Goal: Bidirectional Rendering Both forward and ‘inverse’ rendering!Both forward and ‘inverse’ rendering! Camera Pose Camera View Geom Scene illumination Object shape, position Surface reflectance,… transparency 2D Display Image(s) IBMR TraditionalComputerGraphics ‘3D Scene’ Description ‘Optical’Description ?! New Research?

OLDEST IBR: Shadow Maps (1984) Fast Shadows from Z-buffer hardware: 1) Make the “Shadow Map”: –Render image seen from light source, BUT –Keep ONLY the Z-buffer values (depth) 2) Render Scene from Eyepoint: –Pixel + Z depth gives 3D position of surface; –Project 3D position into Shadow map image –If Shadow Map depth < 3D depth, SHADOW!

Early IBR: QuickTime VR (Chen, Williams ’93) 1) Four Planar Images  1 Cylindrical Panorama: Re-sampling Required! Planar Pixels: equal distance on x,y plane (tan -1  ) Cylinder Pixs: horiz: equal angle on cylinder (  ) vert: equal distance on y (tan -1  ) Cylinder Pixs: horiz: equal angle on cylinder (  ) vert: equal distance on y (tan -1  )

Early IBR: QuickTime VR (Chen, Williams ’93) 1) Four Planar Images  1 Cylindrical Panorama: IN:OUT:

Early IBR: QuickTime VR (Chen, Williams ’93) 2) Windowing, Horizontal-only Reprojection: IN:OUT:

Early IBR: QuickTime VR (Chen, Williams ’93) 2) Windowing, Horizontal-only Reprojection: IN:OUT:

View Interpolation: How? But what if no depth is available?But what if no depth is available? Traditional Stereo Disparity Map: pixel-by-pixel search for correspondenceTraditional Stereo Disparity Map: pixel-by-pixel search for correspondence

View Interpolation: How? Store Depth at each pixel: reprojectStore Depth at each pixel: reproject Coarse or Simple 3D model:Coarse or Simple 3D model:

Plenoptic Array: ‘The Matrix Effect’ Brute force! Simple arc, line, or ring array of camerasBrute force! Simple arc, line, or ring array of cameras Synchronized shutter shutter Warp/blend between images to change viewpoint on ‘time-frozen’ scene:Warp/blend between images to change viewpoint on ‘time-frozen’ scene:

for a given scene, describe ALL rays throughfor a given scene, describe ALL rays through –ALL pixels, of –ALL cameras, at –ALL wavelengths, –ALL time F(x,y,z, , ,, t) “Eyeballs Everywhere” function (5-D x 2-D!) Plenoptic Function (Adelson, Bergen `91) … … … … … … … … … ……

Seitz: ‘View Morphing’ SIGG` )Manually set some corresp.points (eye corners, etc.) 2) pre-warp and post-warp to match points in 3D, 3) Reproject for Virtual cameras

Seitz: ‘View Morphing’ SIGG`96

Seitz: ‘View Morphing’ SIGG`96

Seitz: ‘View Morphing’ SIGG`96

Seitz: ‘View Morphing’ SIGG`96

‘Scene’ causes Light Field Light field: holds all outgoing light rays Shape,Position,Movement, BRDF,Texture,Scattering EmittedLight Reflected,Scattered, Light … Cameras capture subset of these rays.

? Can we recover Shape ? Can you find ray intersections? Or ray depth? Ray colors might not match for non-diffuse materials (BRDF)

? Can we recover Surface Material ? Can you find ray intersections? Or ray depth? Ray colors might not match for non-diffuse materials (BRDF)

Hey, wait… Light field describes light LEAVING the enclosing surface….Light field describes light LEAVING the enclosing surface…. ? Isn’t there a complementary ‘light field’ for the light ENTERING the surface?? Isn’t there a complementary ‘light field’ for the light ENTERING the surface? *YES* ‘Image-Based Lighting’ too!*YES* ‘Image-Based Lighting’ too!

Cleaner Formulation:Cleaner Formulation: –Orthographic camera, –positioned on sphere around object/scene –Orthographic projector, –positioned on sphere around object/scene F(x c,y c,  c,  c,x l,y l  l,  l,, t) ‘Full 8-D Light Field’ (10-D, actually: time, ) camera

Cleaner Formulation:Cleaner Formulation: –Orthographic camera, –positioned on sphere around object/scene –Orthographic projector, –positioned on sphere around object/scene F(x c,y c,  c,  c,x l,y l  l,  l,, t) ‘Full 8-D Light Field’ (10-D, actually: time, ) camera cccc cccc

Cleaner Formulation:Cleaner Formulation: –Orthographic camera, –positioned on sphere around object/scene –Orthographic projector, –positioned on sphere around object/scene F(x c,y c,  c,  c,x l,y l  l,  l,, t) ‘Full 8-D Light Field’ (10-D, actually: time, ) camera Projector (laser brick)

Cleaner Formulation:Cleaner Formulation: –Orthographic camera, –positioned on sphere around object/scene –Orthographic projector, –positioned on sphere around object/scene –(and wavelength and time) F(x c,y c,  c,  c,x l,y l  l,  l,, t) ‘Full 8-D Light Field’ (10-D, actually: time, ) camera projector

Cleaner Formulation:Cleaner Formulation: –Orthographic camera, –positioned on sphere around object/scene –Orthographic projector, –positioned on sphere around object/scene –(and wavelength and time) F(x c,y c,  c,  c,x l,y l  l,  l,, t) ‘Full 8-D Light Field’ (10-D, actually: time, ) camera projector

! Complete !! Complete ! –Geometry, Lighting, BRDF,… ! NOT REQUIRED ! Preposterously Huge: (?!?! 8-D function sampled at image resolution !?!?), butPreposterously Huge: (?!?! 8-D function sampled at image resolution !?!?), but Hugely Redundant:Hugely Redundant: –Wavelength  RGB triplets, ignore Time, and Restrict eyepoint movement: maybe ~3 or 4D ? –Very Similar images—use Warping rules? (SIGG2002) –Exploit Movie Storage/Compression Methods?

IBR-Motivating Opinions “Computer Graphics: Hard” –Complex! geometry, texture, lighting, shadows, compositing, BRDF, interreflections, etc. etc., etc., … –Irregular! Visibility,Topology, Render Eqn.,… –Isolated! Tough to use real objects in CGI –Slow! compute-bound, off-line only,… “Digital Imaging: Easy” –Simple! More quality? Just pump more pixels! –Regular! Vectorized, compressible, pipelined… –Accessible! Use real OR synthetic (CGI) images! –Fast! Scalable, Image reuse, a path to interactivity…

Practical IBMR What useful partial solutions are possible? Texture Maps++: Panoramas, Env. MapsTexture Maps++: Panoramas, Env. Maps Image(s)+Depth: (3D shell)Image(s)+Depth: (3D shell) Estimating Depth & Recovering SilhouettesEstimating Depth & Recovering Silhouettes ‘Light Probe’ measures real-world light‘Light Probe’ measures real-world light Structured Light to Recover Surfaces…Structured Light to Recover Surfaces… Hybrids: BTF, stitching, …Hybrids: BTF, stitching, …

Conclusion Heavy overlap with computer vision: careful not to re-invent & re-name!Heavy overlap with computer vision: careful not to re-invent & re-name! Elegant Geometry is at the heart of it all, even surface reflectance, illumination, etc. etc.Elegant Geometry is at the heart of it all, even surface reflectance, illumination, etc. etc. THUS: we’ll dive into geometry--all the rest is built on it!THUS: we’ll dive into geometry--all the rest is built on it!