Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted.

Slides:



Advertisements
Similar presentations
Lecture 8 Transparency, Mirroring
Advertisements

Environment Mapping CSE 781 – Roger Crawfis
Computer Graphics In4/MSc Computer Graphics Lecture Notes #15 Illumination III View Independent Rendering.
Week 9 - Monday.  What did we talk about last time?  BRDFs  Texture mapping and bump mapping in shaders.
Eyes for Relighting Extracting environment maps for use in integrating and relighting scenes (Noshino and Nayar)
Acquiring the Reflectance Field of a Human Face Paul Debevec, Tim Hawkins, Chris Tchou, Haarm-Pieter Duiker, Westley Sarokin, Mark Sagar Haarm-Pieter Duiker,
Rendering with Environment Maps Jaroslav Křivánek, KSVI, MFF UK
1. What is Lighting? 2 Example 1. Find the cubic polynomial or that passes through the four points and satisfies 1.As a photon Metal Insulator.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
1Notes  Assignment 1 is out, due October 12  Inverse Kinematics  Evaluating Catmull-Rom splines for motion curves  Wednesday: may be late (will get.
10/14/14 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2008 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
IMGD 1001: Illumination by Mark Claypool
18.1 Si31_2001 SI31 Advanced Computer Graphics AGR Lecture 18 Image-based Rendering Light Maps What We Did Not Cover Learning More...
Texture Mapping from Watt, Ch. 8 Jonathan Han. Topics Discussed Texture Map to Models Bump Maps, Light Maps Environment (Reflection) Mapping 3D Textures.
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2005 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
1cs426-winter-2008 Notes  RenderMan tutorial now on web site too  More papers to read when you can: Hanrahan and Lawson, “A language for shading and.
Global Illumination May 7, Global Effects translucent surface shadow multiple reflection.
1Jana Kosecka, CS 223b Cylindrical panoramas Cylindrical panoramas with some slides from R. Szeliski, S. Seitz, D. Lowe, A. Efros,
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
10/16/14 Image-based Lighting (Part 2) Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros, Kevin Karsch.
Image-Based Lighting II : Rendering and Image Processing Alexei Efros …with a lot of slides donated by Paul Debevec.
Computational Photography lecture 13 – HDR CS 590 Spring 2014 Prof. Alex Berg (Credits to many other folks on individual slides)
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Previous Lecture The 7d plenoptic function, indexing all light. Lightfields: a 4d (not 5d!) data structure which captures all outgoing light from a region.
11/18/10 Image-based Lighting (Part 2) Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Recovering High Dynamic Range Radiance Maps from Photographs [Debevec, Malik - SIGGRAPH’97] Presented by Sam Hasinoff CSC2522 – Advanced Image Synthesis.
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
COMP 175: Computer Graphics March 24, 2015
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
11/15/11 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Image-Based Rendering from a Single Image Kim Sang Hoon Samuel Boivin – Andre Gagalowicz INRIA.
Environment Mapping. Examples Fall Motivation Silver candlestick No appropriate texture for it “ environment ” map Simulates the results of ray-tracing.
Reflections Specular reflection is the perfect reflection of light from a surface. The law a reflection states that the direction of the incoming ray and.
03/10/03© 2003 University of Wisconsin Last Time Tone Reproduction and Perceptual Issues Assignment 2 all done (almost)
-Global Illumination Techniques
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
CS 638, Fall 2001 Today Project Stage 0.5 Environment mapping Light Mapping.
16421: Vision Sensors Lecture 7: High Dynamic Range Imaging Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 3:00pm.
Image stitching Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac.
Image-based Rendering. © 2002 James K. Hahn2 Image-based Rendering Usually based on 2-D imagesUsually based on 2-D images Pre-calculationPre-calculation.
CS-378: Game Technology Lecture #4: Texture and Other Maps Prof. Okan Arikan University of Texas, Austin V Lecture #4: Texture and Other Maps.
Taku KomuraComputer Graphics Local Illumination and Shading Computer Graphics – Lecture 10 Taku Komura Institute for Perception, Action.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
Image-Based Lighting © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec CS194: Image Manipulation & Computational Photography Alexei Efros,
What is light? Electromagnetic radiation (EMR) moving along rays in space R( ) is EMR, measured in units of power (watts) – is wavelength Light: Travels.
Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography.
Inverse Global Illumination: Recovering Reflectance Models of Real Scenes from Photographs Computer Science Division University of California at Berkeley.
11/16/10 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Rendering Synthetic Objects into Real- World Scenes by Paul Debevec SIGGRAPH 98 Conference Presented By Justin N. Rogers for Advanced Comp Graphics Spring.
Radiosity Jian Huang, CS594, Fall 2002 This set of slides reference the text book and slides used at Ohio State.
112/5/ :54 Graphics II Image Based Rendering Session 11.
CSCE 441: Computer Graphics Ray Tracing
Local Illumination and Shading
11/24/ :45 Graphics II Shadow Maps Reflections Session 5.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
CS559: Computer Graphics Lecture 36: Raytracing Li Zhang Spring 2008 Many Slides are from Hua Zhong at CUM, Paul Debevec at USC.
Computer Graphics: Illumination
Image-based Lighting Computational Photography
donated by Paul Debevec
3D Graphics Rendering PPT By Ricardo Veguilla.
© 2005 University of Wisconsin
Image Based Modeling and Rendering (PI: Malik)
CSCI 1290: Comp Photo Fall Brown University James Tompkin
Lecture 28: Photometric Stereo
CS-378: Game Technology Lecture #4: Texture and Other Maps
Environment Mapping.
Presentation transcript:

Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted by Klaus Mueller, added some slides from Jian Huang, UTK

Inserting Synthetic Objects Why does this look so bad? Wrong orientation Wrong lighting No shadows

Solutions Wrong Camera Orientation Estimate correct camera orientation and render object –Requires camera calibration to do it right Lighting & Shadows Estimate (eyeball) all the light sources in the scene and simulate it in your virtual rendering But what happens if lighting is complex? Extended light sources, mutual illumination, etc.

Environment Maps Simple solution for shiny objects Models complex lighting as a panoramic image i.e. amount of radiance coming in from each direction A plenoptic function!!!

reflective surface viewer environment texture image v n r projector function converts reflection vector (x, y, z) to texture image (u, v) Environment Mapping Reflected ray: r=2(n·v)n-v Texture is transferred in the direction of the reflected ray from the environment map onto the object What is in the map?

What approximations are made? The map should contain a view of the world with the point of interest on the object as the Center of Projection We can’t store a separate map for each point, so one map is used with the COP at the center of the object Introduces distortions in the reflection, but we usually don’t notice Distortions are minimized for a small object in a large room The object will not reflect itself!

Environment Maps The environment map may take various forms: Cubic mapping Spherical mapping other Describes the shape of the surface on which the map “resides” Determines how the map is generated and how it is indexed

Cubic Mapping The map resides on the surfaces of a cube around the object Typically, align the faces of the cube with the coordinate axes To generate the map: For each face of the cube, render the world from the center of the object with the cube face as the image plane –Rendering can be arbitrarily complex (it’s off-line) Or, take 6 photos of a real environment with a camera in the object’s position To use the map: Index the R ray into the correct cube face Compute texture coordinates

Cubic Map Example

Calculating the Reflection Vector Normal vector of the surface : N Incident Ray : I Reflection Ray: R N,I,R all normalized R = I -2 N ( N. I )‏ The texture coordinate is based on the reflection vector Assuming the origin of the vector is always in the center of the cube environment map

Indexing Cubic Maps Assume you have R and the cube’s faces are aligned with the coordinate axes, and have texture coordinates in [0,1]x[0,1] How do you decide which face to use? Use the reflection vector coordinate with the largest magnitude (0.3, 0.2, 0.8)  face in +z direction

Indexing Cubic Maps How do you decide which texture coordinates to use ? Divide by the coordinate with the largest magnitude Now ranging [-1,1] Remapped to a value between 0 and 1. (0.3,0.2,0.8)  ((0.3/0.8 +1)*0.5, ((0.2/0.8 +1)*0.5) = (0.6875, 0.625)‏

A Sphere Map A mapping between the reflection vector and a circular texture Prepare an image/texture in which the environment is mapped onto a sphere

Sphere Mapping To generate the map:  Take a photograph of a shiny sphere Mapping a cubic environment map onto a sphere For synthetic scenes, you can use ray tracing

Sphere Map Compute the reflection vector at the surface of the object Find the corresponding texture on the sphere map Use the texture to color the surface of the object

Indexing Sphere Maps Given the reflection vector R (Rx,Ry,Rz)‏ ‏ (u,v) on the spherical map

Indexing Sphere Maps The normal vector is the sum of the reflection vector and the eye vector Normalization of the vector gives If the normal is on a sphere of radius 1, its x, y coordinates are also location on the sphere map Finally converting them to make their range [0,1]

Non-linear Mapping Problems: Highly non-uniform sampling Highly non-linear mapping Linear interpolation of texture coordinates picks up the wrong texture pixels Do per-pixel sampling or use high resolution polygons Can only view from one direction CorrectLinear

Example

What about real scenes? From Flight of the Navigator

What about real scenes? from Terminator 2

Real environment maps We can use photographs to capture environment maps How do we deal with light sources? Sun, lights, etc? They are much much brighter than the rest of the enviarnment User High Dynamic Range photography, of course! Several ways to acquire environment maps: Stitching mosaics Fisheye lens Mirrored Balls

Stitching HDR mosaics

Scanning Panoramic Cameras Pros: very high res (10K x 7K+) Full sphere in one scan – no stitching Good dynamic range, some are HDR Issues: More expensive Scans take a while Companies: Panoscan, Sphereon Pros: very high res (10K x 7K+) Full sphere in one scan – no stitching Good dynamic range, some are HDR Issues: More expensive Scans take a while Companies: Panoscan, Sphereon

Fisheye Images

Mirrored Sphere

Sources of Mirrored Balls  2-inch chrome balls ~ $20 ea.  McMaster-Carr Supply Company  6-12 inch large gazing balls  Baker’s Lawn Ornaments  Hollow Spheres, 2in – 4in  Dube Juggling Equipment  FAQ on  2-inch chrome balls ~ $20 ea.  McMaster-Carr Supply Company  6-12 inch large gazing balls  Baker’s Lawn Ornaments  Hollow Spheres, 2in – 4in  Dube Juggling Equipment  FAQ on

=> 59% Reflective Calibrating Mirrored Sphere Reflectivity

Real-World HDR Lighting Environments Lighting Environments from the Light Probe Image Gallery: Funston Beach Uffizi Gallery Eucalyptus Grove Grace Cathedral

Acquiring the Light Probe

Assembling the Light Probe

Not just shiny… We have captured a true radiance map We can treat it as an extended (e.g spherical) light source Can use Global Illumination to simulate light transport in the scene So, all objects (not just shiny) can be lighted What’s the limitation? We have captured a true radiance map We can treat it as an extended (e.g spherical) light source Can use Global Illumination to simulate light transport in the scene So, all objects (not just shiny) can be lighted What’s the limitation?

Illumination Results Rendered with Greg Larson’s RADIANCE synthetic imaging system Rendered with Greg Larson’s RADIANCE synthetic imaging system

Comparison: Radiance map versus single image

Putting it all together Synthetic Objects + Real light! Synthetic Objects + Real light!

CG Objects Illuminated by a Traditional CG Light Source

Illuminating Objects using Measurements of Real Light Object Light Environment assigned “glow” material property in Greg Ward’s RADIANCE system.

Paul Debevec. A Tutorial on Image-Based Lighting. IEEE Computer Graphics and Applications, Jan/Feb 2002.

Rendering with Natural Light SIGGRAPH 98 Electronic Theater

RNL Environment mapped onto interior of large cube

MOVIE!

Illuminating a Small Scene

We can now illuminate synthetic objects with real light. How do we add synthetic objects to a real scene? We can now illuminate synthetic objects with real light. How do we add synthetic objects to a real scene?

Real Scene Example Goal: place synthetic objects on table

Light Probe / Calibration Grid

real scene Modeling the Scene light-based model

The Light-Based Room Model

real scene Modeling the Scene synthetic objects light-based model local scene

The Lighting Computation synthetic objects (known BRDF) synthetic objects (known BRDF) distant scene (light-based, unknown BRDF) local scene (estimated BRDF) local scene (estimated BRDF)

Rendering into the Scene Background Plate

Rendering into the Scene Objects and Local Scene matched to Scene

Differential Rendering Local scene w/o objects, illuminated by model

Differential Rendering (2) Difference in local scene - - = =

Differential Rendering Add differences to images -- Final Result

High-Dynamic Range Photography Standard digital images typically represent only a small fraction of the dynamic range— the ratio between the dimmest and brightest regions accurately represented—present in most realworld lighting environments. The bright light saturates the pixel colour Need a high dynamic range image which can be produced by combining images of different shutter speed Standard digital images typically represent only a small fraction of the dynamic range— the ratio between the dimmest and brightest regions accurately represented—present in most realworld lighting environments. The bright light saturates the pixel colour Need a high dynamic range image which can be produced by combining images of different shutter speed

High-Dynamic Range Photography

HDR images

I MAGE -B ASED L IGHTING IN F IAT L UX Paul Debevec, Tim Hawkins, Westley Sarokin, H. P. Duiker, Christine Cheng, Tal Garfinkel, Jenny Huang SIGGRAPH 99 Electronic Theater

HDR Image Series 2 sec 1/4 sec 1/30 sec 1/250 sec 1/2000 sec 1/8000 sec

Stp1 Panorama

Assembled Panorama

Light Probe Images

Capturing a Spatially-Varying Lighting Environment

The Movie