Presentation is loading. Please wait.

Presentation is loading. Please wait.

Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted.

Similar presentations


Presentation on theme: "Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted."— Presentation transcript:

1 Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted by Klaus Mueller, added some slides from Jian Huang, UTK

2 Inserting Synthetic Objects Why does this look so bad? Wrong orientation Wrong lighting No shadows

3 Solutions Wrong Camera Orientation Estimate correct camera orientation and render object –Requires camera calibration to do it right Lighting & Shadows Estimate (eyeball) all the light sources in the scene and simulate it in your virtual rendering But what happens if lighting is complex? Extended light sources, mutual illumination, etc.

4 Environment Maps Simple solution for shiny objects Models complex lighting as a panoramic image i.e. amount of radiance coming in from each direction A plenoptic function!!!

5 reflective surface viewer environment texture image v n r projector function converts reflection vector (x, y, z) to texture image (u, v) Environment Mapping Reflected ray: r=2(n·v)n-v Texture is transferred in the direction of the reflected ray from the environment map onto the object What is in the map?

6 What approximations are made? The map should contain a view of the world with the point of interest on the object as the Center of Projection We can’t store a separate map for each point, so one map is used with the COP at the center of the object Introduces distortions in the reflection, but we usually don’t notice Distortions are minimized for a small object in a large room The object will not reflect itself!

7 Environment Maps The environment map may take various forms: Cubic mapping Spherical mapping other Describes the shape of the surface on which the map “resides” Determines how the map is generated and how it is indexed

8 Cubic Mapping The map resides on the surfaces of a cube around the object Typically, align the faces of the cube with the coordinate axes To generate the map: For each face of the cube, render the world from the center of the object with the cube face as the image plane –Rendering can be arbitrarily complex (it’s off-line) Or, take 6 photos of a real environment with a camera in the object’s position To use the map: Index the R ray into the correct cube face Compute texture coordinates

9 Cubic Map Example

10 Calculating the Reflection Vector Normal vector of the surface : N Incident Ray : I Reflection Ray: R N,I,R all normalized R = I -2 N ( N. I )‏ The texture coordinate is based on the reflection vector Assuming the origin of the vector is always in the center of the cube environment map

11 Indexing Cubic Maps Assume you have R and the cube’s faces are aligned with the coordinate axes, and have texture coordinates in [0,1]x[0,1] How do you decide which face to use? Use the reflection vector coordinate with the largest magnitude (0.3, 0.2, 0.8)  face in +z direction

12 Indexing Cubic Maps How do you decide which texture coordinates to use ? Divide by the coordinate with the largest magnitude Now ranging [-1,1] Remapped to a value between 0 and 1. (0.3,0.2,0.8)  ((0.3/0.8 +1)*0.5, ((0.2/0.8 +1)*0.5) = (0.6875, 0.625)‏

13 A Sphere Map A mapping between the reflection vector and a circular texture Prepare an image/texture in which the environment is mapped onto a sphere

14 Sphere Mapping To generate the map:  Take a photograph of a shiny sphere Mapping a cubic environment map onto a sphere For synthetic scenes, you can use ray tracing

15 Sphere Map Compute the reflection vector at the surface of the object Find the corresponding texture on the sphere map Use the texture to color the surface of the object

16 Indexing Sphere Maps Given the reflection vector R (Rx,Ry,Rz)‏ ‏ (u,v) on the spherical map

17 Indexing Sphere Maps The normal vector is the sum of the reflection vector and the eye vector Normalization of the vector gives If the normal is on a sphere of radius 1, its x, y coordinates are also location on the sphere map Finally converting them to make their range [0,1]

18 Non-linear Mapping Problems: Highly non-uniform sampling Highly non-linear mapping Linear interpolation of texture coordinates picks up the wrong texture pixels Do per-pixel sampling or use high resolution polygons Can only view from one direction CorrectLinear

19 Example

20 What about real scenes? From Flight of the Navigator

21 What about real scenes? from Terminator 2

22 Real environment maps We can use photographs to capture environment maps How do we deal with light sources? Sun, lights, etc? They are much much brighter than the rest of the enviarnment User High Dynamic Range photography, of course! Several ways to acquire environment maps: Stitching mosaics Fisheye lens Mirrored Balls

23 Stitching HDR mosaics http://www.gregdowning.com/HDRI/stitched/

24 Scanning Panoramic Cameras Pros: very high res (10K x 7K+) Full sphere in one scan – no stitching Good dynamic range, some are HDR Issues: More expensive Scans take a while Companies: Panoscan, Sphereon Pros: very high res (10K x 7K+) Full sphere in one scan – no stitching Good dynamic range, some are HDR Issues: More expensive Scans take a while Companies: Panoscan, Sphereon

25 Fisheye Images

26 Mirrored Sphere

27

28

29 Sources of Mirrored Balls  2-inch chrome balls ~ $20 ea.  McMaster-Carr Supply Company www.mcmaster.com  6-12 inch large gazing balls  Baker’s Lawn Ornaments www.bakerslawnorn.com  Hollow Spheres, 2in – 4in  Dube Juggling Equipment www.dube.com  FAQ on www.debevec.org/HDRShop/  2-inch chrome balls ~ $20 ea.  McMaster-Carr Supply Company www.mcmaster.com  6-12 inch large gazing balls  Baker’s Lawn Ornaments www.bakerslawnorn.com  Hollow Spheres, 2in – 4in  Dube Juggling Equipment www.dube.com  FAQ on www.debevec.org/HDRShop/

30 0.34 0.58 => 59% Reflective Calibrating Mirrored Sphere Reflectivity

31 Real-World HDR Lighting Environments Lighting Environments from the Light Probe Image Gallery: http://www.debevec.org/Probes/ Funston Beach Uffizi Gallery Eucalyptus Grove Grace Cathedral

32 Acquiring the Light Probe

33 Assembling the Light Probe

34 Not just shiny… We have captured a true radiance map We can treat it as an extended (e.g spherical) light source Can use Global Illumination to simulate light transport in the scene So, all objects (not just shiny) can be lighted What’s the limitation? We have captured a true radiance map We can treat it as an extended (e.g spherical) light source Can use Global Illumination to simulate light transport in the scene So, all objects (not just shiny) can be lighted What’s the limitation?

35 Illumination Results Rendered with Greg Larson’s RADIANCE synthetic imaging system Rendered with Greg Larson’s RADIANCE synthetic imaging system

36 Comparison: Radiance map versus single image

37 Putting it all together Synthetic Objects + Real light! Synthetic Objects + Real light!

38 CG Objects Illuminated by a Traditional CG Light Source

39 Illuminating Objects using Measurements of Real Light Object Light http://radsite.lbl.gov/radiance/ Environment assigned “glow” material property in Greg Ward’s RADIANCE system.

40 Paul Debevec. A Tutorial on Image-Based Lighting. IEEE Computer Graphics and Applications, Jan/Feb 2002.

41 Rendering with Natural Light SIGGRAPH 98 Electronic Theater

42 RNL Environment mapped onto interior of large cube

43 MOVIE!

44 Illuminating a Small Scene

45

46 We can now illuminate synthetic objects with real light. How do we add synthetic objects to a real scene? We can now illuminate synthetic objects with real light. How do we add synthetic objects to a real scene?

47 Real Scene Example Goal: place synthetic objects on table

48 Light Probe / Calibration Grid

49 real scene Modeling the Scene light-based model

50 The Light-Based Room Model

51 real scene Modeling the Scene synthetic objects light-based model local scene

52 The Lighting Computation synthetic objects (known BRDF) synthetic objects (known BRDF) distant scene (light-based, unknown BRDF) local scene (estimated BRDF) local scene (estimated BRDF)

53 Rendering into the Scene Background Plate

54 Rendering into the Scene Objects and Local Scene matched to Scene

55 Differential Rendering Local scene w/o objects, illuminated by model

56 Differential Rendering (2) Difference in local scene - - = =

57 Differential Rendering Add differences to images -- Final Result

58

59 High-Dynamic Range Photography Standard digital images typically represent only a small fraction of the dynamic range— the ratio between the dimmest and brightest regions accurately represented—present in most realworld lighting environments. The bright light saturates the pixel colour Need a high dynamic range image which can be produced by combining images of different shutter speed Standard digital images typically represent only a small fraction of the dynamic range— the ratio between the dimmest and brightest regions accurately represented—present in most realworld lighting environments. The bright light saturates the pixel colour Need a high dynamic range image which can be produced by combining images of different shutter speed

60 High-Dynamic Range Photography

61 HDR images

62

63 I MAGE -B ASED L IGHTING IN F IAT L UX Paul Debevec, Tim Hawkins, Westley Sarokin, H. P. Duiker, Christine Cheng, Tal Garfinkel, Jenny Huang SIGGRAPH 99 Electronic Theater

64

65 HDR Image Series 2 sec 1/4 sec 1/30 sec 1/250 sec 1/2000 sec 1/8000 sec

66 Stp1 Panorama

67 Assembled Panorama

68 Light Probe Images

69 Capturing a Spatially-Varying Lighting Environment

70 The Movie


Download ppt "Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted."

Similar presentations


Ads by Google