Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 395/495-25: Spring 2004 IBMR: Measuring Lights, Materials, Lenses and more Jack Tumblin

Similar presentations


Presentation on theme: "CS 395/495-25: Spring 2004 IBMR: Measuring Lights, Materials, Lenses and more Jack Tumblin"— Presentation transcript:

1 CS 395/495-25: Spring 2004 IBMR: Measuring Lights, Materials, Lenses and more Jack Tumblin jet@cs.northwestern.edu

2 Recall: An Image Is… 2D Image: A map of light intensities Light + 3D Scene: Illumination, shape, movement, surface BRDF,… Position(x,y) A ‘Camera’: ?What are ALL the possibilities?

3 An Planar Projection Image Is… Image Plane I(x,y) Angle( ,  ) Position(x,y) 2D Image: Collection of rays through a point Light + 3D Scene: Illumination, shape, movement, surface BRDF,… ‘Center of Projection’ (P 3 or P 2 Origin)

4 Image Making: Pinhole  Thin Lens Interactive Thin Lens Demo (search ‘physlet optical bench’)Interactive Thin Lens Demo (search ‘physlet optical bench’) http://www.swgc.mun.ca/physics/physlets/opticalbench.html From this geometry (for next time)From this geometry (for next time) Can you derive Thin Lens Law?

5 Incident Light Measurement Flux W = power, Watts, # photons/secFlux W = power, Watts, # photons/sec Uniform, point-source light: flux on a patch of surface falls with distance 2Uniform, point-source light: flux on a patch of surface falls with distance 2 E = Watts/r 2 r

6 Light Measurement Flux W = power, Watts, # photons/secFlux W = power, Watts, # photons/sec Irradiance E: flux arriving per unit area, (regardless of direction)Irradiance E: flux arriving per unit area, (regardless of direction) E = Watts/area = dW/dA But direction makes a big difference when computing E...

7 Foreshortening Effect: cos(  ) Larger Incident angle  i spreads same flux over larger areaLarger Incident angle  i spreads same flux over larger area flux per unit area becomes W cos(  i ) / areaflux per unit area becomes W cos(  i ) / area Foreshortening geometry imposes an angular term cos(  i ) on energy transferForeshortening geometry imposes an angular term cos(  i ) on energy transfer circular ‘bundle’ of incident rays, flux W W iiii

8 Irradiance E To find irradiance at a point on a surface,To find irradiance at a point on a surface, Find flux from each (point?) light source,Find flux from each (point?) light source, Weight flux by its direction: cos(  i )Weight flux by its direction: cos(  i ) Add all light sources: or more precisely, integrate over entire hemisphere Add all light sources: or more precisely, integrate over entire hemisphere  Defines Radiance L: L = (watts / area) / sr (sr = steradians; solid angle; = surface area on unit sphere) 

9 Radiance L But for distributed (non-point) light sources? integrate flux over the entire hemisphere .But for distributed (non-point) light sources? integrate flux over the entire hemisphere . But what are the units of what we integrate? Radiance L Radiance L L = (watts / area) / sr (sr = steradians; solid angle; = surface area on unit sphere) 

10 Lighting Invariants Why doesn’t surface intensity change with distance? We know point source flux drops with distance: 1/r 2We know point source flux drops with distance: 1/r 2 We know surface is made of infinitesimal point sources...We know surface is made of infinitesimal point sources... Cam ‘intensity’: 1/r 2 ‘intensity’: constant (?!?!)

11 Lighting Invariants Why doesn’t surface intensity change with distance? Because camera pixels measure Radiance, not flux! Because camera pixels measure Radiance, not flux! –pixel value  flux *cos(  ) / sr –‘good lens’ design: cos(  ) term vanishes. Vignetting=residual error. Pixel’s size in sr fixed:Pixel’s size in sr fixed: –Point source fits in one pixel: 1/r 2 –Viewed surface area grows by r 2, cancels 1/r 2 flux falloff Cam Light ‘intensity’: 1/r 2 Surface ‘intensity’: constant (?!?!)

12 Lighting Invariants Radiance Images are LINEAR:  ·(Radiance caused by (Light 1)) +  ·(Radiance caused by (Light 2)) = Radiance caused by (  · Light 1 +  ·Light 2) http://www.sgi.com/grafica/synth/index.html + =

13 Lighting Invariants Light is Linear:  ·(Radiance caused by (Light 1)) +  ·(Radiance caused by (Light 2)) = Radiance caused by (  · Light 1 +  ·Light 2) http://www.sgi.com/grafica/synth/index.html - = Allows ‘negative’ light!

14 Point-wise Light Reflection Given:Given: –Infinitesimal surface patch dA, –illuminated by irradiance amount E –from just one direction (  i,  i ) How should we measure the returned light?How should we measure the returned light? Ans: by emitted RADIANCE measured for all outgoing directions: (measured on surface of  )Ans: by emitted RADIANCE measured for all outgoing directions: (measured on surface of  )  dA iiii iiii

15 Point-wise Light Reflection: BRDF Bidirectional Reflectance Distribution Function F r (  i,  I,  e,  e ) = L e (  e,  e ) / E i (  i,  i ) Still a ratio (outgoing/incoming) light, butStill a ratio (outgoing/incoming) light, but BRDF: Ratio of outgoing RADIANCE in one direction: L e (  e,  e ) that results from incoming IRRADIANCE in one direction: E i (  i,  i )BRDF: Ratio of outgoing RADIANCE in one direction: L e (  e,  e ) that results from incoming IRRADIANCE in one direction: E i (  i,  i ) Units are tricky: BRDF = F r = L e / E iUnits are tricky: BRDF = F r = L e / E i  dA iiii iiii LeLeLeLe EiEiEiEi

16 Point-wise Light Reflection: BRDF Bidirectional Reflectance Distribution Function F r (  i,  I,  e,  e ) = L e (  e,  e ) / E i (  i,  i ) Still a ratio (outgoing/incoming) light, butStill a ratio (outgoing/incoming) light, but BRDF: Ratio of outgoing RADIANCE in one direction: L e (  e,  e ) that results from incoming IRRADIANCE in one direction: E i (  i,  i )BRDF: Ratio of outgoing RADIANCE in one direction: L e (  e,  e ) that results from incoming IRRADIANCE in one direction: E i (  i,  i ) Units are tricky: BRDF = F r = L e / E i = ( Watts/area/sr) /(Watts/area)Units are tricky: BRDF = F r = L e / E i = ( Watts/area/sr) /(Watts/area)  dA iiii iiii LeLeLeLe EiEiEiEi

17 Point-wise Light Reflection: BRDF Bidirectional Reflectance Distribution Function F r (  i,  I,  e,  e ) = L e (  e,  e ) / E i (  i,  i ) Still a ratio (outgoing/incoming) light, butStill a ratio (outgoing/incoming) light, but BRDF: Ratio of outgoing RADIANCE in one direction: L e (  e,  e ) that results from incoming IRRADIANCE in one direction: E i (  i,  i )BRDF: Ratio of outgoing RADIANCE in one direction: L e (  e,  e ) that results from incoming IRRADIANCE in one direction: E i (  i,  i ) Units are tricky: BRDF = F r = L e / E i = ( Watts/area/sr) / = 1/srUnits are tricky: BRDF = F r = L e / E i = ( Watts/area/sr) / = 1/sr(Watts/area)

18 Point-wise Light Reflection: BRDF Bidirectional Reflectance Distribution Function F r (  i,  I,  e,  e ) = L e (  e,  e ) / E i (  i,  i ), and (1/sr)units ‘Bidirectional’ because value is SAME if we swap in,out directions: (  e,  e )  (  i,  i )‘Bidirectional’ because value is SAME if we swap in,out directions: (  e,  e )  (  i,  i ) Important Property! aka ‘Helmholtz Reciprocity’ BRDF Results from surface’s microscopic structure...BRDF Results from surface’s microscopic structure... Still only an approximation: ignores subsurface scattering...Still only an approximation: ignores subsurface scattering...  dA iiii iiii LeLeLeLe EiEiEiEi

19 Scattering Difficulties: For many surfaces, single-point BRDFs do not exist Angles Depend on refractive index, scattering, cell wall structures, etc. Depends on total area of cell wall interfaces Example: Leaf Structure  dA iiii iiii LeLeLeLe EiEiEiEi

20 Subsurface Scattering Models Classical: Kubelka-Monk(1930s, for paint; many proprietary variants), CG approach: Hanrahan & Krueger(1990s) More Recent: ‘dipole model’ (2001, Jensen) Marble BSSRDF Marble BRDF

21 Subsurface Scattering Models Classical: Kubelka-Monk(1930s, for paint; many proprietary variants), CG approach: Hanrahan & Krueger(1990s) More Recent: ‘dipole model’ (2001, Jensen) Skin BSSRDF (approximated) Skin BRDF (measured)

22 BSSRDF Model Approximates scattering result as embedded point sources below a BRDF surface: BSSRDF: “A Practical Model for Subsurface Light Transport” Henrik Wann Jensen, Steve Marschner, Marc Levoy, Pat Hanrahan, SIGGRAPH’01 (online) online

23 BSSRDF Model Embedded point sources below a BRDF surfaceEmbedded point sources below a BRDF surface Ray-based, tested, Physically-Measurable ModelRay-based, tested, Physically-Measurable Model ?Useful as a predictive model for IBMR data??Useful as a predictive model for IBMR data? Wann Jensen et al., 2001

24 Summary: Light Measurement Flux W = power, Watts, # photons/secFlux W = power, Watts, # photons/sec Irradiance E = Watts/area = dW/dAIrradiance E = Watts/area = dW/dA Radiance L = (Watts/area)/sr = (dW/dA)/srRadiance L = (Watts/area)/sr = (dW/dA)/sr BRDF: Measure EMITTED radiance that results from INCOMING irradiance from just one direction: BRDF = F r = L e / E i = (Watts/area) / (Watts/area  sr)BRDF: Measure EMITTED radiance that results from INCOMING irradiance from just one direction: BRDF = F r = L e / E i = (Watts/area) / (Watts/area  sr)

25 IBMR Tools Digital Light Input:Digital Light Input: –Light meter: measure visible irradiance E –Light meter: measure visible irradiance E (some have plastic ‘dome’ to ensure accurate foreshortening) –Camera: pixels measure Radiance L i ; flux arriving at lens from one (narrow solid) angle Digital Light Output:Digital Light Output: –Luminaires: point lights, extended(area) sources –Emissive Surfaces: CRT, LCD surface –Projectors: laser dot,stripe,scan; video display Light Modifiers (Digital?):Light Modifiers (Digital?): –Calibration objects, shadow sources, etc. –Lenses,diffusers, filters, reflectors, collimators... –?Where are the BRDF displays / printers?

26 Two Big Missing Pieces Computer controlled BRDF.Computer controlled BRDF. –Can we really do without it? –are cameras and projectors enough to ‘import the visible world’ into our computers? BRDF is not enough:BRDF is not enough: –Subsurface scattering is crucial aspect of photographed images –? how can we model it? measure it? use it?

27 More help: GREAT explanation of BRDF:GREAT explanation of BRDF: www.cs.huji.ac.il/~danix/advanced/RenderingEq.pdfwww.cs.huji.ac.il/~danix/advanced/RenderingEq.pdfwww.cs.huji.ac.il/~danix/advanced/RenderingEq.pdf Some questions about measuring light:Some questions about measuring light:

28 END

29 Projects? (due Tues May 25!) Let’s discuss them… IBMR---May 13,2004:

30 Summary: Light Measurement Flux W = power, Watts, # photons/secFlux W = power, Watts, # photons/sec Irradiance E = Watts/area = dW/dAIrradiance E = Watts/area = dW/dA Radiance L = (Watts/area)/sr = (dW/dA)/srRadiance L = (Watts/area)/sr = (dW/dA)/sr BRDF: Measure EMITTED radiance that results from INCOMING irradiance from just one direction: BRDF = F r = L e / E i = (Watts/area) / (Watts/area  sr)BRDF: Measure EMITTED radiance that results from INCOMING irradiance from just one direction: BRDF = F r = L e / E i = (Watts/area) / (Watts/area  sr)

31 IBMR: Measure,Create, Modify Light How can we measure ‘rays’ of light? Light Sources? Scattered rays? etc. Shape,Position,Movement, BRDF,Texture,Scattering EmittedLight Reflected,Scattered, Light … Cameras capture subset of these rays. Digital light sources (Projectors) can produce a subset of these rays.

32 ‘Scene’ modifies Set of Light Rays What measures light rays in, out of scene?

33 Measure Light LEAVING a Scene? Measure Light LEAVING a Scene? Towards a camera?...

34 Measure Light LEAVING a Scene? Measure Light LEAVING a Scene? Towards a camera: Radiance. Light Field Images measure Radiance L(x,y)

35 Measure light ENTERING a Scene? from a (collection of) point sources at infinity?

36 Measure light ENTERING a Scene? from a (collection of) point sources at infinity? ‘Light Map’ Images (texture map light source) describes Irradiance E(x,y)

37 Measure light ENTERING a Scene? leaving a video projector lens? Radiance L ‘Reversed’ Camera: emits Radiance L(x,y)

38 Measure light ENTERING a Scene? from a video projector?—Leaving Lens: Irradiance E Radiance L

39 Cleaner Formulation:Cleaner Formulation: –Orthographic camera, –positioned on sphere around object/scene –Orthographic projector, –positioned on sphere around object/scene –(and wavelength and time) F(x c,y c,  c,  c,x l,y l  l,  l,, t) ‘Full 8-D Light Field’ (10-D, actually: time, ) camera projector

40 Summary: Light Measurement Flux W = power, Watts, # photons/secFlux W = power, Watts, # photons/sec Irradiance E = Watts/area = dW/dAIrradiance E = Watts/area = dW/dA Radiance L = (Watts/area)/sr = (dW/dA)/srRadiance L = (Watts/area)/sr = (dW/dA)/sr BRDF: Measure EMITTED radiance that results from INCOMING irradiance from just one direction: BRDF = F r = L e / E i = (Watts/area) / (Watts/area  sr)BRDF: Measure EMITTED radiance that results from INCOMING irradiance from just one direction: BRDF = F r = L e / E i = (Watts/area) / (Watts/area  sr) Lenses map radiance to the image plane (x,y): THUS: Pixel x,y must measure Radiance L at x,y. well, not exactly; there are distortions!…

41 What do Photos Measure? What We Want What We Get

42

43 Film Response: (digital cameras, video cards too!) approximately linear, but ONLY on log-log axes.

44 Two Key parameters: m == scale == exposure  == gamma == ‘contrastyness’

45 ???? 0 255 Domain of Human Vision: from ~10 -6 to ~10 +8 cd/m 2 Range of Typical Displays: from ~1 to ~100 cd/m 2 starlightmoonlight office light daylightflashbulb 10 -6 10 -2 110100 10 +4 10 +8 Problem:Map Scene to Display

46

47

48 High-Contrast Image Capture? An open problem! (esp. for video...)An open problem! (esp. for video...) Direct (expensive) solution:Direct (expensive) solution: –Flying Spot Radiometer: brute force instrument, costly, slow, delicate –Novel Image Sensors: line-scan cameras, logarithmic CMOS circuits, cooled detectors, rate-based detectors... Most widely used idea: multiple exposuresMost widely used idea: multiple exposures Elegant paper ( Debevec1996 ) describes how:Elegant paper ( Debevec1996 ) describes how: (On class website)

49 starlightmoonlightoffice lightdaylightflashbulb Use Overlapped Exposure Values

50 starlightmoonlightoffice lightdaylightflashbulb

51 Use Overlapped Exposure Values starlightmoonlightoffice lightdaylightflashbulb

52 Use Overlapped Exposure Values starlightmoonlightoffice lightdaylightflashbulb What is the camera response curve? And what are the pixel radiances? (See Debevec SIGGRAPH 1997:) ? f(logL)

53 Debevec’97 Method j=0 j=1 i=2 j=3 j=4 j=5 j=6 STEP 1: --number the images ‘i’, --pick fixed spots (x j,y j ) that sample scene’s radiance values logL i well: j=0123456? logL i Pixel Value Z f(logL)

54 Debevec’97 Method j=0 j=1 i=2 j=3 j=4 j=5 j=6 STEP 2: --Collect pixel values Z ij (from image i, location j) --(All of them sample the response curve f(logL)…) logL i Pixel Value Z j=0123456? f(logL)

55 ? logL i Pixel Value Z j=0123456 Z ij In image I j, `exposure’ changed by log(2 j ) = j * log(2) In image I j, `exposure’ changed by log(2 j ) = j * log(2) In image I j,pixel at (x i,y i ) has known pixel value Z ij In image I j,pixel at (x i,y i ) has known pixel value Z ij and unknown radiance logL i Film response curve: f(logL) = Z; we know several logL i : Film response curve: f(logL) = Z; we know several logL i : f(log(L j * 2 j )) = Z ij, or more simply: f(logL i + j*C) = Z ij f(log(L j * 2 j )) = Z ij, or more simply: f(logL i + j*C) = Z ij Pixel Value Z F(logL) ? logL Debevec’97 Method

56 Debevec’97 Method: It’s another Null-Space Problem… ? f(logL i – (j*C)) logL i Pixel Value Z j=0123456 Z ij In image I j,pixel at (x i,y i ) has known pixel value Z ij In image I j,pixel at (x i,y i ) has known pixel value Z ij and unknown radiance logL i f(logL i + j*C) = Z ij How do we find f() and logL i ? f(logL i + j*C) = Z ij How do we find f() and logL i ? TRICK: Use f() as a scale factor for each pixel value Z ij f ij * (logL j + j*C) – Z ij = 0 TRICK: Use f() as a scale factor for each pixel value Z ij f ij * (logL j + j*C) – Z ij = 0 Pixel Value Z f ij

57 Debevec’97 Method: It’s another Null-Space Problem… ? f(logL i – (j*C)) logL i Pixel Value Z j=0123456 Z ij f ij * (logL j + j*C) – Z ij = 0 Pixel Value Z f ij Wait,wait, wait. We have TWO unknowns? NEXT WEEK: Read Debevec’97 Explain how we solve this!

58 Camera Abilities / Limitations Nonlinear Intensity Response: S-shaped (on log-log axes)Nonlinear Intensity Response: S-shaped (on log-log axes) Low-Contrast Devices: Noise limited (~500:1)Low-Contrast Devices: Noise limited (~500:1) Varied Spectral Response: RGB 1 != RGB 2...Varied Spectral Response: RGB 1 != RGB 2... Color Sensing Strategies:Color Sensing Strategies: –3-chip cameras: best, but expensive! –Mosaic sensor: trades resolution for color Nonuniform sensitivity & geometryNonuniform sensitivity & geometry –Lens limitations (vignetting, radial distortion, bloom/scatter, uneven focus,...) –CCD Sensor geometry: VERY exact, repeatable

59 Display Abilities / Limitations Nonlinear Intensity Response: S-shapedNonlinear Intensity Response: S-shaped Low-Contrast DevicesLow-Contrast Devices –scattering usually sets upper bounds –Best Contrast: laser projectors, some DLP devices, specialized devices...) Varied Spectral Response: RGB 1 != RGB 2...Varied Spectral Response: RGB 1 != RGB 2... Color Reproducing Strategies: varied...Color Reproducing Strategies: varied... Nonuniform sensitivity & geometry:Nonuniform sensitivity & geometry: –CRTs: e-beam cos(  ), distortion, focus, convergence... –LCDs, DLPs: VERY exact, (but pixels die, etc.)

60 Light Modifiers?   Discuss! Low-Contrast BRDF ‘Devices’ to measure light?Low-Contrast BRDF ‘Devices’ to measure light? –‘Light Probe’ mirror sphere BRDF = ? –Diffuse reflectances limited to about 0.02  0.95 –Diffractive materials: complex BRDF may be useful... –(Transmissive LCDs?) ?Can you name more? PRECISELY Linear ‘Response’ to light... BRDFs are fixed ratios; no intensity dependence!PRECISELY Linear ‘Response’ to light... BRDFs are fixed ratios; no intensity dependence! Smudge, nick may modify BRDF drasticallySmudge, nick may modify BRDF drastically Shadows? Precision? Inter-reflections?Shadows? Precision? Inter-reflections? PRECISE input/output symmetryPRECISE input/output symmetry--BUT-- Scattering WITHIN material can be trouble...Scattering WITHIN material can be trouble...

61 What is the complete IBMR toolset? Camera(s) + light probe, etc:  arbitrary Radiance meter.Camera(s) + light probe, etc:  arbitrary Radiance meter. Sphere of Projectors/CRTs:  arbitrary Irradiance source.Sphere of Projectors/CRTs:  arbitrary Irradiance source. Some (as yet unknown) device:  arbitrary BRDF / light ray modifierSome (as yet unknown) device:  arbitrary BRDF / light ray modifier Is our toolset complete complete? have we spanned the IBMR problem?...

62 Missing the most important tool… Human Visual System.Human Visual System. –the receiver/user for MOST IBMR data. –Eye is a very poor light meter, but very good at sensing BRDF and (some) shape. –Eye senses change; integration used to estimate the world –Eye permits tradeoffs of geometry vs. surface appearance –Eye permits selective radiance distortions, especially to illumination:

63

64 Details Everywhere; segmented partial - ordering of intensities. Local changes matter. Absolute intensities don’t matter much, but boundaries, shading, & CHANGES do. ---WANTED:--- visually important information in machine-readable form. Picture: Copy Appearance

65 Visible Light Measurement ‘Visible Light’ = what our eyes can perceive;‘Visible Light’ = what our eyes can perceive; –narrow-band electromagnetic energy:  400-700 nm (nm = 10 -9 meter)  400-700 nm (nm = 10 -9 meter) <1 octave; (honey bees: 3-4 ‘octaves’ ?chords?) Not uniformly visible vs. wavelength :Not uniformly visible vs. wavelength : –Equiluminant Curve  defines ‘luminance’ vs. wavelength –eyes sense spectral CHANGES well, but not wavelength –Metamerism http://www.yorku.ca/eye/photopik.htm

66 Visual Appearance Measurement Measurement of Light—easy. Perception?—hard.Measurement of Light—easy. Perception?—hard. –‘Color’ ==crudely perceived wavelength spectrum –3 sensed dimensions from spectra. –CIE-standard X,Y,Z color spectra: linear coord. system for spectra that spans all perceivable colors X,Y,Z –Projective! luminance = Z chromaticity = (x,y) = (X/Z, Y/Z) –NOT perceptually uniform... (MacAdam’s ellipses...) Many Standard Texts, tutorials on colorMany Standard Texts, tutorials on color –Good: http://www.colourware.co.uk/cpfaq.htm http://www.colourware.co.uk/cpfaq.htm –Good: http://www.yorku.ca/eye/toc.htm http://www.yorku.ca/eye/toc.htm –Watt & Watt pg 277-281

67 END


Download ppt "CS 395/495-25: Spring 2004 IBMR: Measuring Lights, Materials, Lenses and more Jack Tumblin"

Similar presentations


Ads by Google