Reflectance and Texture of Real-World Surfaces KRISTIN J. DANA Columbia University BRAM VAN GINNEKEN Utrecht University SHREE K. NAYAR Columbia University.

Slides:



Advertisements
Similar presentations
Computer Vision Radiometry. Bahadir K. Gunturk2 Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of.
Advertisements

Exploration of bump, parallax, relief and displacement mapping
Recognizing Surfaces using Three-Dimensional Textons Thomas Leung and Jitendra Malik Computer Science Division University of California at Berkeley.
Computer graphics & visualization Global Illumination Effects.
Week 9 - Monday.  What did we talk about last time?  BRDFs  Texture mapping and bump mapping in shaders.
3D Graphics Rendering and Terrain Modeling
Measuring BRDFs. Why bother modeling BRDFs? Why not directly measure BRDFs? True knowledge of surface properties Accurate models for graphics.
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
Foundations of Computer Graphics (Spring 2012) CS 184, Lecture 21: Radiometry Many slides courtesy Pat Hanrahan.
RADIOSITY Submitted by CASULA, BABUPRIYANK. N. Computer Graphics Computer Graphics Application Image Synthesis Animation Hardware & Architecture.
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 8: Illumination and Reflection Many slides courtesy.
ATEC Procedural Animation Introduction to Procedural Methods in 3D Computer Animation Dr. Midori Kitagawa.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
Computer graphics & visualization Pre-Computed Radiance Transfer PRT.
Preserving Realism in real-time Rendering of Bidirectional Texture Functions Jan Meseth, Gero Müller, Reinhard Klein Bonn University Computer Graphics.
Mesostructure Rendering Techniques
Computer Graphics (Fall 2005) COMS 4160, Lecture 16: Illumination and Shading 1
Computer Graphics (Spring 2008) COMS 4160, Lecture 20: Illumination and Shading 2
CURET: A Database of Reflectances and Textures of 60 Real-World Surfaces Dana, Ginneken, Nayar, Koenderink UCSC CMPS 290b, Fall 2005 Presented by Steven.
Measurement, Inverse Rendering COMS , Lecture 4.
(conventional Cartesian reference system)
1 Image-Based Visual Hulls Paper by Wojciech Matusik, Chris Buehler, Ramesh Raskar, Steven J. Gortler and Leonard McMillan [
Computer Graphics (Fall 2008) COMS 4160, Lecture 19: Illumination and Shading 2
Global Illumination May 7, Global Effects translucent surface shadow multiple reflection.
Rendering General BSDFs and BSSDFs Randy Rauwendaal.
A Theory of Locally Low Dimensional Light Transport Dhruv Mahajan (Columbia University) Ira Kemelmacher-Shlizerman (Weizmann Institute) Ravi Ramamoorthi.
Computer Graphics (Fall 2004) COMS 4160, Lecture 16: Illumination and Shading 2 Lecture includes number of slides from.
Rendering with Concentric Mosaics Heung-Yeung Shum Li-Wei he Microsoft Research.
Convergence of vision and graphics Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
Measure, measure, measure: BRDF, BTF, Light Fields Lecture #6
The Story So Far The algorithms presented so far exploit: –Sparse sets of images (some data may not be available) –User help with correspondences (time.
Face Relighting with Radiance Environment Maps Zhen Wen 1, Zicheng Liu 2, Thomas Huang 1 Beckman Institute 1 University of Illinois Urbana, IL61801, USA.
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Introduction --Classification Shape ContourRegion Structural Syntactic Graph Tree Model-driven Data-driven Perimeter Compactness Eccentricity.
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Image-Based Rendering from a Single Image Kim Sang Hoon Samuel Boivin – Andre Gagalowicz INRIA.
MIT EECS 6.837, Durand and Cutler Local Illumination.
Digital Image Processing, 2nd ed. © 2002 R. C. Gonzalez & R. E. Woods Chapter 11 Representation & Description Chapter 11 Representation.
-Global Illumination Techniques
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Digital Image Processing Lecture 6: Image Geometry
Efficient Rendering of Local Subsurface Scattering Tom Mertens 1, Jan Kautz 2, Philippe Bekaert 1, Frank Van Reeth 1, Hans-Peter Seidel
Computer Graphics Global Illumination: Photon Mapping, Participating Media Lecture 12 Taku Komura.
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
Taku KomuraComputer Graphics Local Illumination and Shading Computer Graphics – Lecture 10 Taku Komura Institute for Perception, Action.
Digital Image Processing, 2nd ed. © 2002 R. C. Gonzalez & R. E. Woods Representation & Description.
The Quotient Image: Class-based Recognition and Synthesis Under Varying Illumination T. Riklin-Raviv and A. Shashua Institute of Computer Science Hebrew.
Introduction --Classification Shape ContourRegion Structural Syntactic Graph Tree Model-driven Data-driven Perimeter Compactness Eccentricity.
Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Computer Graphics (Spring 2003) COMS 4160, Lecture 18: Shading 2 Ravi Ramamoorthi Guest Lecturer: Aner Benartzi.
Ray Tracing Fall, Introduction Simple idea  Forward Mapping  Natural phenomenon infinite number of rays from light source to object to viewer.
Local Illumination and Shading
Measuring Bidirectional Texture Reflectance with a Kaleidoscope
A Statistical Approach to Texture Classification Nicholas Chan Heather Dunlop Project Dec. 14, 2005.
Digital Image Processing CSC331
Radiometry of Image Formation Jitendra Malik. A camera creates an image … The image I(x,y) measures how much light is captured at pixel (x,y) We want.
Physically-based Illumination Models (2) CPSC 591/691.
Radiometry of Image Formation Jitendra Malik. What is in an image? The image is an array of brightness values (three arrays for RGB images)
3D Rendering 2016, Fall.
Rendering Pipeline Fall, 2015.
3D Graphics Rendering PPT By Ricardo Veguilla.
Common Classification Tasks
Image Based Modeling and Rendering (PI: Malik)
Illumination Model How to compute color to represent a scene
Part One: Acquisition of 3-D Data 2019/1/2 3DVIP-01.
CS5500 Computer Graphics May 29, 2006
Computer Graphics (Fall 2003)
Presentation transcript:

Reflectance and Texture of Real-World Surfaces KRISTIN J. DANA Columbia University BRAM VAN GINNEKEN Utrecht University SHREE K. NAYAR Columbia University JAN J. KOENDERINK Utrecht University ACM Transactions on Graphics, Vol. 18, No. 1, January 1999

Overview Introduce BRDF and BTF BTF Texture Gathering Technique CUReT Database BTF Applications Future Work Pretty Pictures

Bidirectional Reflectance Distribution Function (BRDF) Nicodemus [1970] and Nicodemus et al. [1977] Coarse scale level –local surface variations are subpixel –local intensity is uniform Bidirectional: 1.Camera Angle 2.Light Angle “Objects look differently when viewed from different angles, and when illuminated from different directions”

Bidirectional Texture Function (BTF) Fine scale level –Surface variations give rise to local intensity variations Bidirectional: 1.Camera Angle 2.Light Angle “Objects look differently when viewed from different angles, and when illuminated from different directions”

BRDF vs. BTF

Why do we need BTFs? Traditional 2-D texture synthesis and texture-mapping do not take into account the change in texture appearance as the viewing and illumination directions change –A single digital image of a rough surface is mapped onto a 3-D object and the appearance of roughness is usually lost or distorted Bump-mapping [Blinn 1977, 1978] preserves some of the appearance of roughness –knowledge of the surface shape is required –shadows cast from the local surface relief are not rendered ray tracing can be used –exact geometry of the surface must be known –high computational cost solid texturing: combine a volumetric texture synthesis with volume rendering techniques –computationally intensive –applicable for a limited variety of textures. BTF database –“potential exists for 3-D texturing algorithms using images, without the need for a volumetric texture model or surface synthesis procedure”

BTF: Where do we start? Already BRDF databases Employ new techniques to create BTF database Pull Together: –Robot –Lamp –PC –Photometer –Video camera

Texture Gathering Technique Fixed light source –Halogen bulb with a Fresnel lens (single- beam focusing) Camera moves through 7 positions –22.5°, 45°, 67.5°, 90°, 112.5°, 135°, 157.5° from light source Texture sample moves through multiple orientations –Robot arm orients sample normal along vertices of quarter-sphere facing the light source

Texture Gathering Technique At each camera position, texture is captured with its normal along quarter- sphere vertices Not all vertices captured at each position –At position 7, only a few normals are actually visible to the camera Quarter-Sphere Orientations: Camera Positions

Texture Gathering Technique Sample lies in xs–ys plane with its global normal pointing in the direction of zs Each circular marker represents a distinct illumination direction For each of these illumination directions, the sample is imaged from seven viewing directions Quarter-Sphere Orientations: Illumination Directions

Texture Gathering Technique Textures that have grids or grains Measurements are repeated rotating sample about zs by either 45° or 90° depending on the structure of the anisotropy Examples: –Linen (square grid) rotated 45° –Corduroy (vertical lines) rotated 90 ° Special Case: Anisotropic Textures

Texture Gathering Technique Relate radiance to pixel values Use Kodak standard card image for every sample measured. Letting r denote the total radiance and p denote the average pixel value, a linear relationship was found Data with significant pixel underflow (pixel values near 0) or overflow (pixel values near 255) were not used. Control Considerations

End Product 205 images for each sample 640 x 480 pixels 24 bits per pixel (8 bits per RGB channel). Database total: over 14,000 images (61 samples, 205 measurements per sample, plus 205 additional measurements for anisotropic samples) CUReT Database: Camera PositionImages Total205

Columbia-Utrecht Reflectance and Texture Database (CUReT)

BTF Applications Top row –Two images of “plaster_b” with different illumination and viewing directions Bottom row –Spatial spectrum of “plaster_b” with zero frequency at the center and brighter regions corresponding to higher magnitudes –Notice orientation change due to change of illumination direction causing change in shadow direction. Computer vision: –Texture recognition algorithms often based on spectral content of image textures –BTF should be considered for recognition of real-world surfaces Sample 11: “plaster_b”

BTF Applications BTF texture gathering technique allows easy gathering of BRDF data –Pros: Simple system Simultaneously gather BRDF and BTF measurements –Cons: Not as accurate as traditional BRDF measurement systems

Future Work Synthesizing Bidirectional Texture Functions for Real-World Surfaces Xinguo Liu, Yizhou Yu, Heung-Yeung Shum 3 Step approach to synthetically generate BTFs 1.Recovers approximate 3D geometry of surface details using a shape-from-shading approach 2.Generates a novel version of the geometric details with the same statistical properties as the sample surface 3.Uses an “appearance preserving procedure” to synthesize novel images under various viewing/lighting settings, defining a novel BTF

Show me some BTF pictures!!! 13 images per sample used from database collection of 205 –1 image of frontal view –12 oblique views Use averaging of three pixels at the section borders to reduce the appearance of seams

Pretty Pictures Traditional 2-D texture-mapping BTF 3-D texture-mapping Sample 11 (plaster)

Pretty Pictures Traditional 2-D texture-mapping BTF 3-D texture-mapping Sample 8 (pebbles)

Pretty Pictures Traditional 2-D texture-mapping BTF 3-D texture-mapping Sample 45 (concrete)

Pretty Pictures Traditional 2-D texture-mapping BTF 3-D texture-mapping Sample 28 (crumpled paper)

Pretty Pictures Traditional 2-D texture-mapping BTF 3-D texture-mapping Sample 19 (plush rug)

Pretty Pictures Traditional 2-D texture-mapping BTF 3-D texture-mapping Sample 56 (wood) (anisotropic)

fine