Reflectance Map: Photometric Stereo and Shape from Shading

Slides:



Advertisements
Similar presentations
Computer Vision Radiometry. Bahadir K. Gunturk2 Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of.
Advertisements

Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #12.
Capturing light Source: A. Efros. Image formation How bright is the image of a scene point?
Foundations of Computer Graphics (Spring 2012) CS 184, Lecture 21: Radiometry Many slides courtesy Pat Hanrahan.
16421: Vision Sensors Lecture 6: Radiometry and Radiometric Calibration Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 2:50pm.
Basic Principles of Surface Reflectance
Radiometry. Outline What is Radiometry? Quantities Radiant energy, flux density Irradiance, Radiance Spherical coordinates, foreshortening Modeling surface.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Radiometry Questions: –how “bright” will surfaces be? –what is “brightness”?
Based on slides created by Edward Angel
1 Angel: Interactive Computer Graphics 5E © Addison-Wesley 2009 Shading I.
University of New Mexico
Course Review CS/ECE 181b Spring Topics since Midterm Stereo vision Shape from shading Optical flow Face recognition project.
Capturing light Source: A. Efros. Review Pinhole projection models What are vanishing points and vanishing lines? What is orthographic projection? How.
Representations of Visual Appearance COMS 6160 [Fall 2006], Lecture 2 Ravi Ramamoorthi
May 2004SFS1 Shape from shading Surface brightness and Surface Orientation --> Reflectance map READING: Nalwa Chapter 5. BKP Horn, Chapter 10.
3-D Computer Vision CSc83029 / Ioannis Stamos 3-D Computer Vision CSc Radiometry and Reflectance.
Stefano Soatto (c) UCLA Vision Lab 1 Homogeneous representation Points Vectors Transformation representation.
Image Formation1 Projection Geometry Radiometry (Image Brightness) - to be discussed later in SFS.
© 2002 by Davi GeigerComputer Vision January 2002 L1.1 Image Formation Light can change the image (and appearances). What is the relation between pixel.
Computer Graphics (Fall 2008) COMS 4160, Lecture 19: Illumination and Shading 2
Introduction to Computer Vision CS / ECE 181B Tues, May 18, 2004 Ack: Matthew Turk (slides)
Basic Principles of Surface Reflectance
© 2004 by Davi GeigerComputer Vision February 2004 L1.1 Image Formation Light can change the image and appearances (images from D. Jacobs) What is the.
Image formation 2. Blur circle A point at distance is imaged at point from the lens and so Points a t distance are brought into focus at distance Thus.
© 2003 by Davi GeigerComputer Vision September 2003 L1.1 Image Formation Light can change the image and appearances (images from D. Jacobs) What is the.
Basic Principles of Surface Reflectance Lecture #3 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Computer Graphics (Fall 2004) COMS 4160, Lecture 16: Illumination and Shading 2 Lecture includes number of slides from.
Photometric Stereo & Shape from Shading
1 Angel: Interactive Computer Graphics 4E © Addison-Wesley 2005 Shading I Ed Angel Professor of Computer Science, Electrical and Computer Engineering,
VECTORS AND THE GEOMETRY OF SPACE 12. VECTORS AND THE GEOMETRY OF SPACE A line in the xy-plane is determined when a point on the line and the direction.
CS 480/680 Computer Graphics Shading I Dr. Frederick C Harris, Jr.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Light and shading Source: A. Efros.
Computer Vision Spring ,-685 Instructor: S. Narasimhan PH A18B T-R 10:30am – 11:50am Lecture #13.
EECS 274 Computer Vision Light and Shading. Radiometry – measuring light Relationship between light source, surface geometry, surface properties, and.
UNIVERSITI MALAYSIA PERLIS
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
1/20 Obtaining Shape from Scanning Electron Microscope Using Hopfield Neural Network Yuji Iwahori 1, Haruki Kawanaka 1, Shinji Fukui 2 and Kenji Funahashi.
MIT EECS 6.837, Durand and Cutler Local Illumination.
Analysis of Lighting Effects Outline: The problem Lighting models Shape from shading Photometric stereo Harmonic analysis of lighting.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Integral University EC-024 Digital Image Processing.
Capturing light Source: A. Efros.
: Chapter 11: Three Dimensional Image Processing 1 Montri Karnjanadecha ac.th/~montri Image.
Course 10 Shading. 1. Basic Concepts: Light Source: Radiance: the light energy radiated from a unit area of light source (or surface) in a unit solid.
1 STRESS STATE 2- D. 2 It is convenient to resolve the stresses at a point into normal and shear components. The stresses used to describe the state of.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
Shape from Shading Course web page: vision.cis.udel.edu/cv February 26, 2003  Lecture 6.
Computer and Robot Vision II Chapter 12 Illumination Presented by: 傅楸善 & 張庭瑄 指導教授 : 傅楸善 博士.
1Ellen L. Walker 3D Vision Why? The world is 3D Not all useful information is readily available in 2D Why so hard? “Inverse problem”: one image = many.
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
Computer and Robot Vision II Chapter 12 Illumination Presented by: 傅楸善 & 周奕宏 指導教授 : 傅楸善 博士.
OpenGL Shading. 2 Objectives Learn to shade objects so their images appear three-dimensional Introduce the types of light-material interactions Build.
Radiometry of Image Formation Jitendra Malik. A camera creates an image … The image I(x,y) measures how much light is captured at pixel (x,y) We want.
01/27/03© 2002 University of Wisconsin Last Time Radiometry A lot of confusion about Irradiance and BRDFs –Clarrified (I hope) today Radiance.
Radiometry of Image Formation Jitendra Malik. What is in an image? The image is an array of brightness values (three arrays for RGB images)
EECS 274 Computer Vision Sources, Shadows, and Shading.
1 Ch. 4: Radiometry–Measuring Light Preview 。 The intensity of an image reflects the brightness of a scene, which in turn is determined by (a) the amount.
Announcements Project 3a due today Project 3b due next Friday.
MAN-522 Computer Vision Spring
Computer and Robot Vision II
CS262 – Computer Vision Lect 4 - Image Formation
Radiometry (Chapter 4).
© 2002 University of Wisconsin
Common Classification Tasks
Capturing light Source: A. Efros.
Part One: Acquisition of 3-D Data 2019/1/2 3DVIP-01.
Depthmap Reconstruction Based on Monocular cues
Presentation transcript:

Reflectance Map: Photometric Stereo and Shape from Shading Image Brightness Radiometry Image Formation Bidirectional Reflectance Distribution Function Surface Orientation The Reflectance Map Shading in Images Photometric Stereo Shape from Shading

Introduction We examine the photometric stereo method for recovering the orientation of surface patches from a number of images taken under different lighting conditions. The photometric stereo method is simple to implement, but requires control of lighting. Shape from shading is more difficult problem of recovering surface shape from a single image.

Introduction We need to know something about radiometry. We have to learn how image irradiance depends on scene radiance. The detailed dependence of surface reflection on the geometry of incident and emitted rays is given by bidirectional reflectance distribution function (BRDF). The reflectance map can be derived from that function and the distribution of light sources.

Image Brightness The image of a three-dimensional object depends on its shape, its reflectance properties, and the distribution of light sources.

Radiometry Irradiance: The amount of light falling on a surface is called the irradiance. It is the power per unit area(W.m-2-watts per square meter) incident on the surface. Radiance: The amount of light radiated from a surface is called the radiance. It is the power per unit area per unit solid angle (W.m-2.sr-1-watt per square meter per steradian) emitted from the surface.

Radiometry The solid angle subtended by a small patch is proportional to its area A and the cosine of the angle of inclination θ ; it is inversely proportional to the square of its distance R from the origin. R A θ where θ is the angle between a surface normal and a line connecting the patch to the origin. Brightness is determined by amount of energy an image system receives per unit appereant area.

Image Formation Consider a lens of diameter d at a distance f from the image plane. Let a patch on the surface of the object have area δO, while the corresponding image patch has area δI. Suppose that the ray from object patch to the center of the lens makes an angle α with the optical axis and that there is an angle θ between this ray and a surface normal. The object patch is at a distance –z from the lens, measured along the optical axis. The ratio of the area of the object patch to that of the image patch is determined by the distances of these patches from the lens and by foreshortening.

Image Formation The solid angle of the cone of rays leading to the patch on the object is equal to the solid angle of the cone of rays leading to the corresponding patch in the image. The apparent area of the image patch as seen from the center of the lens is δIcosα, while the distance of this patch from the center of the lens is f/Cosα.

Image Formation Similarly, the solid angle of the patch on the object as seen from the lens is If these two solid angle are to be equal, we must have

Bidirectional Reflectance Distribution Function-BRDF Scene radiance depends on the amount of light that falls on a surface and the fraction of the incident light that is reflected. The radiance of a surface will generally depend on the direction from which it is viewed as well as on the direction from which it is illuminated.

Bidirectional Reflectance Distribution Function-BRDF Directions can be described by specifying the angle θ between a ray and the normal and the angle Ø between a perpendicular projection of the ray onto the surface and the reference line on the surface. We can describe these directions in terms of a local coordinate system. The direction of incident and emitted light rays can be specified in a local coordinate system using the polar angle θ and the azimuth Ø.

Bidirectional Reflectance Distribution Function-BRDF The bidirectional reflectance distribution function is the ratio of the radiance of the surface patch as viewed from the direction (θe, øe) to the irradiance resulting from illumination from the direction (θi, øi).

Bidirectional Reflectance Distribution Function-BRDF Let the amount of light falling on the surface from the direction (θi, øi) –the irradiance- be δE(θi, øi). Let the brightness of the surface as seen from the direction (θe, øe) –the radiance- be δL(θe, øe). The BRDF is simply the ratio of radiance to irradiance, Fortunalety, for many surfaces the radiance is not altered if the surface rotated about the surface normal. In this case, the BRDF depends only on the difference øe – øi , not on øe and øi separatly.

Surface Orientation A smooth surface has a tangent plane at every point. The surface normal, a unit vector perpendicular to the tangent plane is used for specifying the orientation of this plane. The normal vector has two degrees of freedom, since it is a vector with three components and one constraint –that the sum of squares of the components must equal one.

Surface Orientation A portion of a surface can be described by its perpendicular distance –z from the lens plane. This distance will depend on the lateral displacement (x,y). A surface can be conveniently described in terms of its perpendicular distance –z(x,y) from some reference plane parallel to the image plane.

Surface Orientation The surface normal is perpendicular to all lines in the tangent plane of the surface. As a result, it can be found by taking the cross-product of any two (nonparallel) lines in the tangent plane. Consider taking a small step δx in the x-direction starting from a given point (x,y).

Surface Orientation We use the abbreviations p and q for the first partial derivatives of z with respect to x and y, respectively. Thus p is the slope of the surface measured in the x-direction, while q is the slope in the y-direction.

Surface Orientation A line parallel to the vector rx=(1,0,p)T lies in the tangent plane at (x,y). Similarly, a line parallel to ry=(0,1,q)T lies in the tangent plane also. A surface normal can be found by taking the cross-product of these two lines. N=rxxry=(-p, -q, 1)T Appropriately enough, (p,q) is called the gradient of the surface, since its components, p and q, are the slopes of the surface in the x- and y-directions, respectively. p=∂z/∂x and q= ∂z/∂y

Surface Orientation The unit surface normal is just The unit view vector v from the object to the lens is (0,0,1)T We can calculate the angle θe between the surface normal and the direction to the lens by taking the dot product of the two unit vectors.

The Reflectance Map The reflectance map makes explicit the relationship between surface orientation and brightness. It encodes information about surface reflectance properties and light-source distributions. Consider a source of radiance E illuminating a Lambertian surface. The scene radiance is Where θi is the angle between the surface normal and the direction toward the source.

The Reflectance Map Taking the dot product of the corresponding unit vectors, we obtain The result is called the reflectance map, denoted R(p,q). The reflectance map depends on the properties of the surface material of the object and the distribution of light sources. Note that radiance cannot be negative, so, we should, impose the restriction 0 ≤ θi ≤ π/2. The radiance will be zero for values of θi outside this range.

Photometric Stereo Surface orientation can usually be determined uniquely for some special points, such as those where the brightness is a maximum or minimum of R(p,q). For a Lambertian surface, for example, R(p,q) =1 only when Cosθi =0o (means (p,q)=(ps,qs)). In general, however, the mapping from brightness to surface orientation cannot be unique, since brightness only has one degree of freedom, while orientation has two. To recover surface orientation locally, we must introduce additional information. To determine two unknowns, p and q, we need two equations.

Photometric Stereo Two images, taken with different lighting, will yield two equations for each image point R1(p,q)=E1 and R2(p,q)=E2 If these equations are linear and independent, there will be a unique solution for p and q. Suppose, for example, that

Photometric Stereo Then Provided p1/q1≠ p2/q2 .Thus a unique solution can be obtained for surface orientation at each point, given two registered images taken with different lighting conditions. This is an illustration of the method of photometric stereo.