Download presentation
Presentation is loading. Please wait.
1
Texture, optics and shading
By Zhi Liu
2
outline Texture Optics shading
3
Texture What is texture??
A measure of the variation of the intensity of a surface, quantifying properties such as smoothness, coarseness and regularity. It's often used as a region descriptor in image analysis and computer vision. Source: The Free On-line Dictionary of Computing, © Denis Howe
4
texture Texture is characterized by the spatial distribution of gray levels in a neighborhood. Resolution at which image is observed determines how texture is perceived.
5
example Observing tiled floor from a large distance:
Same scene is observed from a closer distance: What the difference…. So…….our define of texture is…..
6
Texture definition We can define texture as repeating patterns of local variations in image intensity which are too fine to be distinguished as separate objects at the observed resolution. A connected set of pixels satisfying a given gray-level property which occur repeatedly in an image region constitutes a textured region.
7
Texture example Presentation schedule for EE6358 Computer Vision
Summer 2004 Binary image processing—Amin---June 14, 2004 Region segmentation---Nayan---June 14, 2004 Edge detection---Actul---June 21, 2004 Contouring—Ashish---June 21, 2004 Texture, optics &shading---Liu ---June 28, 2004 Color and depth---Amin---June 28, 2004 Curves and surfaces---Jeff---July 5, 2004 Dynamic vision---Rob----July 5,
8
Texture analysis Three primary issues: texture classification.
texture segmentation. shape recovery from texture.
9
Texture classification
What is it about? Identifying the given textured region from a given set of texture classes. A small example. a particular region in an aerial image may belong to agricultural land, forest region, or urban area.
10
Texture classification
Assuming that the boundaries between regions have been determined. Properties as gray-level, co-occurrence, contrast, entropy, and homogeneity are used.
11
Texture classification
Micro Textures: when the texture primitive is small, statistical methods are useful in this case. Macro Textures: when the size of texture primitive is large, have to determine the shape and properties of the basic primitive and then determine the rules which govern the placement of these primitives.
12
Texture segmentation Concerned with automatically determining the boundaries between various textured regions in an image. Most of the statistical methods for determining texture features do not provide accurate measures unless the computations are limited to a single texture region.
13
Shape recovery Image plane variations such as density ,size and orientation are the cues exploited by shape-from-texture algorithms. For example, texture gradient determine the orientation of the surface.
14
Statistical methods of texture analysis
Texture is a spatial property, 1-D histogram can not be used to characterize texture.
15
measures Gray-level co-occurrence matrix Autocorrelation function
Focus on the first measure
16
Gray-level co-occurrence matrix
Gray-level co-occurrence matrix---- P[ I, j ] Displacement vector----d = (dx,dy)
17
Gray-level co-occurrence matrix
The elements of P[I,j] are normalized by dividing each entry by the total number of pixel pairs. The normalized P[I,j]is then treated as a probability mass function since the entries now add up to 1. Another example to show matrix captures the spatial distribution of gray levels
19
Gray-level co-occurrence matrix
Entropy to measure randomness of gray level distribution. Other parameters:
20
parameters Energy: Measures the number of repeated pairs.
Contrast: Measures the local contrast of the image. Homogeneity: essentially tells us the opposite of contrast.
21
Autocorrelation
22
autocorrelation The autocorrelation function can be used to detect repetitive patterns and is a good measure of the fineness/coarseness of the texture. Coarse texture function drops off slowly Fine texture function drops off rapidly Regular textures function will have peaks and valleys. exhibits periodic behavior with a period equal to the spacing between adjacent texture primitive for images comprising repetitive texture patterns
23
Structural analysis of ordered texture
Used when the primitives are large enough to be individually segmented and described. First segmenting the discs using a simple method( connected component labeling), then determine the regular structure. Morphological methods are used when the image is corrupted by noise or other non-repeating random patterns.
24
Example
25
Model-based Methods for texture analysis
What the method is. The model has a set of parameters which determine the properties of the texture. The challenge of this method
26
Markov random field Discrete Gauss-Markov random field model:
Gray level of any pixel is modeled as a linear combination of gray levels of its neighbors plus an additive noise term. Weight h[k,l] is the parameter
27
Markov random fields( Contd.)
The procedure of building the model Computing the parameters from given image texture using least-squares method The estimated parameters will compared with those of the known texture classes to determine the class of the particular texture being analyzed Least-squared method:
28
Contd. When patterns forming texture have the property of self-similarity at different scales, fractal-based models may be used. What is self-similarity? N, r Such texture is characterized by its fractal dimension D, by given
29
Shape from Texture We can use the variations in the size, shape, and density of texture primitive to estimate the surface shape and orientation. Recover 3-D information from 2-D image.
30
example
31
example
32
The example tell us….. Surface is not parallel with the image plane
Size of ellipse decrease------density gradient Aspect ratio (minor and major diameters of the ellipse ) does not remain constant--- aspect ratio gradient
33
Ratio in the point (0,0) Z: distance of the disc from the camera center and f is the focal length of the camera. αis slant angel. d is the diameter of the disc
34
Ratio at point (0,y’) (0,y’) = ( 1 – tanθtanα )
(0,y’) = cosα(1-tanθtanα) (1-tanθtanα)
35
Optics Machine vision rely on pinhole camera model
models the geometry of perspective projection, omits the depth of field view volume is an infinite pyramid.
36
Lens Equation If we define the distance from optical origin to the image plane as z’, focal length as f, and the distance from the optical origin to the point in the scene as z, we have :
37
Image resolution For most machine vision applications, it is determined by the interplay between pixel spacing and depth of field The space between pixel is A, then the resolution limit is 2A since that is the separation between features that allows two features to be perceived as distinct.
38
Image resolution The resolution of the film
a typical spacing between grains is 5 μm human version system. One minute of arc. Example: The length of the arm is 40 cm 0.3 * * 40 = 120 μm
39
Depth of Field Depth of field is the range of distances that produce acceptably focused images When a scene point is out of focus, it creates a circle of diameter b. diameter of the lens d Near plane( min ) Far plane( max )
40
Depth of field so the depth of field D is the difference between the max and the min:
41
Exposure Exposure is the power per square meter multiplied by time.
Parameter that control exposure is called F-stop given by F = f/d.
42
shading Shading is how light reflects from surfaces and estimate the shape of surface.
43
Shading Imaging is the process that maps intensities from points in a scene onto an image plane. Image irradiance is defined to be the power per unit area of radiant energy falling on the image plane. Radiant---outgoing Irradiance---incoming
44
Image irradiance Irradiance is determined by the energy radiated from the corresponding point. We trace the ray back to the surface patch from which the ray was emitted and understand how the light is reflected by surface patch.
45
Image irradiance Two factors determine radiance reflected
illumination falling on the patch of scene surface. the fraction of the incident illumination that is reflected by the surface patch. Establish an infinitesimal patch of surface in a scene.
46
Bidirectional reflectance distribution function ( BRDF )
f(θi, Фi, θe, Фe) = f(θi-θe, Фi-Фe) θ ρ Ф
47
illumination The total irradiance of the surface patch is given by
The radiance reflected by the surface is given by
48
illumination L(θeФe ) = Io / пcosθs
49
Reflectance Lambertian reflectance Specular reflectance
Combinations of Lambertian and Specular reflectance f(θi, Фi, θe, Фe) = ŋ/ п+ (1+ ŋ )(Ơ(θe- θi) Ơ(Фe- Фi- п )/sinФi cos Фi
50
Surface Orientation A surface normal to the surface patch is related to the gradient by: This implies that the amount of displacement in x and y corresponding to the unit change depth z is p and q, respectively.
51
Surface orientation So the coordinates of a point on a scene surface can just be denoted by image plane coordinates x and y. any function or property of the scene surface can be specified in terms of p and q. p = p ( x, y ) q = q ( x, y )
52
The reflectance map The combination of scene illumination, surface reflectance, and the representation of surface orientation in viewer-centered coordinates is called the reflectance map. It specifies the brightness of a patch of surface, at a particular orientation for a given distribution of illumination and surface material
53
The reflectance map E( x, y ) = R( p, q )
which means the irradiance ( brightness ) at point (x, y) in the image plane is equal to the reflectance map value for the surface orientation p and q of the corresponding point on the scene surface
54
Shape from shading For fixed illumination and imaging conditions, changes in surface orientation translate into corresponding changes in image intensity. Inverse problem is ~
55
reference www.cis.rit.edu/~rlc8222/vision/text.html
Project Title: Analysis of Textured Regions Based on Gray Scale Co-occurrence Matrices: Classification of Tissues Using Texture Information
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.