Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Computer Vision Image Texture Analysis

Similar presentations


Presentation on theme: "Introduction to Computer Vision Image Texture Analysis"— Presentation transcript:

1 Introduction to Computer Vision Image Texture Analysis
Lecture 12

2 A few examples Morphological processing for background illumination estimation Optical character recognition Roger S. Gaborski

3 Image with nonlinear illumination
Original Image Thresholded with graythresh

4 Obtain Estimate of Background
background = imopen(I,strel('disk',15)); %GRAYSCALE figure, imshow(background, []) figure, surf(double(background(1:8:end,1:8:end))),zlim([0 1]); Roger S. Gaborski

5 %subtract background estimate from original image I2 = I - background;
figure, imshow(I2), title('Image with background removed') level = graythresh(I2); bw = im2bw(I2,level); figure, imshow(bw),title('threshold') Roger S. Gaborski

6 Comparison Original Threshold Background Removal - Threshold
Roger S. Gaborski

7 Optical Character Recognition
After segmenting a character we still need to recognize the character. How do we determine if a matrix of pixels represents an ‘A’, ‘B’, etc? Roger S. Gaborski

8 Roger S. Gaborski

9 Roger S. Gaborski

10 Approach Select line of text Segment each letter
Recognize each letter as ‘A’, ‘B’, ‘C’, etc. Roger S. Gaborski

11 Samples of segment of individual letters in line 3:
Select line 3: Samples of segment of individual letters in line 3: Roger S. Gaborski

12 We need labeled samples of each potential letter to compare to unknown
Take the product of the unknown character and each labeled character and determine with labeled character is the closest match Roger S. Gaborski

13 %Load Database of characters (samples of known characters)
load charDB mat whos char Name Size Bytes Class Attributes char x double EACH ROW IS VECTORIZED CHARACTER BITMAP Roger S. Gaborski

14 CODE SOMETHING LIKE THIS:
BasicOCR.m CODE SOMETHING LIKE THIS: cc = ['A' 'B' 'C' 'D' 'E' 'F' 'G' 'H' 'I' 'J' 'K' 'L' 'M' 'N' 'O' ... 'P' 'Q' 'R' 'S' 'T' 'U' 'V' 'W' 'X' 'Y' 'Z']; First, convert matrix of text character to a row vector for j=1:26 score(j)= sum(t .* char R(j,:)); end ind(i)=find(score= =max(score)); fprintf('Recognized Text %s, \n', cc(ind)) OUTPUT: Recognized Text HANSPETERBISCHOF, Roger S. Gaborski

15 How can I segment this image?
Assumption: uniformity of intensities in local image region Roger S. Gaborski University of Bonn

16 What is Texture? Roger S. Gaborski University of Bonn

17 Roger S. Gaborski

18 Threshold - graythresh
Edge Detection Histogram Threshold - graythresh Roger S. Gaborski

19 Roger S. Gaborski

20 Roger S. Gaborski

21 Roger S. Gaborski

22 >> figure, imshow(I<lev)
lev = graythresh(I) lev = 0.5647 >> figure, imshow(I<lev) Roger S. Gaborski

23 What is Texture No formal definition
There is significant variation in intensity levels between nearby pixels Variations of intensities form certain repetitive patterns (homogeneous at some spatial scale) The local image statistics are constant, slowly varying human visual system: textures are perceived as homogeneous regions, even though textures do not have uniform intensity Definition is formed by different people depending on the particular applications and there is no generally agreed upon definition Roger S. Gaborski

24 Texture Apparent homogeneous regions:
In both cases the HVS will interpret areas of sand or bricks as a ‘region’ in an image But, close inspection will reveal strong variations in pixel intensity A brick wall Sand on a beach Roger S. Gaborski

25 Texture Is the property of a ‘group of pixels’/area; a single pixel does not have texture Is scale dependent at different scales texture will take on different properties Large number of (if not countless) primitive objects If the objects are few, then a group of countable objects are perceived instead of texture Involves the spatial distribution of intensities 2D histograms Co-occurrence matrixes Roger S. Gaborski

26 Scale Dependency Scale is important – consider sand Close up Far Away
“small rocks, sharp edges” “rough looking surface” “smoother” Far Away “one object  brown/tan color” Roger S. Gaborski

27 Terms (Properties) Used to Describe Texture
Coarseness Roughness Direction Frequency Uniformity Density How would describe dog fur, cat fur, grass, wood grain, pebbles, cloth, steel?? Roger S. Gaborski

28 “The object has a fine grain and a smooth surface”
Can we define these terms precisely in order to develop a computer vision recognition algorithm? Roger S. Gaborski

29 Features Tone – based on pixel intensity in the texture primitive
Structure – spatial relationships between primitives A pixel can be characterized by its Tonal/Structural properties of the group of pixels it belongs to Roger S. Gaborski

30 Spatial Relationship of Primitives:
Tonal: Average intensity Maximum intensity Minimum intensity Size, shape Spatial Relationship of Primitives: Random Pair-wise dependent Roger S. Gaborski

31 Artificial Texture      
      Roger S. Gaborski

32 Artificial Texture      
      Segmenting into regions based on texture Roger S. Gaborski

33 Color Can Play an Important role in Texture
      Roger S. Gaborski

34 Color Can Play an Important Role in Texture
      Roger S. Gaborski

35 Statistical and Structural Texture
Consider a brick wall: Statistical Pattern – close up pattern in bricks Structural (Syntactic) Pattern – brick pattern  on previous slides can be represented by a grammar, such as, ababab ) Roger S. Gaborski

36 Most current research focuses on statistical texture
Edge density is a simple texture measure - edges per unit distance Segment object based on edge density HOW DO WE ESTIMATE EDGE DENSITY?? Roger S. Gaborski

37 Move a window across the image and count the number of edges in
the window ISSUE – window size? How large should the window be? What are the tradeoffs? How does window size affect accuracy of segmentation? Segment object based on edge density Roger S. Gaborski

38 Move a window across the image and count the number of edges in
the window ISSUE – window size? How large should the window be? Large enough to get a good estimate Of edge density What are the tradeoffs? Larger windows result in larger overlap between textures How does window size affect Accuracy of segmentation? Smaller windows result in better region segmentation accuracy, but poorer Estimate of edge density Segment object based on edge density Roger S. Gaborski

39 Average Edge Density Algorithm
Smooth image to remove noise Detect edges by thresholding image Count edges in n x n window Assign count to edge window Feature Vector  [gray level value, edge density] Segment image using feature vector Roger S. Gaborski

40 Run Length Coding Statistics
Runs of ‘similar’ gray level pixels Measure runs in the directions 0,45,90,135 1 2 3 Y( L, LEV, d) Where L is the number of runs of length L LEV is for gray level value and d is for direction d Image Roger S. Gaborski

41 1 2 3 Image 45 degrees 0 degrees Run Length, L Run Length, L 1 2 3 4 1
1 2 3 45 degrees 0 degrees Run Length, L Run Length, L 1 2 3 4 1 2 3 4 Gray Level, LEV Gray Level, LEV Roger S. Gaborski

42 1 2 3 Image 45 degrees 0 degrees Run Length, L Run Length, L 1 2 3 4 1
1 2 3 45 degrees 0 degrees Run Length, L Run Length, L 1 2 3 4 1 2 3 4 Gray Level, LEV Gray Level, LEV Roger S. Gaborski

43 Run Length Coding For gray level images with 8 bits 256 shades of gray  256 rows 1024x1024  1024 columns Reduce size of matrix by quantizing: Instead of 256 shades of gray, quantize each 8 levels into one resulting in 256/8 = 32 rows Quantize runs into ranges; run 1-8  first column, 9-16 the second…. Results in 128 columns Roger S. Gaborski

44 Gray Level Co-occurrence Matrix, P[i,j]
Specify displacement vector d = (dx, dy) Count all pairs of pixels separated by d having gray level values i and j. Formally: P(i, j) = |{(x1, y1), (x2, y2): I(x1, y1) = i, I(x2, 21) = j}| Roger S. Gaborski

45 Gray Level Co-occurrence Matrix
Consider simple image with gray level values 0,1,2 Let d = (1,1) x y x y 2 1 One pixel right One pixel down Roger S. Gaborski

46 Count all pairs of pixels in which the
2 1 Count all pairs of pixels in which the first pixel has value i and the second value j displaced by d. P(1,0) 1 P(2,1) 2 1 Etc. Roger S. Gaborski

47 Co-occurrence Matrix, P[i,j]
2 1 1 2 3 i P(i, j) There are 16 pairs, so normalize by 16 Roger S. Gaborski

48 Uniform Texture d=(1,1) x y Let Black = 1, White = 0 P[i,j] P(0,0)=
Roger S. Gaborski

49 Uniform Texture d=(1,1) x y Let Black = 1, White = 0 P[i,j] P(0,0)= 24
Roger S. Gaborski

50 Uniform Texture d=(1,0) x y Let Black = 1, White = 0 P[i,j] P(0,0)= ?
Roger S. Gaborski

51 Uniform Texture x d=(1,0) y Let Black = 1, White = 0 P[i,j] P(0,0)= 0
Roger S. Gaborski

52 Randomly Distributed Texture
What if the Black and white pixels where randomly distributed? What will matrix P look like?? No preferred set of gray level pairs, matrix P will have approximately a uniform population Roger S. Gaborski

53 Co-occurrence Features
Gray Level Co-occurrence Matrices(GLCM) Typically GLCM are calculated at four different angles: 0, 45,90 and 135 degrees For each angles different distances can be used, d=1,2,3, etc. Size of GLCM of a 8-bit image: 256x256 (28). Quantizing the image will result in smaller matrices. A 6-bit image will result in 64x64 matrices 14 features can be calculated from each GLCM. The features are used for texture calculations Roger S. Gaborski

54 Co-occurrence Features
P(ga,gb,d,t): ga  gray level pixel ‘a’ gb  gray level pixel ‘b’ d  distance d t  angle t (0, 45,90,135) In many applications the transition ga to gb and gb to ga are both counted. This results in symmetric GLCMs: For P(0,0,1,0) results in an entry of 2 for the ‘0 0’ entry Roger S. Gaborski

55 Co-occurrence Features
The data in the GLCM are used to derive the features, not the original image data How do we interpret the contrast equation? Roger S. Gaborski

56 Co-occurrence Features
The data in the GLCM are used to derive the features, not the original image data: Measures the local variations in the gray-level co-occurrence matrix. How do we interpret the contrast equation? The term (i-j)2: weighing factor (a squared term) values along the diagonal (i=j) are multiplied by zero. These values represent adjacent image pixels that do not have a gray level difference. entries further away from the diagonal represent pixels that have a greater gray level difference, that is more contrast, and are multiplied by a larger weighing factor. The (i-j)2 term is a weighing factor, values along the diagonal where i=j are multiplied by zero. These values represent adjacent image pixels that do not have a gray level difference. Entries further away from the diagonal represent pixels that have a greater gray level difference, that is more contrast, and are multiplied by a larger weighing factor. Roger S. Gaborski

57 Co-occurrence Features
Dissimilarity: Dissimilarity is similar to contrast, except the weights increase linearly Roger S. Gaborski

58 Co-occurrence Features
Inverse Difference Moment IDM has smaller numbers for images with high contrast, larger numbers for images low contrast Roger S. Gaborski

59 Co-occurrence Features
Angular Second Moment(ASM) measures orderliness: how regular or orderly the pixel values are in the window Energy is the square root of ASM Entropy: where ln(0)=0 Roger S. Gaborski

60 Matlab Texture Filter Functions
Description rangefilt Calculates the local range of an image. stdfilt Calculates the local standard deviation of an image. entropyfilt Calculates the local entropy of a grayscale image. Entropy is a statistical measure of randomness Roger S. Gaborski

61 rangefilt A = Symmetrical Padding max = 4, min = 1, range = 3 Roger S. Gaborski

62 rangefilt Results (3x3) A = 1 3 5 5 2 4 3 4 2 6 8 7 3 5 4 6 2 7 2 2
>> R = rangefilt(A) R = Roger S. Gaborski

63 rangefilt Results (5x5) A = 1 3 5 5 2 4 3 4 2 6 8 7 3 5 4 6 2 7 2 2
>> R = rangefilt(A, ones(5)) R = Roger S. Gaborski

64 Original image Roger S. Gaborski

65 Imfilt = rangefilt(Im);
figure, imshow(Imfilt, []), title('Image by rangefilt') Roger S. Gaborski

66 figure, imshow(Imfilt, []), title('Image by stdfilt')
Imfilt = stdfilt(Im); figure, imshow(Imfilt, []), title('Image by stdfilt') Roger S. Gaborski

67 Imfilt = entropyfilt(Im);
figure, imshow(Imfilt, []), title('Image by entropyfilt') Roger S. Gaborski

68 Matlab function: graycomatrix
Computes GLCM of an image glcm = graycomatrix(I) analyzes pairs of horizontally adjacent pixels in a scaled version of I. If I is a binary image, it is scaled to 2 levels. If I is an intensity image, it is scaled to 8 levels. [glcm, SI] = graycomatrix(...) returns the scaled image used to calculate GLCM. The values in SI are between 1 and 'NumLevels'. Roger S. Gaborski

69 Parameters ‘Offset’ determines number of co-occurrences matrices generated offsets is a q x 2matrix Each row in matrix has form [row_offset, col_offset] row_off specifies number of rows between pixel of interest and its neighbors col_off specifies number of columns between pixel of interest and its neighbors Roger S. Gaborski

70 Offset [0,1] specifies neighbor one column to the left Angle Offset
  0              [0 D]  45             [-D D]  90             [-D 0] 135            [-D –D] Roger S. Gaborski

71 Orientation of offset The figure illustrates the array: offset = [0 1; -1 1; -1 0; -1 -1] 90, [-1,0] 135, [-1,-1] 45, [ -1,1] 0, [ 0,1] Roger S. Gaborski

72 Intensity Image mat2gray Convert matrix to intensity image.
I = mat2gray(A,[AMIN AMAX]) converts the matrix A to the intensity image I. The returned matrix I contains values in the range 0.0 (black) to 1.0 Roger S. Gaborski

73 graycomatrix Example From textbook, p 649 >> f = [ 1 1 7 5 3 2;
; ; ; ; ] f = Need to convert to an Intensity image [0,1] Roger S. Gaborski

74 >> fm = mat2gray(f) fm = 0 0 0.8571 0.5714 0.2857 0.1429
Roger S. Gaborski

75 Quantize to 8 Levels IS = Roger S. Gaborski

76 >> offsets = [0 1];  >> [GS, IS] =
graycomatrix(fm,'NumLevels', 8, 'Offset', offsets) GS = See NEXT PAGE Roger S. Gaborski

77 GS = IS = Roger S. Gaborski

78 Two-element vector, [low high], that specifies how the grayscale
'GrayLimits' Two-element vector, [low high], that specifies how the grayscale values in I are linearly scaled into gray levels. Grayscale values less than or equal to low are scaled to 1. Grayscale values greater than or equal to high are scaled to NumLevels. If graylimits is set to [], graycomatrix uses the minimum and maximum grayscale values in the image as limits, [min(I(:)) max(I(:))]. >> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[]) Roger S. Gaborski

79 >> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[])
>> I = rand(5) I = Roger S. Gaborski

80 >> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[])
IS = Roger S. Gaborski

81 1 1 4 3 2 1 ORIGINAL IMAGE QUANTIZED 3 1 3 1 1 3 TO 4 LEVELS
>> [GS, IS] = graycomatrix(f,'NumLevels', 4, 'Offset', offsets, 'G',[]) GS = IS = ORIGINAL IMAGE QUANTIZED TO 4 LEVELS Roger S. Gaborski

82 Texture feature formula
Energy Provides the sum of squared elements in the GLCM. (square root of ASM) Entropy Measure uncertainty of the image(variations) Contrast Measures the local variations in the gray-level co-occurrence matrix. Homogeneity Measures the closeness of the distribution of elements in the GLCM to the GLCM diagonal. Roger S. Gaborski

83 glcms = graycomatrix(Im, 'NumLevels', 256, 'G',[])) stats = graycoprops(glcms, 'Contrast Correlation Homogeneity’); figure, plot([stats.Correlation]); title('Texture Correlation as a function of offset'); xlabel('Horizontal Offset'); ylabel('Correlation') Roger S. Gaborski

84 Feature for Each Matrix
Texture Measurement Quantize 256 Gray Levels to 32 Data Window 31x31 or 15x15 GLCM0 GLCM45 GLCM90 GLCM135 Feature for Each Matrix ENERGY ENTROPY CONTRAST etc Generate Feature Matrix For Each Roger S. Gaborski

85 image Ideal map Roger S. Gaborski

86 Classmaps generated using the 3 best CO feature images
Roger S. Gaborski

87 large errors at borders
31x31 produces the Best results, but large errors at borders Classmaps generated using the 7 best CO feature images Roger S. Gaborski

88 Law’s Texture Energy Features
Use texture energy for segmentation General idea: energy measured within textured regions of an image will produce different values for each texture providing a means for segmentation Two part process: Generate 2D kernels from 5 basis vectors Convolve images with kernels Roger S. Gaborski

89 Law’s Kernel Generation
Level L5 = [ ] Ripple R5 = [ 1 –4 6 –4 1 ] Edge E5 = [ -1 – ] To generate kernels, multiply one vector by the transpose of itself or another vector: L5E5 = [ ]’ * [ -1 – ] Spot S5 = [ –1 ] Wave W5 = [ ] 25 possible 2D kernels are possible, but only 24 are used L5L5 is sensitive to mean brightness values and is not used -1 -2 2 1 -4 -8 8 4 -6 -12 12 6 Roger S. Gaborski

90 Roger S. Gaborski

91 Roger S. Gaborski

92 Roger S. Gaborski

93 textureExample.m Reads in image Converts to double and grayscale
Create energy kernels Convolve with image Create data ‘cube’ Roger S. Gaborski

94 stone_building.jpg Roger S. Gaborski

95 Roger S. Gaborski

96 Roger S. Gaborski

97 Roger S. Gaborski

98 Test 2 Roger S. Gaborski

99 Roger S. Gaborski

100 Roger S. Gaborski

101 Scale How will scale affect energy measurements?
Reduce image to one quarter size imGraySm = imresize(imGray, 0.25, bicubic'); Roger S. Gaborski

102 Data ‘cube’ >> data = cat(3, im(:,:,1), im(:,:,2), im(:,:,3), imL5R5, imR5E5); >> figure, imshow(data(:,:,1:3)) >> data_value=data(7,12,:) data_value(:,:,1) = 142 data_value(:,:,2) = 166 data_value(:,:,3) = 194 data_value(:,:,4) = 22 data_value(:,:,5) = 10 Roger S. Gaborski

103 Fractal Dimension d c b a
Hurst coefficient can be used to calculate the fractal dimension of a surface The fractal dimension can be interpreted as a measure of texture Consider the 5 pixel wide neighborhood (13 pixels) d c b a Pixel Class Number Distance from center a 1 b 4 c 1.414 d 2 Roger S. Gaborski

104 Fractal Dimension Algorithm
Lay mask over original image Examine pixels in each of the classes Record the brightest and darkest for each class The pixel brightness difference (range) for each pixel class is used to generate the Hurst plot Use least squares fit to construct a ln distance vs ln range plot The slope of this line is the Hurst coefficient for the specific pixel Roger S. Gaborski


Download ppt "Introduction to Computer Vision Image Texture Analysis"

Similar presentations


Ads by Google