Download presentation
1
Digital Image Processing
Lecture 2 Intensity Transformations and Spatial Filtering Second Semester Azad University Islamshar Branch
2
Lecture Notes Basic Relationships Between Pixels and Introduction to the Mathematical Tools Some Basic Intensity Transformation function Histogram Processing Fundamentals of Spatial Filtering Smoothing Spatial Filtering Sharpening Spatial Filtering Combining Spatial Enhancement Methods Using Fuzzy Techniques for Intensity Transformations
3
Basic Relationships Between Pixels
1. Neighbors of pixel 4-neighbors of p, denoted by N4(p): 4 diagonal neighbors of p, denoted by ND(p): 8 neighbors of p, denoted N8(p)
4
4-neighbors of pixel 4-neighbors of pixel is denoted by N4(p)
It is set of horizontal and vertical neighbors (x-1) (x) (x+1) f(x, y) is a red circle f(x,y-1) is top one f(x-1,y) is left one f(x+1,y) is right one f(x,y+1) is bottom one (y-1) (y) (y+1)
5
4 diagonal neighbors of pixel
Diagonal neighbors of pixel is denoted by ND(p) It is set of diagonal neighbors f(x, y) , is a yellow circle f(x-1,y-1) is top-left one f(x+1,y-1) is top-right one f(x-1,y+1) is bottom-left one f(x+1,y+1)is bottom-right one (x-1) (x) (x+1) (y-1) (y) (y+1)
6
8-neighbors of pixel f(x,y) is a yellow circle
8-neighbors of pixel is denoted by N8(p) 4-neighbors and Diagonal neighbors of pixel (x-1) (x) (x+1) f(x,y) is a yellow circle (x-1,y-1), (x,y-1),(x+1,y-1), (x-1,y), (x,y), (x+1,y), (x-1,y+1),(x,y+1), (x+1,y+1) (y-1) (y) (y+1)
7
Basic Relationships Between Pixels 2. Connectivity
Let V is the set of intensity values used to define connectivity There are three type of connectivity: Connectivity : 2 pixels (p and q) with value in V are 4-connectivity if q is in the set N4(p)
8
Basic Relationships Between Pixels
2. Connectivity 2. 8-Connectivity : 2 pixels (p and q) with value in V are 8-connectivity if q is in the set N8(p) 3. m-Connectivity : 2 pixels (p and q) with value in V are m-connectivity if q is in N4(p), or q is in ND(p) and the set N4(p) ∩ N4(q) has no pixels whose values are from V.
9
Basic Relationships Between Pixels
3.Path A (digital) path (or curve) from pixel p with coordinates (x0, y0) to pixel q with coordinates (xn, yn) is a sequence of distinct pixels with coordinates (x0, y0), (x1, y1), …, (xn, yn) Where (xi, yi) and (xi-1, yi-1) are connected for 1 ≤ i ≤ n. Here n is the length of the path. If (x0, y0) = (xn, yn), the path is closed path. We can define 4-, 8-, and m-paths based on the type of connectivity used.
10
Examples: Adjacency and Path
V = {1, 2} 8-connectivity m-connectivity
11
Examples: Adjacency and Path
V = {1, 2}
12
Basic Relationships Between Pixels
4. Connected in S Let S represent a subset of pixels in an image. Two pixels p with coordinates (x0, y0) and q with coordinates (xn, yn) are said to be connected in S if there exists a path (x0, y0), (x1, y1), …, (xn, yn) Where:
13
Basic Relationships Between Pixels
Let S represent a subset of pixels in an image For every pixel p in S, the set of pixels in S that are connected to p is called a connected component of S. If S has only one connected component, then S is called Connected Set. We call R a region of the image if R is a connected set Two regions, Ri and Rj are said to be adjacent if their union forms a connected set. Regions that are not to be adjacent are said to be disjoint.
14
Basic Relationships Between Pixels
Boundary (or border) The boundary of the region R is the set of pixels in the region that have one or more neighbors that are not in R. If R happens to be an entire image, then its boundary is defined as the set of pixels in the first and last rows and columns of the image. Foreground and background An image contains K disjoint regions, 𝑅 𝑘 , k = 1, 2, …, K. Let 𝑅 𝑢 denote the union of all the K regions, and let 𝑅 𝑢 𝑐 denote its complement. All the points in 𝑅 𝑢 is called foreground; All the points in 𝑅 𝑢 𝑐 is called background.
15
Basic Relationships Between Pixels
Distance Measures Given pixels p, q and z with coordinates (x, y), (s, t), (u, v) respectively, the distance function D has following properties: D(p, q) ≥ [D(p, q) = 0, iff p = q] D(p, q) = D(q, p) D(p, z) ≤ D(p, q) + D(q, z)
16
Basic Relationships Between Pixels
Distance Measures The following are the different Distance measures: a. Euclidean Distance : De(p, q) = [(x-s)2 + (y-t)2]1/2 b. City Block Distance: D4(p, q) = |x-s| + |y-t| c. Chess Board Distance: D8(p, q) = max(|x-s|, |y-t|)
17
Introduction to Mathematical Tools in DIP
Array vs. Matrix Operation Array product Matrix product
18
Introduction to Mathematical Tools in DIP
Linear vs. Nonlinear Operation H is said to be a linear operator; H is said to be a nonlinear operator if it does not meet the above qualification.
19
Introduction to Mathematical Tools in DIP
Arithmetic Operations Arithmetic operations between images are array operations. The four arithmetic operations are denoted as s(x,y) = f(x,y) + g(x,y) d(x,y) = f(x,y) – g(x,y) p(x,y) = f(x,y) × g(x,y) v(x,y) = f(x,y) ÷ g(x,y)
20
Example: Arithmetic Operations (Subtraction)
21
Example : Image Multiplication
22
Spatial Operation 1. Single-pixel operations Alter the values of an image’s pixels based on the intensity (For example Negative of an image).
23
Spatial Operation 2. Neighborhood operations
24
Introduction to Mathematical Tools in DIP
Image Transform Given T(u, v), the original image f(x, y) can be recoverd using the inverse tranformation of T(u, v).
25
Image Transform
26
Example: Image Denoising by Using FFT Transform
27
Intensity Transformation
and Spatial Filtering
28
Spatial Domain Process
29
1. Intensity Transformation Function (Point Processing)
30
Example 1 : Image Negatives
31
Example 1 : Image Negatives
32
Example 2 : Log Transformations
33
Example 2 : Log Transformations
Log function has an important characteristic in which compress a dynamic range of images (Fourier Spectrum)
34
Example 3 : Power-Law (Gamma) Transformations
If γ <1 then the function maps a narrow range of dark input values into a wider range of output. If γ>1 then the function has opposite effect.
35
Example 3 - 1 : Gamma Correction (γ <1 )
36
Example 3 - 2 : Gamma Correction (γ <1 )
37
Example 3 - 3 : Gamma Correction (γ >1 )
38
Intensity-level Slicing
1. Intensity Transformation Function (Piecewise-Linear Transformations) Contrast Stretching — is a process that expand the range of intensity levels in an image so that it spans the full intensity range of the recording medium or display device. Intensity-level Slicing — Highlighting a specific range of intensities in an image often is of interest.
39
Contrast Stretching
40
Intensity-level Slicing
Can be implemented: By displaying in one value (white) all the values in the range of interest an in another in another (black) all other intensity. By transformation the brightness (darkness) the desired range of intensity and unchanged all other intensity level.
41
Example: Intensity-level Slicing
Highlight the major blood vessels and study the shape of the flow of the contrast medium (to detect blockages, etc.)
42
Histogram Processing Histogram Equalization Histogram Matching
Local Histogram Processing Using Histogram Statistics for Image Enhancement
43
Histogram Processing
44
Some Typical Histograms
· The shape of a histogram provides useful information for contrast enhancement. Dark image
45
Some Typical Histograms
Bright image Low contrast image
46
Some Typical Histograms
High contrast image
47
Histogram Equalization
What is the histogram equalization? Basic idea: find a map such that the histogram of the modified (equalized) image is flat (uniform). Let us assume for the moment that the input image to be enhanced has continuous gray values, with r = 0 representing black and r = 1 representing white. We need to design a gray value transformation s = T(r), based on the histogram of the input image, which will enhance the image.
48
Histogram Equalization
T(r) assume to be : 1) a monotonically increasing function for 0 £ r £ 1 (preserves order from black to white). 2) maps [0,1] into [0,1] (preserves the range of allowed Gray values).
49
Histogram Equalization
Key motivation: cumulative probability function (CDF) of a random variable approximates a uniform distribution We consider the gray values in the input image and output image as random variables in the interval [0, 1]. Let 𝑝 𝑖𝑛 (𝑟) and 𝑝 𝑜𝑢𝑡 (𝑠) denote the probability density of the gray values in the input and output images. We consider the transformation: Thus, using a transformation function equal to the CDF of input gray values r, we can obtain an image with uniform gray values.
50
Histogram Equalization
How to implement histogram equalization? Step 1: For images with discrete gray values, compute: 𝐿 : Total number of gray levels 𝑛 𝑘 : Number of pixels with gray value 𝑟 𝑘 𝑛 : Total number of pixels in the image Step 2: Based on CDF, compute the discrete version of the previous transformation :
51
Histogram Equalization
Example 1: Consider an 8-level 64 x 64 image with gray values (0, 1, …,7). The normalized gray values are (0, 1/7, 2/7, …, 1). The normalized histogram is given below:
52
Histogram Equalization
Example 1: Normalized gray value Fraction of # pixels Gray value # pixels
53
Histogram Equalization
Example 1:
54
Histogram Equalization
Example 1: · Notice that there are only five distinct gray levels --- (1/7, 3/7, 5/7, 6/7, 1) in the output image. We will relabelethem as ( 𝑠 0 , 𝑠 1 , 𝑠 2 , 𝑠 3 , 𝑠 4 ). · With this transformation, the output image will have histogram
55
Histogram Equalization
Example 1: Histogram of output image # pixels Gray values
56
Histogram Equalization
Example 2: before after
57
Is histogram equalization always good?
No
58
Histogram Matching (histogram specification)
Generate a processed image that has a specified histogram
59
Histogram Matching (histogram specification)
60
Histogram Matching (Procedure)
Obtain pr(r) from the input image and then obtain the values of s (scaled histogram-equalized) Use the specified PDF and obtain the transformation function G(z) Mapping from s to z
61
Histogram Matching (example)
Assuming continuous intensity values, suppose that an image has the intensity PDF Find the transformation function that will produce an image whose intensity PDF is
62
Histogram Matching (example)
Find the histogram equalization transformation for the input image Find the histogram equalization transformation for the specified histogram The transformation function
63
Histogram Matching (Discrete Cases)
Obtain pr(rj) from the input image and then obtain the values of sk, round the value to the integer range [0, L-1]. Use the specified PDF and obtain the transformation function G(zq), round the value to the integer range [0, L-1]. Mapping from sk to zq
64
Example: Histogram Matching
Suppose that a 3-bit image (L=8) of size 64 × 64 pixels (MN = 4096) has the intensity distribution shown in the following table (on the left). Get the histogram transformation function and make the output image with the specified histogram, listed in the table on the right.
65
Example: Histogram Matching
Obtain the scaled histogram-equalized values, Compute all the values of the transformation function G,
66
Example: Histogram Matching
67
Example: Histogram Matching
68
Example: Histogram Matching
69
Example: Histogram Matching
70
Example: Histogram Matching
71
Example: Histogram Matching
72
Local Histogram Processing
Define a neighborhood and move its center from pixel to pixel At each location, the histogram of the points in the neighborhood is computed. Either histogram equalization or histogram specification transformation function is obtained Map the intensity of the pixel centered in the neighborhood Move to the next location and repeat the procedure
73
Local Histogram Processing
74
Using Histogram Statistics for Image Enhancement
75
Using Histogram Statistics for Image Enhancement
Average Intensity Variance
76
Using Histogram Statistics for Image Enhancement(Example)
77
Spatial Filtering Introduction
Some neighborhood operations work with the values of the image pixels in the neighborhood and the corresponding values of a sub-image that has the same dimensions as the neighborhood. The sub-image is called a filter, mask, kernel, template, or window. The values in a filter sub-image are referred to as coefficients rather than pixels. These neighborhood operations consists of: Defining a center point, (x,y) 2. Performing an operation that involves only the pixels in a predefined neighborhood about that center point. 3. Letting the result of that operation be the “response” of the process at that point. 4. Repeating the process for every point in the image.
78
Spatial Filtering Introduction
The process of moving the center point creates new neighborhoods, one for each pixel in the input image. The two principal terms used to identify this operation are neighborhood processing or spatial filtering, with the second term being more common. If the computations performed on the pixels of the neighborhoods are linear, the operation is called linear spatial filtering; otherwise it is called nonlinear spatial filtering.
79
Spatial Filtering Introduction
The mechanic of linear spatial filtering R = w (-1,-1) f(x-1, y-1) + w(-1,0) f (x-1, y) + … + w(0,0) f(0,0) + … + w(1,0) f(x+1, y) + w(1,1) f(x+1, y+1)
80
Spatial Filtering Introduction
There are two closely related concepts that must be understood clearly when performing linear spatial filtering. One is correlation; the other is convolution. Correlation is the process of passing the mask w by the image array f. Mechanically, convolution is the same process, except that w is rotated by 180 degrees prior to passing it by f.
81
Spatial Filtering Introduction
82
Spatial Filtering Introduction (correlation Vs. convolution)
83
Spatial Filtering Smoothing Spatial Filters
Smoothing filters are used for blurring and for noise reduction. Blurring is used in removal of small details and bridging of small gaps in lines or curves. Smoothing spatial filters include linear filters and nonlinear filters. The output (response) of a smoothing and linear spatial filter is simply the average of the pixels contained in the neighborhood of the filter mask
84
Spatial Filtering Smoothing Spatial Filters
85
Spatial Filtering Smoothing Spatial Filters
86
Spatial Filtering Smoothing Spatial Filters (Example)
87
Spatial Filtering Smoothing Spatial Filters (Example)
Using a spatial averaging filter blends small objects with the background and larger objects become easy to detect .
88
Spatial Filtering Order-statistic (Nonlinear) Filters — Nonlinear
— Based on ordering (ranking) the pixels contained in the filter mask — Replacing the value of the center pixel with the value determined by the ranking result — E.g., median filter, max filter, min filter — Median filter provides excellent noise reduction capabilities with considerably less blurring than linear smoothing filter of similar size.
89
Spatial Filtering Order-statistic (Nonlinear) Filters (Example)
90
Spatial Filtering Sharpening Spatial Filters
The principal objective of sharpening is to highlight transitions in intensity. Sharpening filter is based on first and second derivatives First derivative: 1) Must be zero in flat segment 2) Must be nonzero along ramps. 3) Must be nonzero at the onset of a gray-level step or ramp Second derivative: 2) Must be zero along ramps. and end of a gray-level step or ramp
91
Spatial Filtering Sharpening Spatial Filters
92
Spatial Filtering Sharpening Spatial Filters
Comparing the response between first- and second-ordered derivatives: First-order derivative produce thicker edge 2) Second-order derivative have a stronger response to fine detail, such as thin lines and isolated points. 3) First-order derivatives generally have a stronger response to a gray-level step. 4) Second-order derivatives produce a double response at step changes in gray level. In general the second derivative is better than the first derivative for image enhancement. The principle use of first derivative is for edge extraction.
93
Spatial Filtering Sharpening Spatial Filters (Laplace Operator)
The second-order isotropic derivative operator is the Laplacian for a function (image) f(x,y)
94
Spatial Filtering Sharpening Spatial Filters (Laplace Operator)
95
Spatial Filtering Sharpening Spatial Filters (Laplace Operator)
The basic way to sharpening an image using the Laplacian is as the following:
96
Spatial Filtering Sharpening Spatial Filters (Laplace Operator)
97
Spatial Filtering Un-sharp Masking and High-boost Filtering
A process for sharpening images that consists of subtracting an un-sharp (smoothed) version of an image from the original image. Consists of the following steps: 1. Blur the original image 2. Subtract the blurred image from the original 3. Add the mask to the original
98
Spatial Filtering Un-sharp Masking and High-boost Filtering
99
Spatial Filtering Un-sharp Masking and High-boost Filtering
100
Spatial Filtering Un-sharp Masking and High-boost Filtering
101
Spatial Filtering
102
Homework 2 : Take a grey level image and find the histogram of the image and plot the histogram. Calculate the mean, variance and histogram of your image. Show the histogram as a bar plot next to your image (use the subplot command). Indicate the size of your image. A 4x4 image is given as follow. The image is transformed using the point transform shown. Find the pixel values of the output image.
103
Homework 3 : 1- a) Equalize the histogram of the 8 × 8 image below. The image has grey levels 0, 1, , 7. 1-b) Show the original image and the histogram – equalized image 2- a) Perform histogram equalization given the following histogram. ( r=Gray level, n=number of occurrences)
104
Homework 3 : 2 – b) Perform histogram specification of the previous histogram using the specified histogram shown in the following table. (r=Gray level, p=probability of occurrences)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.