Digital Image Processing

Slides:



Advertisements
Similar presentations
Digital Image Processing
Advertisements

Grey Level Enhancement Contrast stretching Linear mapping Non-linear mapping Efficient implementation of mapping algorithms Design of classes to support.
Linear Filtering – Part I Selim Aksoy Department of Computer Engineering Bilkent University
Spatial Filtering (Chapter 3)
Image Processing Lecture 4
CS & CS Multimedia Processing Lecture 2. Intensity Transformation and Spatial Filtering Spring 2009.
Spatial Filtering.
Chapter 3 Image Enhancement in the Spatial Domain.
Lecture 6 Sharpening Filters
Chapter - 2 IMAGE ENHANCEMENT
Intensity Transformations (Chapter 3)
EE663 Image Processing Histogram Equalization Dr. Samir H. Abdul-Jauwad Electrical Engineering Department King Fahd University of Petroleum & Minerals.
Digital Image Processing
Image Enhancement in the Spatial Domain II Jen-Chang Liu, 2006.
Image Enhancement in the Spatial Domain
Intensity Transformations
Digital Image Processing
BYST Eh-1 DIP - WS2002: Enhancement in the Spatial Domain Digital Image Processing Bundit Thipakorn, Ph.D. Computer Engineering Department Image Enhancement.
Lecture 4 Digital Image Enhancement
Digital Image Processing In The Name Of God Digital Image Processing Lecture3: Image enhancement M. Ghelich Oghli By: M. Ghelich Oghli
Chapter 3: Image Enhancement in the Spatial Domain
Image Enhancement To process an image so that the result is more suitable than the original image for a specific application. Spatial domain methods and.
6/9/2015Digital Image Processing1. 2 Example Histogram.
Digital Image Processing
Digital Image Processing
Image Enhancement.
Image Analysis Preprocessing Arithmetic and Logic Operations Spatial Filters Image Quantization.
Lecture 2. Intensity Transformation and Spatial Filtering
Chapter 3 Image Enhancement in the Spatial Domain.
ECE 472/572 - Digital Image Processing Lecture 4 - Image Enhancement - Spatial Filter 09/06/11.
Chapter 3 (cont).  In this section several basic concepts are introduced underlying the use of spatial filters for image processing.  Mainly spatial.
Lecture 4 Digital Image Enhancement
Chap2 Image enhancement (Spatial domain)
 Image Enhancement in Spatial Domain.  Spatial domain process on images can be described as g(x, y) = T[f(x, y)] ◦ where f(x,y) is the input image,
Chapter 10: Image Segmentation
Presentation Image Filters
Spatial-based Enhancements Lecture 3 prepared by R. Lathrop 10/99 updated 10/03 ERDAS Field Guide 6th Ed. Ch 5: ;
Spatial Filtering: Basics
University of Ioannina - Department of Computer Science Intensity Transformations (Point Processing) Christophoros Nikou Digital Image.
Digital Image Fundamentals II 1.Image modeling and representations 2.Pixels and Pixel relations 3.Arithmetic operations of images 4.Image geometry operation.
Chapter 3 Image Enhancement in the Spatial Domain.
Digital Image Processing Lecture12: Basics of Spatial Filtering.
Linear Filtering – Part I Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Processing CCS331 Relationships of Pixel 1.
DIGITAL IMAGE PROCESSING
Digital Image Processing Lecture 5: Neighborhood Processing: Spatial Filtering Prof. Charlene Tsai.
Intensity Transformations or Translation in Spatial Domain.
Digital Image Processing, 2nd ed. © 2002 R. C. Gonzalez & R. E. Woods  Process an image so that the result will be more suitable.
Chapter 5: Neighborhood Processing
Lecture 5 Mask/Filter Transformation 1.The concept of mask/filters 2.Mathematical model of filtering Correlation, convolution 3.Smoother filters 4.Filter.
Spatial Filtering.
Ch5 Image Restoration CS446 Instructor: Nada ALZaben.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Chapter 3 Intensity Transformations.
Image Subtraction Mask mode radiography h(x,y) is the mask.
Digital Image Processing EEE415 Lecture 3
Digital Image Processing Lecture 5: Neighborhood Processing: Spatial Filtering March 9, 2004 Prof. Charlene Tsai.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities May 2, 2005 Prof. Charlene Tsai.
Lecture Reading  3.1 Background  3.2 Some Basic Gray Level Transformations Some Basic Gray Level Transformations  Image Negatives  Log.
Digital Image Processing Lecture 4: Image Enhancement: Point Processing January 13, 2004 Prof. Charlene Tsai.
EE 7730 Image Enhancement. Bahadir K. Gunturk2 Image Enhancement The objective of image enhancement is to process an image so that the result is more.
Digital Image Processing Image Enhancement in Spatial Domain
Image Processing Lab Section 28/3/2016 Prepared by Mahmoud Abdelsatar Demonstrator at IT Dep. Faculty of computers and information Assuit University.
Spatial Filtering (Chapter 3) CS474/674 - Prof. Bebis.
Image Enhancement in the Spatial Domain.
Image Subtraction Mask mode radiography h(x,y) is the mask.
Fundamentals of Spatial Filtering:
Image Enhancement in the Spatial Domain
Lecture 3 (2.5.07) Image Enhancement in Spatial Domain
CSC 381/481 Quarter: Fall 03/04 Daniela Stan Raicu
Image Enhancement in the Spatial Domain
Presentation transcript:

Digital Image Processing Lecture 2 Intensity Transformations and Spatial Filtering Second Semester Azad University Islamshar Branch Y.ebrahimdoost@iiau.ac.ir

Lecture Notes Basic Relationships Between Pixels and Introduction to the Mathematical Tools Some Basic Intensity Transformation function Histogram Processing Fundamentals of Spatial Filtering Smoothing Spatial Filtering Sharpening Spatial Filtering Combining Spatial Enhancement Methods Using Fuzzy Techniques for Intensity Transformations

Basic Relationships Between Pixels 1. Neighbors of pixel 4-neighbors of p, denoted by N4(p): 4 diagonal neighbors of p, denoted by ND(p): 8 neighbors of p, denoted N8(p)

4-neighbors of pixel 4-neighbors of pixel is denoted by N4(p) It is set of horizontal and vertical neighbors (x-1) (x) (x+1) f(x, y) is a red circle f(x,y-1) is top one f(x-1,y) is left one f(x+1,y) is right one f(x,y+1) is bottom one (y-1) (y) (y+1)

4 diagonal neighbors of pixel Diagonal neighbors of pixel is denoted by ND(p) It is set of diagonal neighbors f(x, y) , is a yellow circle f(x-1,y-1) is top-left one f(x+1,y-1) is top-right one f(x-1,y+1) is bottom-left one f(x+1,y+1)is bottom-right one (x-1) (x) (x+1) (y-1) (y) (y+1)

8-neighbors of pixel f(x,y) is a yellow circle 8-neighbors of pixel is denoted by N8(p) 4-neighbors and Diagonal neighbors of pixel (x-1) (x) (x+1) f(x,y) is a yellow circle (x-1,y-1), (x,y-1),(x+1,y-1), (x-1,y), (x,y), (x+1,y), (x-1,y+1),(x,y+1), (x+1,y+1) (y-1) (y) (y+1)

Basic Relationships Between Pixels 2. Connectivity Let V is the set of intensity values used to define connectivity There are three type of connectivity: 1. 4-Connectivity : 2 pixels (p and q) with value in V are 4-connectivity if q is in the set N4(p)

Basic Relationships Between Pixels 2. Connectivity 2. 8-Connectivity : 2 pixels (p and q) with value in V are 8-connectivity if q is in the set N8(p) 3. m-Connectivity : 2 pixels (p and q) with value in V are m-connectivity if q is in N4(p), or q is in ND(p) and the set N4(p) ∩ N4(q) has no pixels whose values are from V.

Basic Relationships Between Pixels 3.Path A (digital) path (or curve) from pixel p with coordinates (x0, y0) to pixel q with coordinates (xn, yn) is a sequence of distinct pixels with coordinates (x0, y0), (x1, y1), …, (xn, yn) Where (xi, yi) and (xi-1, yi-1) are connected for 1 ≤ i ≤ n. Here n is the length of the path. If (x0, y0) = (xn, yn), the path is closed path. We can define 4-, 8-, and m-paths based on the type of connectivity used.

Examples: Adjacency and Path V = {1, 2} 0 1 1 0 1 1 0 1 1 0 2 0 0 2 0 0 2 0 0 0 1 0 0 1 0 0 1 8-connectivity m-connectivity

Examples: Adjacency and Path V = {1, 2}

Basic Relationships Between Pixels 4. Connected in S Let S represent a subset of pixels in an image. Two pixels p with coordinates (x0, y0) and q with coordinates (xn, yn) are said to be connected in S if there exists a path (x0, y0), (x1, y1), …, (xn, yn) Where:

Basic Relationships Between Pixels Let S represent a subset of pixels in an image For every pixel p in S, the set of pixels in S that are connected to p is called a connected component of S. If S has only one connected component, then S is called Connected Set. We call R a region of the image if R is a connected set Two regions, Ri and Rj are said to be adjacent if their union forms a connected set. Regions that are not to be adjacent are said to be disjoint.

Basic Relationships Between Pixels Boundary (or border) The boundary of the region R is the set of pixels in the region that have one or more neighbors that are not in R. If R happens to be an entire image, then its boundary is defined as the set of pixels in the first and last rows and columns of the image. Foreground and background An image contains K disjoint regions, 𝑅 𝑘 , k = 1, 2, …, K. Let 𝑅 𝑢 denote the union of all the K regions, and let 𝑅 𝑢 𝑐 denote its complement. All the points in 𝑅 𝑢 is called foreground; All the points in 𝑅 𝑢 𝑐 is called background.

Basic Relationships Between Pixels Distance Measures Given pixels p, q and z with coordinates (x, y), (s, t), (u, v) respectively, the distance function D has following properties: D(p, q) ≥ 0 [D(p, q) = 0, iff p = q] D(p, q) = D(q, p) D(p, z) ≤ D(p, q) + D(q, z)

Basic Relationships Between Pixels Distance Measures The following are the different Distance measures: a. Euclidean Distance : De(p, q) = [(x-s)2 + (y-t)2]1/2 b. City Block Distance: D4(p, q) = |x-s| + |y-t| c. Chess Board Distance: D8(p, q) = max(|x-s|, |y-t|)

Introduction to Mathematical Tools in DIP Array vs. Matrix Operation Array product Matrix product

Introduction to Mathematical Tools in DIP Linear vs. Nonlinear Operation H is said to be a linear operator; H is said to be a nonlinear operator if it does not meet the above qualification.

Introduction to Mathematical Tools in DIP Arithmetic Operations Arithmetic operations between images are array operations. The four arithmetic operations are denoted as s(x,y) = f(x,y) + g(x,y) d(x,y) = f(x,y) – g(x,y) p(x,y) = f(x,y) × g(x,y) v(x,y) = f(x,y) ÷ g(x,y)

Example: Arithmetic Operations (Subtraction)

Example : Image Multiplication

Spatial Operation 1. Single-pixel operations Alter the values of an image’s pixels based on the intensity (For example Negative of an image).

Spatial Operation 2. Neighborhood operations

Introduction to Mathematical Tools in DIP Image Transform Given T(u, v), the original image f(x, y) can be recoverd using the inverse tranformation of T(u, v).

Image Transform

Example: Image Denoising by Using FFT Transform

Intensity Transformation and Spatial Filtering

Spatial Domain Process

1. Intensity Transformation Function (Point Processing)

Example 1 : Image Negatives

Example 1 : Image Negatives

Example 2 : Log Transformations

Example 2 : Log Transformations Log function has an important characteristic in which compress a dynamic range of images (Fourier Spectrum)

Example 3 : Power-Law (Gamma) Transformations If γ <1 then the function maps a narrow range of dark input values into a wider range of output. If γ>1 then the function has opposite effect.

Example 3 - 1 : Gamma Correction (γ <1 )

Example 3 - 2 : Gamma Correction (γ <1 )

Example 3 - 3 : Gamma Correction (γ >1 )

Intensity-level Slicing 1. Intensity Transformation Function (Piecewise-Linear Transformations) Contrast Stretching — is a process that expand the range of intensity levels in an image so that it spans the full intensity range of the recording medium or display device. Intensity-level Slicing — Highlighting a specific range of intensities in an image often is of interest.

Contrast Stretching

Intensity-level Slicing Can be implemented: By displaying in one value (white) all the values in the range of interest an in another in another (black) all other intensity. By transformation the brightness (darkness) the desired range of intensity and unchanged all other intensity level.

Example: Intensity-level Slicing Highlight the major blood vessels and study the shape of the flow of the contrast medium (to detect blockages, etc.)

Histogram Processing Histogram Equalization Histogram Matching Local Histogram Processing Using Histogram Statistics for Image Enhancement

Histogram Processing

Some Typical Histograms · The shape of a histogram provides useful information for contrast enhancement. Dark image

Some Typical Histograms Bright image Low contrast image

Some Typical Histograms High contrast image

Histogram Equalization What is the histogram equalization? Basic idea: find a map such that the histogram of the modified (equalized) image is flat (uniform). Let us assume for the moment that the input image to be enhanced has continuous gray values, with r = 0 representing black and r = 1 representing white. We need to design a gray value transformation s = T(r), based on the histogram of the input image, which will enhance the image.

Histogram Equalization T(r) assume to be : 1) a monotonically increasing function for 0 £ r £ 1 (preserves order from black to white). 2) maps [0,1] into [0,1] (preserves the range of allowed Gray values).

Histogram Equalization Key motivation: cumulative probability function (CDF) of a random variable approximates a uniform distribution We consider the gray values in the input image and output image as random variables in the interval [0, 1]. Let 𝑝 𝑖𝑛 (𝑟) and 𝑝 𝑜𝑢𝑡 (𝑠) denote the probability density of the gray values in the input and output images. We consider the transformation: Thus, using a transformation function equal to the CDF of input gray values r, we can obtain an image with uniform gray values.

Histogram Equalization How to implement histogram equalization? Step 1: For images with discrete gray values, compute: 𝐿 : Total number of gray levels 𝑛 𝑘 : Number of pixels with gray value 𝑟 𝑘 𝑛 : Total number of pixels in the image Step 2: Based on CDF, compute the discrete version of the previous transformation :

Histogram Equalization Example 1: Consider an 8-level 64 x 64 image with gray values (0, 1, …,7). The normalized gray values are (0, 1/7, 2/7, …, 1). The normalized histogram is given below:

Histogram Equalization Example 1: Normalized gray value Fraction of # pixels Gray value # pixels

Histogram Equalization Example 1:

Histogram Equalization Example 1: · Notice that there are only five distinct gray levels --- (1/7, 3/7, 5/7, 6/7, 1) in the output image. We will relabelethem as ( 𝑠 0 , 𝑠 1 , 𝑠 2 , 𝑠 3 , 𝑠 4 ). · With this transformation, the output image will have histogram

Histogram Equalization Example 1: Histogram of output image # pixels Gray values

Histogram Equalization Example 2: before after

Is histogram equalization always good? No

Histogram Matching (histogram specification) Generate a processed image that has a specified histogram

Histogram Matching (histogram specification)

Histogram Matching (Procedure) Obtain pr(r) from the input image and then obtain the values of s (scaled histogram-equalized) Use the specified PDF and obtain the transformation function G(z) Mapping from s to z

Histogram Matching (example) Assuming continuous intensity values, suppose that an image has the intensity PDF Find the transformation function that will produce an image whose intensity PDF is

Histogram Matching (example) Find the histogram equalization transformation for the input image Find the histogram equalization transformation for the specified histogram The transformation function

Histogram Matching (Discrete Cases) Obtain pr(rj) from the input image and then obtain the values of sk, round the value to the integer range [0, L-1]. Use the specified PDF and obtain the transformation function G(zq), round the value to the integer range [0, L-1]. Mapping from sk to zq

Example: Histogram Matching Suppose that a 3-bit image (L=8) of size 64 × 64 pixels (MN = 4096) has the intensity distribution shown in the following table (on the left). Get the histogram transformation function and make the output image with the specified histogram, listed in the table on the right.

Example: Histogram Matching Obtain the scaled histogram-equalized values, Compute all the values of the transformation function G,

Example: Histogram Matching

Example: Histogram Matching

Example: Histogram Matching

Example: Histogram Matching

Example: Histogram Matching

Example: Histogram Matching

Local Histogram Processing Define a neighborhood and move its center from pixel to pixel At each location, the histogram of the points in the neighborhood is computed. Either histogram equalization or histogram specification transformation function is obtained Map the intensity of the pixel centered in the neighborhood Move to the next location and repeat the procedure

Local Histogram Processing

Using Histogram Statistics for Image Enhancement

Using Histogram Statistics for Image Enhancement Average Intensity Variance

Using Histogram Statistics for Image Enhancement(Example)

Spatial Filtering Introduction Some neighborhood operations work with the values of the image pixels in the neighborhood and the corresponding values of a sub-image that has the same dimensions as the neighborhood. The sub-image is called a filter, mask, kernel, template, or window. The values in a filter sub-image are referred to as coefficients rather than pixels. These neighborhood operations consists of: Defining a center point, (x,y) 2. Performing an operation that involves only the pixels in a predefined neighborhood about that center point. 3. Letting the result of that operation be the “response” of the process at that point. 4. Repeating the process for every point in the image.

Spatial Filtering Introduction The process of moving the center point creates new neighborhoods, one for each pixel in the input image. The two principal terms used to identify this operation are neighborhood processing or spatial filtering, with the second term being more common. If the computations performed on the pixels of the neighborhoods are linear, the operation is called linear spatial filtering; otherwise it is called nonlinear spatial filtering.

Spatial Filtering Introduction The mechanic of linear spatial filtering R = w (-1,-1) f(x-1, y-1) + w(-1,0) f (x-1, y) + … + w(0,0) f(0,0) + … + w(1,0) f(x+1, y) + w(1,1) f(x+1, y+1)

Spatial Filtering Introduction There are two closely related concepts that must be understood clearly when performing linear spatial filtering. One is correlation; the other is convolution. Correlation is the process of passing the mask w by the image array f. Mechanically, convolution is the same process, except that w is rotated by 180 degrees prior to passing it by f.

Spatial Filtering Introduction

Spatial Filtering Introduction (correlation Vs. convolution)

Spatial Filtering Smoothing Spatial Filters Smoothing filters are used for blurring and for noise reduction. Blurring is used in removal of small details and bridging of small gaps in lines or curves. Smoothing spatial filters include linear filters and nonlinear filters. The output (response) of a smoothing and linear spatial filter is simply the average of the pixels contained in the neighborhood of the filter mask

Spatial Filtering Smoothing Spatial Filters

Spatial Filtering Smoothing Spatial Filters

Spatial Filtering Smoothing Spatial Filters (Example)

Spatial Filtering Smoothing Spatial Filters (Example) Using a spatial averaging filter blends small objects with the background and larger objects become easy to detect .

Spatial Filtering Order-statistic (Nonlinear) Filters — Nonlinear — Based on ordering (ranking) the pixels contained in the filter mask — Replacing the value of the center pixel with the value determined by the ranking result — E.g., median filter, max filter, min filter — Median filter provides excellent noise reduction capabilities with considerably less blurring than linear smoothing filter of similar size.

Spatial Filtering Order-statistic (Nonlinear) Filters (Example)

Spatial Filtering Sharpening Spatial Filters The principal objective of sharpening is to highlight transitions in intensity. Sharpening filter is based on first and second derivatives First derivative: 1) Must be zero in flat segment 2) Must be nonzero along ramps. 3) Must be nonzero at the onset of a gray-level step or ramp Second derivative: 2) Must be zero along ramps. and end of a gray-level step or ramp

Spatial Filtering Sharpening Spatial Filters

Spatial Filtering Sharpening Spatial Filters Comparing the response between first- and second-ordered derivatives: First-order derivative produce thicker edge 2) Second-order derivative have a stronger response to fine detail, such as thin lines and isolated points. 3) First-order derivatives generally have a stronger response to a gray-level step. 4) Second-order derivatives produce a double response at step changes in gray level. In general the second derivative is better than the first derivative for image enhancement. The principle use of first derivative is for edge extraction.

Spatial Filtering Sharpening Spatial Filters (Laplace Operator) The second-order isotropic derivative operator is the Laplacian for a function (image) f(x,y)

Spatial Filtering Sharpening Spatial Filters (Laplace Operator)

Spatial Filtering Sharpening Spatial Filters (Laplace Operator) The basic way to sharpening an image using the Laplacian is as the following:

Spatial Filtering Sharpening Spatial Filters (Laplace Operator)

Spatial Filtering Un-sharp Masking and High-boost Filtering A process for sharpening images that consists of subtracting an un-sharp (smoothed) version of an image from the original image. Consists of the following steps: 1. Blur the original image 2. Subtract the blurred image from the original 3. Add the mask to the original

Spatial Filtering Un-sharp Masking and High-boost Filtering

Spatial Filtering Un-sharp Masking and High-boost Filtering

Spatial Filtering Un-sharp Masking and High-boost Filtering

Spatial Filtering

Homework 2 : Take a grey level image and find the histogram of the image and plot the histogram. Calculate the mean, variance and histogram of your image. Show the histogram as a bar plot next to your image (use the subplot command). Indicate the size of your image. A 4x4 image is given as follow. The image is transformed using the point transform shown. Find the pixel values of the output image.

Homework 3 : 1- a) Equalize the histogram of the 8 × 8 image below. The image has grey levels 0, 1, . . . , 7. 1-b) Show the original image and the histogram – equalized image 2- a) Perform histogram equalization given the following histogram. ( r=Gray level, n=number of occurrences)

Homework 3 : 2 – b) Perform histogram specification of the previous histogram using the specified histogram shown in the following table. (r=Gray level, p=probability of occurrences)