Digital Image Processing Lecture12: Basics of Spatial Filtering.

Slides:



Advertisements
Similar presentations
Polynomial Functions and Their Graphs
Advertisements

Image Enhancement in the Frequency Domain (2)
Neighborhood Processing
Digital Image Processing
Spatial Filtering (Chapter 3)
Image Processing Lecture 4
CS & CS Multimedia Processing Lecture 2. Intensity Transformation and Spatial Filtering Spring 2009.
Spatial Filtering.
Image Filtering. Outline Outline Concept of image filter  Focus on spatial image filter Various types of image filter  Smoothing, noise reductions 
Chapter 3 Image Enhancement in the Spatial Domain.
Hongliang Li, Senior Member, IEEE, Linfeng Xu, Member, IEEE, and Guanghui Liu Face Hallucination via Similarity Constraints.
Regional Processing There goes the neighborhood. Overview  Under point processing a single input sample is processed to produce a single output sample.
Sliding Window Filters and Edge Detection Longin Jan Latecki Computer Graphics and Image Processing CIS 601 – Fall 2004.
Image Enhancement To process an image so that the result is more suitable than the original image for a specific application. Spatial domain methods and.
6/9/2015Digital Image Processing1. 2 Example Histogram.
Image Filtering CS485/685 Computer Vision Prof. George Bebis.
7. Neighbourhood operations A single pixel considered in isolation conveys information on the intensity and colour at a single location in an image, but.
Image Enhancement in the Frequency Domain Part I Image Enhancement in the Frequency Domain Part I Dr. Samir H. Abdul-Jauwad Electrical Engineering Department.
CS 376b Introduction to Computer Vision 02 / 27 / 2008 Instructor: Michael Eckmann.
1 Image Filtering Readings: Ch 5: 5.4, 5.5, 5.6,5.7.3, 5.8 (This lecture does not follow the book.) Images by Pawan SinhaPawan Sinha formal terminology.
Preprocessing ROI Image Geometry
Arrays. Objectives Learn about arrays Explore how to declare and manipulate data into arrays Learn about “array index out of bounds” Become familiar with.
Lecture 2. Intensity Transformation and Spatial Filtering
Image processing Lecture 4.
Chapter 3 (cont).  In this section several basic concepts are introduced underlying the use of spatial filters for image processing.  Mainly spatial.
Presentation Image Filters
Digital Image Processing
Spatial Filtering: Basics
Basis beeldverwerking (8D040) dr. Andrea Fuster Prof.dr. Bart ter Haar Romeny dr. Anna Vilanova Prof.dr.ir. Marcel Breeuwer Convolution.
1 ITK Lecture 8 - Neighborhoods Methods in Image Analysis CMU Robotics Institute U. Pitt Bioengineering 2630 Spring Term, 2006 Methods in Image.
Digital Image Processing Lecture 5: Neighborhood Processing: Spatial Filtering Prof. Charlene Tsai.
Lecture 03 Area Based Image Processing Lecture 03 Area Based Image Processing Mata kuliah: T Computer Vision Tahun: 2010.
Digital Image Processing Lecture4: Fundamentals. Digital Image Representation An image can be defined as a two- dimensional function, f(x,y), where x.
Chapter 5: Neighborhood Processing
Lecture 5 Mask/Filter Transformation 1.The concept of mask/filters 2.Mathematical model of filtering Correlation, convolution 3.Smoother filters 4.Filter.
Spatial Filtering.
Ch5 Image Restoration CS446 Instructor: Nada ALZaben.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Chapter 3 Intensity Transformations.
CSC508 Convolution Operators. CSC508 Convolution Arguably the most fundamental operation of computer vision It’s a neighborhood operator –Similar to the.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Frequency Domain Processing Lecture: 3. In image processing, linear systems are at the heart of many filtering operations, and they provide the basis.
Digital Image Processing Lecture 12: Image Topology (Suppl)
Chapter 5: Neighborhood Processing 5.1 Introduction
Digital Image Processing Lecture 5: Neighborhood Processing: Spatial Filtering March 9, 2004 Prof. Charlene Tsai.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities May 2, 2005 Prof. Charlene Tsai.
 Mathematical morphology is a tool for extracting image components that are useful in the representation and description of region shape, such as boundaries,
Machine Vision Edge Detection Techniques ENT 273 Lecture 6 Hema C.R.
Digital Image Processing Image Enhancement in Spatial Domain
Gholamreza Anbarjafari, PhD Video Lecturers on Digital Image Processing Digital Image Processing Spatial Domain Filtering: Part I.
1 © 2010 Cengage Learning Engineering. All Rights Reserved. 1 Introduction to Digital Image Processing with MATLAB ® Asia Edition McAndrew ‧ Wang ‧ Tseng.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Chapter 11 Representation and.
Polynomial Functions and Their Graphs. Definition of a Polynomial Function Let n be a nonnegative integer and let a n, a n- 1,…, a 2, a 1, a 0, be real.
Filters– Chapter 6. Filter Difference between a Filter and a Point Operation is that a Filter utilizes a neighborhood of pixels from the input image to.
Image Processing Lab Section 28/3/2016 Prepared by Mahmoud Abdelsatar Demonstrator at IT Dep. Faculty of computers and information Assuit University.
Image Enhancement in the Spatial Domain.
1 Chapter 3 Intensity Transformations and Spatial Filtering Chapter 3 Intensity Transformations and Spatial Filtering Objectives: We will learn about different.
Basic Principles Photogrammetry V: Image Convolution & Moving Window:
Basis beeldverwerking (8D040) dr. Andrea Fuster dr. Anna Vilanova Prof
Fundamentals of Spatial Filtering:
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Intensity Transformations and Spatial Filtering
Image Enhancement in the
Fundamentals of Spatial Filtering
Lecture: Image manipulations (2) Filters and kernels
Digital Image Processing Week IV
Image Filtering Readings: Ch 5: 5. 4, 5. 5, 5. 6, , 5
Lecture 7 Spatial filtering.
Fundamentals of Spatial Filtering
Image Enhancement in Spatial Domain: Neighbourhood Processing
Presentation transcript:

Digital Image Processing Lecture12: Basics of Spatial Filtering

Introduction As have been mentioned before in earlier lectures, some neighborhood operations work with the values of the image pixels in the neighborhood and the corresponding values of a subimage that has the same dimensions as the neighborhood. The subimage is called a filter, mask, kernel, template, or window. The values in a filter subimage are referred to as coefficients rather than pixels.

Introduction, Cont. These neighborhood operations consists of: 1.Defining a center point, (x,y) 2. Performing an operation that involves only the pixels in a predefined neighborhood about that center point. 3. Letting the result of that operation be the “response” of the process at that point. 4. Repeating the process for every point in the image.

Introduction, Cont. The process of moving the center point creates new neighborhoods, one for each pixel in the input image. The two principal terms used to identify this operation are neighborhood processing or spatial filtering, with the second term being more common. If the computations performed on the pixels of the neighborhoods are linear, the operation is called linear spatial filtering; otherwise it is called nonlinear spatial filtering.

The mechanics of Spatial Filtering The mechanics of linear spatial filtering

Linear Spatial Filtering For linear spatial filtering, the response is given by a sum of products of the filter coefficients and the corresponding image pixels in the area spanned by the filter mask. For the 3 X 3 mask shown in the previous figure, the result (response), R, of linear filtering with the filter mask at a point (x,y) in the image is: R = w (-1,-1) f(x-1, y-1) + w(-1,0) f (x-1, y) + … + w(0,0) f(0,0) + … + w(1,0) f(x+1, y) + w(1,1) f(x+1, y+1) which we see is the sum of products of the mask coefficients with the corresponding pixels directly under the mask.

Linear Spatial Filtering Note in particular that the coefficient w(0,0) coincides with image value f(x,y), indicating that the mask is centered at (x,y) when the computation of the sum of products takes place. For a mask of size m X n, we assume that m = 2a + 1 and n = 2b + 1, where a and b are nonnegative integers. All this says is that out focus in the following discussion will be on masks of odd sizes, with the smallest meaningful size being 3 X 3, (we exclude from our discussion the trivial case of a 1 X 1 mask). In general, linear filtering of an image f of size M X N with a filter mask of size m*n is given by the expression:

Linear Spatial Filtering There are two closely related concepts that must be understood clearly when performing linear spatial filtering. One is correlation; the other is convolution. Correlation is the process of passing the mask w by the image array f. Mechanically, convolution is the same process, except that w is rotated by 180 degrees prior to passing it by f.

Example: correlation Vs. convolution Figure 3.13(a) shows a one-dimensional function, f, and a mask w, the origin of f is assumed to be the leftmost point. To perform the correlation of the two functions, we move w so that its rightmost point coincides with the origin of f, as shown in FIg3.13 (b). Note that there are points between the two functions that do not overlap. The most common way to handle this problem is to pad f with as many 0s as are necessary to guarantee that there will always be corresponding points for the full excursion of w past f. this situation is shown in figure 3.13( c). We are ready now to perform the correlation. The first value of correlation is the sum of products of the two functions in the position shown in Fig3.13(c ). The sum of products is 0 in this case. Next we move w to one location to the right, and repeat the process, figure 3.13 (d). The sum of products again is 0. after four shifts, fig 3.13 (e), we encounter the first nonzero value of the correlation, which is (2)(1) = 2. if we proceed in this manner until w moves completeley past f, we would get the result in fig3.13 (g).

Example: correlation Vs. convolution Now, to perform convolution, we rotate w by 180 degree, and place its rightmost point at the origin of f, as shown in fig13.3(j). We then repeat the sliding/computing process employed in correlation, as illustrated in figs 3.13 (k) through (n). The preceding concepts extend easily to images, as illustrated in fig The origin is at the top, left corner of image f(x,y). To perform correlation, we place the bottom, rightmost point of w(x,y) so that it coincides with the origin of f(x,y), as illustrated in Fig 3.14 (c ). Note the use of 0 padding for the reasons mentioned in the previous example. To perform correlation, we move w(x,y) in all possible locations so that at least one of its pixels overlaps a pixel in the original image f(x,y). For convolution, we simply rotate w(x,y) by 180 degrees, and proceed in the same manner as in correlation (fig 3.14 (f) through (h) ).

Nonlinear Spatial Filtering Nonlinear Spatial Filtering is based on neighborhood operations also, and the mechanics of defining m X n neighborhoods by sliding the center point through an image are the same as discussed in the linear spatial filtering. However, whereas linear spatial filtering is based on computing the sum of products (which is a linear operation), nonlinear spatial filtering is based, as the name implies, on nonlinear operations involving the pixels of a neighborhood. For example, letting the response at each center point be equal to the maximum pixel value in its neighborhood is a nonlinear filtering operation. Another example, using log function.

Linear Spatial Filtering Implementation using MATLAB The toolbox implements linear spatial filtering using function imfilter, with the following syntax: g = imfilter (f, w, filtering_mode, boundary_option, size_option); This function filters the multidimensional array f (image) with the multidimensional filter w (filter mask). The result g (filtered image) has the same size and class as f. Each element of the output g is computed using double-precision floating point.

Linear Spatial Filtering Implementation using MATLAB Filtering_mode: OptionDescription ‘Corr’imfilter performs multidimensional filtering using correlation (default option) ‘Conv’filtering is done using convolution Boundary_option: OptionDescription Xinput array values outside the bounds of the array are implicitly assumed to have the value x, when no boundary option is specified, imfilter uses x = 0 (zero padding)

Linear Spatial Filtering Implementation using MATLAB Size_options: OptionDescription ‘Same’the output image is the same size as the input image, this is the default ‘full’the output image is the full filtered result.