Download presentation
Presentation is loading. Please wait.
Published byHomer Stanley Modified over 8 years ago
1
Image Visualization
3
Outline 9.1. Image Data Representation 9.2. Image Processing and Visualization 9.3. Basic Imaging Algorithms Contrast Enhancement Histogram Equalization Gaussian Smoothing Edge Detection 9.4. Shape Representation and Analysis Segmentation Connected Components Morphological Operations Distance Transforms Skeletonization
4
Image Data Representation An image is a well-behaved uniform dataset. An image is a two-dimensional array, or matrix of pixels, e.g., bitmaps, pixmaps, RGB images A pixel is square-shaped A pixel has a constant value over the entire pixel surface The value is typically encoded in 8 bits integer What is an image?
5
Image Processing and Visualization Image processing follows the visualization pipeline, e.g., image contrast enhancement following the rendering operation Image processing may also follow every step of the visualization pipeline
6
Image Processing and Visualization
8
Basic Image Processing Image enhancement operation is to apply a transfer function on the pixel luminance values Transfer function is usually based on image histogram analysis High-slope function enhance image contrast Low-slope function attenuate the contrast.
9
(Continued) Image Visualization Chap. 9 November 12, 2009
10
Basic Image Processing The basic image processing is the contrast enhancement through applying a transfer function
11
Image Enhancement Linear TransferNon-linear Transfer
12
Histogram Equalization All luminance values covers the same number of pixels Histogram equalization method is to compute a transfer function such as the resulted image has a near-constant histogram
13
Histogram Equalization Histogram equalization is a technique for adjusting image intensities to enhance contrast. In general, a histogram is the estimation of the probability distribution of a particular type of data. An image histogram is a type of histogram which offers a graphical representation of the tonal distribution of the gray values in a digital image. By viewing the image’s histogram, we can analyze the frequency of appearance of the different gray levels contained in the image.
14
Histogram Equalization Original ImageAfter equalization
15
Smoothing Fact: Images are noisy Noise is anything in the image that we are not interested in Noise can be described as rapid variation of high amplitude Or regions where high-order derivatives of f have large values Smoothing is often used to reduce noise within an image How to remove noise?
16
Smoothing Noise image After filtering
17
Fourier Transform The Fourier Transform is an important image processing tool which is used to decompose an image into its sine and cosine components. The output of the transformation represents the image in the Fourier or frequency domain, while the input image is the spatial domain equivalent.frequency domain spatial domain In the Fourier domain image, each point represents a particular frequency contained in the spatial domain image.
18
Spatial Domain Vs Frequency Domain Spatial Domain (Image Enhancement) is manipulating or changing an image representing an object in space to enhance the image for a given application. Techniques are based on direct manipulation of pixels in an image Used for filtering basics, smoothing filters, sharpening filters, unsharp masking and laplacian 2. Frequency Domain Techniques are based on modifying the spectral transform of an image Transform the image to its frequency representation Perform image processing Compute inverse transform back to the spatial domain
19
Frequency Filtering 1.Computer the Fourier transform F(w x,w y ) of f(x,y) 2.Multiple F by the transfer function Φ to obtain a new function G, e.g., high frequency components are removed or attenuated. 3.Compute the inverse Fourier transform G -1 to get the filtered version of f
20
Frequency Filtering Frequency filter function Φ can be classified into three different types: 1.Low-pass filter: A low-pass filter is a filter that passes signals with a frequency lower than a certain cutoff frequency and attenuates signals with frequencies higher than the cutoff frequency. 2.High-pass filter: A high-pass filter is an electronic filter that passes signals with a frequency higher than a certain cutoff frequency and attenuates signals with frequencies lower than the cutoff frequency. 3.Band-pass filter: A band-pass filter is a device that passes frequencies within a certain range and rejects (attenuates) frequencies outside that range. To remove noise, low-pass filter is used
21
Gaussian Filter In image processing, a Gaussian blur (also known as Gaussian smoothing) is the result of blurring an image by a Gaussian function.image processingGaussian function It is a widely used effect in graphics software, typically to reduce image noise and reduce detail. image noise Gaussian smoothing is also used as a pre-processing stage in computer vision algorithms in order to enhance image structures at different scales. computer vision The Gaussian outputs a `weighted average' of each pixel's neighborhood, with the average weighted more towards the value of the central pixels. This is in contrast to the mean filter's uniformly weighted average. Because of this, a Gaussian provides gentler smoothing and preserves edges better than a similarly sized mean filter. To remove noise, low-pass filter is used
22
Edge detection is an image processing technique for finding the boundaries of objects within images. It works by detecting discontinuities in brightness. Edge detection is used for image segmentation and data extraction in areas such as image processing, computer vision, and machine vision. Edges are curves that separate image regions of different luminance The points at which image brightness changes sharply are typically organized into a set of curved line segments termed edges. Edge Detection
23
Origin of Edges Edges are caused by a variety of factors depth discontinuity surface color discontinuity illumination discontinuity surface normal discontinuity
24
Edge Detection Original ImageEdge Detection
25
Generally, the first order derivative operators are very sensitive to noise and produce thicker edges. a.1) Roberts filtering: diagonal edge gradients, susceptible to fluctations. Gives no information about edge orientation and works best with binary images. a.2) Prewitt filter: The Prewitt operator is a discrete differentiation operator which functions similar to the Sobel operator, by computing the gradient for the image intensity function. Makes use of the maximum directional gradient. As compared to Sobel, the Prewitt masks are simpler to implement but are very sensitive to noise. a.3) Sobel filter: Detects edges are where the gradient magnitude is high.This makes the Sobel edge detector more sensitive to diagonal edge than horizontal and vertical edges. First Order Derivative Edge Detection
26
If there is a significant spatial change in the second derivative, an edge is detected. Good on producing thinner edges. 2nd Order Derivative operators are more sophisticated methods towards automatized edge detection, however, still very noise-sensitive. As differentiation amplifies noise, smoothing is suggested prior to applying the Laplacians. In that context, typical examples of 2nd order derivative edge detection are the Difference of Gaussian (DOG) and the Laplacian of Gaussian 2 nd Order Derivative Edge Detection
27
(Continued) Image Visualization Chap. 9 November 19, 2009
28
Shape Representation and Analysis Shape Analysis Pipeline
29
Filtering high-volume, low level datasets into low volume dataset containing high amounts of information Shape is defined as a compact subset of a given image Shape is characterized by a boundary and an interior Shape properties include geometry (form, aspect ratio, roundness, or squareness) Topology (type, kind, number) Texture (luminance, shading) Shape Representation and Analysis
30
Segment or classify the image pixels into those belonging to the shape of interest, called foreground pixels, and the remainder, also called background pixels. Segmentation results in a binary image Segmentation is related to the operation of selection, i.e., thresholding Segmentation
31
Find soft tissueFind hard tissue
32
Connected Components Find non-local properties Algorithm: start from a given foreground pixels, find all foreground pixels that are directly or indirectly neighbored
33
To close holes and remove islands in segmented images a: original image b: segmentation c: close holes d: remove island Morphological Operations
34
3/5/201634 Introduction Morphology: a branch of image processing that deals with the form and structure of an object. Morphological image processing is used to extract image components for representation and description of region shape, such as boundaries, skeletons, and shape of the image.
35
Structuring Element (Kernel) Structuring Elements can have varying sizes Usually, element values are 0,1 and none(!) Structural Elements have an origin For thinning, other values are possible Empty spots in the Structuring Elements are don’t care’s! 5-Mar-1635 Box Disc Examples of stucturing elements
36
Dilation & Erosion Basic operations Are dual to each other: –Erosion –Erosion shrinks foreground, enlarges Background –Dilation enlarges foreground, shrinks background 5-Mar-1636
37
Erosion Erosion is the set of all points in the image, where the structuring element “fits into”. Consider each foreground pixel in the input image –If the structuring element fits in, write a “1” at the origin of the structuring element! Simple application of pattern matching Input: –Binary Image (Gray value) –Structuring Element, containing only 1s! 5-Mar-1637
38
Erosion Operations (cont.) Structuring Element (B) Original image (A) Intersect pixelCenter pixel
39
Erosion Operations (cont.) Result of Erosion Boundary of the “center pixels” where B is inside A
40
Dilation Dilation is the set of all points in the image, where the structuring element “touches” the foreground. Consider each pixel in the input image –If the structuring element touches the foreground image, write a “1” at the origin of the structuring element! Input: –Binary Image –Structuring Element, containing only 1s!! 5-Mar-1640
41
Dilation Operations (cont.) Structuring Element (B) Original image (A) Reflection Intersect pixelCenter pixel
42
Dilation Operations (cont.) Result of Dilation Boundary of the “center pixels” where intersects A
43
Opening & Closing Important operations Derived from the fundamental operations –Dilatation –Erosion Usually applied to binary images, but gray value images are also possible Opening and closing are dual operations 5-Mar-1643
44
Morphological closing: dilation followed by an erosion Morphological opening: erosion followed by a dilation operation Morphological Operations
45
Opening Similar to Erosion –Spot and noise removal –Less destructive Erosion next dilation the same structuring element for both operations. Input: –Binary Image –Structuring Element, containing only 1s! 5-Mar-1645
46
Opening erosion followed by a dilation operation Take the structuring element (SE) and slide it around inside each foreground region. –All pixels which can be covered by the SE with the SE being entirely within the foreground region will be preserved. –All foreground pixels which can not be reached by the structuring element without lapping over the edge of the foreground object will be eroded away! 5-Mar-1646
47
Opening Structuring element: 3x3 square 5-Mar-1647
48
Opening Example Opening with a 11 pixel diameter disc 5-Mar-1648
49
Closing Similar to Dilation –Removal of holes –Tends to enlarge regions, shrink background Closing is defined as a Dilatation, followed by an Erosion using the same structuring element for both operations. Dilation next erosion! Input: –Binary Image –Structuring Element, containing only 1s! 5-Mar-1649
50
Closing : dilation followed by an erosion Take the structuring element (SE) and slide it around outside each foreground region. –All background pixels which can be covered by the SE with the SE being entirely within the background region will be preserved. –All background pixels which can not be reached by the structuring element without lapping over the edge of the foreground object will be turned into a foreground. 5-Mar-1650
51
Closing Structuring element: 3x3 square 5-Mar-1651
52
Closing Example Closing operation with a 22 pixel disc Closes small holes in the foreground 5-Mar-1652
53
Closing Example 1 1.Threshold 2.Closing with disc of size 20 5-Mar-1653 Thresholded closed
54
Opening Erosion and dilation are not inverse transforms. An erosion followed by a dilation leads to an interesting morphological operation
55
Opening Erosion and dilation are not inverse transforms. An erosion followed by a dilation leads to an interesting morphological operation
56
Opening Erosion and dilation are not inverse transforms. An erosion followed by a dilation leads to an interesting morphological operation
57
Closing Closing is a dilation followed by an erosion followed
58
Closing Closing is a dilation followed by an erosion followed
59
Closing Closing is a dilation followed by an erosion followed
60
Closing Closing is a dilation followed by an erosion followed
61
Distance Transform
62
The distance transform DT of a binary image I is a scalar field that contains, at every pixel of I, the minimal distance to the boundary ∂ Ω of the foreground of I Distance Transform
63
Distance transform can be used for morphological operation Consider a contour line C(δ) of DT Distance Transform δ = 0 … δ > 0 … δ < 0 …
64
The contour lines of DT are also called level sets Distance Transform ShapeLevel SetsElevation plot
65
Find the closest boundary points, so called feature points Feature Transform Given a: Feature point is b Given p: Feature points are q 1 and q 2
66
Skeletonization
67
Skeletonization: the Goals Geometric analysis: aspect ratio, eccentricity, curvature and elongation Topological analysis: genus Retrieval: find the shape matching a source shape Classification: partition the shape into classes Matching: find the similarity between two shapes
68
Skeletons are the medial axes Or skeleton S( Ω) was the set of points that are centers of maximally inscribed disks in Ω Or skeletons are the set of points situated at equal distance from at least two boundary feature points of the given shape Skeletonization
70
Feature Transform Method: Select those points whose feature transform contains more than two boundary points. Skeleton Computation Works well on continuous data Fails on discreate data
71
Using distance field singularities: Skeleton points are local maxima of distance transform Skeleton Computation
72
End of Chap. 9 Note: covered all sections except 9.4.7 (skeleton in 3D)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.