Download presentation
Presentation is loading. Please wait.
Published byCecil Kelley Modified over 9 years ago
1
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision
2
2 Image Noise Additive random noise: I(x,y) : the true pixel values n(x,y) : the (random) noise at pixel (x,y)
3
3 Image Noise Gaussian Noise: −Is used to model additive random noise −Probability of n(x,y) is:
4
4 Image Noise Salt-and-Pepper Noise: −Is used to model impulsive noise Impulsive Noise −Alters random pixels −Makes their values very different from the true ones p, r are uniformly distributed random variables l, s min, s max are constants
5
Smoothing Using Averaging Idea: replace each pixel by the average of its neighbors Useful for reducing random additive noise and unimportant details The size of the mask controls the amount of smoothing 5
6
Smoothing Using Averaging Trade-off: less noise vs blurring and loss of detail 3x35x57x7 15x1525x25 original 6
7
Gaussian Smoothing Idea: replace each pixel by a weighted average of its neighbors Mask weights are computed by sampling a Gaussian function Note: weight values decrease with distance from mask center 7
8
Gaussian Smoothing σ = 3 determines the degree of smoothing As increases, the size of the mask must also increase if we are to sample the Gaussian satisfactorily Usual rule – choose: height = width = 5 (subtends 98.76% of the area) 8
9
Gaussian Smoothing - Example = 30= 1 = 5 = 10 9
10
Averaging vs Gaussian Smoothing Averaging Gaussian 10
11
Properties of Gaussian Convolution of two Gaussians is another Gaussian Special case: convolving two times with Gaussian kernel of width is equivalent to convolving once with kernel of width * = 11
12
Properties of Gaussian Separable kernel: 12 a 2D Gaussian can be expressed as the product of two 1D Gaussians
13
Properties of Gaussian 2D Gaussian convolution can be implemented more efficiently using 1D convolutions: 13
14
row get a new image I r Convolve each column of I r with g Properties of Gaussian 14
15
Example 2D convolution (center location only) The filter factors into a product of 1D filters: Perform convolution along rows: Followed by convolution along the remaining column: * * = = O(n 2 ) O(2n)=O(n) 15
16
16 Median Filter Effective for removing "salt-and-pepper" noise (random occurrences of black or white pixels)
17
17 Median Filter Replace each pixel value by the median of the gray-levels in the neighborhood of the pixels
18
Image Sharpening Idea: compute intensity differences in local image regions Useful for emphasizing transitions in intensity (e.g., in edge detection) 2 nd derivative of Gaussian 18
19
Example 19
20
20 Geometric Transformations Modify the arrangement of pixels based on some geometric transformation.
21
21 Geometric Transformations Translation r’ = r + t r c’ = c + t c Scaling r’ = s r r c’ = s c c Rotation r’ = r 0 + (r - r 0 )cos( ) - (c - c 0 )sin( ) c’ = c 0 + (r - r 0 )sin( ) + (c - c 0 )sin( ) Affine Transformation r’ = a 11 r + a 12 c + b 1 c’ = a 21 r + a 22 c + b 2
22
22 Geometric Transformations Some Practical Problems 1)Transformed pixel coordinates might not lie within the bounds of the image. 2)Transformed pixel coordinates can be non-integer. 3)There might be no pixels in the input image that map to certain pixel locations in the transformed image (one-to-one correspondence can be lost).
23
23 Geometric Transformations Problem (3): Forward vs. Inverse Mapping −To guarantee that a value is generated for every pixel in the output image, we must consider each output pixel in turn and use the inverse mapping to determine the position in the input image.
24
24 Geometric Transformations Problem (2): Image Interpolation −Interpolation is the process of generating integer coordinates for a transformed pixel by examining its surrounding pixels. Zero-order interpolation (or nearest-neighbor)
25
25 Geometric Transformations First-order interpolation Higher-order interpolation schemes are more sophisticated but also more time consuming
26
26 Edge Detection Definition of edges −Edges are significant local changes of intensity in an image. −Edges typically occur on the boundary between two different regions in an image.
27
What Causes Intensity Changes? Geometric events −surface orientation (boundary) discontinuities −depth discontinuities −color and texture discontinuities Non-geometric events −illumination changes −specularities −shadows −inter-reflections depth discontinuity color discontinuity illumination discontinuity surface normal discontinuity 27
28
Goal of Edge Detection Produce a line drawing of a scene from an image of that scene. Important features can be extracted from the edges of an image (e.g., corners, lines, curves). These features are used by higher-level computer vision algorithms (e.g., segmentation, recognition). 28
29
Effect of Illumination 29
30
30 Edge Descriptors Edge descriptors −Edge normal: unit vector in the direction of maximum intensity change. −Edge direction: unit vector perpendicular to the edge normal. −Edge position or center: the image position at which the edge is located. −Edge strength: related to the local image contrast along the normal.
31
31 Modeling Intensity Changes Edges can be modeled according to their intensity profiles: Step edge: −the image intensity abruptly changes from one value to one side of the discontinuity to a different value on the opposite side.
32
32 Modeling Intensity Changes Ramp edge: −a step edge where the intensity change is not instantaneous but occurs over a finite distance.
33
33 Modeling Intensity Changes Ridge edge: −the image intensity abruptly changes value but then returns to the starting value within some short distance −generated usually by lines
34
34 Modeling Intensity Changes Roof edge: −a ridge edge where the intensity change is not instantaneous but occurs over a finite distance −generated usually by the intersection of surfaces
35
35 Main Steps in Edge Detection The four steps of edge detection: −Smoothing: suppress as much noise as possible, without destroying the true edges. −Enhancement: apply a filter that responds to edges in the image −Detection/thresholding: determine which edge pixels should be discarded as noise and which should be retained (usually, thresholding provides the criterion used for detection). −Localization: determine the exact location of an edge (sub-pixel resolution might be required for some applications, to estimate the location of an edge to better than the spacing between pixels). Edge thinning and linking are usually required in this step.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.