Entropy and some applications in image processing Neucimar J. Leite Institute of Computing

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Image Processing Lecture 4
Chapter 3 Image Enhancement in the Spatial Domain.
1 Texture Texture is a description of the spatial arrangement of color or intensities in an image or a selected region of an image. Structural approach:
1 Texture Texture is a description of the spatial arrangement of color or intensities in an image or a selected region of an image. Structural approach:
嵌入式視覺 Feature Extraction
CDS 301 Fall, 2009 Image Visualization Chap. 9 November 5, 2009 Jie Zhang Copyright ©
 Image Characteristics  Image Digitization Spatial domain Intensity domain 1.
The Image Histogram.
Intensity Transformations (Chapter 3)
Digital Image Processing
Image Enhancement in the Spatial Domain
Automatic Histogram Threshold Using Fuzzy Measures 呂惠琪.
Digital Image Processing In The Name Of God Digital Image Processing Lecture3: Image enhancement M. Ghelich Oghli By: M. Ghelich Oghli
Computer Vision Lecture 16: Region Representation
5. 1 Model of Image degradation and restoration
Mining for High Complexity Regions Using Entropy and Box Counting Dimension Quad-Trees Rosanne Vetro, Wei Ding, Dan A. Simovici Computer Science Department.
Measures of Information Hartley defined the first information measure: –H = n log s –n is the length of the message and s is the number of possible values.
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Mutual Information Mathematical Biology Seminar
1 Texture Texture is a description of the spatial arrangement of color or intensities in an image or a selected region of an image. Structural approach:
Texture Turk, 91.
Texture Texture is a description of the spatial arrangement of color or intensities in an image or a selected region of an image. Structural approach:
Texture Classification Using QMF Bank-Based Sub-band Decomposition A. Kundu J.L. Chen Carole BakhosEvan Kastner Dave AbramsTommy Keane Rochester Institute.
Losslessy Compression of Multimedia Data Hao Jiang Computer Science Department Sept. 25, 2007.
Machine Learning CMPT 726 Simon Fraser University
E.G.M. PetrakisTexture1 Repeative patterns of local variations of intensity on a surface –texture pattern: texel Texels: similar shape, intensity distribution.
Image Enhancement.
Texture Readings: Ch 7: all of it plus Carson paper
Run-Length Encoding for Texture Classification
EE462 MLCV 1 Lecture 3-4 Clustering (1hr) Gaussian Mixture and EM (1hr) Tae-Kyun Kim.
Image Segmentation by Clustering using Moments by, Dhiraj Sakumalla.
Some basic concepts of Information Theory and Entropy
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Introduction --Classification Shape ContourRegion Structural Syntactic Graph Tree Model-driven Data-driven Perimeter Compactness Eccentricity.
Copyright © 2012 Elsevier Inc. All rights reserved.
8D040 Basis beeldverwerking Feature Extraction Anna Vilanova i Bartrolí Biomedical Image Analysis Group bmia.bmt.tue.nl.
8D040 Basis beeldverwerking Feature Extraction Anna Vilanova i Bartrolí Biomedical Image Analysis Group bmia.bmt.tue.nl.
S EGMENTATION FOR H ANDWRITTEN D OCUMENTS Omar Alaql Fab. 20, 2014.
Prof. Amr Goneid Department of Computer Science & Engineering
Texture. Texture is an innate property of all surfaces (clouds, trees, bricks, hair etc…). It refers to visual patterns of homogeneity and does not result.
Digital Image Processing
Digital Image Processing Lecture 18: Segmentation: Thresholding & Region-Based Prof. Charlene Tsai.
EDGE DETECTION USING MINMAX MEASURES SOUNDARARAJAN EZEKIEL Matthew Lang Department of Computer Science Indiana University of Pennsylvania Indiana, PA.
Image Compression – Fundamentals and Lossless Compression Techniques
Digital Image Processing Lecture 10: Image Restoration March 28, 2005 Prof. Charlene Tsai.
Introduction --Classification Shape ContourRegion Structural Syntactic Graph Tree Model-driven Data-driven Perimeter Compactness Eccentricity.
Coding Theory Efficient and Reliable Transfer of Information
9/26 디지털 영상통신 Mathematical Preliminaries Math Background Predictive Coding Huffman Coding Matrix Computation.
Digital Image Processing Lecture 10: Image Restoration
Ch5 Image Restoration CS446 Instructor: Nada ALZaben.
Image Subtraction Mask mode radiography h(x,y) is the mask.
Levels of Image Data Representation 4.2. Traditional Image Data Structures 4.3. Hierarchical Data Structures Chapter 4 – Data structures for.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
Autonomous Robots Vision © Manfred Huber 2014.
Digital Image Processing Lecture 22: Image Compression
Colour and Texture. Extract 3-D information Using Vision Extract 3-D information for performing certain tasks such as manipulation, navigation, and recognition.
Digital Image Processing
1 Mathematic Morphology used to extract image components that are useful in the representation and description of region shape, such as boundaries extraction.
Thresholding Foundation:. Thresholding In A: light objects in dark background To extract the objects: –Select a T that separates the objects from the.
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
SEAC-3 J.Teuhola Information-Theoretic Foundations Founder: Claude Shannon, 1940’s Gives bounds for:  Ultimate data compression  Ultimate transmission.
Digital Image Processing CCS331 Relationships of Pixel 1.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Digital Image Processing Lecture 20: Image Compression May 16, 2005
Texture.
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
CH 8. Image Compression 8.1 Fundamental 8.2 Image compression models
Presentation transcript:

Entropy and some applications in image processing Neucimar J. Leite Institute of Computing

Outline Introduction –Intuitive understanding Entropy as global information Entropy as local information –edge detection, texture analysis Entropy as minimization/maximization constraints –global thresholding –deconvolution problem

Information Entropy (Shannon´s entropy) An information theory concept closely related to the following question: - What is the minimum amount of data needed to represent an information content? For images (compression problems): - How few data are sufficient to completely describe an images without (much) loss of information?

Intuitive understanding: - relates the amount of uncertainty about an event with a given probability distribution Event: randomly draw out a ball high uncertainty low uncertainty no uncertainty entropy = max min (uncertainty)

Example 1: Event: a coin flipping = { heads, tails } Probability: P(heads) = P(tails) = 1/2 0  heads 1  tails self-information: inversely related to the probability of E Self-information: - Units of information used to represent an event E

Example 2: amount of conveyed information of event E Entropy : average information

coding the balls (3 bits/ball) Entropy: = 3bits/ball Degree of information compression: equal length binary code for independent data:

H= -( 5/8 log 2 (5/8) + 1/8 log 2 (1/8) + 1/8 log 2 (1/8) + 1/8 log 2 (1/8) ) = 1.54 H = -8log 2 1 = 0 medium uncertainty: no uncertainty: code

H= -( 5/8 log 2 (5/8) + 1/8 log 2 (1/8) + 1/8 log 2 (1/8) + 1/8 log 2 (1/8) ) = 1.54 medium uncertainty: code bits/ball > 1.54 bit/ball  code redundancy !!! and  We need an encoding method for eliminating this code redundancy 22%

BallProbabilityReduction 1Reduction 2 red black blue green 5/8 1/8 The Huffman encoding:

BallProbabilityReduction 1Reduction 2 red black blue green 5/8 1/8 5/8 2/8 1/8

BallProbabilityReduction 1Reduction 2 red black blue green 5/8 1/8 5/8 2/8 1/8 5/8 3/8

BallProbabilityReduction 1Reduction 2 red black blue green 5/8 1/8 5/8 2/8 1/8 5/8 3/8 (1) (0)

BallProbabilityReduction 1Reduction 2 red black blue green 5/8 1/8 5/8 2/8 1/8 5/8 3/8 (1) (0) (01) (00) (1)

BallProbabilityReduction 1Reduction 2 red black blue green 5/8 1/8 5/8 2/8 1/8 5/8 3/8 (1) (0) (01) (00) (1) (011) (010) (1) (00) variable length code red black blue green ball and (18,6%)

Entropy: x bit image: After Huffman encoding: bits/pixel Variable length coding does not take advantage of the high images pixel-to-pixel correlation:  a pixel can be predicted from the values of its neighbors  more redundancy  lower entropy (bits/pixel)

Entropy: 7.45 After Huffman encoding: Entropy: 7.35 After Huffman encoding:

Coding the interpixel difference  highlighting redundancies: Entropy: 4.73 instead of 7.45 After Huffman encoding: Entropy: 5.97 instead of 7.35 After Huffman encoding: instead of 1.07 instead of 1.08

Entropy as a local information: the edge detection example

Edge detection examples:

Entropy-based edge detection Low entropy values  low frequencies  uniform image regions High entropy values  high frequencies  image edges

Binary entropy function: Entropy H p 1.0

Entropy H p 1.0

Entropy H p 1.0

Entropy H p 1.0

Entropy H p 1.0

Entropy H p 1.0

Binary entropy function: Isotropic edge detection

H in a 3x3 neighborhood:

5x5 neighborhood:

7x7 neighborhood:

9x9 neighborhood:

Texture Analysis Similarity grouping based on brightness, colors, slopes, sizes etc The perceived patterns of lightness, directionality, coarseness, regularity, etc can be used to describe and segment an image

Texture description: statistical approach Characterizes textures as smooth, coarse, periodic, etc - Based on the intensity histogram  prob. density function Descriptors examples: z i = random variable denoting gray levels p(z i ) = the intensity histogram in a region Mean: a measure of average intensity

Other moments of different orders: - e.g., standard deviation: a measure of average contrast Entropy: a measure of randomness

TextureAverage IntensityAverage contrastEntropy smooth coarse periodic smooth coarse periodic

Descriptors and segmentation: ?

Gray-level co-occurrence matrix: Haralick´s descriptors Conveys information about the positions of pixels having similar gray level values d= M d (a,b)

For the descriptor H: large empty spaces in M  little information content cluttered areas  large information content M d = the probability that a pixel with gray level i will have a pixel with level j a distance of d pixels away in a given direction d = 2, horizontal direction

Obviously, more complex texture analysis based on statistical descriptors should consider combination of information related to image scale, moments, contrast, homogeneity, directionality, etc

Entropy as minimization/maximization constraints

Global thresholding examples: mean histogram peaks

For images with levels 0-255: The probability that a given pixel will have value less than or equal t is: Now considering: Class A: Class B:

The optimal threshold is the value of t that maximizes where

Examples:

Entropy as a fuzziness measure In fuzzy set theory an element x belongs to a set S with a certain probability p x defined by a membership function p x (x) Example of a membership function for a given threshold t: p x (x) gives the degree to which x belongs to the object or background with gray-level average and, respectively.

How can the degree of fuzziness be measured? Example: t = 0 for a a binary image  fuzziness = 0

Using the Shannon´s function (for two classes): the entropy of an entire fuzzy set of dimension MxN is and for segmentation purpose, the threshold t is such that E(t) is minimum  t minimizes fuzziness

Segmentation examples

Maximum Entropy Restoration: the deconvolution problem

The image degradation model: f(x,y) h(x,y) + noise degraded image g(x,y)

The restoration problem: Given g, h, and we can find an estimate such that the residual Since there may exist many functions such that the above constraint is satisfied, we can consider the maximization entropy as an additional constraint for “optimum” restoration originaldegradedrestored

Wiener Lucy-Richardson Entropy h Degraded Other restoration methods:

Conclusions The entropy information has been extensively used in various image processing applications. Other examples concern distortion prediction, images evaluation, registration, multiscale analysis, high-level feature extraction and classification, etc