Chapter 3 cont’d. Adjacency, Histograms, & Thresholding.

Slides:



Advertisements
Similar presentations
CMPUT 615 Applications of Machine Learning in Image Analysis
Advertisements

DTU Informatics Introduction to Medical Image Analysis Rasmus R. Paulsen DTU Informatics TexPoint fonts.
Fast Algorithms For Hierarchical Range Histogram Constructions
Graph cut Chien-chi Chen.
Intensity Transformations (Chapter 3)
Histogram Processing The histogram of a digital image with gray levels from 0 to L-1 is a discrete function h(rk)=nk, where: rk is the kth gray level nk.
Automatic Thresholding
電腦視覺 Computer and Robot Vision I Chapter2: Binary Machine Vision: Thresholding and Segmentation Instructor: Shih-Shinh Huang 1.
Lesson  In this investigation you will explore the predictability of random outcomes. You will use a familiar random process, the flip of a coin.
Clustering CMPUT 466/551 Nilanjan Ray. What is Clustering? Attach label to each observation or data points in a set You can say this “unsupervised classification”
Otsu’s Thresholding Method Based on a very simple idea: Find the threshold that minimizes the weighted within-class variance. This turns out to be the.
Medical Imaging Mohammad Dawood Department of Computer Science University of Münster Germany.
Medical Imaging Mohammad Dawood Department of Computer Science University of Münster Germany.
1 Binary Image Analysis Binary image analysis consists of a set of image analysis operations that are used to produce or process binary images, usually.
CS 376b Introduction to Computer Vision 04 / 04 / 2008 Instructor: Michael Eckmann.
Image Classification.
CS 376b Introduction to Computer Vision 02 / 25 / 2008 Instructor: Michael Eckmann.
Objective of Computer Vision
Computer Vision Basics Image Terminology Binary Operations Filtering Edge Operators.
Segmentation (Section 10.3 & 10.4) CS474/674 – Prof. Bebis.
METU Informatics Institute Min 720 Pattern Classification with Bio-Medical Applications PART 2: Statistical Pattern Classification: Optimal Classification.
1. Binary Image B(r,c) 2 0 represents the background 1 represents the foreground
Chapter 3 Binary Image Analysis. Types of images ► Digital image = I[r][c] is discrete for I, r, and c.  B[r][c] = binary image - range of I is in {0,1}
Machine Vision for Robots
Manipulating contrast/point operations. Examples of point operations: Threshold (demo) Threshold (demo) Invert (demo) Invert (demo) Out[x,y] = max – In[x,y]
Lecture 3: Region Based Vision
CS 376b Introduction to Computer Vision 02 / 22 / 2008 Instructor: Michael Eckmann.
Medical Imaging Dr. Mohammad Dawood Department of Computer Science University of Münster Germany.
Remote Sensing Supervised Image Classification. Supervised Image Classification ► An image classification procedure that requires interaction with the.
1 Binary Image Analysis Binary image analysis consists of a set of image analysis operations that are used to produce or process binary images, usually.
CS654: Digital Image Analysis Lecture 18: Image Enhancement in Spatial Domain (Histogram)
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Image Segmentation and Edge Detection Digital Image Processing Instructor: Dr. Cheng-Chien LiuCheng-Chien Liu Department of Earth Sciences National Cheng.
CS654: Digital Image Analysis
CS-498 Computer Vision Week 8, Day 3 Thresholding and morphological operators My thesis? 1.
Computational Intelligence: Methods and Applications Lecture 16 Model evaluation and ROC Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Radiometric Normalization Spring 2009 Ben-Gurion University of the Negev.
Digital Image Processing
Markov Random Fields (MRF) Spring 2009 Ben-Gurion University of the Negev.
Machine Vision. Image Acquisition > Resolution Ability of a scanning system to distinguish between 2 closely separated points. > Contrast Ability to detect.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
CHAPTER 4 ESTIMATES OF MEAN AND ERRORS. 4.1 METHOD OF LEAST SQUARES I n Chapter 2 we defined the mean  of the parent distribution and noted that the.
Discrete Probability Distributions
Clustering MacKay - Chapter 20.
Image Segmentation Today’s Readings Szesliski Chapter 5
Ch. 4 – Displaying Quantitative Data (Day 1)
COUNTING IN BINARY Binary weightings 0 x x x x 8
Your Poster’s Title Goes Here names go here institution names go here
Binary Image Analysis: Part 1 Readings: Chapter 3: 3.1, 3.4, 3.8
Otsu’s Thresholding Method
Your Poster’s Title Goes Here names go here institution names go here
Your Poster’s Title Goes Here Name of Presenter Name of institution
Otsu’s Thresholding Method
Your Poster’s Title Goes Here names go here institution names go here
Binary Image Analysis used in a variety of applications:
Binary Image Analysis: Part 1 Readings: Chapter 3: 3.1, 3.4, 3.8
Your Poster’s Title Goes Here names go here institution names go here
A graphing calculator is required for some problems or parts of problems 2000.
Chap.8 Image Analysis 숙명여자대학교 컴퓨터과학과 최 영 우 2005년 2학기.
Announcements Project 4 questions Evaluations.
Your Poster’s Title Goes Here names go here institution names go here
COUNTING IN BINARY Binary weightings 0 x x x x 8
Mathematical Foundations of BME
Your Poster’s Title Goes Here names go here institution names go here
Your Poster’s Title Goes Here names go here institution names go here
Computer and Robot Vision I
Computer and Robot Vision I
Your Poster’s Title Goes Here names go here institution names go here
Binary Image Analysis used in a variety of applications:
Your Poster’s Title Goes Here Name of Presenter Name of institution
Presentation transcript:

Chapter 3 cont’d. Adjacency, Histograms, & Thresholding

RAGs (Region Adjacency Graphs)

Steps: 1. label image 2. scan and enter adjacencies in graph (includes containment)

But how do we obtain binary images?

Histograms & Thresholding

Gray to binary Thresholding Thresholding G  B G  B const int t=200; if (G[r][c]>t)B[r][c]=1; elseB[r][c]=0; How do we choose t? 1. Interactively 2. Automatically

Gray to binary Interactively. How? Interactively. How? Automatically. Automatically. Many, many, many, …, many methods. Many, many, many, …, many methods. 1. Experimentally (using a priori information). 2. Supervised/training methods. 3. Unsupervised Otsu’s method (among many, many, many, many, … other methods). Otsu’s method (among many, many, many, many, … other methods).

Histogram “Probability” of a given gray value in an image. “Probability” of a given gray value in an image. h( g) = count of pixels w/ gray value equal to g. h( g) = count of pixels w/ gray value equal to g. p(g) = h(g) / (w*h) p(g) = h(g) / (w*h) w*h = # of pixels in entire image w*h = # of pixels in entire image Demo histogram. Demo histogram.

Histogram Note: Sometimes we need to group gray values together in our histogram into “bins” or “buckets.” E.g., we have 10 bins in our histogram and 100 possible different gray values. So we put 0..9 into bin 0, into bin 1, …

Histogram

Something is missing here!

Otsu’s method Automatic thresholding method Automatic thresholding method automatically picks t given an image histogram automatically picks t given an image histogram Assume 2 groups are present in the image: Assume 2 groups are present in the image: 1. Those that are <=t 2. Those that are >t

Best choices for t. Otsu’s method

For every possible t: 1. Pick a t. 2. Calculate within group variances 1. probability of being in group 1 2. probability of being in group 2 3. determine mean of group 1 4. determine mean of group 2 5. calculate variance for group 1 6. calculate variance for group 2 7. calculate weighted sum of group variances and remember which t gave rise to minimum.

Otsu’s method: probability of being in each group

Otsu’s method: mean of individual groups

Otsu’s method: variance of individual groups

Otsu’s method: weighted sum of group variances Calculate for all t’s and minimize. Calculate for all t’s and minimize. Demo Otsu. Demo Otsu.

Generalized thresholding Single range of gray values Single range of gray values const int t1=200; const int t2=500; if (G[r][c]>t1 && G[r][c] t1 && G[r][c]<t2)B[r][c]=1; elseB[r][c]=0;

Even more general thresholding Union of ranges of gray values. Union of ranges of gray values. const int t1=200, t2=500; const int t3=1200, t4=1500; if (G[r][c]>t1 && G[r][c] t1 && G[r][c]<t2)B[r][c]=1; else if (G[r][c]>t3 && G[r][c] t3 && G[r][c]<t4)B[r][c]=1; elseB[r][c]=0;

Something is missing here!

K-Means Clustering Clustering = the process of partitioning a set of pattern vectors into subsets called clusters. Clustering = the process of partitioning a set of pattern vectors into subsets called clusters. K = number of clusters (known in advance). K = number of clusters (known in advance). Not an exhaustive search so it may not find the globally optimal solution. Not an exhaustive search so it may not find the globally optimal solution. (see section ) (see section )

Iterative K-Means Clustering Algorithm Form K-means clusters from a set of nD feature vectors. 1. Set ic=1 (iteration count). 2. Choose randomly a set of K means m 1 (1), m 2 (1), … m K (1). 3. For each vector x i compute D(x i,m j (ic)) for each j=1,…,K. 4. Assign x i to the cluster C j with the nearest mean. 5. ic =ic+1; update the means to get a new set m 1 (ic), m 2 (ic), … m K (ic). 6. Repeat 3..5 until C j (ic+1) = C j (ic) for all j.

K-Means for Optimal Thresholding What are the features? What are the features?

K-Means for Optimal Thresholding What are the features? What are the features? Individual pixel gray values Individual pixel gray values

K-Means for Optimal Thresholding What value for K should be used? What value for K should be used?

K-Means for Optimal Thresholding What value for K should be used? What value for K should be used? K=2 to be like Otsu’s method. K=2 to be like Otsu’s method.

Iterative K-Means Clustering Algorithm Form 2 clusters from a set of pixel gray values. 1. Set ic=1 (iteration count). 2. Choose 2 random gray values as our initial K means, m 1 (1), and m 2 (1). 3. For each pixel gray value x i compute fabs(x i,m j (ic)) for each j=1,2. 4. Assign x i to the cluster C j with the nearest mean. 5. ic =ic+1; update the means to get a new set m 1 (ic), m 2 (ic), … m K (ic). 6. Repeat 3..5 until Cj(ic+1) = Cj(ic) for all j.

Iterative K-Means Clustering Algorithm Example. m1(1)=260.83, m2(1)= m1(2)=39.37, m2(2)= m1(3)=52.29, m2(3)= m1(4)=54.71, m2(4)= m1(5)=55.04, m2(5)= m1(6)=55.10, m2(6)= m1(7)=55.10, m2(7)= Demo.