Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 3 cont’d. Adjacency, Histograms, & Thresholding.

Similar presentations


Presentation on theme: "Chapter 3 cont’d. Adjacency, Histograms, & Thresholding."— Presentation transcript:

1 Chapter 3 cont’d. Adjacency, Histograms, & Thresholding

2 RAGs (Region Adjacency Graphs)

3 Steps: 1. label image 2. scan and enter adjacencies in graph (includes containment)

4

5 But how do we obtain binary images?

6 Histograms & Thresholding

7 Gray to binary Thresholding Thresholding G  B G  B const int t=200; if (G[r][c]>t)B[r][c]=1; elseB[r][c]=0; How do we choose t? 1. Interactively 2. Automatically

8 Gray to binary Interactively. How? Interactively. How? Automatically. Automatically. Many, many, many, …, many methods. Many, many, many, …, many methods. 1. Experimentally (using a priori information). 2. Supervised/training methods. 3. Unsupervised Otsu’s method (among many, many, many, many, … other methods). Otsu’s method (among many, many, many, many, … other methods).

9 Histogram “Probability” of a given gray value in an image. “Probability” of a given gray value in an image. h( g) = count of pixels w/ gray value equal to g. h( g) = count of pixels w/ gray value equal to g. p(g) = h(g) / (w*h) p(g) = h(g) / (w*h) w*h = # of pixels in entire image w*h = # of pixels in entire image Demo histogram. Demo histogram.

10 Histogram Note: Sometimes we need to group gray values together in our histogram into “bins” or “buckets.” E.g., we have 10 bins in our histogram and 100 possible different gray values. So we put 0..9 into bin 0, 10..19 into bin 1, …

11 Histogram

12 Something is missing here!

13 Otsu’s method Automatic thresholding method Automatic thresholding method automatically picks t given an image histogram automatically picks t given an image histogram Assume 2 groups are present in the image: Assume 2 groups are present in the image: 1. Those that are <=t 2. Those that are >t

14 Best choices for t. Otsu’s method

15 For every possible t: 1. Pick a t. 2. Calculate within group variances 1. probability of being in group 1 2. probability of being in group 2 3. determine mean of group 1 4. determine mean of group 2 5. calculate variance for group 1 6. calculate variance for group 2 7. calculate weighted sum of group variances and remember which t gave rise to minimum.

16 Otsu’s method: probability of being in each group

17 Otsu’s method: mean of individual groups

18 Otsu’s method: variance of individual groups

19 Otsu’s method: weighted sum of group variances Calculate for all t’s and minimize. Calculate for all t’s and minimize. Demo Otsu. Demo Otsu.

20

21 Generalized thresholding Single range of gray values Single range of gray values const int t1=200; const int t2=500; if (G[r][c]>t1 && G[r][c] t1 && G[r][c]<t2)B[r][c]=1; elseB[r][c]=0;

22 Even more general thresholding Union of ranges of gray values. Union of ranges of gray values. const int t1=200, t2=500; const int t3=1200, t4=1500; if (G[r][c]>t1 && G[r][c] t1 && G[r][c]<t2)B[r][c]=1; else if (G[r][c]>t3 && G[r][c] t3 && G[r][c]<t4)B[r][c]=1; elseB[r][c]=0;

23 Something is missing here!

24 K-Means Clustering Clustering = the process of partitioning a set of pattern vectors into subsets called clusters. Clustering = the process of partitioning a set of pattern vectors into subsets called clusters. K = number of clusters (known in advance). K = number of clusters (known in advance). Not an exhaustive search so it may not find the globally optimal solution. Not an exhaustive search so it may not find the globally optimal solution. (see section 10.1.1) (see section 10.1.1)

25 Iterative K-Means Clustering Algorithm Form K-means clusters from a set of nD feature vectors. 1. Set ic=1 (iteration count). 2. Choose randomly a set of K means m 1 (1), m 2 (1), … m K (1). 3. For each vector x i compute D(x i,m j (ic)) for each j=1,…,K. 4. Assign x i to the cluster C j with the nearest mean. 5. ic =ic+1; update the means to get a new set m 1 (ic), m 2 (ic), … m K (ic). 6. Repeat 3..5 until C j (ic+1) = C j (ic) for all j.

26 K-Means for Optimal Thresholding What are the features? What are the features?

27 K-Means for Optimal Thresholding What are the features? What are the features? Individual pixel gray values Individual pixel gray values

28 K-Means for Optimal Thresholding What value for K should be used? What value for K should be used?

29 K-Means for Optimal Thresholding What value for K should be used? What value for K should be used? K=2 to be like Otsu’s method. K=2 to be like Otsu’s method.

30 Iterative K-Means Clustering Algorithm Form 2 clusters from a set of pixel gray values. 1. Set ic=1 (iteration count). 2. Choose 2 random gray values as our initial K means, m 1 (1), and m 2 (1). 3. For each pixel gray value x i compute fabs(x i,m j (ic)) for each j=1,2. 4. Assign x i to the cluster C j with the nearest mean. 5. ic =ic+1; update the means to get a new set m 1 (ic), m 2 (ic), … m K (ic). 6. Repeat 3..5 until Cj(ic+1) = Cj(ic) for all j.

31 Iterative K-Means Clustering Algorithm Example. m1(1)=260.83, m2(1)=539.00 m1(2)=39.37, m2(2)=1045.65 m1(3)=52.29, m2(3)=1098.63 m1(4)=54.71, m2(4)=1106.28 m1(5)=55.04, m2(5)=1107.24 m1(6)=55.10, m2(6)=1107.44 m1(7)=55.10, m2(7)=1107.44...Demo.


Download ppt "Chapter 3 cont’d. Adjacency, Histograms, & Thresholding."

Similar presentations


Ads by Google