Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gaussian Mixture Models and Expectation-Maximization Algorithm.

Similar presentations


Presentation on theme: "Gaussian Mixture Models and Expectation-Maximization Algorithm."— Presentation transcript:

1 Gaussian Mixture Models and Expectation-Maximization Algorithm

2 2 The RGB Domain A regular image

3 3 The RGB Domain Image pixels in RGB space

4 4 Pixel Clusters Suppose we cluster the points for 2 clusters

5 5 Pixel Clusters The result in image space

6 6 Normal Distribution (1D Gaussian)

7 7 d= 2 x= random data point (2D vector) = mean value (2D vector) = covariance matrix (2D matrix) 2D Gaussians The same equation holds for a 3D Gaussian

8 8 2D Gaussians

9 9 Exploring Covariance Matrix is symmetric has eigendecomposition (svd)

10 10 Covariance Matrix Geometry b a

11 11 3D Gaussians

12 12 GMMs – Gaussian Mixture Models W H Suppose we have 1000 data points in 2D space (w,h)

13 13 W H GMMs – Gaussian Mixture Models Assume each data point is normally distributed Obviously, there are 5 sets of underlying gaussians

14 14 The GMM assumption There are K components (Gaussians) Each k is specified with three parameters: weight, mean, covariance matrix The total density function is:

15 15 The EM algorithm (Dempster, Laird and Rubin, 1977) Raw data GMMs (K = 6) Total Density Function

16 16 EM Basics Objective: Given N data points, find maximum likelihood estimation of : Algorithm: 1. Guess initial 2. Perform E step (expectation) Based on, associate each data point with specific gaussian 3. Perform M step (maximization) Based on data points clustering, maximize 4. Repeat 2-3 until convergence (~tens iterations)

17 17 EM Details E-Step (estimate probability that point t associated to gaussian j) : M-Step (estimate new parameters) :

18 18 EM Example Gaussian j data point t blue: w t,j

19 19 EM Example

20 20 EM Example

21 21 EM Example

22 22 EM Example

23 23 EM Example

24 24 EM Example

25 25 EM Example

26 26 Back to Clustering We want to label “close” pixels with the same label Proposed metric: label pixels from the same gaussian with same label Label according to max probability: Number of labels = K

27 Graph-Cut Optimization

28 28 Motivation for Graph-Cuts Let’s recall the car example

29 29 Motivation for Graph-Cuts Suppose we have two clusters in color-space Each pixel is colored by it’s associated gaussian index

30 30 A Problem: Noise Why? Pixel labeling is done independently for each pixel, ignoring the spatial relationships between pixels! Motivation for Graph-Cuts

31 31 Previous model for labeling: A new model for labeling. Minimize E:  f = Labeling function, assigns label f p for each pixel p  E data = Data Term  E smooth = Smooth Term  Lamda is a free parameter Formalizing a New Labeling Problem

32 32 Labels Set: { j=1,…,K } E data :  Penalize disagreement between pixel and the GMM E smooth :  Penalize disagreement between two pixels, unless it’s a natural edge in the image  dist(p,q) = normalized color-distance between p,q The Energy Function

33 33 Solving Min(E) is NP-hard It is possible to approximate the solution using iterative methods Graph-Cuts based methods approximate the global solution (up to constant factor) in polynomial time Read: “Fast Approximate Energy Minimization via Graph Cuts”, Y. Boykov, O. Veksler and R. Zabih, PAMI 2001 Minimizing the Energy

34 34 When using iterative methods, each iteration some of the pixels change their labeling Given a label α, a move from partition P (labeling f) to a new partition P’ (labeling f’) is called an α-expansion move if: α-expansion moves Current Labeling One Pixel Move α-β-swap Move α-expansion Move

35 35 Algorithm for Minimizing E(f) 1. Start with an arbitrary labeling 2. Set success = 0 3. For each label j 3.1 Find f’ = argmin(E(f’)) among f’ within one α-expansion of f 3.2 If E(f’) < E(f), set f = f’ and success = 1 4. If (success == 1) Goto 2 5. Return f How to find argmin(E(f’)) ?

36 36 A Reminder: min-cut / max-flow Given two terminal nodes α and β in G=(V,E), a cut is a set of edges C E that separates α from β in G’=(V,E\C)  Also, no proper subset of C separates α from β in G’. The cost of a cut is defined as the sum of all the edge weights in the cut. The minimum-cut of G is the cut C with the lowest cost. The minimum-cut problem is solvable in practically linear time.

37 37 Finding the Optimal Expansion Move Problem: Find f’ = argmin(E(f’)) among f’ within one α-expansion of f Solution: Translate the problem to a min-cut problem on an appropriately defined graph.

38 38 Graph Structure for Optimal Expansion Move Terminal α Terminal not(α) Cut C 1-1 correspondence between cut and labeling E(f) is minimized!

39 39 Each pixel gets a node A Closer Look P1P2 Pα

40 40 Add auxiliary nodes between pixel with different labels A Closer Look P1P2 Pα

41 41 Add two terminal nodes for α and not(α) A Closer Look P1P2 Pα

42 42 A Closer Look P1P2 Pα

43 43 A Closer Look P1P2 Pα

44 44 A Closer Look P1P2 Pα

45 45 A Closer Look P1P2 Pα

46 46 A Closer Look P1P2 Pα

47 47 Implementation Notes Neighboring system can be 4-connected pixels, 8-connected and even more. Lamda allows to determine the ratio between the data term and the smooth term. Solving Min(E) is simpler and possible in polynomial time if only two labels involved (see “Interactive Graph Cuts for Optimal Boundary & Region Segmentation of Objects in N-D Images”, Y. Boykov and M-P. Jolly 2001) There is a ready-to-use package for solving max-flow (see http://www.cs.cornell.edu/People/vnk/software/maxflow- v2.2.src.tar.gz)

48 Final Project Optimized Color Transfer www.cs.tau.ac.il/~gamliela/color_transfer_project/color_transfer_project.htm


Download ppt "Gaussian Mixture Models and Expectation-Maximization Algorithm."

Similar presentations


Ads by Google