Download presentation
Presentation is loading. Please wait.
1
Markov Cluster Algorithm
2
Outline Introduction Important Concepts in MCL Algorithm MCL Algorithm
The Features of MCL Algorithm Summary
3
Graph clustering Decompose a network into subnetworks based on some topological properties Usually we look for dense subnetworks
4
Graph clustering Algorithms:
Exact: have proven solution quality and time complexity Approximate: heuristics are used to make them efficient Example algorithms: Highly connected subgraphs (HCS) Restricted neighborhood search clustering (RNSC) Molecular Complex Detection (MCODE) Markov Cluster Algorithm (MCL)
5
Graph Clustering Intuition: Model:
High connected nodes could be in one cluster Low connected nodes could be in different clusters. Model: A random walk may start at any node Starting at node r, if a random walk will reach node t with high probability, then r and t should be clustered together.
6
Definitions and Representation
An undirected graph and its adjacency matrix representation. An undirected graph and its adjacency list representation.
7
Theorem. Let M be the adjacency matrix for graph G
Theorem. Let M be the adjacency matrix for graph G. Then each (i, j) entry in M r is the number of paths of length r from vertex i to vertex j. Note: This is the standard power of m, not a Boolean product.
10
K-path Graph power The kth power of a graph G: a graph with the same set of vertices as G and an edge between two vertices iff there is a path of length at most k between them The number of paths of length k between any two nodes can be calculated by raising adjacency matrix of G to the exponent k Then, G’s kth power is defined as the graph whose adjacency matrix is given by the sum of the first k powers of the adjacency matrix:
11
K-Path Clustering G G2 G3
12
All-Pairs Shortest Paths
Given a weighted graph G(V,E,w), the all-pairs shortest paths problem is to find the shortest paths between all pairs of vertices vi, vj ∈ V. A number of algorithms are known for solving this problem.
13
All-Pairs Shortest Paths: Matrix-Multiplication Based Algorithm
Consider the multiplication of the weighted adjacency matrix with itself - except, in this case, we replace the multiplication operation in matrix multiplication by addition, and the addition operation by minimization. Notice that the product of weighted adjacency matrix with itself returns a matrix that contains shortest paths of length 2 between any pair of nodes. It follows from this argument that An contains all shortest paths.
14
Matrix-Multiplication Based Algorithm
15
Markov Clustering (MCL)
Markov process The probability that a random will take an edge at node u only depends on u and the given edge. It does not depend on its previous route. This assumption simplifies the computation.
16
MCL Flow of network is used to approximate the partition
There is an initial amount of flow injected into each node. At each step, a percentage of flow will goes from a node to its neighbors via the outgoing edges.
17
MCL Edge Weight Similarity between two nodes
Considered as the bandwidth or connectivity. If an edge has higher weight than the other, then more flow will be flown over the edge. The amount of flow is proportional to the edge weight. If there is no edge weight, then we can assign the same weight to all edges.
18
Intuition of MCL Two natural clusters
When the flow reaches the border points, it is likely to return back, than cross the border. A B
19
MCL When the flow reaches A, it has four possible outcomes.
Three back into the cluster, one leak out. ¾ of flow will return, only ¼ leaks. Flow will accumulate in the center of a cluster (island). The border nodes will starve.
20
Introduction—MCL in General
Simualtion of Random Flow in graph Two Operations: Expansion and Inflation Intrinsic relationship between MCL process result and cluster structure
21
Introduction-Cluster
Observation 1: The number of Higher-Length paths in G is large for pairs of vertices lying in the same dense cluster Small for pairs of vertices belonging to different clusters
22
Introduction-Cluster
Oberservation 2: A Random Walk in G that visits a dense cluster will likely not leave the cluster until many of its vertices have been visited Let’s take a driving for example. Scenario a is like there are many different ways of driving from A to B if A and B are in the same district, and only few if they are in different districts. Scenario b is like driving round randomly, but in line with traffic regulations, will keep you in the same district for a long time. Make sense?
23
Definitions nxn Adjacency matrix A.
A(i,j) = weight on edge from i to j If the graph is undirected A(i,j)=A(j,i), i.e. A is symmetric nxn Transition matrix P. P is row stochastic P(i,j) = probability of stepping on node j from node i = A(i,j)/∑iA(i,j)
24
Flow Formulation Flow: Transition probability from a node to another node. Flow matrix: Matrix with the flows among all nodes; ith column represents flows out of ith node. Each column sums to 1. 1 2 3 1 2 3 0.5 1.0 1 2 3 0.5 Flow Matrix 24
25
Definitions Adjacency matrix A Transition matrix P 1 1/2 1
26
What is a random walk t=0 1 1/2
27
What is a random walk 1 1/2 t=1 t=0 1 1/2
28
What is a random walk 1 1/2 t=1 t=0 1 1/2 t=2 1 1/2
29
What is a random walk 1 1/2 t=1 t=0 1 1/2 t=2 1 1/2 t=3 1 1/2
30
Probability Distributions
xt(i) = probability that the surfer is at node i at time t xt+1(i) = ∑j(Probability of being at node j)*Pr(j->i) =∑jxt(j)*P(j,i) xt+1 = xtP = xt-1*P*P= xt-2*P*P*P = …=x0 Pt What happens when the surfer keeps walking for a long time?
31
Motivation behind MCL Measure or Sample any of these—high-length paths, random walks and deduce the cluster structure from the behavior of the samples quantities. Cluster structure will show itself as a peaked distribution of the quantities A lack of cluster structure will result in a flat distribution
32
Important Concepts about MCL
Markov Chain Random Walk on Graph Some Definitions in MCL
33
Markov Chain A Random Process with Markov Property
Markov Property: given the present state, future states are independent of the past states At each step the process may change its state from the current state to another state, or remain in the same state, according to a certain probability distribution. At each step the system may change its state from the current state to another state, or remain in the same state, according to a certain probability distribution. The changes of state are called transitions, and the probabilities associated with various state-changes are called transition probabilities.
34
Markov Chain Example Let’s take node 1 as an example. For node 1, it has 5 options when it takes the next step, which are node 2,6,7,10 and itself. The probability distribution tells that there are equal possibility for the next node that node 1 will come to . Markov chain is applied into many fields including economics, internet application(Page Rank is defined in Markov chain), gambling, physics.
35
Random Walk on Graph A walker takes off on some arbitrary vertex
He successively visits new vertices by selecting arbitrarily one of outgoing edges There is not much difference between random walk and finite Markov chain.
36
Some Definitions in MCL
Simple Graph Simple graph is undirected graph in which every nonzero weight equals 1.
37
Some Definitions in MCL
Associated Matrix The associated matrix of G, denoted MG ,is defined by setting the entry (MG)pq equal to w(vp,vq)
38
Some Definitions in MCL
Markov Matrix The Markov matrix associated with a graph G is denoted by TG and is formally defined by letting its qth column be the qth column of M normalized
39
Example
40
Explanation to Previous Example
The associate matrix and markov matrix is actually for matrix M+I I denotes diagonal matrix with nonzero element equals 1 Adding a loop to every vertex of the graph because for a walker it is possible that he will stay in the same place in his next step
41
Example
42
Markov Cluster Algorithm
Find Higher-Length Path Start Point: In associated matrix that the quantity (Mk)pq has a straightforward interpretation as the number of paths of length k between vp and vq
43
Example-Associate Matrix
MG We can interprete the matrix as that for vertex 1, the number of 2-length path between itself, vertex 6, vertex 7 and vertex 10 is respectively 5,3,3,4. So we can presumely say that these four vertex belongs to the same cluster. Is is too arbitrary? It is just 2-length path and it seems we should have some kind of threshold? (MG+I)2
44
Example- Markvo Matrix
MG There seems to have the same trend. This means that we’ve already taken one stepsof random walk and its distribution of probability that the walker will go next. It seems that for vertex 1, vertex 1,6,7,10 has higher possibility to choose. Does that mean that we just make those four vertexs as a cluster? Let’s continue to look:
45
Example-Markov Matrix
The fiinal result is that for each of its columns all nonzeros values are homogeneously distributed. It can be interpreted as each node is equally attracted to all of its neighbours or at each node one moves to each of its neighbours with equal probability.
46
Conclusion Flow is easier with dense regions than across sparse boundaries, However, in the long run, this effect disappears. Power of matrix can be used to find higher- length path but the effect will diminish as the flow goes on.
47
Inflation Operation Idea: How can we change the distribution of transition probabilities such that prefered neighbours are further favoured and less popular neighbours are demoted. MCL Solution: raise all the entries in a given column to a certain power greater than 1 (e.g. squaring) and rescaling the column to have the sum 1 again.
48
Example for Inflation Operation
From examples, we can tell that this operation does not change entries that are homogeneously distributed and that different column positions with nearly identical values will still be close to each other after rescaling. And those entries whose values are not close are further taken apart.
49
Definition for Inflation Operation
50
Apply Inflation Operation to the previous Markov Matrix
51
Inflation Effects
52
MCL Opeartions Expansion Operation: power of matrix, expansion of dense region Inflation Operation: mention aboved, elimination of unfavoured region
53
The MCL algorithm Output clusters
Input: A, Adjacency matrix Initialize M to MG, the canonical transition matrix M:= MG:= (A+I) D-1 Enhances flow to well-connected nodes as well as to new nodes. Expand: M := M*M Increases inequality in each column. “Rich get richer, poor get poorer.” Inflate: M := M.^r (r usually 2), renormalize columns Prune Saves memory by removing entries close to zero. No Converged? Yes Output clusters 53 Output clusters
54
MCL Result for the Graph
55
An Striking Example
56
Striking Animation animation.html
57
Mapping nonnegative idempotent matrces onto clusters
Find attractor: the node a is an attractor if Maa is nonzero Find attractor system: If a is an attractor then the set of its neighbours is called an attractor system. If there is a node who has arc connected to any node of an attractor system, the node will belong to the same cluster as that attractor system.
58
Example Attractor Set={1,2,3,4,5,6,7,8,9,10}
The Attractor System is {1,2,3},{4,5,6,7},{8,9},{10} The overlapping clusters are {1,2,3,11,12,15},{4,5,6,7,13},{8,9,12,13,14,15},{10,12,13}
59
MCL Feature how many steps are requred before the algorithm converges to a idempoent matrix? The number is typically somewhere between 10 and 100 The effect of inflation on cluster granularity
60
Summary MCL stimulates random walk on graph to find cluster
Expansion promotes dense region while Inflation demotes the less favoured region There is intrinsic relationship between MCL result and cluster structure
61
Markov Clustering (MCL) [van Dongen ‘00]
The original algorithm for clustering graphs using stochastic flows. Advantages: Simple and elegant. Widely used in Bioinformatics because of its noise tolerance and effectiveness. Disadvantages: Very slow. - Takes 1.2 hours to cluster a 76K node social network. Prone to output too many clusters. Produces 1416 clusters on a 4741 node PPI network.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.