Presentation is loading. Please wait.

Presentation is loading. Please wait.

Minimum Spanning Trees

Similar presentations


Presentation on theme: "Minimum Spanning Trees"— Presentation transcript:

1 Minimum Spanning Trees
5/19/2018 4:53 PM Minimum Spanning Trees 2704 867 BOS 849 PVD ORD 187 740 144 1846 621 JFK 184 1258 802 SFO 1391 BWI 1464 337 1090 DFW 946 LAX 1235 1121 MIA 2342 Minimum Spanning Trees

2 Minimum Spanning Trees
Spanning subgraph Subgraph of a graph G containing all the vertices of G Spanning tree Spanning subgraph that is itself a (free) tree Minimum spanning tree (MST) Spanning tree of a weighted graph with minimum total edge weight Applications Communications networks Transportation networks ORD 10 1 PIT DEN 6 7 9 3 DCA STL 4 8 5 2 DFW ATL Minimum Spanning Trees

3 Cycle Property Cycle Property: C C 8 f
4 2 3 6 7 9 e C f Cycle Property: Let T be a minimum spanning tree of a weighted graph G Let e be an edge of G that is not in T and C let be the cycle formed by e with T For every edge f of C, weight(f)  weight(e) Proof: By contradiction If weight(f) > weight(e) we can get a spanning tree of smaller weight by replacing f with e Replacing f with e yields a better spanning tree 8 4 2 3 6 7 9 C e f Minimum Spanning Trees

4 Partition Property Partition Property: U V 7 f
Consider a partition of the vertices of G into subsets U and V Let e be an edge of minimum weight across the partition There is a minimum spanning tree of G containing edge e Proof: Let T be an MST of G If T does not contain e, consider the cycle C formed by e with T and let f be an edge of C across the partition By the cycle property, weight(f)  weight(e) Thus, weight(f) = weight(e) We obtain another MST by replacing f with e 4 9 5 2 8 8 3 e 7 Replacing f with e yields another MST U V 7 f 4 9 5 2 8 8 3 e 7 Minimum Spanning Trees

5 Minimum Spanning Trees
Kruskal’s Algorithm Maintain a partition of the vertices into clusters Initially, single-vertex clusters Keep an MST for each cluster Merge “closest” clusters and their MSTs A priority queue stores the edges outside clusters Key: weight Element: edge At the end of the algorithm One cluster and one MST Algorithm KruskalMST(G) for each vertex v in G do Create a cluster consisting of v let Q be a priority queue. Insert all edges into Q T   {T is the union of the MSTs of the clusters} while T has fewer than n - 1 edges do e  Q.removeMin().getValue() [u, v]  G.endVertices(e) A  getCluster(u) B  getCluster(v) if A  B then Add edge e to T mergeClusters(A, B) return T Minimum Spanning Trees

6 Example G G 8 8 B 4 B 4 E 9 E 6 9 6 5 1 F 5 1 F C 3 11 C 3 11 2 2 7 H 7 H D D A 10 A 10 G G 8 8 B 4 B 4 9 E E 6 9 6 5 1 F 5 1 F C 3 3 11 C 11 2 2 7 H 7 H D D A A 10 10 Campus Tour

7 Example (contd.) four steps two steps G G 8 8 B 4 B 4 E 9 E 6 9 6 5 5
1 F 1 F C 3 C 3 11 11 2 2 7 H 7 H D D A 10 A 10 four steps two steps G G 8 8 B 4 B 4 E E 9 6 9 6 5 5 1 F 1 F C 3 11 C 3 11 2 2 7 H 7 H D D A A 10 10 Campus Tour

8 Data Structures for Kruskal’s Algorithm
The graph will be implemented using adjacency lists. The algorithm maintains a forest of trees. A priority queue extracts the edges by increasing weight. The priority queue is implemented as a min heap. An edge is accepted it if connects distinct trees. We need a data structure that maintains a partition, i.e., a collection of disjoint sets, with operations: makeSet(u): create a set consisting of u find(u): return the set storing u union(A, B): replace sets A and B with their union Minimum Spanning Trees

9 Recall of Data Structures for Disjoint Sets
Each set is stored as a linked list represented by its first member. Each list element (set member) has a reference to the set representative. Operation find(u) takes O(1) time, and returns the set of which u is a member. In operation union(A,B), we move the elements of the smaller set to the sequence of the larger set and update their references The time for operation union(A,B) is min(|A|, |B|) Whenever an element is processed, it goes into a set of size at least double, hence each element is processed at most log n times Minimum Spanning Trees

10 Partition-Based Implementation
Partition-based version of Kruskal’s Algorithm Cluster merges as unions Cluster locations as finds Running time O((n + m) log n) PQ operations O(m log m) =O(m log n) UF operations O(n log n) Algorithm KruskalMST(G) Initialize a partition P for each vertex v in G do P.makeSet(v) let Q be a priority queue. Insert all edges into Q T   {T is the union of the MSTs of the clusters} while T has fewer than n - 1 edges do e  Q.removeMin().getValue() [u, v]  G.endVertices(e) A  P.find(u) B  P.find(v) if A  B then Add edge e to T P.union(A, B) return T Minimum Spanning Trees

11 Prim-Jarnik’s Algorithm
Similar to Dijkstra’s algorithm We pick an arbitrary vertex s and we grow the MST as a cloud of vertices, starting from s We store with each vertex v label D(v) representing the smallest weight of an edge connecting v to a vertex in the cloud At each step: We add to the cloud the vertex u outside the cloud with the smallest distance label We update the labels of the vertices adjacent to u Minimum Spanning Trees

12 Prim-Jarnik’s Algorithm (cont.)
We will use adjacency lists for the representation of the input graph. We will use a priority queue to store, for each vertex v, the pair (v,e) with key D(v) where e is the edge with the smallest weight connecting v to the cloud and D(v) is that weight. The priority queue will be implemented as a min heap. Minimum Spanning Trees

13 Prim-Jarnik’s Algorithm (cont.)
Algorithm PrimJarnikMST(G) Pick any vertex v of G D[v]  0 for each vertex u  v do D[u]  + Initialize T  . Initialize a priority queue Q with an entry ((u,null),D[u]) for each vertex u, where (u,null) is the element and D[u] is the key. while Q is not empty do (u,e)  Q.removeMin() Add vertex u and edge e to T. for each vertex z adjacent to u such that z is in Q do if weight((u,z)) < D[z] then D[z]  weight((u,z)) Change to (z,(u,z)) the element of vertex z in Q Change to D[z] the key of vertex z in Q Return the tree T Minimum Spanning Trees

14 Minimum Spanning Trees
Example 7 7 D 7 D 2 2 B 4 B 4 8 9 5 9 5 5 2 F 2 F C C 8 8 3 3 8 8 E E A 7 A 7 7 7 7 7 7 D 2 7 D 2 B 4 B 4 5 9 5 5 9 4 2 F 5 C 2 F 8 C 8 3 8 3 8 E A E 7 7 A 7 7 Minimum Spanning Trees

15 Minimum Spanning Trees
Example (contd.) 7 7 D 2 B 4 9 4 5 5 2 F C 8 3 8 E A 3 7 7 7 D 2 B 4 5 9 4 5 2 F C 8 3 8 E A 3 7 Minimum Spanning Trees

16 Minimum Spanning Trees
Analysis Let n and m denote the number of vertices and edges of the input graph respectively. Since the priority queue is implemented as a heap, we can extract the vertex u in O(log n) time. We can update each D[z] value in O(log n) time as well (why?). This update is done at most once for each edge (u,z). Hence, Prim-Jarnik’s algorithm runs in O((n + m) log n) time. Minimum Spanning Trees

17 Minimum Spanning Trees
Readings M. T. Goodrich, R. Tamassia and D. Mount. Data Structures and Algorithms in C++. 2nd edition. John Wiley Chapter 13 Minimum Spanning Trees


Download ppt "Minimum Spanning Trees"

Similar presentations


Ads by Google