MST Lemma Let G = (V, E) be a connected, undirected graph with real-value weights on the edges. Let A be a viable subset of E (i.e. a subset of some MST),

Slides:



Advertisements
Similar presentations
Chapter 23 Minimum Spanning Tree
Advertisements

Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm.
Minimum Spanning Tree (MST) form a tree that connects all the vertices (spanning tree). minimize the total edge weight of the spanning tree. Problem Select.
UNC Chapel Hill Lin/Foskey/Manocha Minimum Spanning Trees Problem: Connect a set of nodes by a network of minimal total length Some applications: –Communication.
1 Greedy 2 Jose Rolim University of Geneva. Algorithmique Greedy 2Jose Rolim2 Examples Greedy  Minimum Spanning Trees  Shortest Paths Dijkstra.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 13 Minumum spanning trees Motivation Properties of minimum spanning trees Kruskal’s.
Minimum Spanning Tree Algorithms
CSE 780 Algorithms Advanced Algorithms Minimum spanning tree Generic algorithm Kruskal’s algorithm Prim’s algorithm.
Lecture 18: Minimum Spanning Trees Shang-Hua Teng.
1 Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Analysis of Algorithms CS 477/677
Lecture 12 Minimum Spanning Tree. Motivating Example: Point to Multipoint Communication Single source, Multiple Destinations Broadcast – All nodes in.
CSE Algorithms Minimum Spanning Trees Union-Find Algorithm
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 11 Instructor: Paul Beame.
David Luebke 1 9/10/2015 CS 332: Algorithms Single-Source Shortest Path.
1.1 Data Structure and Algorithm Lecture 13 Minimum Spanning Trees Topics Reference: Introduction to Algorithm by Cormen Chapter 13: Minimum Spanning Trees.
Design and Analysis of Computer Algorithm September 10, Design and Analysis of Computer Algorithm Lecture 5-2 Pradondet Nilagupta Department of Computer.
Definition: Given an undirected graph G = (V, E), a spanning tree of G is any subgraph of G that is a tree Minimum Spanning Trees (Ch. 23) abc d f e gh.
Theory of Computing Lecture 10 MAS 714 Hartmut Klauck.
Algorithms: Design and Analysis Summer School 2013 at VIASM: Random Structures and Algorithms Lecture 3: Greedy algorithms Phan Th ị Hà D ươ ng 1.
MST Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
2IL05 Data Structures Fall 2007 Lecture 13: Minimum Spanning Trees.
Spring 2015 Lecture 11: Minimum Spanning Trees
UNC Chapel Hill Lin/Foskey/Manocha Minimum Spanning Trees Problem: Connect a set of nodes by a network of minimal total length Some applications: –Communication.
Minimum Spanning Trees and Kruskal’s Algorithm CLRS 23.
1 Minimum Spanning Trees. Minimum- Spanning Trees 1. Concrete example: computer connection 2. Definition of a Minimum- Spanning Tree.
1 Greedy Algorithms and MST Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
1 Minimum Spanning Trees (some material adapted from slides by Peter Lee, Ananda Guna, Bettina Speckmann)
November 13, Algorithms and Data Structures Lecture XII Simonas Šaltenis Aalborg University
Chapter 23: Minimum Spanning Trees: A graph optimization problem Given undirected graph G(V,E) and a weight function w(u,v) defined on all edges (u,v)
F d a b c e g Prim’s Algorithm – an Example edge candidates choosen.
Finding Minimum Spanning Trees Algorithm Design and Analysis Week 4 Bibliography: [CLRS]- Chap 23 – Minimum.
© 2007 Seth James Nielson Minimum Spanning Trees … or how to bring the world together on a budget.
Greedy Algorithms Z. GuoUNC Chapel Hill CLRS CH. 16, 23, & 24.
1 22c:31 Algorithms Minimum-cost Spanning Tree (MST)
© 2007 Seth James Nielson Minimum Spanning Trees … or how to bring the world together on a budget.
1 2/23/2016 ITCS 6114 Topological Sort Minimum Spanning Trees.
1 Week 3: Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm.
Lecture 12 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Minimum Spanning Trees Problem Description Why Compute MST? MST in Unweighted Graphs MST in Weighted Graphs –Kruskal’s Algorithm –Prim’s Algorithm 1.
Algorithm Design and Analysis June 11, Algorithm Design and Analysis Pradondet Nilagupta Department of Computer Engineering This lecture note.
Minimum Spanning Tree. p2. Minimum Spanning Tree G=(V,E): connected and undirected w: E  R, weight function a b g h i c f d e
November 22, Algorithms and Data Structures Lecture XII Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Tirgul 12 Solving T4 Q. 3,4 Rehearsal about MST and Union-Find
Greedy Algorithms General principle of greedy algorithm
Lecture ? The Algorithms of Kruskal and Prim
Introduction to Algorithms
Spanning Trees Kruskel’s Algorithm Prim’s Algorithm
CS 3343: Analysis of Algorithms
Minimum Spanning Trees
Minimum Spanning Trees
Lecture 12 Algorithm Analysis
CISC 235: Topic 10 Graph Algorithms.
Minimum Spanning Tree.
Algorithms and Data Structures Lecture XII
Data Structures – LECTURE 13 Minumum spanning trees
Chapter 23 Minimum Spanning Tree
Lecture 12 Algorithm Analysis
Minimum Spanning Tree.
Minimum Spanning Trees
Minimum spanning tree Shortest path algorithms
Greedy Algorithms Comp 122, Spring 2004.
Lecture 12 Algorithm Analysis
Total running time is O(E lg E).
Advanced Algorithms Analysis and Design
Chapter 23: Minimum Spanning Trees: A graph optimization problem
Minimum Spanning Trees
Presentation transcript:

MST Lemma Let G = (V, E) be a connected, undirected graph with real-value weights on the edges. Let A be a viable subset of E (i.e. a subset of some MST), let (S, V-S) be any cut that respects A, and let (u,v) be a light edge crossing this cut. Then, the edge is safe for A. Point of Lemma: Greedy strategy works!

Basics of Kruskal’s Algorithm It works by attempting to add edges to the A in increasing order of weight (lightest edge first). –If the next edge does not induce a cycle among the current set of edges, then it is added to A. –If it does, then this edge is passed over, and we consider the next edge in order. –As this algorithm runs, the edges of A will induce a forest on the vertices and the trees of this forest are merged together until we have a single tree containing all vertices.

Detecting a Cycle We can perform a DFS on subgraph induced by the edges of A, but this takes too much time. Use “disjoint set UNION-FIND” data structure. This data structure supports 3 operations: Create-Set(u): create a set containing u. Find-Set(u): Find the set that contains u. Union(u, v): Merge the sets containing u and v. Each can be performed in O(lg n) time. The vertices of the graph will be elements to be stored in the sets; the sets will be vertices in each tree of A (stored as a simple list of edges).

MST-Kruskal(G, w) 1. A   // initially A is empty 2. for each vertex v  V[G] // line 2-3 takes O(V) time 3. do Create-Set(v)// create set for each vertex 4. sort the edges of E by nondecreasing weight w 5. for each edge (u,v)  E, in order by nondecreasing weight 6. do if Find-Set(u)  Find-Set(v) // u&v on different trees 7. then A  A  {(u,v)} 8. Union(u,v) 9. return A Total running time is O(E lg E).

Example: Kruskal’s Algorithm Figure 4: Kruskal’s Algorithm

Analysis of Kruskal’s Lines 1-3 (initialization): O(V) Line 4 (sorting): O(E lg E) Lines 6-8 (set-operation): O(E log E) Total: O(E log E)

Correctness Consider the edge (u, v) that the algorithm seeks to add next, and suppose that this edge does not induce a cycle in A. Let A’ denote the tree of the forest A that contains vertex u. Consider the cut (A’, V-A’). Every edge crossing the cut is not in A, and so this cut respects A, and (u, v) is the light edge across the cut (because any lighter edge would have been considered earlier by the algorithm). Thus, by the MST Lemma, (u,v) is safe.

Intuition behind Prim’s Algortihm Consider the set of vertices S currently part of the tree, and its complement (V-S). We have a cut of the graph and the current set of tree edges A is respected by this cut. Which edge should we add next? Light edge!

Basics of Prim ’s Algorithm It works by building the tree up by adding leaves on at a time to the current tree. –Start with the root vertex r (it can be any vertex). At any time, the subset of edges A forms a single tree. S = vertices of A. –At each step, a light edge connecting a vertex in S to a vertex in V- S is added to the tree. –The tree grows until it spans all the vertices in V. Implementation Issues: –How to update the cut efficiently? –How to determine the light edge quickly?

Implementation: Priority Queue Priority queue implemented using heap can support the following operations in O(lg n) time: –Insert (Q, u, key): Insert u with the key value key in Q – u = Extract_Min(Q): Extract the item with minimum key value in Q –Decrease_Key(Q, u, new_key): Decrease the value of u’s key value to new_key All the vertices that are not in the S (the vertices of the edges in A) reside in a priority queue Q based on a key field. When the algorithm terminates, Q is empty. A = {(v,  [v]): v  V - {r}}

MST-Prim(G, w, r) 1. Q  V[G] 2. for each vertex u  Q// initialization: O(V) time 3. do key[u]   4. key[r]  0// start at the root 5.  [r]  NIL// set parent of r to be NIL 6. while Q   // until all vertices in MST 7. do u  Extract-Min(Q) // vertex with lightest edge 8. for each v  adj[u] 9. do if v  Q and w(u,v) < key[v] 10. then  [v]  u 11. key[v]  w(u,v) // new lighter edge out of v 12. decrease_Key(Q, v, key[v])

Example: Prim’s Algorithm

Analysis of Prim’s Extracting the vertex from the queue: O(lg n) For each incident edge, decreasing the key of the neighboring vertex: O(lg n) where n = |V| The other steps are constant time. The overall running time is, where e = |E| T(n) =  u  V (lg n + deg(u) lg n) =  u  V (1+ deg(u)) lg n = lg n (n + 2e) = O((n + e) lg n) Essentially same as Kruskal’s: O((n+e) lg n) time