Total running time is O(E lg E).

Slides:



Advertisements
Similar presentations
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Advertisements

Greedy Algorithms Greed is good. (Some of the time)
Greed is good. (Some of the time)
Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm.
UNC Chapel Hill Lin/Foskey/Manocha Minimum Spanning Trees Problem: Connect a set of nodes by a network of minimal total length Some applications: –Communication.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 13 Minumum spanning trees Motivation Properties of minimum spanning trees Kruskal’s.
CSE 780 Algorithms Advanced Algorithms Minimum spanning tree Generic algorithm Kruskal’s algorithm Prim’s algorithm.
Lecture 18: Minimum Spanning Trees Shang-Hua Teng.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Analysis of Algorithms CS 477/677
Lecture 12 Minimum Spanning Tree. Motivating Example: Point to Multipoint Communication Single source, Multiple Destinations Broadcast – All nodes in.
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
David Luebke 1 9/10/2015 CS 332: Algorithms Single-Source Shortest Path.
1.1 Data Structure and Algorithm Lecture 13 Minimum Spanning Trees Topics Reference: Introduction to Algorithm by Cormen Chapter 13: Minimum Spanning Trees.
Design and Analysis of Computer Algorithm September 10, Design and Analysis of Computer Algorithm Lecture 5-2 Pradondet Nilagupta Department of Computer.
MST Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
2IL05 Data Structures Fall 2007 Lecture 13: Minimum Spanning Trees.
Spring 2015 Lecture 11: Minimum Spanning Trees
UNC Chapel Hill Lin/Foskey/Manocha Minimum Spanning Trees Problem: Connect a set of nodes by a network of minimal total length Some applications: –Communication.
Minimum Spanning Trees and Kruskal’s Algorithm CLRS 23.
1 Minimum Spanning Trees. Minimum- Spanning Trees 1. Concrete example: computer connection 2. Definition of a Minimum- Spanning Tree.
1 Greedy Algorithms and MST Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
November 13, Algorithms and Data Structures Lecture XII Simonas Šaltenis Aalborg University
Chapter 23: Minimum Spanning Trees: A graph optimization problem Given undirected graph G(V,E) and a weight function w(u,v) defined on all edges (u,v)
F d a b c e g Prim’s Algorithm – an Example edge candidates choosen.
© 2007 Seth James Nielson Minimum Spanning Trees … or how to bring the world together on a budget.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Greedy Algorithms Z. GuoUNC Chapel Hill CLRS CH. 16, 23, & 24.
1 22c:31 Algorithms Minimum-cost Spanning Tree (MST)
© 2007 Seth James Nielson Minimum Spanning Trees … or how to bring the world together on a budget.
1 Week 3: Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm.
MST Lemma Let G = (V, E) be a connected, undirected graph with real-value weights on the edges. Let A be a viable subset of E (i.e. a subset of some MST),
Lecture 12 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Minimum Spanning Trees Problem Description Why Compute MST? MST in Unweighted Graphs MST in Weighted Graphs –Kruskal’s Algorithm –Prim’s Algorithm 1.
Algorithm Design and Analysis June 11, Algorithm Design and Analysis Pradondet Nilagupta Department of Computer Engineering This lecture note.
November 22, Algorithms and Data Structures Lecture XII Simonas Šaltenis Nykredit Center for Database Research Aalborg University
David Luebke 1 11/21/2016 CS 332: Algorithms Minimum Spanning Tree Shortest Paths.
Tirgul 12 Solving T4 Q. 3,4 Rehearsal about MST and Union-Find
Greedy Algorithms General principle of greedy algorithm
Lecture ? The Algorithms of Kruskal and Prim
Algorithm Analysis Fall 2017 CS 4306/03
Introduction to Algorithms
Spanning Trees Kruskel’s Algorithm Prim’s Algorithm
Introduction to Algorithms`
CS 3343: Analysis of Algorithms
Minimum Spanning Trees
Minimum Spanning Tree Shortest Paths
CSC 413/513: Intro to Algorithms
Minimum Spanning Trees
Lecture 12 Algorithm Analysis
CISC 235: Topic 10 Graph Algorithms.
Minimum Spanning Trees
Algorithms and Data Structures Lecture XII
Data Structures – LECTURE 13 Minumum spanning trees
Greedy Algorithm (17.4/16.4) Greedy Algorithm (GA)
Analysis of Algorithms CS 477/677
Lecture 12 Algorithm Analysis
Minimum Spanning Tree Algorithms
Minimum Spanning Tree.
Minimum Spanning Trees
Algorithms Searching in a Graph.
Minimum spanning tree Shortest path algorithms
Greedy Algorithms Comp 122, Spring 2004.
Introduction to Algorithms: Greedy Algorithms (and Graphs)
Lecture 12 Algorithm Analysis
Advanced Algorithms Analysis and Design
Binhai Zhu Computer Science Department, Montana State University
Chapter 23: Minimum Spanning Trees: A graph optimization problem
Minimum Spanning Trees
Presentation transcript:

Total running time is O(E lg E). MST-Kruskal(G, w) 1. A   // initially A is empty 2. for each vertex v  V[G] // line 2-3 takes O(V) time 3. do Create-Set(v) // create set for each vertex 4. sort the edges of E by nondecreasing weight w 5. for each edge (u,v)  E, in order by nondecreasing weight 6. do if Find-Set(u)  Find-Set(v) // u&v on different trees 7. then A  A  {(u,v)} 8. Union(u,v) 9. return A Total running time is O(E lg E). UNC Chapel Hill Lin/Foskey/Manocha

Analysis of Kruskal Lines 1-3 (initialization): O(V) Line 4 (sorting): O(E lg E) Lines 6-8 (set operations): O(E log E) Total: O(E log E) UNC Chapel Hill Lin/Foskey/Manocha

Correctness of Kruskal Idea: Show that every edge added is a safe edge for A Assume (u, v) is next edge to be added to A. Will not create a cycle Let A’ denote the tree of the forest A that contains vertex u. Consider the cut (A’, V-A’). This cut respects A (why?) and (u, v) is the light edge across the cut (why?) Thus, by the MST Lemma, (u,v) is safe. UNC Chapel Hill Lin/Foskey/Manocha

Intuition behind Prim’s Algorithm Consider the set of vertices S currently part of the tree, and its complement (V-S). We have a cut of the graph and the current set of tree edges A is respected by this cut. Which edge should we add next? Light edge! UNC Chapel Hill Lin/Foskey/Manocha

Basics of Prim ’s Algorithm It works by adding leaves on at a time to the current tree. Start with the root vertex r (it can be any vertex). At any time, the subset of edges A forms a single tree. S = vertices of A. At each step, a light edge connecting a vertex in S to a vertex in V- S is added to the tree. The tree grows until it spans all the vertices in V. Implementation Issues: How to update the cut efficiently? How to determine the light edge quickly? UNC Chapel Hill Lin/Foskey/Manocha

Implementation: Priority Queue Priority queue implemented using heap can support the following operations in O(lg n) time: Insert (Q, u, key): Insert u with the key value key in Q u = Extract_Min(Q): Extract the item with minimum key value in Q Decrease_Key(Q, u, new_key): Decrease the value of u’s key value to new_key All the vertices that are not in the S (the vertices of the edges in A) reside in a priority queue Q based on a key field. When the algorithm terminates, Q is empty. A = {(v, [v]): v  V - {r}} UNC Chapel Hill Lin/Foskey/Manocha

Example: Prim’s Algorithm UNC Chapel Hill Lin/Foskey/Manocha

MST-Prim(G, w, r) 1. Q  V[G] 2. for each vertex u  Q // initialization: O(V) time 3. do key[u]   4. key[r]  0 // start at the root 5. [r]  NIL // set parent of r to be NIL 6. while Q   // until all vertices in MST 7. do u  Extract-Min(Q) // vertex with lightest edge 8. for each v  adj[u] 9. do if v  Q and w(u,v) < key[v] 10. then [v]  u 11. key[v]  w(u,v) // new lighter edge out of v 12. decrease_Key(Q, v, key[v]) UNC Chapel Hill Lin/Foskey/Manocha

Analysis of Prim Extracting the vertex from the queue: O(lg n) For each incident edge, decreasing the key of the neighboring vertex: O(lg n) where n = |V| The other steps are constant time. The overall running time is, where e = |E| T(n) = uV(lg n + deg(u) lg n) = uV (1+ deg(u)) lg n = lg n (n + 2e) = O((n + e) lg n) Essentially same as Kruskal’s: O((n+e) lg n) time UNC Chapel Hill Lin/Foskey/Manocha

Correctness of Prim Again, show that every edge added is a safe edge for A Assume (u, v) is next edge to be added to A. Consider the cut (A, V-A). This cut respects A (why?) and (u, v) is the light edge across the cut (why?) Thus, by the MST Lemma, (u,v) is safe. UNC Chapel Hill Lin/Foskey/Manocha

Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may be several solutions to achieve an optimal value.) Two common techniques: Dynamic Programming (global) Greedy Algorithms (local) UNC Chapel Hill Lin/Foskey/Manocha

Dynamic Programming Similar to divide-and-conquer, it breaks problems down into smaller problems that are solved recursively. In contrast to D&C, DP is applicable when the sub-problems are not independent, i.e. when sub-problems share sub-subproblems. It solves every sub-subproblem just once and saves the results in a table to avoid duplicated computation. UNC Chapel Hill Lin/Foskey/Manocha

Elements of DP Algorithms Substructure: decompose problem into smaller sub-problems. Express the solution of the original problem in terms of solutions for smaller problems. Table-structure: Store the answers to the sub-problem in a table, because sub-problem solutions may be used many times. Bottom-up computation: combine solutions on smaller sub-problems to solve larger sub-problems, and eventually arrive at a solution to the complete problem. UNC Chapel Hill Lin/Foskey/Manocha

Applicability to Optimization Problems Optimal sub-structure (principle of optimality): for the global problem to be solved optimally, each sub-problem should be solved optimally. This is often violated due to sub-problem overlaps. Often by being “less optimal” on one problem, we may make a big savings on another sub-problem. Small number of sub-problems: Many NP-hard problems can be formulated as DP problems, but these formulations are not efficient, because the number of sub-problems is exponentially large. Ideally, the number of sub-problems should be at most a polynomial number. UNC Chapel Hill Lin/Foskey/Manocha