Analysis of Algorithms

Slides:



Advertisements
Similar presentations
Chapter 23 Minimum Spanning Tree
Advertisements

Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Greedy Algorithms Greed is good. (Some of the time)
Greed is good. (Some of the time)
Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm.
1 Greedy 2 Jose Rolim University of Geneva. Algorithmique Greedy 2Jose Rolim2 Examples Greedy  Minimum Spanning Trees  Shortest Paths Dijkstra.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 13 Minumum spanning trees Motivation Properties of minimum spanning trees Kruskal’s.
Minimum Spanning Tree Algorithms
CSE 780 Algorithms Advanced Algorithms Minimum spanning tree Generic algorithm Kruskal’s algorithm Prim’s algorithm.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Lecture 18: Minimum Spanning Trees Shang-Hua Teng.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
1 Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Analysis of Algorithms CS 477/677
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 11 Instructor: Paul Beame.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
1 Minimum Spanning Trees Longin Jan Latecki Temple University based on slides by David Matuszek, UPenn, Rose Hoberman, CMU, Bing Liu, U. of Illinois, Boting.
TECH Computer Science Graph Optimization Problems and Greedy Algorithms Greedy Algorithms  // Make the best choice now! Optimization Problems  Minimizing.
1 Minimum Spanning Trees Longin Jan Latecki Temple University based on slides by David Matuszek, UPenn, Rose Hoberman, CMU, Bing Liu, U. of Illinois, Boting.
Data Structures and Algorithms Graphs Minimum Spanning Tree PLSD210.
Shortest Path Algorithms. Kruskal’s Algorithm We construct a set of edges A satisfying the following invariant:  A is a subset of some MST We start with.
1.1 Data Structure and Algorithm Lecture 13 Minimum Spanning Trees Topics Reference: Introduction to Algorithm by Cormen Chapter 13: Minimum Spanning Trees.
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 223 – Advanced Data Structures Graph Algorithms: Minimum.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Algorithms: Design and Analysis Summer School 2013 at VIASM: Random Structures and Algorithms Lecture 3: Greedy algorithms Phan Th ị Hà D ươ ng 1.
MST Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
2IL05 Data Structures Fall 2007 Lecture 13: Minimum Spanning Trees.
Spring 2015 Lecture 11: Minimum Spanning Trees
UNC Chapel Hill Lin/Foskey/Manocha Minimum Spanning Trees Problem: Connect a set of nodes by a network of minimal total length Some applications: –Communication.
Minimum Spanning Trees CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Minimum Spanning Trees and Kruskal’s Algorithm CLRS 23.
1 Minimum Spanning Trees. Minimum- Spanning Trees 1. Concrete example: computer connection 2. Definition of a Minimum- Spanning Tree.
1 Greedy Algorithms and MST Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
Spanning Trees CSIT 402 Data Structures II 1. 2 Two Algorithms Prim: (build tree incrementally) – Pick lower cost edge connected to known (incomplete)
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
November 13, Algorithms and Data Structures Lecture XII Simonas Šaltenis Aalborg University
Chapter 23: Minimum Spanning Trees: A graph optimization problem Given undirected graph G(V,E) and a weight function w(u,v) defined on all edges (u,v)
1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible b locally optimal.
Minimum Spanning Trees CSE 373 Data Structures Lecture 21.
1 22c:31 Algorithms Minimum-cost Spanning Tree (MST)
1 Week 3: Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm.
WEEK 12 Graphs IV Minimum Spanning Tree Algorithms.
MST Lemma Let G = (V, E) be a connected, undirected graph with real-value weights on the edges. Let A be a viable subset of E (i.e. a subset of some MST),
Lecture 12 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Midwestern State University Minimum Spanning Trees Definition of MST Generic MST algorithm Kruskal's algorithm Prim's algorithm 1.
November 22, Algorithms and Data Structures Lecture XII Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Greedy Algorithms General principle of greedy algorithm
Lecture ? The Algorithms of Kruskal and Prim
Chapter 9 : Graphs Part II (Minimum Spanning Trees)
Greedy function greedy { S <- S0 //Initialization
Lecture 22 Minimum Spanning Tree
Minimum Spanning Trees
Lecture 12 Algorithm Analysis
Minimum Spanning Tree.
Minimum-Cost Spanning Tree
Kruskal’s Minimum Spanning Tree Algorithm
Lecture 12 Algorithm Analysis
Minimum Spanning Tree Algorithms
Minimum Spanning Trees
Minimum Spanning Trees
Minimum Spanning Tree.
Minimum Spanning Trees
Greedy Algorithms Comp 122, Spring 2004.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Lecture 12 Algorithm Analysis
Chapter 23: Minimum Spanning Trees: A graph optimization problem
Minimum-Cost Spanning Tree
Presentation transcript:

Analysis of Algorithms The Greedy Approach

Greedy Algorithms Algorithms work in stages, considering one input at a time. At each stage a decision is made regarding whether or not a particular input is in an optimal solution. Inputs are considered to be in an order determined by some selection procedure. If the inclusion of an input into a partially constructed optimal solution will result in an infeasible solution, then this input is not added to the partial solution.

Greedy Algorithms Greedy algorithm obtains an optimal solution to a problem by making a sequence of choices. For each decision point in the algorithm, the choice that seems best at the moment is chosen. This heuristic strategy does not always produce an optimal solution. How can one tell if a greedy algorithm will solve a particular optimization problem? No way in general, But there are some key ingredients that are exhibited by most problems that lend themselves to a greedy strategy.

Greedy Algorithm Example The sales clerk often encounter the problem of giving change for a purchase. Customers usually don’t want to receive a lot of coins. The goal of sales clerk is not only to give the correct change, but to do so with as few coins as possible. A solution to an instance of change problem is a set of coins that adds up to the required amount. An optimal solution to a problem is such a set of minimum size.

Greedy Algorithm Example A greedy approach to the problem could proceed as follows. Initially there are no coins in the change. Sales clerk starts by looking for the largest coin (in value) he can found. I.e. His criterion for deciding which coin is best (locally optimal) is the value of the coin. This is called a selection procedure greedy algorithm.

Greedy Algorithm Example Next he sees if adding this coin to the change would make the total value of the change exceed the amount required. This is called the feasibility check in a greedy algorithm. If adding the coin would not make the change exceed the amount required, he adds the coin to the change. Next he checks to see if the value of the change is now equal to the amount required. This is the solution check in the greedy algorithm.

Greedy Algorithm Example If they are not equal, he gets another coin using his selection procedure, and repeats the process. He does this until the value of the change equals the amount required or he runs out of coins. In the later case, he is not able to return the exact amount required.

Greedy Algorithm Example while there are more coins and the instance is not solved do Grab the largest remaining coin //selection procedure if adding the coin makes the change exceed the amount required then //feasibility check reject the coin else add the coin to the change if the total value of the change equals the amount required then //solution check the instance is solved

Greedy Algorithm Example In the feasibility check, when we determine that adding a coin would make the change exceed the amount required, we learn that The set obtained by adding that coin can not be completed to give a solution to the instance. Therefore that set is infeasible and is rejected.

Greedy Algorithms Greedy Choice Property A globally optimal solution can be arrived at by making a locally optimal (greedy) choice. In dynamic programming, We make a choice at each step, but the Choice may depend on the solutions to subproblems.

Greedy Algorithms In a greedy algorithm We make whatever choice seems best at the moment and then solve the subproblems arising after the choice is made. The choice made by greedy algorithm may depend on choices so far, but it can not depned on any future choices or on the solutions to subproblems. A greedy algorithms starts with a locally optimal choice, and continues making locally optimal choice until a solution is found

Greedy Algorithms Optimal Substructure Optimal solution to the problem contains within it optimal solutions to sub-problems. This is a key ingredients of accessing the applicability of dynamic programming as well as greedy algorithms.

Minimum Spanning Tree A Spanning Tree for a connected, undirected graph, G = (V, E), is a subgraph of G that is an undirected tree and contains all the vertices of G. In a weighted graph G = (V, E, W), the weight of a subgraph is the sum of the weights of the edges in the subgraph. A minimum spanning tree (MST) for a weighted graph is a spanning tree with minimum weight.

Minimum Spanning Tree Consider the following graph B D C 2.0 4.0 1.0 3.0 Consider the following graph The possible spanning trees for this graph are A B D C 2.0 1.0 3.0 A B D C 2.0 3.0 A B D C 2.0 1.0 3.0 MST Weight is 6 MST Weight is 6 Weight is 7

Minimum Spanning Tree Minimum spanning trees are useful when we want to find the cheapest way to connect a Set of cities by roads Set of electrical terminals or computers by wires or telephone lines Etc…

Prims’s Algorithm for Minimum Spanning Tree Prim’s algorithm begins by selecting an arbitrary starting vertex, and then “branches out” form the past of the tree constructed so far by choosing a new vertex and edge at each iteration. The new edge connects the new vertex to the previous tree. During the course of the algorithm, the vertices may be thought of as divided into three (disjoint) categories as follows: Tree Vertices: in the tree constructed so far Fringe Vertices: Not in the tree, but adjacent to some vertex in the tree. Unseen vertices: all others

Prims’s Algorithm for Minimum Spanning Tree The key step in the algorithm is the selection of a vertex from the fringe and an incident edge. Prim’s algorithm always chooses an edge of minimum weight from a tree vertex to a fringe vertex. The general algorithm structure is

Prims’s Algorithm for Minimum Spanning Tree Prim MST(G, n) Initialize all the vertices as unseen Select an arbitrary vertex s to start the tree; reclassify it as tree. Reclassify all the vertices adjacent to s as fringe. While there are fringe vertices Select an edge of minimum weight between a tree vertex t and a fringe vertex v. Reclassify v as tree; add edge tv to the tree; Reclassify all unseen vertices adjacent to v as fringe.

Prims’s Algorithm for Minimum Spanning Tree 2 3 7 A B G F The tree so far Fringe Vertices The tree and fringe after the starting vertex A is selected A B G F I H C E D 2 7 3 6 1 5 4 8

Prims’s Algorithm for Minimum Spanning Tree 3 7 A C G F The tree so far Fringe Vertices After Selecting an edge and vertex: BG is not shown because AG is a better choice to reach G. B 2 4 A B G F I H C E D 2 7 3 6 1 5 4 8

Prims’s Algorithm for Minimum Spanning Tree 7 A C G F The tree so far Fringe Vertices After Selecting an edge AG : GB is not shown because vertex B is already include in a tree. B 2 4 3 I H 1 A B G F I H C E D 2 7 3 6 1 5 4 8

Prims’s Algorithm for Minimum Spanning Tree B G F I H C E D 2 7 3 6 1 5 4 8 A G F The final Minimum Spanning tree after prim’s algorithm is B 2 5 3 I E 1 D C H

Prim’s Algorithm Algorithm prim(G) F=empty for i=2 o n nearest[i]=1;distance[i]=w[1:i] end repeat n-1 times min=∞ for i=2 to n if 0<dist[i]<min min=dist[i], near=i e= edge connecting vertices index by near and nearest[near] add e to f dist[near]=-1 for i= 2 to n if w[i,near]<distance[i] distance[i]=w[i,near], nearest[i]=near

Analysis n-1(2(n-1))=n2 using (adjacency matrix) it may be changed if data structure is changed. if implemented via min heap its complexity would be (v-1+E)log(v)=Elog(v) algorithm will perform v-1 deletions of min element from graph, and makes verifications,chnages of element priority in in heap size not greater than V, as deletion will take O(log v) time,

Kruskal's Algorithm Edge based algorithm Add the edges one at a time, in increasing weight order The algorithm maintains A – a forest of trees. An edge is accepted it if connects vertices of distinct trees We need a data structure that maintains a partition, i.e.,a collection of disjoint sets MakeSet(S,x): S ¬ S È {{x}} Union(Si,Sj): S ¬ S – {Si,Sj} È {Si È Sj} FindSet(S, x): returns unique Si Î S, where x Î Si

Kruskal's Algorithm The algorithm adds the cheapest edge that connects two trees of the forest MST-Kruskal(G,w)  A ¬ Æ for each vertex v Î V[G] do Make-Set(v) sort the edges of E by non-decreasing weight w for each edge (u,v) Î E, in order by non-decreasing weight do if Find-Set(u) ¹ Find-Set(v) then A ¬ A È {(u,v)} Union(u,v) return A

Kruskal Example

Kruskal Example (2)

Kruskal Example (3)

Kruskal Example (4)

Kruskal Running Time A detailed analysis will show O(V) + O(Elog(E)) + O(Elog(V)). We need O(V) operations to build the initial forest with |V| trees each containing one node. The edges are stored in a priority queue and each time the smallest edge is retrieved, hence we need O(Elog(E)) operations to process the edges. Finally, the disjoint set operations are implemented by a tree with V nodes, O(Elog(V));(comparison of each edge is performed in worst case)

Disregarding the lower term O(V) we get O(E (log(V) + log(E)) Disregarding the lower term O(V) we get O(E (log(V) + log(E)). At the worst case E = O(V2). Hence log(E) = O(log(V2)) = O(2log(V)) = O(log(V). Thus we get complexity O(Elog(V)). On the other hand, V = O(E), hence we can reduce the complexity expression

Prim’s Vs Kruskal For sparse trees Kruskal's algorithm is better - since it is guided by the edges. For dense trees  Prim's algorithm is better - the process is limited by the number of the processed vertices