Download presentation
Presentation is loading. Please wait.
Published byTobias Stewart Modified over 8 years ago
1
Greedy Algorithms
2
Zhengjin,Central South University2 Review: Dynamic Programming Summary of the basic idea: Optimal substructure: optimal solution to problem consists of optimal solutions to subproblems Overlapping subproblems: few subproblems in total, many recurring instances of each Solve bottom-up, building a table of solved subproblems that are used to solve larger ones Variations: “ Table ” could be 3-dimensional, triangular, a tree, etc.
3
Zhengjin,Central South University3 Greedy Algorithms A greedy algorithm always makes the choice that looks best at the moment The hope: a locally optimal choice will lead to a globally optimal solution For some problems, it works well Dynamic programming can be overkill; greedy algorithms tend to be easier to code
4
Zhengjin,Central South University4 Review: The Knapsack Problem The famous knapsack problem: A thief breaks into a museum. Fabulous paintings, sculptures, and jewels are everywhere. The thief has a good eye for the value of these objects, and knows that each will fetch hundreds or thousands of dollars on the clandestine art collector ’ s market. But, the thief has only brought a single knapsack to the scene of the robbery, and can take away only what he can carry. What items should the thief take to maximize the haul?
5
Zhengjin,Central South University5 Review: The Knapsack Problem More formally, the 0-1 knapsack problem: The thief must choose among n items, where the ith item worth v i dollars and weighs w i pounds Carrying at most W pounds, maximize value Note: assume v i,w i,and W are all integers each item must be taken or left in entirety A variation, the fractional knapsack problem: Thief can take fractions of items
6
Zhengjin,Central South University6 Solving The Knapsack Problem The optimal solution to the fractional knapsack problem can be found with a greedy algorithm How? The optimal solution to the 0-1 problem cannot be found with the same greedy strategy Greedy strategy: take in order of dollars/pound
7
Zhengjin,Central South University7 The Knapsack Problem: Greedy Vs. Dynamic The fractional problem can be solved greedily The 0-1 problem cannot be solved with a greedy approach As you have seen, however, it can be solved with dynamic programming
8
Zhengjin,Central South University8 Change Making Problem How to make 63 cents of change using coins of denominations of 25, 10, 5, and 1 so that the total number of coins is the smallest? The idea: make the locally best choice at each step. Is the solution optimal?
9
Zhengjin,Central South University9 Greedy Algorithms A greedy algorithm makes a locally optimal choice in the hope that this choice will lead to a globally optimal solution. The choice made at each step must be: Feasible Satisfy the problem’s constraints locally optimal Be the best local choice among all feasible choices Irrevocable Once made, the choice can’t be changed on subsequent steps. Do greedy algorithms always yield optimal solutions? Example: change making problem with a denomination set of 11, 5 and 1 and to make 15 cents of change.
10
Zhengjin,Central South University10 Applications of the Greedy Strategy Optimal solutions: change making Minimum Spanning Tree (MST) Single-source shortest paths Huffman codes Approximations: Traveling Salesman Problem (TSP) Knapsack problem other optimization problems
11
Zhengjin,Central South University11 Minimum Spanning Tree (MST) Spanning tree of a connected graph G: a connected acyclic subgraph (tree) of G that includes all of G’s vertices. Minimum Spanning Tree of a weighted, connected graph G: a spanning tree of G of minimum total weight. Example: 3 4 2 1 4 2 6 1 3
12
Zhengjin,Central South University12 Prim’s MST algorithm Start with a tree, T 0,consisting of one vertex “Grow” tree one vertex/edge at a time Construct a series of expanding subtrees T 1, T 2, … T n-1..At each stage construct T i+1 from T i by adding the minimum weight edge connecting a vertex in tree (T i ) to one not yet in tree choose from “fringe” edges (this is the “greedy” step!) Or (another way to understand it) expanding each tree (Ti) in a greedy manner by attaching to it the nearest vertex not in that tree. (a vertex not in the tree connected to a vertex in the tree by an edge of the smallest weight) Algorithm stops when all vertices are included
13
Zhengjin,Central South University13 Examples 3 4 2 1 4 2 6 1 3 a edc b 1 5 2 4 6 3 7 Fringe edges: one vertex is in Ti and the other is not. Unseen edges: both vertices are not in Ti.
14
Zhengjin,Central South University14 The Key Point Notations : T: the expanding subtree.,Q: the remaining vertices. At each stage, the key point of expanding the current subtree T is to determine which vertex in Q is the nearest vertex. Q can be thought of as a priority queue: The key(priority) of each vertex, key[v], means the minimum weight edge from v to a vertex in T. Key[v] is ∞ if v is not linked to any vertex in T. The major operation is to to find and delete the nearest vertex (v, for which key[v] is the smallest among all the vertices) Remove the nearest vertex v from Q and add it to the corresponding edge to T. With the occurrence of that action, the key of v’s neighbors will be changed.
15
Zhengjin,Central South University15 ALGORITHM MST-PRIM( G, w, r ) //w: weight; r: root, the starting vertex 1. for each u V[G] 2. do key[u] 3. P[u] Null // P[u] : the parent of u 4. key[r] 0 5. Q V[G] //Now the priority queue, Q has been built. 6. while Q 7. do u Extract-Min(Q) //remove the nearest vertex from Q 8. for each v Adj[u] // update the key for each of u’s adjacent node 9. do if v Q and w(u,v) < key[v] 10. then P[v] u 11. Key[v] w(u,v)
16
Zhengjin,Central South University16 Notes about Prim’s algorithm Need priority queue for locating the nearest vertex Use unordered array to store the priority queue: Efficiency: Θ(n 2 ) use min-heap to store the priority queue Efficiency: For graph with n vertices and m edges: (n + m) logn O (m log n) number of stages (min-heap deletions) number of edges considered (min-heap key decreases) Key decreases/deletion from min-heap
17
Zhengjin,Central South University17 Another Greedy Algorithm for MST: Kruskal Edges are initially sorted by increasing weight Start with an empty forest “grow” MST one edge at a time intermediate stages usually have forest of trees (not connected) at each stage add minimum weight edge among those not yet used that does not create a cycle at each stage the edge may: expand an existing tree combine two existing trees into a single tree create a new tree need efficient way of detecting/avoiding cycles algorithm stops when all vertices are included
18
Zhengjin,Central South University18 Kruskal’s Algorithm ALGORITHM Kruscal(G) //Input: A weighted connected graph G = //Output: E T, the set of edges composing a minimum spanning tree of G. 1. Sort E in nondecreasing order of the edge weights w(e i 1 ) <= … <= w(e i |E| ) 2. E T ; ecounter 0 //initialize the set of tree edges and its size 3. k 0 4. while encounter < |V| - 1 do k k + 1 if E T U {e i k } is acyclic E T E T U {e i k } ; ecounter ecounter + 1 5. return E T P314-P317 (UNION-FIND ALGORITHM)
19
Zhengjin,Central South University19 Efficiency of Kruskal’s Algorithm Efficiency: For graph with n vertices and m edges: O(n + m logn) if use the efficiency UNION-FIND algorithm. SORT: O( m logm) FIND: O( m logn) UNION: O( n) So the efficiency of Kruskal’s Algorithm is O(n + m logn)
20
Zhengjin,Central South University20 Minimum Spanning Tree-SUMMARY Is Prim ’ s algorithm greedy? Why? Is Kruskal ’ s algorithm greedy? Why?
21
Zhengjin,Central South University21 Shortest Paths – Dijkstra’s Algorithm Shortest Path Problems All pair shortest paths (Floy’s algorithm) Single Source Shortest Paths Problem (Dijkstra’s algorithm): Given a weighted graph G, find the shortest paths from a source vertex s to each of the other vertices. a edc b 3 4 6 2 5 7 4
22
Zhengjin,Central South University22 Prim’s and Dijkstra’s Algorithms Generate different kinds of spanning trees Prim’s: a minimum spanning tree. Dijkstra’s : a spanning tree rooted at a given source s, such that the distance from s to every other vertex is the shortest. Different greedy strategies Prims’: Always choose the closest (to the tree) vertex in the priority queue Q to add to the expanding tree V T. Dijkstra’s : Always choose the closest (to the source) vertex in the priority queue Q to add to the expanding tree V T. Different labels for each vertex Prims’: parent vertex and the distance from the tree to the vertex.. Dijkstra’s : parent vertex and the distance from the source to the vertex.
23
Zhengjin,Central South University23 Dijkstra’s Algorithm ALGORITHM Dijkstra(G, s) //Input: A weighted connected graph G = and a source vertex s //Output: The length d v of a shortest path from s to v and its penultimate vertex p v for every vertex v in V Initialize (Q) //initialize vertex priority in the priority queue for every vertex v in V do d v ∞ ; P v null// P v, the parent of v insert(Q, v, d v ) //initialize vertex priority in the priority queue d s 0; Decrease(Q, s, d s ) //update priority of s with d s, making d s, the minimum V T for i 0 to |V| - 1 do//produce |V| - 1 edges for the tree u* DeleteMin(Q)//delete the minimum priority element V T V T U {u*}//expanding the tree, choosing the locally best vertex for every vertex u in V – V T that is adjacent to u* do if d u* + w(u*, u) < d u d u d u + w(u*, u); p u u* Decrease(Q, u, d u )
24
Zhengjin,Central South University24 Notes on Dijkstra’s Algorithm Doesn’t work with negative weights Can you give a counter example? Applicable to both undirected and directed graphs Efficiency Use unordered array to store the priority queue: Θ(n 2 ) Use min-heap to store the priority queue: O (m log n)
25
Zhengjin,Central South University25 Summary The greedy technique suggests constructing a solution to an optimization problem through a sequence of steps,each expanding a partially constructed solution obtained so far,until a complete solution to the problem is reached.On each step,the choice made must be feasible,locally optimal,and irrevocable. Prim’s algorithm is a greedy algorithm for constructing a minimum spanning tree of a weighted connected graph.It works by attaching to a previously constructed subtree a vertex closest to the already in the tree.
26
Zhengjin,Central South University26 Summary Kruskal’s algorithm is another greedy algorithm for the minimum spanning tree problem.It constructs a minimum spanning tree by selecting edges in increasing order of their weights provided that the inclusion doesn’t create a cycle. Dijkstra’s algorithm solves the single-source shortest- path problem of finding shortest paths from a given vertex (the source) to all the other vertices of a weighted graph or digraph.It works as Prim’s algorithm but compares path lengths rather than edge lengths.Dijktra’s algorithm always yields a correct solution for a graph with nonnegative weights.
27
Zhengjin,Central South University27 Homework5(a) Exercise 9.1 2, 6.a Exercise 9.2 5 Exercise 9.3 8
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.