Download presentation
Presentation is loading. Please wait.
Published byGerard Tyler Modified over 8 years ago
1
Greedy Algorithms Z. GuoUNC Chapel Hill CLRS CH. 16, 23, & 24
2
Overview Greedy algorithms <- optimization problems. Problems must have optimal substructure : – an optimal solution must be made of subproblems that are solved optimally. Problems must have greedy-choice property: – A globally optimal solution can be arrived at by making a locally optimal (greedy) choice. When we have a choice to make, we can make the one that looks best right now. Demonstrate that a locally optimal choice can be part of a globally optimal solution. UNC Chapel Hill
3
Activity-selection Problem Input: Set S of n activities, a 1, a 2, …, a n. – s i = start time of activity i. – f i = finish time of activity i. Output: Subset A S with a maximum number of compatible activities. – Two activities are compatible if their intervals don’t overlap. Example: Activities in each line are compatible. UNC Chapel Hill
4
Optimal Substructure Let’s assume that activities are sorted by finishing times. f 1 f 2 … f n. Suppose an optimal solution includes activity a k. – This generates two subproblems. – Selecting from a 1, …, a k-1, activities compatible with one another, and that finish before a k starts (compatible with a k ). – Selecting from a k+1, …, a n, activities compatible with one another, and that start after a k finishes. – The solutions to the two subproblems must be optimal. Prove using the cut-and-paste approach. Thus, we could use dynamic programming… UNC Chapel Hill
5
Recursive Solution Let S ij = subset of activities in S that start after a i finishes and finish before a j starts. Subproblems: Selecting maximum number of mutually compatible activities from S ij. Let c[i, j] = size of maximum-size subset of mutually compatible activities in S ij. Recursive Solution: UNC Chapel Hill
6
Greedy-choice Property The problem also exhibits the greedy-choice property. – There is an optimal solution to the subproblem S ij, that includes the activity with the earliest finish time in set S ij. – Proof: if any solution does not do the task with earliest finishing time, then replace its first task with that. Hence, there is an optimal solution to S doing a 1. So, make this greedy choice without first evaluating the cost of subproblems. Then solve only the subproblem that results from making this greedy choice. Combine the greedy choice and the subproblem solution. UNC Chapel Hill
7
Recursive Algorithm Recursive-Activity-Selector (s, f, i, j) 1.m i+1 2.while m < j and s m < f i 3. do m m+1 4.if m < j 5. then return {a m } Recursive-Activity-Selector(s, f, m, j) 6. else return Recursive-Activity-Selector (s, f, i, j) 1.m i+1 2.while m < j and s m < f i 3. do m m+1 4.if m < j 5. then return {a m } Recursive-Activity-Selector(s, f, m, j) 6. else return Initial Call: Recursive-Activity-Selector (s, f, 0, n+1) Complexity: (n) This is just to make the algorithm look complicated; The iterative algorithm is even easier. (See CLRS 16) UNC Chapel Hill
8
Typical Steps Cast the optimization problem as one in which we make a choice & are left with one subproblem to solve. Prove that there exists an optimal solution that makes the greedy choice, so the greedy choice is always safe. Show that greedy choice and optimal solution to subproblem optimal solution to the problem. Make the greedy choice and solve top-down. May have to preprocess input to put it into greedy order. – Example: Sorting activities by finish time. UNC Chapel Hill
9
Minimum Spanning Trees CLRS CH. 23
10
Minimum Spanning Trees Given: Connected, undirected, weighted graph, G Find: Minimum - weight spanning tree, T Example: bc a def 5 11 0 3 1 7 -3 2 a bc fed 5 3 1 0 Acyclic subset of edges(E) that connects all vertices of G. UNC Chapel Hill
11
Generic Algorithm “Grows” a set A. Invariant: A is subset of some MST. Edge is “safe” adding it to A maintains the invariant. A := ; while A not complete tree do find a safe edge (u, v); A := A {(u, v)} od A := ; while A not complete tree do find a safe edge (u, v); A := A {(u, v)} od UNC Chapel Hill
12
cut partitions vertices into disjoint sets, S and V – S. bca def 5 11 0 3 1 7 -3 2 this edge crosses the cut a light edge crossing cut (could be more than one) Definitions cut respects the edge set {(a, b), (b, c)} one endpoint is in S and the other is in V – S. no edge in the set crosses the cut UNC Chapel Hill
13
Proof: Let T be a MST that includes A. Case: (u, v) in T. We’re done. Case: (u, v) not in T. We have the following: Theorem 23.1 Theorem 23.1: Let (S, V-S) be any cut that respects A, and let (u, v) be a light edge crossing (S, V-S). Then, (u, v) is safe for A. Theorem 23.1: Let (S, V-S) be any cut that respects A, and let (u, v) be a light edge crossing (S, V-S). Then, (u, v) is safe for A. (x, y) crosses cut. Let T´ = T - {(x, y)} {(u, v)}. Because (u, v) is light for cut, w(u, v) w(x, y). Thus, w(T´) = w(T) - w(x, y) + w(u, v) w(T). Hence, T´ is also a MST. So, (u, v) is safe for A. u y x v edge in A cut shows edges in T UNC Chapel Hill
14
In general, A will consist of several connected components. Corollary Corollary: If (u, v) is a light edge connecting one CC in (V, A) to another CC in (V, A), then (u, v) is safe for A. Corollary: If (u, v) is a light edge connecting one CC in (V, A) to another CC in (V, A), then (u, v) is safe for A. UNC Chapel Hill
15
Kruskal’s Algorithm Starts with each vertex in its own component. Repeatedly merges two components into one by choosing a light edge that connects them (i.e., a light edge crossing the cut between them). Scans the set of edges in monotonically increasing order by weight. Uses a disjoint-set data structure to determine whether an edge connects vertices in different components. UNC Chapel Hill
16
Prim’s Algorithm Builds one tree, so A is always a tree. Starts from an arbitrary “root” r. At each step, adds a light edge crossing cut ( V A, V - V A ) to A. – V A = vertices that A is incident on. UNC Chapel Hill
17
Prim’s Algorithm Builds one tree, so A is always a tree. Starts from an arbitrary “root” r. At each step, adds a light edge crossing cut ( V A, V - V A ) to A. – V A = vertices that A is incident on. Uses a priority queue Q to find a light edge quickly. Each object in Q is a vertex in V - V A. Key of v is minimum weight of any edge ( u, v ), where u V A. Then the vertex returned by Extract-Min is v such that there exists u V A and ( u, v ) is light edge crossing ( V A, V - V A ). Key of v is if v is not adjacent to any vertex in V A. UNC Chapel Hill
18
Q := V[G]; for each u Q do key[u] := od; key[r] := 0; [r] := NIL; while Q do u := Extract - Min(Q); for each v Adj[u] do if v Q w(u, v) < key[v] then [v] := u; key[v] := w(u, v) fi od Q := V[G]; for each u Q do key[u] := od; key[r] := 0; [r] := NIL; while Q do u := Extract - Min(Q); for each v Adj[u] do if v Q w(u, v) < key[v] then [v] := u; key[v] := w(u, v) fi od Complexity: Using binary heaps: O(E lg V). Initialization – O(V). Building initial queue – O(V). V Extract-Min’s – O(V lgV). E Decrease-Key’s – O(E lg V). Using Fibonacci heaps: O(E + V lg V). (see book) Prim’s Algorithm Note: A = {(v, [v]) : v v - {r} - Q}. decrease-key operation UNC Chapel Hill
19
Example of Prim’s Algorithm b/ c/ a/0 d/ e/ f/ 5 11 0 3 1 7 -3 2 Q = a b c d e f 0 Not in tree UNC Chapel Hill
20
Example of Prim’s Algorithm b/5 c/ a/0 d/11 e/ f/ 5 11 0 3 1 7 -3 2 Q = b d c e f 5 11 UNC Chapel Hill
21
Example of Prim’s Algorithm b/5 c/7 a/0 d/11e/3 f/ 5 11 0 3 1 7 -3 2 Q = e c d f 3 7 11 UNC Chapel Hill
22
Example of Prim’s Algorithm b/5 c/1 a/0 d/0 e/3 f/2 5 11 0 3 1 7 -3 2 Q = d c f 0 1 2 UNC Chapel Hill
23
Example of Prim’s Algorithm b/5 c/1 a/0 d/0e/3 f/2 5 11 0 3 1 7 -3 2 Q = c f 1 2 UNC Chapel Hill
24
Example of Prim’s Algorithm b/5c/1 a/0 d/0 e/3 f/-3 5 11 0 3 1 7 -3 2 Q = f -3 UNC Chapel Hill
25
Example of Prim’s Algorithm b/5c/1 a/0 d/0 e/3 f/-3 5 11 0 3 1 7 -3 2 Q = UNC Chapel Hill
26
Example of Prim’s Algorithm 0 b/5c/1 a/0 d/0 e/3f/-3 5 3 1 -3 UNC Chapel Hill
27
Chapter 24: Single-Source Shortest Paths Given: A single source vertex in a weighted, directed graph. Want to compute a shortest path for each possible destination. – Similar to BFS. We will assume either – no negative-weight edges, or – no reachable negative-weight cycles. Algorithm will compute a shortest-path tree. – Similar to BFS tree. UNC Chapel Hill
28
Outline General Lemmas and Theorems. – CLRS now does this last. We’ll still do it first. Bellman-Ford algorithm. DAG algorithm. Dijkstra’s algorithm. We will skip Section 24.4. UNC Chapel Hill
29
Corollary: Let p = SP from s to v, where p = s u v. Then, δ(s, v) = δ(s, u) + w(u, v). Corollary: Let p = SP from s to v, where p = s u v. Then, δ(s, v) = δ(s, u) + w(u, v). General Results (Relaxation) Lemma 24.1: Let p = ‹v 1, v 2, …, v k › be a SP from v 1 to v k. Then, p ij = ‹v i, v i+1, …, v j › is a SP from v i to v j, where 1 i j k. Lemma 24.1: Let p = ‹v 1, v 2, …, v k › be a SP from v 1 to v k. Then, p ij = ‹v i, v i+1, …, v j › is a SP from v i to v j, where 1 i j k. So, we have the optimal-substructure property. Bellman-Ford algorithm uses dynamic programming. Dijkstra’s algorithm uses the greedy approach. Let δ(u, v) = weight of SP from u to v. Lemma 24.10: Let s V. For all edges (u,v) E, we have δ(s, v) δ(s, u) + w(u,v). Lemma 24.10: Let s V. For all edges (u,v) E, we have δ(s, v) δ(s, u) + w(u,v). UNC Chapel Hill
30
Relaxation Initialize(G, s) for each v V[G] do d[v] := ; [v] := NIL od; d[s] := 0 Initialize(G, s) for each v V[G] do d[v] := ; [v] := NIL od; d[s] := 0 Relax(u, v, w) if d[v] > d[u] + w(u, v) then d[v] := d[u] + w(u, v); [v] := u fi Relax(u, v, w) if d[v] > d[u] + w(u, v) then d[v] := d[u] + w(u, v); [v] := u fi Algorithms keep track of d[v], [v]. Initialized as follows: These values are changed when an edge (u, v) is relaxed: UNC Chapel Hill
31
Properties of Relaxation Consider any algorithm in which d[v], and [v] are first initialized by calling Initialize(G, s) [s is the source], and are only changed by calling Relax. We have: Lemma 24.11: ( v:: d[v] (s, v)) is an invariant. Implies d[v] doesn’t change once d[v] = (s, v). Proof: Initialize(G, s) establishes invariant. If call to Relax(u, v, w) changes d[v], then it establishes: d[v] = d[u] + w(u, v) (s, u) + w(u, v), invariant holds before call. (s, v), by Lemma 24.10. Corollary 24.12: If there is no path from s to v, then d[v] = δ(s, v) = is an invariant. Corollary 24.12: If there is no path from s to v, then d[v] = δ(s, v) = is an invariant. UNC Chapel Hill
32
More Properties Lemma 24.14: Let p = SP from s to v, where p = s u v. If d[u] = δ(s, u) holds at any time prior to calling Relax(u, v, w), then d[v] = δ(s, v) holds at all times after the call. Lemma 24.14: Let p = SP from s to v, where p = s u v. If d[u] = δ(s, u) holds at any time prior to calling Relax(u, v, w), then d[v] = δ(s, v) holds at all times after the call. Proof: After the call we have: d[v] d[u] + w(u, v), by Lemma 24.13. = (s, u) + w(u, v), d[u] = (s, u) holds. = (s, v), by corollary to Lemma 24.1. By Lemma 24.11, d[v] δ(s, v), so d[v] = δ(s, v). Lemma 24.13: Immediately after relaxing edge (u, v) by calling Relax(u, v, w), we have d[v] d[u] + w(u, v). Lemma 24.13: Immediately after relaxing edge (u, v) by calling Relax(u, v, w), we have d[v] d[u] + w(u, v). UNC Chapel Hill
33
Predecessor Subgraph Lemma 24.16: Assume given graph G has no negative-weight cycles reachable from s. Let G = predecessor subgraph. G is always a tree with root s (i.e., this property is an invariant). Lemma 24.16: Assume given graph G has no negative-weight cycles reachable from s. Let G = predecessor subgraph. G is always a tree with root s (i.e., this property is an invariant). Proof: Two proof obligations: (1) G is acyclic. (2) There exists a unique path from source s to each vertex in V . Proof of (1): Suppose there exists a cycle c = ‹v 0, v 1, …, v k ›, where v 0 = v k. We have [v i ] = v i-1 for i = 1, 2, …, k. Assume relaxation of (v k-1, v k ) created the cycle. We show cycle has a negative weight. Note: Cycle must be reachable from s. (Why?) UNC Chapel Hill
34
Proof of (1) (Continued) Before call to Relax(v k-1, v k, w): [v i ] = v i-1 for i = 1, …, k–1. Implies d[v i ] was last updated by “d[v i ] := d[v i-1 ] + w(v i-1, v i )” for i = 1, …, k–1. Implies d[v i ] d[v i-1 ] + w(v i-1, v i ) for i = 1, …, k–1. Because [v k ] is changed by call, d[v k ] > d[v k-1 ] + w(v k-1, v k ). Thus, UNC Chapel Hill
35
Proof of (2) Proof of (2): ( v: v V :: ( path from s to v)) is an invariant. So, for any v in V , at least 1 path from s to v. Show 1 path. Assume 2 paths. su y x zv impossible! UNC Chapel Hill
36
Lemma 24.17 Lemma 24.17: Same conditions as before. Call Initialize & repeatedly call Relax until d[v] = δ(s, v) for all v in V. Then, G is a shortest-path tree rooted at s. Lemma 24.17: Same conditions as before. Call Initialize & repeatedly call Relax until d[v] = δ(s, v) for all v in V. Then, G is a shortest-path tree rooted at s. Proof: Key Proof Obligation: For all v in V , the unique simple path p from s to v in G (path exists by Lemma 24.16) is a shortest path from s to v in G. Let p = ‹v 0, v 1, …, v k ›, where v 0 = s and v k = v. We have d[v i ] = δ(s, v i ) d[v i ] d[v i-1 ] + w(v i-1, v i ) Implies w(v i-1, v i ) δ(s, v i ) – δ(s, v i-1 ). UNC Chapel Hill
37
Proof (Continued) So, p is a shortest path. UNC Chapel Hill
38
Bellman-Ford Algorithm Can have negative-weight edges. Will “detect” reachable negative-weight cycles. Initialize(G, s); for i := 1 to |V[G]| –1 do for each (u, v) in E[G] do Relax(u, v, w) od od; for each (u, v) in E[G] do if d[v] > d[u] + w(u, v) then return false fi od; return true Initialize(G, s); for i := 1 to |V[G]| –1 do for each (u, v) in E[G] do Relax(u, v, w) od od; for each (u, v) in E[G] do if d[v] > d[u] + w(u, v) then return false fi od; return true Time Complexity is O(VE). UNC Chapel Hill
39
Example 0 z uv x y 6 5 –3 9 7 7 8 –2 –4 2 UNC Chapel Hill
40
Example 0 7 6 z uv x y 6 5 –3 9 7 7 8 –2 –4 2 UNC Chapel Hill
41
Example 0 2 7 46 z uv x y 6 5 –3 9 7 7 8 –2 –4 2 UNC Chapel Hill
42
Example 0 2 7 42 z uv x y 6 5 –3 9 7 7 8 –2 –4 2 UNC Chapel Hill
43
Example 0 -2 7 42 z uv x y 6 5 –3 9 7 7 8 –2 –4 2 UNC Chapel Hill
44
Another Look Note: This is essentially dynamic programming. Let d(i, j) = cost of the shortest path from s to i that is at most j hops. d(i, j) = 0 if i = s j = 0 if i s j = 0 min({d(k, j–1) + w(k, i): i Adj(k)} {d(i, j–1)}) if j > 0 zuvxy 1 2 3 4 5 00 106 7 206472 302472 40247 –2 j i UNC Chapel Hill
45
Lemma 24.2 Lemma 24.2: Assuming no negative-weight cycles reachable from s, d[v] = (s, v) holds upon termination for all vertices v reachable from s. Lemma 24.2: Assuming no negative-weight cycles reachable from s, d[v] = (s, v) holds upon termination for all vertices v reachable from s. Proof: Consider a SP p, where p = ‹v 0, v 1, …, v k ›, where v 0 = s and v k = v. Assume k |V| – 1, otherwise p has a cycle. Claim: d[v i ] = (s, v i ) holds after the i th pass over edges. Proof follows by induction on i. By Lemma 24.11, once d[v i ] = (s, v i ) holds, it continues to hold. UNC Chapel Hill
46
Correctness Claim: Algorithm returns the correct value. (Part of Theorem 24.4. Other parts of the theorem follow easily from earlier results.) Case 1: There is no reachable negative-weight cycle. Upon termination, we have for all (u, v): d[v] = (s, v), by Lemma 24.2 if v is reachable; d[v] = (s, v) = otherwise. (s, u) + w(u, v), by Lemma 24.10. = d[u] + w(u, v) So, algorithm returns true. UNC Chapel Hill
47
Case 2 Case 2: There exists a reachable negative-weight cycle c = ‹v 0, v 1, …, v k ›, where v 0 = v k. We have i = 1, …, k w(v i-1, v i ) < 0. (*) Suppose algorithm returns true. Then, d[v i ] d[v i-1 ] + w(v i-1, v i ) for i = 1, …, k. Thus, i = 1, …, k d[v i ] i = 1, …, k d[v i-1 ] + i = 1, …, k w(v i-1, v i ) But, i = 1, …, k d[v i ] = i = 1, …, k d[v i-1 ]. Can show no d[v i ] is infinite. Hence, 0 i = 1, …, k w(v i-1, v i ). Contradicts (*). Thus, algorithm returns false. UNC Chapel Hill
48
Dijkstra’s Algorithm Assumes no negative-weight edges. Maintains a set S of vertices whose SP from s has been determined. Repeatedly selects u in V–S with minimum SP estimate (greedy choice). Store V–S in priority queue Q. Initialize(G, s); S := ; Q := V[G]; while Q do u := Extract-Min(Q); S := S {u}; for each v Adj[u] do Relax(u, v, w) od Initialize(G, s); S := ; Q := V[G]; while Q do u := Extract-Min(Q); S := S {u}; for each v Adj[u] do Relax(u, v, w) od Relax(u, v, w) if d[v] > d[u] + w(u, v) then d[v] := d[u] + w(u, v); [v] := u fi Relax(u, v, w) if d[v] > d[u] + w(u, v) then d[v] := d[u] + w(u, v); [v] := u fi UNC Chapel Hill
49
Example 0 s uv x y 10 1 9 2 46 5 23 7 UNC Chapel Hill
50
Example 0 5 10 s uv x y 1 9 2 46 5 23 7 UNC Chapel Hill
51
Example 0 7 5 148 s uv x y 10 1 9 2 46 5 23 7 UNC Chapel Hill
52
Example 0 7 5 138 s uv x y 10 1 9 2 46 5 23 7 UNC Chapel Hill
53
Example 0 7 5 98 s uv x y 10 1 9 2 46 5 23 7 UNC Chapel Hill
54
Example 0 7 5 98 s uv x y 10 1 9 2 46 5 23 7 UNC Chapel Hill
55
Correctness Theorem 24.6: Upon termination, d[u] = δ(s, u) for all u in V (assuming non-negative weights). Theorem 24.6: Upon termination, d[u] = δ(s, u) for all u in V (assuming non-negative weights). Proof: By Lemma 24.11, once d[u] = δ(s, u) holds, it continues to hold. We prove: For each u in V, d[u] = (s, u) when u is inserted in S. Suppose not. Let u be the first vertex such that d[u] (s, u) when inserted in S. Note that d[s] = (s, s) = 0 when s is inserted, so u s. S just before u is inserted (in fact, s S). UNC Chapel Hill
56
Proof (Continued) Note that there exists a path from s to u, for otherwise d[u] = (s, u) = by Corollary 24.12. there exists a SP from s to u. SP looks like this: x s y u S p1p1 p2p2 UNC Chapel Hill
57
Proof (Continued) Claim: d[y] = (s, y) when u is inserted into S. We had d[x] = (s, x) when x was inserted into S. Edge (x, y) was relaxed at that time. By Lemma 24.14, this implies the claim. Now, we have: d[y] = (s, y), by Claim. (s, u), nonnegative edge weights. d[u], by Lemma 24.11. Because u was added to S before y, d[u] d[y]. Thus, d[y] = (s, y) = (s, u) = d[u]. Contradiction. UNC Chapel Hill
58
Complexity Running time is O(V 2 ) using linear array for priority queue. O((V + E) lg V) using binary heap. O(V lg V + E) using Fibonacci heap. (See book.) UNC Chapel Hill
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.