Presentation is loading. Please wait.

Presentation is loading. Please wait.

Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible locally optimal irrevocable.

Similar presentations


Presentation on theme: "Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible locally optimal irrevocable."— Presentation transcript:

1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible locally optimal irrevocable For some problems, yields an optimal solution for every instance. For most, does not but can be useful for fast approximations. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

2 Greedy Algorithms A greedy algorithm always makes the choice that looks best at the moment The hope: a locally optimal choice will lead to a globally optimal solution For some problems, it works Dynamic programming can be overkill (slow); greedy algorithms tend to be easier to code

3 Applications of the Greedy Strategy
Optimal solutions: change making for “normal” coin denominations minimum spanning tree (MST) single-source shortest paths simple scheduling problems Huffman codes Approximations: traveling salesman problem (TSP) knapsack problem other combinatorial optimization problems A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

4 Review: The Knapsack Problem
The thief must choose among n items, where the ith item worth vi dollars and weighs wi pounds Carrying at most W pounds, maximize value Note: assume vi, wi, and W are all integers “0-1” b/c each item must be taken or left in entirety A variation, the fractional knapsack problem: Thief can take fractions of items Think of items in 0-1 problem as gold ingots, in fractional problem as buckets of gold dust

5 Solving The Knapsack Problem
The optimal solution to the 0-1 problem cannot be found with the same greedy strategy Greedy strategy: take in order of dollars/pound Example: 3 items weighing 10, 20, and 30 pounds, knapsack can hold 50 pounds Suppose 3 items are worth $60, $100, and $120. Will greedy strategy work?

6 0-1 Knapsack - Greedy Strategy Does Not Work
50 50 50 20 $100 $120 + $220 30 Item 3 30 20 Item 2 $100 + 20 Item 1 10 10 $60 $60 $100 $120 W $160 Not optimal $6/pound $5/pound $4/pound Greedy choice: Compute the benefit per pound Sort the items based on these values

7 Solving The Knapsack Problem
The optimal solution to the fractional knapsack problem can be found with a greedy algorithm 50 50 2/3 Of 30 $80 + Item 3 30 20 Item 2 $100 + 20 Item 1 10 10 $60 $60 $100 $120 W $240 Optimal $6/pound $5/pound $4/pound Greedy choice: Compute the benefit per pound Sort the items based on these values Take as much as you can from the top items in the list

8 The Knapsack Problem: Greedy vs. Dynamic
The fractional problem can be solved greedily The 0-1 problem cannot be solved with a greedy approach As you have seen, however, it can be solved with dynamic programming

9 Change-Making Problem
Given unlimited amounts of coins of denominations d1 > … > dm , give change for amount n with the least number of coins Example: d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c Greedy solution: Greedy solution is optimal for any amount and “normal’’ set of denominations may not be optimal for arbitrary coin denominations A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

10 Minimum Spanning Tree (MST)
Spanning tree of a connected graph G: a connected acyclic subgraph of G that includes all of G’s vertices Minimum spanning tree of a weighted, connected graph G: a spanning tree of G of minimum total weight Example: 6 c 6 c a 1 4 a 1 4 2 2 d b 3 d b 3 A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

11 Prim’s MST algorithm Start with tree T1 consisting of one (any) vertex and “grow” tree one vertex at a time to produce MST through a series of expanding subtrees T1, T2, …, Tn On each iteration, construct Ti+1 from Ti by adding vertex not in Ti that is closest to those already in Ti (this is a “greedy” step!) Stop when all vertices are included A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

12 Prim’s Algorithm Run on example graph MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 14 2 10 15 3 8 Run on example graph

13 Prim’s Algorithm         MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 14 2 10 15 3 8

14 Prim’s Algorithm     r    Pick a start vertex r
MST-Prim(G, w, r) Q = V[G]; for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 14 2 10 15 r 3 8 Pick a start vertex r

15 Red vertices have been removed from Q
Prim’s Algorithm MST-Prim(G, w, r) Q = V[G]; for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 14 2 10 15 u 3 8 Red vertices have been removed from Q

16 Red arrows indicate parent pointers
Prim’s Algorithm MST-Prim(G, w, r) Q = V[G]; for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 14 2 10 15 u 3 8 3 Red arrows indicate parent pointers

17 Prim’s Algorithm  14   u   3 MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 14 14 2 10 15 u 3 8 3

18 Prim’s Algorithm  14     3 u MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 14 14 2 10 15 3 8 3 u

19 Prim’s Algorithm  14   8  3 u MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 14 14 2 10 15 8 3 8 3 u

20 Prim’s Algorithm  10   8  3 u MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 10 14 2 10 15 8 3 8 3 u

21 Prim’s Algorithm  10   8  3 u MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 10 14 2 10 15 8 3 8 3 u

22 Prim’s Algorithm  10 2  8  3 u MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 10 2 14 2 10 15 8 3 8 3 u

23 Prim’s Algorithm  10 2  8 15 3 u MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 9 5 10 2 14 2 10 15 8 15 3 8 3 u

24 Prim’s Algorithm  u 10 2  8 15 3 MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 u 9 5 10 2 14 2 10 15 8 15 3 8 3

25 Prim’s Algorithm  u 10 2 9 8 15 3 MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 6 4 u 9 5 10 2 9 14 2 10 15 8 15 3 8 3

26 Prim’s Algorithm 4 u 10 2 9 8 15 3 MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 4 6 4 u 9 5 10 2 9 14 2 10 15 8 15 3 8 3

27 Prim’s Algorithm 4 u 5 2 9 8 15 3 MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 4 6 4 u 9 5 5 2 9 14 2 10 15 8 15 3 8 3

28 Prim’s Algorithm u 4 5 2 9 8 15 3 MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 4 6 4 9 5 5 2 9 14 2 10 15 8 15 3 8 3

29 Prim’s Algorithm 4 u 5 2 9 8 15 3 MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 4 6 4 9 5 u 5 2 9 14 2 10 15 8 15 3 8 3

30 Prim’s Algorithm u 4 5 2 9 8 15 3 MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 4 6 4 9 5 5 2 9 14 2 10 15 8 15 3 8 3

31 Prim’s Algorithm 4 5 2 9 8 15 3 u MST-Prim(G, w, r) Q = V[G];
for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); 4 6 4 9 5 5 2 9 14 2 10 15 8 15 3 8 3 u

32 What is the hidden cost in this code? A: DECREASE-KEY operation
Prim’s Algorithm MST-Prim(G, w, r) Q = V[G]; for each u  Q key[u] = ; key[r] = 0; p[r] = NULL; while (Q not empty) u = ExtractMin(Q); for each v  Adj[u] if (v  Q and w(u,v) < key[v]) p[v] = u; key[v] = w(u,v); What is the hidden cost in this code? A: DECREASE-KEY operation on the min-heap What will be the running time? A: Depends on queue binary heap: O(E lg V)

33 Notes about Prim’s algorithm
Needs priority queue for locating closest fringe vertex Efficiency O(n2) for weight matrix representation of graph and array implementation of priority queue O(m log n) for adjacency list representation of graph with n vertices and m edges and min-heap implementation of priority queue A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

34 Another greedy algorithm for MST: Kruskal’s
Sort the edges in nondecreasing order of lengths “Grow” tree one edge at a time to produce MST through a series of expanding forests F1, F2, …, Fn-1 On each iteration, add the next edge on the sorted list unless this would create a cycle. (If it would, skip the edge.) A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

35 Kruskal’s Algorithm Run the algorithm: Kruskal() { T = ;
for each v  V MakeSet(v); sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 2 19 9 14 17 8 25 5 21 13 1

36 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

37 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

38 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1?

39 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

40 Kruskal’s Algorithm 2? 19 Kruskal() { T = ; for each v  V
MakeSet(v); sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

41 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

42 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5? 21 13 1

43 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

44 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8? 25 5 21 13 1

45 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

46 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9? 14 17 8 25 5 21 13 1

47 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

48 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13? 1

49 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

50 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14? 17 8 25 5 21 13 1

51 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

52 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17? 8 25 5 21 13 1

53 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

54 Kruskal’s Algorithm 2 19? Kruskal() { T = ; for each v  V
MakeSet(v); sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

55 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

56 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21? 13 1

57 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

58 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25? 5 21 13 1

59 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

60 Kruskal’s Algorithm 2 19 Kruskal() { T = ; for each v  V MakeSet(v);
sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); } 9 14 17 8 25 5 21 13 1

61 Kruskal’s Algorithm What will affect the running time? Kruskal() Sort
O(V) MakeSet() calls O(E) FindSet() calls O(V) Union() calls Kruskal() { T = ; for each v  V MakeSet(v); sort E by increasing edge weight w for each (u,v)  E (in sorted order) if FindSet(u)  FindSet(v) T = T U {{u,v}}; Union(FindSet(u), FindSet(v)); }

62 Kruskal’s Algorithm: Running Time
To summarize: Sort edges: O(E log E) O(V) MakeSet()’s O(E) FindSet()’s O(V) Union()’s Overall O(E log E)

63 Shortest paths – Dijkstra’s algorithm
Single Source Shortest Paths Problem: Given a weighted connected graph G, find shortest paths from source vertex s to each of the other vertices Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with a different way of computing numerical labels: Among vertices not already in the tree, it finds vertex u with the smallest sum dv + w(v,u) where v is a vertex for which shortest path has been already found on preceding iterations (such vertices form a tree) dv is the length of the shortest path form source to v w(v,u) is the length (weight) of edge from v to u A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

64 Dijkstra’s Algorithm 10 4 3 2 1 5 A B C D Dijkstra(G) for each v  V
d[v] = ; d[s] = 0; S = ; Q = V; while (Q  ) u = ExtractMin(Q); S = S U {u}; for each v  u->Adj[] if (d[v] > d[u]+w(u,v)) d[v] = d[u]+w(u,v);

65 Dijkstra’s Algorithm B Dijkstra(G) for each v  V d[v] = ;
d[s] = 0; S = ; Q = V; while (Q  ) u = ExtractMin(Q); S = S U {u}; for each v  u->Adj[] if (d[v] > d[u]+w(u,v)) d[v] = d[u]+w(u,v); 2 s A 10 D 4 3 5 1 C

66 Dijkstra’s Algorithm B Dijkstra(G) for each v  V d[v] = ;
d[s] = 0; S = ; Q = V; while (Q  ) u = ExtractMin(Q); S = S U {u}; for each v  u->Adj[] if (d[v] > d[u]+w(u,v)) d[v] = d[u]+w(u,v); 10 2 s A 10 D 4 3 5 1 5 C S = {A}

67 Dijkstra’s Algorithm B Dijkstra(G) for each v  V d[v] = ;
d[s] = 0; S = ; Q = V; while (Q  ) u = ExtractMin(Q); S = S U {u}; for each v  u->Adj[] if (d[v] > d[u]+w(u,v)) d[v] = d[u]+w(u,v); 9 2 s A 10 D 4 3 6 5 1 5 C S = {A, C}

68 Dijkstra’s Algorithm B Dijkstra(G) for each v  V d[v] = ;
d[s] = 0; S = ; Q = V; while (Q  ) u = ExtractMin(Q); S = S U {u}; for each v  u->Adj[] if (d[v] > d[u]+w(u,v)) d[v] = d[u]+w(u,v); 8 2 s A 10 D 4 3 6 5 1 5 C S = {A, C, D}

69 Dijkstra’s Algorithm B Dijkstra(G) for each v  V d[v] = ;
d[s] = 0; S = ; Q = V; while (Q  ) u = ExtractMin(Q); S = S U {u}; for each v  u->Adj[] if (d[v] > d[u]+w(u,v)) d[v] = d[u]+w(u,v); 8 2 s A 10 D 4 3 6 5 1 5 C S = {A, C, D, B}

70 Dijkstra’s Algorithm B Dijkstra(G) for each v  V d[v] = ;
d[s] = 0; S = ; Q = V; while (Q  ) u = ExtractMin(Q); S = S U {u}; for each v  u->Adj[] if (d[v] > d[u]+w(u,v)) d[v] = d[u]+w(u,v); 8 2 s A 10 D 4 3 6 5 1 5 C

71 Dijkstra’s Algorithm Dijkstra(G) for each v  V d[v] = ;
d[s] = 0; S = ; Q = V; while (Q  ) u = ExtractMin(Q); S = S U {u}; for each v  u->Adj[] if (d[v] > d[u]+w(u,v)) d[v] = d[u]+w(u,v); How many times is ExtractMin() called? How many times is comparison called? What will be the total running time? A: O(E lg V) using min heap for Q

72 Notes on Dijkstra’s algorithm
Doesn’t work for graphs with negative weights Applicable to both undirected and directed graphs Efficiency O(|V|2) for graphs represented by weight matrix and array implementation of priority queue O(|E|log|V|) for graphs represented by adj. lists and min-heap implementation of priority queue A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

73 Coding Problem Coding: assignment of bit strings to alphabet characters Codewords: bit strings assigned for characters of alphabet Two types of codes: fixed-length encoding (e.g., ASCII) variable-length encoding (e,g., Morse code) Prefix-free codes: no codeword is a prefix of another codeword A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

74 Coding Problem Fixed length encoding:
Encode every symbol by a unique binary string of a fixed length. Examples: ASCII (7 bit code), EBCDIC (8 bit code), …

75 American Standard Code for Information Interchange
AABCAA A A B C A A

76 Total space usage in bits:
Assume an ℓ bit fixed length code. For a file of n characters Need nℓ bits.

77 Variable Length codes Idea: In order to save space, use less bits for frequent characters and more bits for rare characters. Example: suppose alphabet of 3 symbols:{ A, B, C }. suppose in file: 1,000,000 characters. Need 2 bits for a fixed length code for a total of ,000,000 bits.

78 Variable Length codes - example
Suppose the frequency distribution of the characters is: C B A 500 999,000 Encode: C B A 11 10 Note that the code of A is of length 1, and the codes for B and C are of length 2

79 Total space usage in bits:
Fixed code: 1,000,000 x 2 = 2,000,000 Variable code: 999,000 x 1 x 2 500 x 2 1,001,000 A savings of almost 50%

80 How do we decode? In the fixed length, we know where every character starts, since they all have the same number of bits. Example: A = 00 B = 01 C = 10 A A A B B C C C B C B A A C C

81 How do we decode? In the variable length code, we use an idea called Prefix code, where no code is a prefix of another. Example: A = 0 B = 10 C = 11 None of the above codes is a prefix of another.

82 How do we decode? Example: A = 0 B = 10 C = 11 So, for the string:
A A A B B C C C B C B A A C C the encoding:

83 Prefix Code Example: A = 0 B = 10 C = 11 Decode the string
A A A B B C C C B C B A A C C

84 Desiderata: Construct a variable length code for a given file with the following properties: Prefix code. Using shortest possible codes. Efficient.

85 Idea Consider a binary tree, with: 0 meaning a left branch
1 meaning a right branch 1 A 1 B 1 C D

86 Idea Consider the paths from the root to each of the leaves A, B, C, D: A : 0 B : 10 C : 110 D : 111 1 A 1 B 1 C D

87 Observe: This is a prefix code, since each of the leaves has a path ending in it, without continuation. If the tree is full then we are not “wasting” bits. If we make sure that the more frequent symbols are closer to the root then they will have a smaller code. 1 A 1 B 1 C D

88 Greedy Algorithm: 1. Consider all pairs: <frequency, symbol>.
2. Choose the two lowest frequencies, and make them brothers, with the root having the combined frequency. 3. Iterate.

89 Greedy Algorithm Example:
Alphabet: A, B, C, D, E, F Frequency table: F E D C B A 60 50 40 30 20 10 Total File Length: 210

90 Algorithm Run: A B C D E F

91 Algorithm Run: X C D E F A B

92 Algorithm Run: Y D E F X C A B

93 Algorithm Run: D E Y F X C A B

94 Algorithm Run: Z Y F D E X C A B

95 Algorithm Run: Y F Z X C D E A B

96 Algorithm Run: W 120 Z Y F D E X C A B

97 Algorithm Run: Z W 120 D E Y F X C A B

98 Algorithm Run: 1 1 1 1 1 V 210 Z 90 W 120 D 40 E 50 Y 60 F 60 X 30
1 Z W 120 1 1 D E Y F 1 X C 1 A B

99 The Huffman encoding: A: 1000 B: 1001 C: 101 D: 00 E: 01 F: 11 1 1 1 1
V 210 1 Z W 120 1 1 D E Y F 1 X C 1 A B File Size: 10x4 + 20x4 + 30x3 + 40x2 + 50x2 + 60x2 = = 510 bits

100 Note the savings: The Huffman code: Required 510 bits for the file.
Fixed length code: Need 3 bits for 6 characters. File has 210 characters. Total: 630 bits for the file.

101 Greedy Algorithm Initialize trees of a single node each.
Keep the roots of all subtrees in a priority queue. Iterate until only one tree left: Merge the two smallest frequency subtrees into a single subtree with two children, and insert into priority queue.

102 Total run time: (n logn)
Huffman Algorithm Total run time: (n logn) Huffman(C) n = |C| Q = C // Q is a binary Min-Heap; (n) Build-Heap for i = 1 to n-1 z = Allocate-Node() x = Extract-Min(Q) // (logn), (n) times y = Extract-Min(Q) // (logn), (n) times left(z) = x right(z) = y f(z) = f(x) + f(y) Insert(Q, z) // (logn), (n) times return Extract-Min(Q) // return the root of the tree


Download ppt "Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible locally optimal irrevocable."

Similar presentations


Ads by Google