Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS6045: Advanced Algorithms

Similar presentations


Presentation on theme: "CS6045: Advanced Algorithms"— Presentation transcript:

1 CS6045: Advanced Algorithms
Greedy Algorithms

2 Greedy Algorithms Main Concept
Divide the problem into multiple steps (sub-problems) For each step take the best choice at the current moment (Local optimal) (Greedy choice) A greedy algorithm always makes the choice that looks best at the moment The hope: A locally optimal choice will lead to a globally optimal solution For some problems, it works. For others, it does not

3 Greedy Algorithms A greedy algorithm always makes the choice that looks best at the moment The hope: a locally optimal choice will lead to a globally optimal solution For some problems, it works Activity-Selection Problem Huffman Codes Dynamic programming can be overkill (slow); greedy algorithms tend to be easier to code

4 Activity-Selection Problem
Problem: get your money’s worth out of a carnival Buy a wristband that lets you onto any ride Lots of rides, each starting and ending at different times Your goal: ride as many rides as possible Another, alternative goal that we don’t solve here: maximize time spent on rides Welcome to the activity selection problem

5 Activity-Selection Formally: Assume that f1  f2  …  fn
Given a set S of n activities S = {a1, …, an} si = start time of activity i fi = finish time of activity i Find max-size subset A of compatible (non-overlapping ) activities 1 2 3 4 5 6 Assume that f1  f2  …  fn

6 Example Maximum-size mutually compatible set: {a1, a3, a6, a8}.
Not unique: also {a2, a5, a7, a9}.

7 Activity Selection: Optimal Substructure
𝑆𝑖𝑗= 𝑎𝑘∈𝑆 :𝑓𝑖 ≤𝑠𝑘<𝑓𝑘 ≤𝑠𝑗 = activities that start after ai finishes and finish before aj starts In words, activities in 𝑆𝑖𝑗 are compatible with: All activities that finish by fi All activities that start no earlier than sj

8 Activity Selection: Optimal Substructure
Let 𝐴𝑖𝑗 be a maximum-size set of compatible activities in Sij Let 𝑎𝑘∈𝐴𝑖𝑗 be some activity in Aij. Then we have two sub-problems: Find compatible activities in Sik (activities that start after ai finishes and that finish before ak starts) Find compatible activities in Skj (activities that start after ak finishes and that finish before aj starts) 𝐴𝑖𝑗 = Aik ∪ 𝑎𝑘 ∪𝐴𝑘𝑗 → 𝐴𝑖𝑗 = 𝐴𝑖𝑘 + 𝐴𝑘𝑗 +1

9 Activity Selection: Dynamic Programming
Let 𝑐[𝑖,𝑗] be the size of optimal solution for Sij . Then, c[i,j] = c[i,k] + c[k,j] + 1

10 Greedy Choice Property
Dynamic programming Solve all the sub-problems Activity selection problem also exhibits the greedy choice property: We should choose an activity that leaves the resource available for as many other activities as possible The first greedy choice is a1, since f1  f2  …  fn

11 Activity Selection: A Greedy Algorithm
So actual algorithm is simple: Sort the activities by finish time Schedule the first activity Then schedule the next activity in sorted list which starts after previous activity finishes Repeat until no more activities Intuition is even more simple: Always pick the shortest ride available at the time

12 Activity Selection: A Greedy Algorithm
Greedy Choice: Select the next best activity (Local Optimal) Select the activity that ends first (smallest end time) Intuition: it leaves the largest possible empty space for more activities Once selected an activity Delete all non-compatible activities They cannot be selected Repeat the algorithm for the remaining activities Either using iterations or recursion Sub-problem: We created one sub-problem to solve (Find the optimal schedule after the selected activity) Hopefully when we merge the local optimal + the sub-problem optimal solution  we get a global optimal

13 Greedy Algorithm Correctness
Theorem: If Sk (activities that start after ak finishes) is nonempty and am has the earliest finish time in Sk, then am is included in some optimal solution. How to prove it? We can convert any other optimal solution (S’) to the greedy algorithm solution (S) Idea: Compare the activities in S’ and S from left-to-right If they match in the selected activity  skip If they do not match, we can replace the activity in S’ by that in S because the one in S finishes first

14 Example S: {a1, a3, a6, a8}. S’:{a2, a5, a7, a9}.
a2, a5, a7, a9 in S’ can be replaced by a1, a3, a6, a8 from S (finishes earlier) We mapped S’ to S and showed that S is even better

15 Recursive Solution Recursive-Activity-Selection(s, f, k, n) m = k +1
Two arrays containing the start and end times (Assumption: they are sorted based on end times) The activity chosen in the last call The problem size Recursive-Activity-Selection(s, f, k, n) m = k +1 While (m <= n) && ( s[m] < f[k]) m++; If (m <= n) return {Am} U Recursive-Activity-Selection(s, f, m, n) Else return Φ Find the next activity starting after the end of k Time Complexity: O(n) (Assuming arrays are already sorted, otherwise we add O(n Log n)

16 Iterative Solution Iterative-Activity-Selection(s, f) n = s.length
Two arrays containing the start and end times (Assumption: they are sorted based on end times) Iterative-Activity-Selection(s, f) n = s.length A = {a1} k = 1 for (m = 2 to n) if (S[m] >= f[k]) A = A U {am} k = m Return A

17 Elements Of Greedy Algorithms
Greedy-Choice Property At each step, we do a greedy (local optimal) choice Top-Down Solution The greedy choice is usually done independent of the sub-problems Usually done “before” solving the sub-problem Optimal Substructure The global optimal solution can be composed from the local optimal of the sub-problems

18 Elements Of Greedy Algorithms
Proving a greedy solution is optimal Remember: Not all problems have optimal greedy solution If it does, you need to prove it Usually the proof includes mapping or converting any other optimal solution to the greedy solution

19 Review: The Knapsack Problem
The thief must choose among n items, where the ith item worth vi dollars and weighs wi pounds Carrying at most W pounds, maximize value Note: assume vi, wi, and W are all integers “0-1” b/c each item must be taken or left in entirety A variation, the fractional knapsack problem: Thief can take fractions of items Think of items in 0-1 problem as gold ingots, in fractional problem as buckets of gold dust

20 Review: The Knapsack Problem And Optimal Substructure
Both variations exhibit optimal substructure To show this for the 0-1 problem, consider the most valuable load weighing at most W pounds If we remove item j from the load, what do we know about the remaining load? A: remainder must be the most valuable load weighing at most W - wj that thief could take from museum, excluding item j

21 Solving The Knapsack Problem
The optimal solution to the 0-1 problem cannot be found with the same greedy strategy Greedy strategy: take in order of dollars/pound Example: 3 items weighing 10, 20, and 30 pounds, knapsack can hold 50 pounds Suppose 3 items are worth $60, $100, and $120. Will greedy strategy work?

22 0-1 Knapsack - Greedy Strategy Does Not Work
50 50 50 20 $100 $120 + $220 30 Item 3 30 20 Item 2 $100 + 20 Item 1 10 10 $60 $60 $100 $120 W $160 Not optimal $6/pound $5/pound $4/pound Greedy choice: Compute the benefit per pound Sort the items based on these values

23 Solving The Knapsack Problem
The optimal solution to the fractional knapsack problem can be found with a greedy algorithm 50 50 2/3 Of 30 $80 + Item 3 30 20 Item 2 $100 + 20 Item 1 10 10 $60 $60 $100 $120 W $240 Optimal $6/pound $5/pound $4/pound Greedy choice: Compute the benefit per pound Sort the items based on these values Take as much as you can from the top items in the list

24 The Knapsack Problem: Greedy Vs. Dynamic
The fractional problem can be solved greedily The 0-1 problem cannot be solved with a greedy approach As you have seen, however, it can be solved with dynamic programming

25 Huffman code Computer Data Encoding: Historical Solution:
How do we represent data in binary? Historical Solution: Fixed length codes Encode every symbol by a unique binary string of a fixed length. Examples: ASCII (7 bit code), EBCDIC (8 bit code), …

26 American Standard Code for Information Interchange

27 ASCII Example: AABCAA A A B C A A

28 Total space usage in bits:
Assume an ℓ bit fixed length code. For a file of n characters Need nℓ bits.

29 Variable Length codes Idea: In order to save space, use less bits for frequent characters and more bits for rare characters. Example: suppose alphabet of 3 symbols:{ A, B, C }. suppose in file: 1,000,000 characters. Need 2 bits for a fixed length code for a total of ,000,000 bits.

30 Variable Length codes - example
Suppose the frequency distribution of the characters is: C B A 500 999,000 Encode: C B A 11 10 Note that the code of A is of length 1, and the codes for B and C are of length 2

31 Total space usage in bits:
Fixed code: 1,000,000 x 2 = 2,000,000 Variable code: 999,000 x 1 x 2 500 x 2 1,001,000 A savings of almost 50%

32 How do we decode? In the fixed length, we know where every character starts, since they all have the same number of bits. Example: A = 00 B = 01 C = 10 A A A B B C C C B C B A A C C

33 How do we decode? In the variable length code, we use an idea called Prefix code, where no code is a prefix of another. Example: A = 0 B = 10 C = 11 None of the above codes is a prefix of another.

34 How do we decode? Example: A = 0 B = 10 C = 11 So, for the string:
A A A B B C C C B C B A A C C the encoding:

35 Prefix Code Example: A = 0 B = 10 C = 11 Decode the string
A A A B B C C C B C B A A C C

36 Desiderata: Construct a variable length code for a given file with the following properties: Prefix code. Using shortest possible codes. Efficient.

37 Idea Consider a binary tree, with: 0 meaning a left branch
1 meaning a right branch 1 A 1 B 1 C D

38 Idea Consider the paths from the root to each of the leaves A, B, C, D: A : 0 B : 10 C : 110 D : 111 1 A 1 B 1 C D

39 Observe: This is a prefix code, since each of the leaves has a path ending in it, without continuation. If the tree is full then we are not “wasting” bits. If we make sure that the more frequent symbols are closer to the root then they will have a smaller code. 1 A 1 B 1 C D

40 Greedy Algorithm: 1. Consider all pairs: <frequency, symbol>.
2. Choose the two lowest frequencies, and make them brothers, with the root having the combined frequency. 3. Iterate.

41 Greedy Algorithm Example:
Alphabet: A, B, C, D, E, F Frequency table: F E D C B A 60 50 40 30 20 10 Total File Length: 210

42 Algorithm Run: A B C D E F

43 Algorithm Run: X C D E F A B

44 Algorithm Run: Y D E F X C A B

45 Algorithm Run: D E Y F X C A B

46 Algorithm Run: Z Y F D E X C A B

47 Algorithm Run: Y F Z X C D E A B

48 Algorithm Run: W 120 Z Y F D E X C A B

49 Algorithm Run: Z W 120 D E Y F X C A B

50 Algorithm Run: 1 1 1 1 1 V 210 Z 90 W 120 D 40 E 50 Y 60 F 60 X 30
1 Z W 120 1 1 D E Y F 1 X C 1 A B

51 The Huffman encoding: A: 1000 B: 1001 C: 101 D: 00 E: 01 F: 11 1 1 1 1
V 210 1 Z W 120 1 1 D E Y F 1 X C 1 A B File Size: 10x4 + 20x4 + 30x3 + 40x2 + 50x2 + 60x2 = = 510 bits

52 Note the savings: The Huffman code: Required 510 bits for the file.
Fixed length code: Need 3 bits for 6 characters. File has 210 characters. Total: 630 bits for the file.

53 Greedy Algorithm Initialize trees of a single node each.
Keep the roots of all subtrees in a priority queue. Iterate until only one tree left: Merge the two smallest frequency subtrees into a single subtree with two children, and insert into priority queue.

54 Total run time: (n lgn)
Huffman Algorithm Total run time: (n lgn) Huffman(C) n = |C| Q = C // Q is a binary Min-Heap; (n) Build-Heap for i = 1 to n-1 z = Allocate-Node() x = Extract-Min(Q) // (lgn), (n) times y = Extract-Min(Q) // (lgn), (n) times left(z) = x right(z) = y f(z) = f(x) + f(y) Insert(Q, z) // (lgn), (n) times return Extract-Min(Q) // return the root of the tree

55 Algorithm correctness:
Need to prove two things for greedy algorithms: Greedy Choice Property: The choice of local optimum is indeed part of a global optimum. Optimal Substructure Property: When we recursive on the remaining and combine it with the local optimum of the greedy choice, we get a global optimum.

56 Huffman Algorithm correctness:
Greedy Choice Property: There exists a minimum cost prefix tree where the two smallest frequency characters are indeed siblings with the longest path from root. This means that the greedy choice does not hurt finding the optimum.

57 Algorithm correctness:
Optimal Substructure Property: An optimal solution to the problem once we choose the two least frequent elements and combine them to produce a smaller problem, is indeed a solution to the problem when the two elements are added.

58 Algorithm correctness:
Greedy Choice Property: There exists a minimum cost tree where the minimum frequency elements are longest path siblings: Proof by contradiction: Assume that is not the situation. Then there are two elements in the longest path. Say a,b are the elements with smallest frequency and x,y the elements in the longest path

59 Algorithm correctness:
We know about depth and frequency: da ≤ dy fa ≤ fy da dy a x y

60 Algorithm correctness:
We also know about code tree CT: ∑fσdσ σ is smallest possible. CT da dy a x y Now exchange a and y.

61 Algorithm correctness:
Cost(CT) = ∑fσdσ = σ ∑fσdσ+fada+fydy≥ σ≠a,y CT’ da dy (da ≤ dy, fa ≤ fy Therefore fada ≥fyda and fydy ≥fady ) y ∑fσdσ+fyda+fady= σ≠a,y cost(CT’) x a

62 Algorithm correctness:
Now do the same thing for b and x db dx b x a

63 Algorithm correctness:
And get an optimal code tree where a and b are sibling with the longest paths db dx x b a

64 Algorithm correctness:
Optimal substructure property: Let a,b be the symbols with the smallest frequency. Let x be a new symbol whose frequency is fx =fa +fb. Delete characters a and b, and find the optimal code tree CT for the reduced alphabet. Then CT’ = CT U {a,b} is an optimal tree for the original alphabet.

65 Algorithm correctness:
x fx = fa + fb x a b

66 Algorithm correctness:
cost(CT’)=∑fσd’σ = ∑fσd’σ + fad’a + fbd’b= σ σ≠a,b ∑fσd’σ + fa(dx+1) + fb (dx+1) = σ≠a,b ∑fσd’σ+(fa + fb)(dx+1)= σ≠a,b ∑fσdσ+fx(dx+1)+fx = cost(CT) + fx σ≠a,b

67 Algorithm correctness:
x fx = fa + fb x cost(CT)+fx = cost(CT’) a b

68 Algorithm correctness:
Assume CT’ is not optimal. By the previous lemma there is a tree CT” that is optimal, and where a and b are siblings. So cost(CT”) < cost(CT’)

69 Algorithm correctness:
Consider CT’’’ CT” x fx = fa + fb x By a similar argument: cost(CT’’’)+fx = cost(CT”) a b

70 Algorithm correctness:
We get: cost(CT’’’) = cost(CT”) – fx < cost(CT’) – fx = cost(CT) and this contradicts the minimality of cost(CT).

71 Greedy vs. Dynamic Greedy Algorithms Dynamic Programming
Can assemble a globally optimal solution by making locally optimal choices Making the choice before solving the sub-problems Top-down (simpler and more efficient) Can solve some problems optimally Dynamic Programming Choice depends on knowing optimal solutions to sub-problems. Solve all sub-problems Bottom-up (slow) Can solve more problems optimally


Download ppt "CS6045: Advanced Algorithms"

Similar presentations


Ads by Google