1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
CS38 Introduction to Algorithms Lecture 5 April 15, 2014.
Cs333/cutler Greedy1 Introduction to Greedy Algorithms The greedy technique Problems explored –The coin changing problem –Activity selection.
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
Data Structures – LECTURE 10 Huffman coding
Chapter 9: Huffman Codes
Week 2: Greedy Algorithms
Greedy Algorithms Huffman Coding
Lecture 7: Greedy Algorithms II
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Lecture 1: The Greedy Method 主講人 : 虞台文. Content What is it? Activity Selection Problem Fractional Knapsack Problem Minimum Spanning Tree – Kruskal’s Algorithm.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy algorithms David Kauchak cs161 Summer 2009.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
COSC 3101A - Design and Analysis of Algorithms 9 Knapsack Problem Huffman Codes Introduction to Graphs Many of these slides are taken from Monica Nicolescu,
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
Greedy Algorithms Chapter 16 Highlights
Greedy Algorithms Analysis of Algorithms.
Greedy Algorithms Lecture 10 Asst. Prof. Dr. İlker Kocabaş 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
HUFFMAN CODES.
CSC317 Greedy algorithms; Two main properties:
CSCE 411 Design and Analysis of Algorithms
The Greedy Method and Text Compression
Chapter 16: Greedy Algorithm
Chapter 16: Greedy Algorithms
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
CS6045: Advanced Algorithms
Algorithms (2IL15) – Lecture 2
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Analysis of Algorithms CS 477/677
Presentation transcript:

1 Chapter 16: Greedy Algorithm

2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm

3 Coin Changing Suppose that in a certain country, the coin dominations consist of: $1, $2, $5, $10 You want to design an algorithm such that you can make change of any x dollars using the fewest number of coins

4 Coin Changing An idea is as follows: 1. Create an empty bag 2. while (x  0) { Find the largest coin c at most x; Put c in the bag; Set x = x – c ; } 3. Return coins in the bag

5 Coin Changing It is easy to check that the algorithm always return coins whose sum is x At each step, the algorithm makes a greedy choice (by including the largest coin) which looks best to come up with an optimal solution (a change with fewest #coins) This is an example of Greedy Algorithm

6 Coin Changing Is Greedy Algorithm always working? No! Consider a new set of coin denominations: $1, $4, $5, $10 Suppose we want a change of $8 Greedy algorithm: 4 coins (5,1,1,1) Optimal solution: 2 coins (4,4)

7 Greedy Algorithm We will look at some non-trivial examples where greedy algorithm works correctly Usually, to show a greedy algorithm works: We show that some optimal solution includes the greedy choice  selecting greedy choice is correct We show optimal substructure property  solve the subproblem recursively

8 Activity Selection Suppose you are a freshman in a school, and there are many welcoming activities There are n activities A 1, A 2, …, A n For each activity A k, it has a start time s k, and a finish time f k Target: Join as many as possible!

9 Activity Selection To join the activity A k, you must join at s k ; you must also stay until f k Since we want as many activities as possible, should we choose the one with (1) Shortest duration time? (2) Earliest start time? (3) Earliest finish time?

10 Activity Selection Shortest duration time may not be good: A 1 : [4:50, 5:10), A 2 : [3:00, 5:00), A 3 : [5:05, 7:00), Though not optimal, #activities in this solution R (shortest duration first) is at least half #activities in an optimal solution O: One activity in R clashes with at most 2 in O If |O|  2|R|, R should have one more activity

11 Activity Selection Earliest start time may even be worse: A 1 : [3:00, 10:00), A 2 : [3:10, 3:20), A 3 : [3:20, 3:30), A 4 : [3:30, 3:40), A 5 : [3:40, 3:50) … In the worst-case, the solution contains 1 activity, while optimal has n-1 activities

12 Greedy Choice Property To our surprise, earliest finish time works! We actually have the following lemma: Lemma: For the activity selection problem, some optimal solution includes an activity with earliest finish time How to prove?

13 Proof: (By “Cut-and-Paste” argument) Let OPT = an optimal solution Let A j = activity with earliest finish time If OPT contains A j, done! Else, let A’ = earliest activity in OPT Since A j finishes no later than A’, we can replace A’ by A j in OPT without conflicting other activities in OPT  an optimal solution containing A j (since it has same #activities as OPT)

14 Let A j = activity with earliest finish time Let S = the subset of original activities that do not conflict with A j Let OPT = optimal solution containing A j Lemma: OPT – { A j } must be an optimal solution for the subproblem with input activities S Optimal Substructure

15 Proof: (By contradiction) First, OPT – { A j } can contain only activities in S If it is not an optimal solution for input activities in S, let C be some optimal solution for input S  C has more activities than OPT – { A j }  C ∪ {A j } has more activities than OPT  Contradiction occurs

16 The previous two lemmas implies the following correct greedy algorithm: S = input set of activities ; while (S is not empty) { A = activity in S with earliest finish time; Select A and update S by removing activities having conflicts with A; } Greedy Algorithm If finish times are sorted in input, running time = O(n)

17 Designing a greedy algorithm 1. Cast the optimization problem as one in which we make a choice and are left with one subproblem to solve. 2. Prove that there is always an optimal solution to the original problem that makes the greedy choice 3. Demonstrate optimal substructure by showing that, having made the greedy choice, if we combine an optimal solution to the subproblem with the greedy choice, we arrive at an optimal solution to the original problem.

18 Designing a greedy algorithm Greedy-choice property: A global optimal solution can be achieved by making a local optimal (optimal) choice. Optimal substructure: An optimal solution to the problem contains its optimal solution to subproblem.

19 Target: Get items with total value as large as possible without exceeding weight limit 0-1 Knapsack Problem Suppose you are a thief, and you are now in a jewelry shop (nobody is around !) You have a big knapsack that you have “borrowed” from some shop before Weight limit of knapsack: W There are n items, I 1, I 2, …, I n I k has value v k, weight w k

Knapsack Problem We may think of some strategies like: (1) Take the most valuable item first (2) Take the densest item (with v k /w k is maximized) first Unfortunately, someone shows that this problem is very hard (NP-complete), so that it is unlikely to have a good strategy Let’s change the problem a bit…

21 Target: Get as valuable a load as possible, without exceeding weight limit Fractional Knapsack Problem In the previous problem, for each item, we either take it all, or leave it there Cannot take a fraction of an item Suppose we can allow taking fractions of the items; precisely, for a fraction c c part of I k has value cv k, weight cw k

22 Fractional Knapsack Problem Suddenly, the following strategy works: Take as much of the densest item (with v k /w k is maximized) as possible The correctness of the above greedy- choice property can be shown by cut- and-paste argument Also, it is easy to see that this problem has optimal substructure property  implies a correct greedy algorithm

23 Fractional Knapsack Problem However, the previous greedy algorithm (pick densest) does not work for 0-1 knapsack To see why, consider W = 50 and: I 1 : v 1 = $60, w 1 = 10 (density: 6) I 2 : v 2 = $100, w 2 = 20 (density: 5) I 3 : v 3 = $120, w 3 = 30 (density: 4) Greedy algorithm: $160 (I 1, I 2 ) Optimal solution: $220 (I 2, I 3 )

24 Encoding Characters In ASCII, each character is encoded using the same number of bits (8 bits) called fixed-length encoding However, in real-life English texts, not every character has the same frequency One way to encode the texts is: Encode frequent chars with few bits Encode infrequent chars with more bits  called variable-length encoding

25 Encoding Characters Variable-length encoding may gain a lot in storage requirement Example: Suppose we have a 100K-char file consisted of only chars a, b, c, d, e, f Suppose we know a occurs 45K times, and other chars each 11K times  Fixed-length encoding: 300K bits

26 Encoding Characters Example (cont): Suppose we encode the chars as follows: a  0, b  100, c  101, d  110, e  1110, f  1111 Storage with the above encoding: (45 x x x 4) x 1K = 232K bits (reduced by 25% !!)

27 Encoding Characters Thinking a step ahead, you may consider an even “better” encoding scheme: a  0, b  1, c  00, d  01, e  10, f  11 This encoding requires less storage since each char is encoded in fewer bits … What’s wrong with this encoding?

28 Prefix Code Suppose the encoded texts is: 0101 We cannot tell if the original text is abab, dd, abd, aeb, or … The problem comes from: one codeword is a prefix of another one

29 Prefix Code To avoid the problem, we generally want each codeword not a prefix of another called prefix code, or prefix-free code Let T = text encoded by prefix code We can easily decode T back to original: Scan T from the beginning Once we see a codeword, output the corresponding char Then, recursively decode remaining

30 Prefix Code Tree Naturally, a prefix code scheme corresponds to a prefix code tree Each char  a leaf Root-to-leaf path  codeword E.g., a  0, b  100, c  101, d  110, e  1110, f  1111 a b 0 c 0 1 d 10 ef 01

31 Optimal Prefix Code Question: Given frequencies of each char, how to find the optimal prefix code scheme (or optimal prefix code tree)? Precisely: Input: S = a set n chars, c 1, c 2, …, c n with c k occurs f c k times Target: Find codeword w k for each c k such that  k |w k | f c k is minimized

32 Huffman Code In 1952, David Huffman (then an MIT PhD student) thinks of a greedy algorithm to obtain the optimal prefix code tree Let c and c’ be chars with least frequencies. He observed that: Lemma: There is some optimal prefix code tree with c and c’ sharing the same parent, and the two leaves are farthest from root

33 Proof: (By “Cut-and-Paste” argument) Let OPT = some optimal solution If c and c’ as required, done! Else, let a and b be two bottom-most leaves sharing same parent (such leaves must exist… why??) swap a with c, swap b with c’ an optimal solution as required (since it at most the same  k |w k | f k as OPT … why??)

34 Graphically: 0 1 ab c If this is optimal Bottom-most leaves 0 1 cb a then this is optimal

35 Optimal Substructure Let OPT be an optimal prefix code tree with c and c’ as required Let T’ be a tree formed by merging c, c’, and their parent into one node Consider S’ = set formed by removing c and c’ from S, but adding X with f X = f c + f c’ Lemma: If T’ is an optimal prefix code tree for S’, then T is an optimal prefix code tree for S.

36 Graphically, the lemma says: 0 1 c c’ If this is optimal for S Merging c, c’ and the parent 0 1 X a then this is optimal for S’ Merged node Here, f X = f c + f c’ Tree T for S Tree T’ for S’ a

37 Huffman Code Questions: Based on the previous lemmas, can you obtain Huffman’s coding scheme? (Try to think about yourself before looking at next page…) What is the running time? O(n log n) time, using heap (how??)

38 Huffman(S) { // build Huffman code tree 1. Find least frequent chars c and c’ 2. S’ = remove c and c’ from S, but add char X with f X = f c + f c’ 3. T’ = Huffman(S’) 4. Make leaf X of T’ an internal node by connecting two leaves c and c’ to it 5. Return resulting tree }

39 Constructing a Huffman code HUFFMAN( C ) 1n  |C| 2Q  C /* initialize the min-priority queue with the character in C */ 3for i  1 to n – 1 4do allocate a new node z 5 left[z]  x  E XTRACT -M IN (Q) 6 right[z]  y  E XTRACT -M IN (Q) 7 f[z]  f[x] + f[y] 8 I NSERT (Q,Z) 9return E XTRACT -M IN (Q) Complexity: O(n log n) Complexity: O(n log n)

40 The steps of Huffman’s algorithm

41 The steps of Huffman’s algorithm

42 Brainstorm Suppose there are n items. Let –S= {item 1, item 2, …, item n } –w i = weight of item i –p i = profit of item i –W = maximum weight the knapsack can hold, where w i, p i, W are positive integers. Determine a subset A of S such that is maximized subject to Solve this problem in  (nW). Hint: Use Dynamic Programming Strategy

43 Homework Exercises: , , (Due Nov. 23) Practice at home: , , , , ,