Greedy Algorithms Basic idea Connection to dynamic programming

Slides:



Advertisements
Similar presentations
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Advertisements

1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Chapter 5 Fundamental Algorithm Design Techniques.
Analysis of Algorithms
Instructor Neelima Gupta Table of Contents Greedy Algorithms.
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Greedy Algorithms Basic idea Connection to dynamic programming
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 7: Greedy Algorithms
Cs333/cutler Greedy1 Introduction to Greedy Algorithms The greedy technique Problems explored –The coin changing problem –Activity selection.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 8.
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
CSC5101 Advanced Algorithms Analysis
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Analysis of Algorithms.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Greedy Algorithms Lecture 10 Asst. Prof. Dr. İlker Kocabaş 1.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms General principle of greedy algorithm
Greedy Algorithms.
Greedy algorithms: CSC317
Greedy Algorithms.
Lecture on Design and Analysis of Computer Algorithm
CS 3343: Analysis of Algorithms
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
The Greedy Method and Text Compression
Analysis of Algorithms CS 477/677
Greedy Algorithms (Chap. 16)
Design & Analysis of Algorithm Greedy Algorithm
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Greedy Algorithms.
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
The Greedy Method Spring 2007 The Greedy Method Merge Sort
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Greedy Algorithm Enyue (Annie) Lu.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Greedy Algorithms.
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Lecture 6 Topics Greedy Algorithm
Richard Anderson Lecture 6 Greedy Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Richard Anderson Autumn 2016 Lecture 7
Lecture 18 CSE 331 Oct 9, 2017.
Richard Anderson Lecture 7 Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Greedy Algorithms Comp 122, Spring 2004.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Richard Anderson Winter 2019 Lecture 7
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Week 2: Greedy Algorithms
Lecture 19 CSE 331 Oct 10, 2016.
Greedy algorithms.
Richard Anderson Autumn 2015 Lecture 7
COMPSCI 330 Design and Analysis of Algorithms
Advance Algorithm Dynamic Programming
Richard Anderson Autumn 2019 Lecture 7
Richard Anderson Autumn 2019 Lecture 8 – Greedy Algorithms II
Presentation transcript:

Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques

Example: Making Change Input Positive integer n Task Compute a minimal multiset of coins from C = {d1, d2, d3, …, dk} such that the sum of all coins chosen equals n Example n = 73, C = {1, 3, 6, 12, 24} Solution: 3 coins of size 24, 1 coin of size 1

Dynamic programming solution 1 Subsolutions: T(k) for 0 ≤ k ≤ n Recurrence relation T(n) = mini (T(i) + T(n-i)) T(di) = 1 Linear array of values to compute Time to compute each entry? Compute T(k) starting at T(1) skipping base case values

Dynamic programming solution 2 Subsolutions: T(k) for 0 ≤ k ≤ n Recurrence relation T(n) = mini (T(n-di) + 1) There has to be a “first/last” coin T(di) = 1 Linear array of values to compute Time to compute each entry? Compute T(k) starting at T(1) skipping base case values

Greedy Solution From dyn prog 2: T(n) = mini (T(n-di) + 1) Key observation For many (but not all) sets of coins, the optimal choice for the first/last coin di will always be the maximum possible di That is, T(n) = T(n-dmax) + 1 where dmax is the largest di ≤ n Algorithm Choose largest di smaller than n and recurse

Comparison T(n) DP 1: T(k) T(n-k) DP 2: di T(n-di) Greedy: dmax dmax T(n-dmax)

Greedy Technique When trying to solve a problem, make a local greedy choice that optimizes progress towards global solution and recurse Implementation/running time analysis is typically straightforward Often implementation involves use of a sorting algorithm or a data structure to facilitate identification of next greedy choice Proof of optimality is typically the hard part

Proofs of Optimality We will often prove some structural properties about an optimal solution Example: Every optimal solution to the making change problem has at most x coins of denomination y. We will often prove that an optimal solution is the one generated by the greedy algorithm If we have an optimal solution that does not obey the greedy constraint, we can “swap” some elements to make it obey the greedy constraint Always consider the possibility that greedy is not optimal and consider counter-examples

Example 1: Making Change Proof 1 Greedy is optimal for coin set C = {1, 3, 9, 27, 81} Structural property of any optimal solution: In any optimal solution, the number of coins of denomination 1, 3, 9, and 27 must be at most 2. Why? This structural property immediately leads to the fact that the greedy solution must be optimal

Example 1: Making Change Proof 2 Greedy is optimal for coin set C = {1, 3, 9, 27, 81} Let S be an optimal solution and G be the greedy solution Let Ak denote the number of coins of size k in solution A Let kdiff be the largest value of k s.t. Gk ≠ Sk Claim 1: Gkdiff ≥ Skdiff. Why? Claim 2: Si ≥ 3 for some di < dkdiff. Why? Claim 3: We can create a better solution than S by performing a “swap”. What swap? These three claims imply kdiff does not exist and Gk is optimal.

Proof that Greedy is NOT optimal Consider the following coin set C = {1, 3, 6, 12, 24, 30} Prove that greedy will not produce an optimal solution for all n What about the following coin set? C = {1, 5, 10, 25, 50}

Activity selection problem Input Set of n intervals (si, fi) where fi > si ≥ 0 for all intervals n Task Identify a maximal set of intervals that do not overlap Example {(0,2), (1,3), (5,7), (6, 11), (8,10)} Solution: {(1,3), (5,7), (8,10)}

Possible Greedy Strategies Choose the shortest interval and recurse Choose the interval that starts earliest and recurse Choose the interval that ends earliest and recurse Choose the interval that starts latest and recurse Choose the interval that ends latest and recurse Do any of these strategies work?

Earliest end time Algorithm Sort intervals by end time A: Choose the interval with the earliest end time breaking ties arbitrarily Prune intervals that overlap with this interval and goto A Running time?

Proof of Optimality For any instance of the problem, there exists an optimal solution that includes an interval with earliest end time Let S be an optimal solution Suppose S does not include an interval with the earliest end time We can swap the interval with earliest end time in S with an interval with earliest overall end time to obtain feasible solution S’ S’ has at least as many intervals as S and the result follows We recursively apply this observation to the subproblem induced by selecting an interval with earliest end time to see that greedy produces an optimal schedule

Shortest Interval Algorithm Sort intervals by interval length A: Choose the interval with shortest length breaking ties arbitrarily Prune intervals that overlap with this interval and goto A Running time?

Proof of Optimality? For any instance of the problem, there exists an optimal solution that includes an interval with earliest end time Let S be an optimal solution Suppose S does not include a shortest interval Can we produce an equivalent solution S’ from S that includes a shortest interval? If not, how does this lead to a counterexample?

Example: Minimizing sum of completion times Input Set of n jobs with lengths xi Task Schedule these jobs on a single processor so that the sum of all job completion times are minimized Example Job A (length 2), Job B (length 1), Job C (length 3) Optimal Solution: Completion times: A:3, B:1, C:6 for a sum of 10 Develop a greedy strategy and prove it is optimal B A C 1 3 6

Questions What is the running time of your algorithm? Does it ever make sense to preempt a job? That is, start a job, interrupt it to run a second job (and possibly others), and then finally finish the first job? Can you develop a swapping proof of optimality for your algorithm?