Greedy Algorithms. Surprisingly, many important and practical computational problems can be solved this way. Every two year old knows the greedy algorithm.

Slides:



Advertisements
Similar presentations
COMP 482: Design and Analysis of Algorithms
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Analysis of Algorithms
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
1 Searching in a Graph Jeff Edmonds York University COSC 3101 Lecture 5 Generic Search Breadth First Search Dijkstra's Shortest Paths Algorithm Depth First.
Greedy Algorithms Basic idea Connection to dynamic programming
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
Cs333/cutler Greedy1 Introduction to Greedy Algorithms The greedy technique Problems explored –The coin changing problem –Activity selection.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Greedy Algorithms Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!
Greedy vs Dynamic Programming Approach
1 Greedy Algorithms. 2 2 A short list of categories Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
CSE115/ENGR160 Discrete Mathematics 03/03/11 Ming-Hsuan Yang UC Merced 1.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS 206 Introduction to Computer Science II 10 / 14 / 2009 Instructor: Michael Eckmann.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
Week 2: Greedy Algorithms
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
9/3/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Guest lecturer: Martin Furer Algorithm Design and Analysis L ECTURE.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
1 Review Jeff Edmonds York University COSC Some Math Recurrence Relations T(n) = a T(n/b) + f(n) Input Size Time Classifying Functions f(i) =
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
1 What NOT to do I get sooooo Frustrated! Marking the SAME wrong answer hundreds of times! I will give a list of mistakes which I particularly hate marking.
Analysis of Algorithms
COSC 3101NJ. Elder Announcements Midterm Exam: Fri Feb 27 CSE C Two Blocks: –16:00-17:30 –17:30-19:00 The exam will be 1.5 hours in length. You can attend.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Dr. Naveed Ahmad Assistant Professor Department of Computer Science University of Peshawar.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 8.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Image segmentation Prof. Noah Snavely CS1114
1 Algorithmic Paradigms Jeff Edmonds York University COSC 2011 Lecture 9 Brute Force: Optimazation Problem Greedy Algorithm: Minimal Spanning Tree Dual.
Algorithms April-May 2013 Dr. Youn-Hee Han The Project for the Establishing the Korea ㅡ Vietnam College of Technology in Bac Giang.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
Recursive Back Tracking & Dynamic Programming Lecture 7.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
COSC 3101NJ. Elder Announcements Midterm Exam: Fri Feb 27 CSE C –Two Blocks: 16:00-17:30 17:30-19:00 –The exam will be 1.5 hours in length. –You can attend.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
Loop Invariants and Binary Search Chapter 4.4, 5.1.
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Copyright © 2014 Curt Hill Algorithms From the Mathematical Perspective.
TU/e Algorithms (2IL15) – Lecture 3 1 DYNAMIC PROGRAMMING
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Central Algorithmic Techniques Iterative Algorithms.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms.
Greedy Algorithms.
Communication & Entropy
Greedy Algorithms – Chapter 5
Lecture on Design and Analysis of Computer Algorithm
Weighted Interval Scheduling
Greedy Algorithms Basic idea Connection to dynamic programming
Chapter 4 Greedy Algorithms
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Presented by Po-Chuan & Chen-Chen 2016/03/08
Greedy Algorithms.
Algorithms (2IL15) – Lecture 2
Advanced Analysis of Algorithms
Greedy Algorithms.
Presentation transcript:

Greedy Algorithms

Surprisingly, many important and practical computational problems can be solved this way. Every two year old knows the greedy algorithm. In order to get what you want, just start grabbing what looks best.

Ingredients: Instances: The possible inputs to the problem. Solutions for Instance: Each instance has an exponentially large set of solutions. Cost of Solution: Each solution has an easy to compute cost or value. Specification Preconditions: The input is one instance. Postconditions: An valid solution with optimal cost. (minimum or maximum) Optimization Problems

with a Greedy Algorithm Instances: A set of objects and a relationship between them. Solutions for Instance: A subset of the objects. Or some other choice about each object. Some subsets are not allowed because some objects conflict

Optimization Problems with a Greedy Algorithm Instances: A set of objects and a relationship between them. Solutions for Instance: A subset of the objects. Or some other choice about each object. Cost of Solution: The number of objects in solution or the sum of the costs of objects

Optimization Problems with a Greedy Algorithm Instances: A set of objects and a relationship between them. Goal: Find an optimal non-conflicting solution.

The Brute Force Algorithm Exponential Time, because exponential many Try every solution!

The Greedy Choice Commit to the object that looks the ``best'' Must prove that this locally greedy choice does not have negative global consequences.

The Game Show Example Problem: Choose the best m prizes.

The Game Show Example Problem: Choose the best m prizes. Greedy: Start by grabbing the best. Consequences: If you take the lion, you cant take the elephant. But greedy algorithms do not try to predict the future and do not back track.

The Game Show Example Iterative Greedy Algorithm: Loop: grabbing the best, then second best,... if it conflicts with committed objects or fulfills no new requirements. Reject this next best object else Commit to it. Problem: Choose the best m prizes.

The Game Show Example Makes a greedy first choice and then recurses (See Recursive Backtracking Algorithms) Recursive Greedy Algorithm: Problem: Choose the best m prizes.

Making Change Example Problem: Find the minimum # of quarters, dimes, nickels, and pennies that total to a given amount.

Instances: A set of objects and a relationship between them. Some subsets are not allowed because some objects conflict 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ Solutions for Instance: A subset of the coins that total the amount. Making Change Example

Instances: A set of objects and a relationship between them. 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ Solutions for Instance: A subset of the coins that total the amount. Cost of Solution: The number of objects in solution or the sum of the costs of objects = 14 Making Change Example Goal: Find an optimal non-conflicting solution.

Instances: A set of objects and a relationship between them. 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ Making Change Example Greedy Choice: Does this lead to an optimal # of coins? Start by grabbing quarters until exceeds amount, then dimes, then nickels, then pennies. Cost of Solution: 7

Hard Making Change Example Greedy Choice: Start by grabbing a 4 coin. Problem: Find the minimum # of 4, 3, and 1 cent coins to make up 6 cents. Consequences: = 6 mistake 3+3=6 better Greedy Algorithm does not work!

When Does It Work? Greedy Algorithms: Easy to understand and to code, but do they work? For most optimization problems, all greedy algorithms tried do not work. A few have greedy algorithm. The proof that they work, however, is subtle. As with all iterative algorithms, we use loop invariants.

Designing an Algorithm Define ProblemDefine Loop Invariants Define Measure of Progress Define StepDefine Exit ConditionMaintain Loop Inv Make ProgressInitial ConditionsEnding 79 km to school Exit 79 km75 km Exit 0 kmExit

The algorithm chooses the “best” object from amongst those not considered so far and either commits to it or rejects it. Define Step Another object considered Make Progress 79 km75 km Exit When all objects have been considered Exit Exit Condition

“The” optimal solution contains the best object: There may be more than one optimal solution and all might not contain the chosen object. At least one optimal solution contains the best object: It is ok to burn a few of your bridges as long as you do not burn all of them. We do not go “wrong” by committing to the best object. First Choice

We have not gone wrong. There is at least one optimal solution consistent with the choices made so far. Loop Invariant Initially no choices have been made and hence all optimal solutions are consistent with these choices. codeA Establishing Loop Invariant

¬ codeB Exit Maintaining Loop Invariant Let optS LI denote one.  opt sol consistent with choices made. codeB Commits or rejects the next best object. Proof massages optS LI into optS ours and proves it is a valid solution is consistent both with previous and new choices. is optimal  opt sol consistent with all choices. ?

Algorithm: commits or rejects next best object Emphasizes his actions not part of algorithm Emphasizes alg and prover do not know optS LI. Prover: Proves LI is maintained. Three Players Fairy God Mother: Holds the hypothetical opt sol optS LI. optS LI She gives no feedback.

Algorithm: commits or rejects next best object Emphasizes his actions not part of algorithm Prover: Proves LI is maintained. Three Players Fairy God Mother: Does she exist? optS LI The prover would find his conversations equally supportive, even if she did not.

Different 25 ¢ are considered to be different. Massaging optS LI into optS ours 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ I have committed to these coins. I instruct how to massage optS LI into optS ours so that it is consistent with previous & new choice. I commit to keeping another 25 ¢ I hold optS LI witnessing that there is an opt sol consistent with previous choices. I hold optS ours witnessing that there is an opt sol consistent with previous & new choices. optS LI

I know that her optS LI Is consistent with these choices. As Time Goes On 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ I keep making more choices. I always hold an opt sol optS LI but which one keeps changing. optS LI Hence, I know more and more of optS LI In the end, I know it all.

I will now instruct how to massage optS LI into optS ours so that it is consistent with previous & new choice. Massaging optS LI into optS ours 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ optS LI

If it happens to be the case that what you hold is consistent with this new choice that was made, then we are done. Massaging optS LI into optS ours 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ optS LI

The Algorithm has 92 ¢ -50 ¢ = 42 ¢ unchosen. Massaging optS LI into optS ours 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ optS LI Fairy God Mother must also have  25 ¢ that I don’t know about. There are different cases.  25 ¢

Massaging optS LI into optS ours optS LI 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ Replace A different 25 ¢ Alg’s 25 ¢ With

Massaging optS LI into optS ours optS LI 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ Replace A different 25 ¢ Alg’s 25 ¢ With 3 × 10 ¢ Alg’s 25 ¢ + 5 ¢ Oops, this is not actually optimal, But we must consider all cases.

Massaging optS LI into optS ours optS LI 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ Replace A different 25 ¢ Alg’s 25 ¢ With 3 × 10 ¢ 2 × 10 ¢ + 1 × 5 ¢ Alg’s 25 ¢ + 5 ¢ Alg’s 25 ¢

Massaging optS LI into optS ours optS LI 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ Replace A different 25 ¢ Alg’s 25 ¢ With 3 × 10 ¢ 2 × 10 ¢ + 1 × 5 ¢ Alg’s 25 ¢ + 5 ¢ 1 × 10 ¢ + 3 × 5 ¢ Alg’s 25 ¢

Massaging optS LI into optS ours optS LI 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ Replace A different 25 ¢ Alg’s 25 ¢ With ?? + 5 × 1 ¢ 3 × 10 ¢ 2 × 10 ¢ + 1 × 5 ¢ Alg’s 25 ¢ + 5 ¢ 1 × 10 ¢ + 3 × 5 ¢ Alg’s 25 ¢

optS LI Done optS ours Massaging optS LI into optS ours She now has something we must prove that it is what we want.

optS ours optS ours is valid optS LI was valid and we introduced no new conflicts. Massaging optS LI into optS ours Total remains unchanged. Replace A different 25 ¢ Alg’s 25 ¢ With ?? + 5 × 1 ¢ 3 × 10 ¢ 2 × 10 ¢ + 1 × 5 ¢ Alg’s 25 ¢ + 5 ¢ 1 × 10 ¢ + 3 × 5 ¢ Alg’s 25 ¢

optS ours is consistent Massaging optS LI into optS ours optS LI was consistent with previous choices and we made it consistent with new. 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ optS ours

optS ours is optimal We do not even know the cost of an optimal solution. Massaging optS LI into optS ours optS LI was optimal and optS ours cost (# of coins) is not bigger. Replace A different 25 ¢ Alg’s 25 ¢ With ?? + 5 × 1 ¢ 3 × 10 ¢ 2 × 10 ¢ + 1 × 5 ¢ Alg’s 25 ¢ + 5 ¢ 1 × 10 ¢ + 3 × 5 ¢ Alg’s 25 ¢ optS ours

optS ours is valid optS ours is consistent optS ours is optimal optS ours Massaging optS LI into optS ours Case 1 ¬ codeB Exit Maintaining Loop Invariant optS ours

optS LI I hold optS LI witnessing that there is an opt sol consistent with previous choices. I must make sure that what the Fairy God Mother has is consistent with this new choice. Massaging optS LI into optS ours Case 2 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ I reject the next 25 ¢

The Algorithm has 92 ¢ -75 ¢ = 17 ¢  25 ¢ unchoosen. Massaging optS LI into optS ours 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ optS LI Fairy God Mother must also have  25 ¢ that I don’t know about. optS LI does not contain the 25 ¢ either.

Massaging optS LI into optS ours ¬ codeB Exit Maintaining Loop Invariant optS ours

I know that her optS LI Is consistent with these choices. As Time Goes On 25 ¢ 10 ¢ 5 ¢ 1 ¢ Amount = 92 ¢ I keep making more choices. I always hold an opt sol optS LI but which one keeps changing. optS LI Hence, I know more and more of optS LI In the end, I know it all

codeC Exit Clean up loose ends Alg commit to or reject each object. Has giving a solution S.  opt sol consistent with these choices. S must be optimal. Alg returns S. codeC

Designing an Algorithm Define ProblemDefine Loop Invariants Define Measure of Progress Define StepDefine Exit ConditionMaintain Loop Inv Make ProgressInitial ConditionsEnding 79 km to school Exit 79 km75 km Exit 0 kmExit

Making Change Example Greedy Choice: Start by grabbing quarters until exceeds amount, then dimes, then nickels, then pennies. Problem: Find the minimum # of quarters, dimes, nickels, and pennies that total to a given amount. Does this lead to an optimal # of coins? Yes

Hard Making Change Example Greedy Choice: Start by grabbing a 4 coin. Problem: Find the minimum # of 4, 3, and 1 cent coins to make up 6 cents.

I will now instruct how to massage optS LI into optS ours so that it is consistent with previous & new choice. Massaging optS LI into optS ours 4¢4¢ 4¢4¢ 4¢4¢ 4¢4¢ 4¢4¢ 4¢4¢ 4¢4¢ 4¢4¢ 4¢4¢ 4¢4¢ 3 ¢ 1 ¢ Amount = 6 ¢ optS LI I commit to keeping a 4 ¢ I hold optS LI. Oops!

Hard Making Change Example Greedy Choice: Start by grabbing a 4 coin. Problem: Find the minimum # of 4, 3, and 1 cent coins to make up 6 cents. Consequences: = 6 mistake 3+3=6 better Greedy Algorithm does not work!

Running Time Greedy algorithms are very fast because they take only a small amount of time per object in the instance.

Ingredients: Instances: Events with starting and finishing times,,…, >. Solutions: A set of events that do not overlap. Cost of Solution: The number of events scheduled. Goal: Given a set of events, schedule as many as possible. The Job/Event Scheduling Problem

Possible Criteria for Defining “Best” The Shortest Event Counter Example Does not book the room for a long period of time. Motivation: Optimal Schedule first Optimal Greedy Criteria:

Possible Criteria for Defining “Best” The Earliest Starting Time Counter Example Common scheduling algorithm. Motivation: Optimal Schedule first Optimal Greedy Criteria:

Possible Criteria for Defining “Best” Conflicting with the Fewest Other Events Counter Example So many can still be scheduled. Motivation: Schedule first Optimal Greedy Criteria:

Possible Criteria for Defining “Best” Earliest Finishing Time Schedule the event who will free up your room for someone else as soon as possible. Motivation: Works! Greedy Criteria:

The Greedy Algorithm

Designing an Algorithm Define ProblemDefine Loop Invariants Define Measure of Progress Define StepDefine Exit ConditionMaintain Loop Inv Make ProgressInitial ConditionsEnding 79 km to school Exit 79 km75 km Exit 0 kmExit

We have not gone wrong. There is at least one optimal solution consistent with the choices made so far. Loop Invariant Initially no choices have been made and hence all optimal solutions are consistent with these choices. codeA Establishing Loop Invariant

¬ codeB Exit Maintaining Loop Invariant Let optS LI denote one.  opt sol consistent with choices made. codeB Commits or rejects the next best object. Proof massages optS LI into optS ours and proves it is a valid solution is consistent both with previous and new choices. is optimal  opt sol consistent with these choices.

Massaging optS LI into optS ours optS LI I hold optS LI witnessing that there is an opt sol consistent with previous choices. I instruct how to massage optS LI into optS ours so that it is consistent with previous & new choice. I commit to keeping next event i. Case 1

Massaging optS LI into optS ours optS LI Start by adding new event i. Delete events conflicting with job i.

Massaging optS LI into optS ours optS LI optS LI was valid and we removed any new conflicts. optS ours is valid

Massaging optS LI into optS ours optS LI optS LI was consistent. We added event i. Events in Commit don’t conflict with event i and hence were not deleted. optS ours is consistent

Massaging optS LI into optS ours optS LI optS LI was optimal. If we delete at most one event then optS ours is optimal too. optS ours is optimal

Massaging optS LI into optS ours optS LI Only one in optS LI. Deleted at most one event j j i<j  j runs at time f i. Two such j conflict with each other. j’ [j conflicts with i]  s j  f i  f i  f j

optS LI optS ours optS ours is valid optS ours is consistent optS ours is optimal optS ours Massaging optS LI into optS ours Case 1 ¬ codeB Exit Maintaining Loop Invariant

optS LI I hold optS LI witnessing that there is an opt sol consistent with previous choices. I reject next event i. Massaging optS LI into optS ours Case 2 Event i conflicts with committed to events so cant be in optS LI either.

optS LI Massaging optS LI into optS ours ¬ codeB Exit Maintaining Loop Invariant

codeC Exit Clean up loose ends Alg commit to or reject each event. Has a solution S.  opt sol consistent with these choices. S must be optimal. Alg returns optS. codeC

Designing an Algorithm Define ProblemDefine Loop Invariants Define Measure of Progress Define StepDefine Exit ConditionMaintain Loop Inv Make ProgressInitial ConditionsEnding 79 km to school Exit 79 km75 km Exit 0 kmExit

Running Time Greedy algorithms are very fast because they take only a small amount of time per object in the instance. Checking whether next event i conflicts with previously committed events requires only comparing it with the last such event.

Fixed vs Adaptive Priority Fixed Priority: Sort the objects from best to worst and loop through them. Adaptive Priority: –Greedy criteria depends on which objects have been committed to so far. –At each step, the next “best’’ object is chosen according to the current greedy criteria. –Searching or re-sorting takes too much time. –Use a priority queue.

Fixed vs Adaptive Priority

Example Dijkstra's shortest weighted path algorithm can be considered to be a greedy algorithm with an adaptive priority criteria.