Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Greedy Algorithms.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Introduction to Algorithms Jiafen Liu Sept
Review: Dynamic Programming
Cs333/cutler Greedy1 Introduction to Greedy Algorithms The greedy technique Problems explored –The coin changing problem –Activity selection.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
CSE 780 Algorithms Advanced Algorithms Greedy algorithm Job-select problem Greedy vs DP.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Design Patterns for Optimization Problems Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2008 Lecture 2 Tuesday, 9/16/08 Design Patterns for Optimization.
Week 2: Greedy Algorithms
Lecture 7: Greedy Algorithms II
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
COSC 3101A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 14.
CSC5101 Advanced Algorithms Analysis
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Chapter 16 Highlights
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 2 Tuesday, 2/2/10 Design Patterns for Optimization.
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Data Structures and Algorithms (AT70.02) Comp. Sc. and Inf. Mgmt. Asian Institute of Technology Instructor: Prof. Sumanta Guha Slide Sources: CLRS “Intro.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Greedy Algorithms Lecture 10 Asst. Prof. Dr. İlker Kocabaş 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms General principle of greedy algorithm
Greedy algorithms: CSC317
Analysis of Algorithms CS 477/677
Greedy Algorithms (Chap. 16)
Algorithm Design Methods
Prepared by Chen & Po-Chuan 2016/03/29
CS 3343: Analysis of Algorithms
CS4335 Design and Analysis of Algorithms/WANG Lusheng
Greedy Algorithm Enyue (Annie) Lu.
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Advanced Algorithms Analysis and Design
Lecture 6 Topics Greedy Algorithm
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Analysis of Algorithms CS 477/677
Asst. Prof. Dr. İlker Kocabaş
Advance Algorithm Dynamic Programming
Presentation transcript:

Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17

CS 477/677 - Lecture 172 Greedy Algorithms Similar to dynamic programming, but simpler approach –Also used for optimization problems Idea: When we have a choice to make, make the one that looks best right now –Make a locally optimal choice in the hope of getting a globally optimal solution Greedy algorithms don’t always yield an optimal solution When the problem has certain general characteristics, greedy algorithms give optimal solutions 2

CS 477/677 - Lecture 173 Activity Selection StartEndActivity 18:00am9:15amNumerical methods class 28:30am10:30amMovie presentation (refreshments served) 39:20am11:00amData structures class 410:00amnoonProgramming club mtg. (Pizza provided) 511:30am1:00pmComputer graphics class 61:05pm2:15pmAnalysis of algorithms class 72:30pm3:00pmComputer security class 8noon4:00pmComputer games contest (refreshments served) 94:00pm5:30pmOperating systems class Problem –Schedule the largest possible set of non-overlapping activities for a given room 3

CS 477/677 - Lecture 174 Activity Selection Schedule n activities that require exclusive use of a common resource S = {a 1,..., a n } – set of activities a i needs resource during period [s i, f i ) –s i = start time and f i = finish time of activity a i –0  s i < f i <  Activities a i and a j are compatible if the intervals [s i, f i ) and [s j, f j ) do not overlap ij ji f j  s i f i  s j 4

CS 477/677 - Lecture 175 Activity Selection Problem Select the largest possible set of non-overlapping (compatible) activities. E.g.: Activities are sorted in increasing order of finish times A subset of mutually compatible activities: {a 3, a 9, a 11 } Maximal set of mutually compatible activities: {a 1, a 4, a 8, a 11 } and {a 2, a 4, a 9, a 11 } i sisi fifi

CS 477/677 - Lecture 176 Optimal Substructure Define the space of subproblems: S ij = { a k  S : f i ≤ s k < f k ≤ s j } –activities that start after a i finishes and finish before a j starts Add fictitious activities –a 0 = [- , 0) –a n+1 = [ , “  + 1”) –Range for S ij is 0  i, j  n + 1 S = S 0,n+1 entire space of activities 6

CS 477/677 - Lecture 177 Representing the Problem We assume that activities are sorted in increasing order of finish times: f 0  f 1  f 2  …  f n < f n+1 What happens to set S ij for i ≥ j ? –For an activity a k  S ij : f i  s k < f k  s j < f j contradiction with f i  f j !  S ij =  (the set S ij must be empty!) We only need to consider sets S ij with 0  i < j  n + 1 7

CS 477/677 - Lecture 178 Optimal Substructure Subproblem: –Select a maximum-size subset of mutually compatible activities from set S ij Assume that a solution to the above subproblem includes activity a k (S ij is non-empty) Solution to S ij = (Solution to S ik )  {a k }  (Solution to S kj )  Solution to S ij  = S ik S kj S ij  Solution to S ik   Solution to S kj  8

CS 477/677 - Lecture 179 Optimal Substructure A ij = Optimal solution to S ij Claim: Sets A ik and A kj must be optimal solutions Assume  A ik ’ that includes more activities than A ik Size[A ij ’] = Size[A ik ’] Size[A kj ] > Size[A ij ]  Contradiction: we assumed that A ij has the maximum # of activities taken from S ij A ik A kj A ij 9

CS 477/677 - Lecture 1710 Recursive Solution Any optimal solution (associated with a set S ij ) contains within it optimal solutions to subproblems S ik and S kj c[i, j] = size of maximum-size subset of mutually compatible activities in S ij If S ij =   c[i, j] = 0 10

CS 477/677 - Lecture 1711 Recursive Solution If S ij   and if we consider that a k is used in an optimal solution (maximum-size subset of mutually compatible activities of S ij ), then: c[i, j] = S ik S kj S ij c[i,k] + c[k, j]

CS 477/677 - Lecture 1712 Recursive Solution 0if S ij =  c[i, j] = max {c[i,k] + c[k, j] + 1}if S ij   There are j – i – 1 possible values for k –k = i+1, …, j – 1 –a k cannot be a i or a j (from the definition of S ij ) S ij = { a k  S : f i ≤ s k < f k ≤ s j } –We check all the values and take the best one We could now write a dynamic programming algorithm i < k < j a k  S ij 12

CS 477/677 - Lecture 1713 Theorem Let S ij   and a m the activity in S ij with the earliest finish time: f m = min { f k : a k  S ij } Then: 1.a m is used in some maximum-size subset of mutually compatible activities of S ij –There exists some optimal solution that contains a m 2.S im =  –Choosing a m leaves S mj the only nonempty subproblem 13

CS 477/677 - Lecture 1714 Proof 2.Assume  a k  S im f i  s k < f k  s m < f m  f k < f m contradiction ! a m must have the earliest finish time  There is no a k  S im  S im =  S im S mj smsm fmfm amam S ij akak 14

CS 477/677 - Lecture 1715 Proof 1.a m is used in some maximum-size subset of mutually compatible activities of S ij A ij = optimal solution for activity selection from S ij –Order activities in A ij in increasing order of finish time –Let a k be the first activity in A ij = {a k, …} If a k = a m Done! Otherwise, replace a k with a m (resulting in a set A ij ’) –since f m  f k the activities in A ij ’ will continue to be compatible –A ij ’ will have the same size as A ij  a m is used in some maximum-size subset S ij smsm fmfm amam fkfk : Greedy Choice Property 15

CS 477/677 - Lecture 17 Why is the Theorem Useful? Making the greedy choice (the activity with the earliest finish time in S ij ) –Reduces the number of subproblems and choices –Allows solving each subproblem in a top-down fashion Only one subproblem left to solve! Dynamic programming Using the theorem Number of subproblems in the optimal solution Number of choices to consider 2 subproblems: S ik, S kj j – i – 1 choices 1 choice: the activity a m with the earliest finish time in S ij 1 subproblem: S mj (S im =  ) 16

CS 477/677 - Lecture 17 Greedy Approach To select a maximum-size subset of mutually compatible activities from set S ij : –Choose a m  S ij with earliest finish time (greedy choice) –Add a m to the set of activities used in the optimal solution –Solve the same problem for the set S mj From the theorem –By choosing a m we are guaranteed to have used an activity included in an optimal solution  We do not need to solve the subproblem S mj before making the choice! –The problem has the GREEDY CHOICE property 17

CS 477/677 - Lecture 17CS 477/677 - Lecture 1818 Characterizing the Subproblems The original problem: find the maximum subset of mutually compatible activities for S = S 0, n+1 Activities are sorted by increasing finish time a 0, a 1, a 2, a 3, …, a n+1 We always choose an activity with the earliest finish time –Greedy choice maximizes the unscheduled time remaining –Finish time of activities selected is strictly increasing 18

CS 477/677 - Lecture 17CS 477/677 - Lecture 1819 A Recursive Greedy Algorithm Alg.: REC-ACT-SEL ( s, f, i, n ) 1. m ← i while m ≤ n and s m < f i ►Find first activity in S i,n+1 3. do m ← m if m ≤ n 5. then return { a m }  REC-ACT-SEL( s, f, m, n ) 6. else return  Activities are ordered in increasing order of finish time Running time:  (n) – each activity is examined only once Initial call: REC-ACT-SEL(s, f, 0, n) fifi aiai amam fmfm amam fmfm amam fmfm 19

CS 477/677 - Lecture 17CS 477/677 - Lecture    +1 k s k f k a a0 m=1 a1 a2 a3 a4 a5 a6 a7 a8 m=4 m=8 a8 a9 a10 a11 m=11 a11 Example 20

CS 477/677 - Lecture 17 An Incremental Algorithm Alg.: GREEDY-ACTIVITY-SELECTOR(s, f, n) 1. A ← {a 1 } 2. i ← 1 3. for m ← 2 to n 4. do if s m ≥ f i ► activity a m is compatible with a i 5. then A ← A  {a m } 6. i ← m ► a i is most recent addition to A 7.return A Assumes that activities are ordered in increasing order of finish time Running time:  (n) – each activity is examined only once fifi aiai amam amam amam fmfm fmfm fmfm 21

CS 477/677 - Lecture 17 Steps Toward Our Greedy Solution 1.Determined the optimal substructure of the problem 2.Developed a recursive solution 3.Proved that one of the optimal choices is the greedy choice 4.Showed that all but one of the subproblems resulted by making the greedy choice are empty 5.Developed a recursive algorithm that implements the greedy strategy 6.Converted the recursive algorithm to an iterative one 22

CS 477/677 - Lecture 17 Designing Greedy Algorithms 1.Cast the optimization problem as one for which: we make a (greedy) choice and are left with only one subproblem to solve 2.Prove the GREEDY CHOICE property: that there is always an optimal solution to the original problem that makes the greedy choice 3.Prove the OPTIMAL SUBSTRUCTURE: the greedy choice + an optimal solution to the resulting subproblem leads to an optimal solution 23

CS 477/677 - Lecture 17 Correctness of Greedy Algorithms 1.Greedy Choice Property –A globally optimal solution can be arrived at by making a locally optimal (greedy) choice 2.Optimal Substructure Property –We know that we have arrived at a subproblem by making a greedy choice –Optimal solution to subproblem + greedy choice  optimal solution for the original problem 24

CS 477/677 - Lecture 17 Dynamic Programming vs. Greedy Algorithms Dynamic programming –We make a choice at each step –The choice depends on solutions to subproblems –Bottom up solution, from smaller to larger subproblems Greedy algorithm –Make the greedy choice and THEN –Solve the subproblem arising after the choice is made –The choice we make may depend on previous choices, but not on solutions to subproblems –Top down solution, problems decrease in size 25

CS 477/677 - Lecture 17 The Knapsack Problem The 0-1 knapsack problem –A thief robbing a store finds n items: the i -th item is worth v i dollars and weights w i pounds ( v i, w i integers) –The thief can only carry W pounds in his knapsack –Items must be taken entirely or left behind –Which items should the thief take to maximize the value of his load? The fractional knapsack problem –Similar to above –The thief can take fractions of items 26

CS 477/677 - Lecture 17 Fractional Knapsack Problem Knapsack capacity: W There are n items: the i -th item has value v i and weight w i Goal: –Find fractions x i so that for all 0  x i  1, i = 1, 2,.., n  w i x i  W and  x i v i is maximum 27

CS 477/677 - Lecture 17 Fractional Knapsack Problem Greedy strategy 1: –Pick the item with the maximum value E.g.: –W = 1 –w 1 = 100, v 1 = 2 –w 2 = 1, v 2 = 1 –Taking from the item with the maximum value: Total value (choose item 1) = v 1 W/w 1 = 2/100 –Smaller than what the thief can take if choosing the other item Total value (choose item 2) = v 2 W/w 2 = 1 28

CS 477/677 - Lecture 17 Fractional Knapsack Problem Greedy strategy 2: –Pick the item with the maximum value per pound v i /w i –If the supply of that element is exhausted and the thief can carry more: take as much as possible from the item with the next greatest value per pound –It is good to order items based on their value per pound 29

CS 477/677 - Lecture Fractional Knapsack - Example E.g.: Item 1 Item 2 Item 3 $60$100$ $60 $100 + $240 $6/pound$5/pound$4/pound $

31 Readings Chapter 15 CS 477/677 - Lecture 1731