Week 2: Greedy Algorithms

Slides:



Advertisements
Similar presentations
COMP 482: Design and Analysis of Algorithms
Advertisements

COMP 482: Design and Analysis of Algorithms
1 Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
1 Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Analysis of Algorithms
Instructor Neelima Gupta Table of Contents Greedy Algorithms.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 7: Greedy Algorithms
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Week 2: Greedy Algorithms
Lecture 7: Greedy Algorithms II
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
9/3/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Guest lecturer: Martin Furer Algorithm Design and Analysis L ECTURE.
Called as the Interval Scheduling Problem. A simpler version of a class of scheduling problems. – Can add weights. – Can add multiple resources – Can ask.
9/8/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 6 Greedy Algorithms.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 8.
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
1 Chapter 5-1 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Interval Scheduling and Fractional Knapsack These slides are based on the Lecture Notes by David Mount for the course CMSC 451 at the.
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
PREPARED BY: Qurat Ul Ain SUBMITTED TO: Ma’am Samreen.
Greedy algorithms: CSC317
Greedy Algorithms – Chapter 5
Lecture on Design and Analysis of Computer Algorithm
Chapter 4 Greedy Algorithms
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Analysis of Algorithms CS 477/677
Chapter 4 Greedy Algorithms
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Presented by Po-Chuan & Chen-Chen 2016/03/08
Chapter 4 Greedy Algorithms
Richard Anderson Autumn 2015 Lecture 8 – Greedy Algorithms II
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
The Greedy Method Spring 2007 The Greedy Method Merge Sort
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
CS4335 Design and Analysis of Algorithms/WANG Lusheng
4.1 Interval Scheduling.
Lecture 20 CSE 331 Oct 14, 2016.
Lecture 19 CSE 331 Oct 12, 2016.
Advanced Algorithms Analysis and Design
Greedy Algorithms: Homework Scheduling and Optimal Caching
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 6 Greedy Algorithms
Lecture 19 CSE 331 Oct 8, 2014.
Richard Anderson Autumn 2016 Lecture 7
Lecture 18 CSE 331 Oct 9, 2017.
Richard Anderson Lecture 7 Greedy Algorithms
Lecture 20 CSE 331 Oct 13, 2017.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Richard Anderson Autumn 2016 Lecture 8 – Greedy Algorithms II
The results for Challenging Problem 1.
Chapter 4 Greedy Algorithms
Richard Anderson Winter 2019 Lecture 7
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Lecture 19 CSE 331 Oct 10, 2016.
Lecture 2: Greedy Algorithms
Richard Anderson Autumn 2015 Lecture 7
Advance Algorithm Dynamic Programming
Analysis of Algorithms CS 477/677
Richard Anderson Autumn 2019 Lecture 7
Richard Anderson Autumn 2019 Lecture 8 – Greedy Algorithms II
Presentation transcript:

Week 2: Greedy Algorithms 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng Greedy Algorithm A technique to solve problems: always makes the best choice at the moment (local optimal). Hopefully, a series of local optimal choices will lead to a globally optimal solution. Greedy algorithms yield optimal solutions for many (but not all) problems. 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng Knapsack problem: 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng Knapsack problem: Input: n items: (w1, v1), …, (wn, vn) and a number W the i-th item’s is worth vi dollars with weight wi. at most W pounds to be carried. Output: Some items which are most valuable and with total weight at most W. Two versions: 0-1 Knapsack: Items cannot be divided into pieces fractional knapsack: Items can be divided into pieces 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Fractional Knapsack problem 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Fractional Knapsack problem How many units to take? 15-4-2-1-1=7 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng Another Example a b c d e f 11 kg 3kg 4kg 58kg 8kg 88kg $3 $6 $35 $8 $28 $66 W=20kg First, compute unit prices $0.27 $2.00 $8.75 $0.11 $3.50 $0.75 Sort the items according to prices, and then take items until the bucket is full. (c, 4kg), (e, 8kg), (b, 3kg), (f, 5kg) 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

fractional Knapsack Algorithm Greedy on the unit price: vi/wi. Each time, take the item with maximum vi/wi. If exceeds W, take fractions of the item. 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Proof: (correctness, the hard part) Let X = i1, i2, …ik be an optimal set of items taken. We just need to prove that our solution is as good as X. Consider the item with highest unit price, assume it is item j: (vj, wj) if j is not used in X, get rid of some items with total weight wj and add item j. Total value is increased. Why? It implies that item j must be in X. Repeat the process, X is changed to contain all items selected by greedy WITHOUT decreasing the total value taken. 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

0-1 knapsack problem cannot be solved optimally by greedy Counter example: (moderate part) We have three items: a:($10, 10kg), b:($10, 10kg), c:($12, 11kg) W=20kg Greedy approach will take c. Total value is $12. Optimal solution will take a and b Total value is $20. 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Interval Scheduling

Interval Scheduling Interval scheduling. a b c d e f g h Time Job j starts at sj and finishes at fj. Two jobs compatible if they don't overlap. Goal: find maximum subset of mutually compatible jobs. a b c Activity selection = interval scheduling OPT = B, E, H Note: smallest job (C) is not in any optimal solution, job that starts first (A) is not in any optimal solution. d e f g h Time 1 2 3 4 5 6 7 8 9 10 11

Ideas for Interval Scheduling What are possible rules for next Interval? Choose the interval that starts earliest. Rationale: start using the resource as soon as possible. Choose the smallest interval. Rationale: try to t in lots of small jobs. Choose the interval that overlaps with the fewest remaining intervals. Rationale: keep our options open and eliminate as few intervals as possible. 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng Rules That Don’t Work 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng Rules That Don’t Work 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng Rules That Don’t Work 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Optimal Greedy Algorithm Choose the interval with the earliest finishing time. Rationale: ensure we have as much of the resource left as possible. 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng Example: Jobs (s, f): (0, 10), (3, 4), (2, 8), (1, 5), (4, 5), (4, 8), (5, 6) (7,9). Sorting based on fi: (3, 4) (1, 5), (4, 5) (5, 6) (4,8) (2,8) (7, 9)(0,10). Selecting jobs: (3, 4), remove (1, 5), (2, 8), (0, 10) (4, 5), remove (4, 8) (5, 6), remove nothing (7, 9), remove nothing 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Interval Scheduling: Greedy Algorithm Greedy algorithm. Consider jobs in increasing order of finish time. Take each job provided it's compatible with the ones already taken. Implementation. O(n log n). Remember job j* that was added last to A. Job j is compatible with A if sj  fj*. Sort jobs by finish times so that f1  f2  ...  fn. A   for j = 1 to n { if (job j compatible with A) A  A  {j} } return A jobs selected

CS4335 Design and Analysis of Algorithms/WANG Lusheng 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

How to Prove Optimality How can we prove the schedule returned is optimal? Let A be the schedule returned by this algorithm. Let OPT be some optimal solution. Might be hard to show that A = OPT, instead we need only to show that |A| = |OPT|. Note the distinction: instead of proving directly that a choice of intervals A is the same as an optimal choice, we prove that it has the same number of intervals as an optimal. Therefore, it is optimal. 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Interval Scheduling: Analysis Theorem. Greedy algorithm is optimal. Pf. (by contradiction) Assume greedy is not optimal, and let's see what happens. Let i1, i2, ... ik denote set of jobs selected by greedy. Let j1, j2, ... jm denote set of jobs in the optimal solution with i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r. job ir+1 finishes before jr+1 Greedy: i1 i1 ir ir+1 OPT: j1 j2 jr jr+1 . . . why not replace job jr+1 with job ir+1?

Interval Scheduling: Analysis Theorem. Greedy algorithm is optimal. Pf. (by contradiction) Assume greedy is not optimal, and let's see what happens. Let i1, i2, ... ik denote set of jobs selected by greedy. Let j1, j2, ... jm denote set of jobs in the optimal solution with i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r. job ir+1 finishes before jr+1 Greedy: i1 i1 ir ir+1 OPT: j1 j2 jr ir+1 . . . solution still feasible and optimal, but contradicts maximality of r.

Interval Scheduling : Example B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 1 2 3 4 5 6 7 8 9 10 11

Interval Scheduling B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 B 1 2 1 2 3 4 5 6 7 8 9 10 11 B 1 2 3 4 5 6 7 8 9 10 11

Interval Scheduling B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 B C 1 1 2 3 4 5 6 7 8 9 10 11 B C 1 2 3 4 5 6 7 8 9 10 11

Interval Scheduling B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 A B 1 1 2 3 4 5 6 7 8 9 10 11 A B 1 2 3 4 5 6 7 8 9 10 11

Interval Scheduling B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 B E 1 1 2 3 4 5 6 7 8 9 10 11 B E 1 2 3 4 5 6 7 8 9 10 11

Interval Scheduling B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 B D E 1 2 3 4 5 6 7 8 9 10 11 B D E 1 2 3 4 5 6 7 8 9 10 11

Interval Scheduling B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 B E F 1 2 3 4 5 6 7 8 9 10 11 B E F 1 2 3 4 5 6 7 8 9 10 11

Interval Scheduling B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 B E G 1 2 3 4 5 6 7 8 9 10 11 B E G 1 2 3 4 5 6 7 8 9 10 11

Interval Scheduling B C A E D F G H Time 1 2 3 4 5 6 7 8 9 10 11 B E H 1 2 3 4 5 6 7 8 9 10 11 B E H 1 2 3 4 5 6 7 8 9 10 11

Interval Partitioning

Interval Partitioning Lecture j starts at sj and finishes at fj. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. Ex: This schedule uses 4 classrooms to schedule 10 lectures. e j c d g b h a f i 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time

Interval Partitioning Lecture j starts at sj and finishes at fj. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. Ex: This schedule uses only 3. c d f j b g i a e h 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time

Interval Partitioning: Lower Bound on Optimal Solution Def. The depth of a set of open intervals is the maximum number that contain any given time. Key observation. Number of classrooms needed  depth. Ex: Depth of schedule below = 3  schedule below is optimal. Q. Does there always exist a schedule equal to depth of intervals? a, b, c all contain 9:30 c d f j b g i a e h 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time

CS4335 Design and Analysis of Algorithms/WANG Lusheng Greedy Algorithm: c d g j b f i Depth: The maximum No. of jobs required at any time. a e h Depth:3 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Interval Partitioning: Greedy Algorithm Greedy algorithm. Consider lectures in increasing order of start time: assign lecture to any compatible classroom. Implementation. O(n log n). For each classroom k, maintain the finish time of the last job added. Keep the classrooms in a priority queue. Sort intervals by starting time so that s1  s2  ...  sn. d  0 for j = 1 to n { if (lecture j is compatible with some classroom k) schedule lecture j in classroom k else allocate a new classroom d + 1 schedule lecture j in classroom d + 1 d  d + 1 } number of allocated classrooms

Interval Partitioning: Greedy Analysis Observation. Greedy algorithm never schedules two incompatible lectures in the same classroom. Theorem. Greedy algorithm is optimal. Pf. Let d = number of classrooms that the greedy algorithm allocates. Classroom d is opened because we needed to schedule a job, say j, that is incompatible with all d-1 other classrooms. Since we sorted by start time, all these incompatibilities are caused by lectures that start no later than sj. Thus, we have d lectures overlapping at time sj + . Key observation  all schedules use  d classrooms. ddepth

Scheduling to Minimize Lateness

Solving end of Semester blues Term paper Exam study Party! CS4335 HW Max lateness = 2 Project 2 Party! Exam study CS4335 HW Term paper Project Monday Tuesday Wednesday Thursday Friday

Scheduling to Minimizing Lateness Minimizing lateness problem. Single resource processes one job at a time. Job j requires tj units of processing time and is due at time dj. If j starts at time sj, it finishes at time fj = sj + tj. Lateness: j = max { 0, fj - dj }. Goal: schedule all jobs to minimize maximum lateness L = max j. Ex: dj 6 tj 3 1 8 2 9 4 14 5 15 Note: finding a maximal size subset of jobs that meet deadline is NP-hard. lateness = 2 lateness = 0 max lateness = 6 d3 = 9 d2 = 8 d6 = 15 d1 = 6 d5 = 14 d4 = 9 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Minimizing Lateness: Greedy Algorithms Greedy template. Consider jobs in some order. [Shortest processing time first] Consider jobs in ascending order of processing time tj. [Earliest deadline first] Consider jobs in ascending order of deadline dj. [Smallest slack] Consider jobs in ascending order of slack dj - tj.

Minimizing Lateness: Greedy Algorithms Greedy template. Consider jobs in some order. [Shortest processing time first] Consider jobs in ascending order of processing time tj. [Smallest slack] Consider jobs in ascending order of slack dj - tj. 100 1 10 2 tj dj counterexample l2=0, l1=0 l1=0, l2=1 10 11 l1=0, l2=1 2 1 10 l1=9, l2=0 tj 1 11 dj counterexample

CS4335 Design and Analysis of Algorithms/WANG Lusheng 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Minimizing Lateness: Greedy Algorithm Greedy algorithm. Earliest deadline first. Sort n jobs by deadline so that d1  d2  …  dn t  0 for j = 1 to n Assign job j to interval [t, t + tj] sj  t, fj  t + tj t  t + tj output intervals [sj, fj] dj 6 tj 3 1 8 2 9 4 14 5 15 max lateness = 1 d5 = 14 d2 = 8 d6 = 15 d1 = 6 d4 = 9 d3 = 9 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Minimizing Lateness: No Idle Time Observation. There exists an optimal schedule with no idle time. Observation. The greedy schedule has no idle time. d = 4 d = 6 d = 12 1 2 3 4 5 6 7 8 9 10 11 d = 4 d = 6 d = 12 1 2 3 4 5 6 7 8 9 10 11 "machine not working for some reason, yet work still to be done"

Minimizing Lateness: Inversions Def. An inversion in schedule S is a pair of jobs i and j such that: di < dj but j scheduled before i. Observation. Greedy schedule has no inversions. Observation. If a schedule (with no idle time) has an inversion, it has one with a pair of inverted jobs scheduled consecutively. inversion before swap j i Recall: jobs sorted in ascending order of due dates

Minimizing Lateness: Inversions Def. An inversion in schedule S is a pair of jobs i and j such that: di < dj but j scheduled before i. Claim. Swapping two adjacent, inverted jobs reduces the number of inversions by one and does not increase the max lateness. Pf. Let  be the lateness before the swap, and let  ' be it afterwards. 'k = k for all k  i, j 'i  i If job j is late: inversion fi before swap j i after swap i j f'j

CS4335 Design and Analysis of Algorithms/WANG Lusheng di<dj We check every pair of consecutive numbers, if there is not inversion, then the whole sequenc has no inversion. n1n2 … nk Example: 1, 2, 3, 4, 5, 6, 7, 8, 9 1, 2, 6, 7, 3, 4, 5, 8, 9 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

CS4335 Design and Analysis of Algorithms/WANG Lusheng 2019/5/14 CS4335 Design and Analysis of Algorithms/WANG Lusheng

Greedy Analysis Strategies Greedy algorithm stays ahead. Show that after each step of the greedy algorithm, its solution is at least as good as any other algorithm's. Exchange argument. Gradually transform any solution to the one found by the greedy algorithm without hurting its quality. Structural. Discover a simple "structural" bound asserting that every possible solution must have a certain value. Then show that your algorithm always achieves this bound. myopic greed: "at every iteration, the algorithm chooses the best morsel it can swallow, without worrying about the future" Objective function. Does not explicitly appear in greedy algorithm! Hard, if not impossible, to precisely define "greedy algorithm."