Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.

Slides:



Advertisements
Similar presentations
1 Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Advertisements

Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Analysis of Algorithms
Greedy Algorithms Basic idea Connection to dynamic programming
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
T(n) = 4 T(n/3) +  (n). T(n) = 2 T(n/2) +  (n)
Lecture 37 CSE 331 Nov 4, Homework stuff (Last!) HW 10 at the end of the lecture Solutions to HW 9 on Monday Graded HW 9 on.
18 = = = = = = = =
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
CSE 421 Algorithms Richard Anderson Lecture 16 Dynamic Programming.
Lecture 36 CSE 331 Dec 2, Graded HW 8 END of the lecture.
KNAPSACK PROBLEM A dynamic approach. Knapsack Problem  Given a sack, able to hold K kg  Given a list of objects  Each has a weight and a value  Try.
Lecture 37 CSE 331 Dec 1, A new grading proposal Towards your final score in the course MAX ( mid-term as 25%+ finals as 40%, finals as 65%) .
Lecture 20 CSE 331 Oct 21, Algorithm for Interval Scheduling R: set of requests Set A to be the empty set While R is not empty Choose i in R with.
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
9/3/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Guest lecturer: Martin Furer Algorithm Design and Analysis L ECTURE.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 8.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 17.
Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems.
Greedy Algorithms. Surprisingly, many important and practical computational problems can be solved this way. Every two year old knows the greedy algorithm.
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Dynamic programming vs Greedy algo – con’t Input: Output: Objective: a number W and a set of n items, the i-th item has a weight w i and a cost c i a subset.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 14.
1 Chapter 6-1 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
1 Chapter 5-1 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
PREPARED BY: Qurat Ul Ain SUBMITTED TO: Ma’am Samreen.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms General principle of greedy algorithm
Greedy algorithms: CSC317
Greedy Algorithms – Chapter 5
Weighted Interval Scheduling
Analysis of Algorithms CS 477/677
Richard Anderson Lecture 17 Dynamic Programming
Greedy Algorithms Basic idea Connection to dynamic programming
Chapter 4 Greedy Algorithms
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Presented by Po-Chuan & Chen-Chen 2016/03/08
Prepared by Chen & Po-Chuan 2016/03/29
ICS 353: Design and Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 16 Dynamic Programming
Lecture 19 CSE 331 Oct 12, 2016.
Lecture 11 Overview Self-Reducibility.
Richard Anderson Lecture 16a Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 6 Greedy Algorithms
Lecture 19 CSE 331 Oct 8, 2014.
Lecture 20 CSE 331 Oct 17, 2011.
Lecture 18 CSE 331 Oct 9, 2017.
Richard Anderson Lecture 7 Greedy Algorithms
ICS 353: Design and Analysis of Algorithms
Dynamic programming vs Greedy algo – con’t
Lecture 36 CSE 331 Nov 28, 2011.
Lecture 36 CSE 331 Nov 30, 2012.
Dynamic Programming II DP over Intervals
Richard Anderson Lecture 16, Winter 2019 Dynamic Programming
Analysis of Algorithms CS 477/677
Week 2: Greedy Algorithms
Lecture 19 CSE 331 Oct 10, 2016.
Greedy algorithms.
Analysis of Algorithms CS 477/677
Presentation transcript:

Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer

Interval Scheduling Input: Output: Objective: a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals

Interval Scheduling Input: Output: Objective: a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals Idea #1: Select interval that starts earliest, remove overlapping intervals and recurse.

Interval Scheduling Input: Output: Objective: a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals Idea #2: Select the shortest interval, remove overlapping intervals and recurse.

Interval Scheduling Input: Output: Objective: a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals Idea #3: Select the interval with the fewest conflicts, remove overlapping intervals and recurse.

Interval Scheduling Input: Output: Objective: a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals Idea #4: Select the earliest finishing interval, remove overlapping intervals and recurse.

Interval Scheduling Idea #4: Select the earliest finishing interval, remove overlapping intervals and recurse. INTERVAL-SCHEDULING( (s 0,f 0 ), …, (s n-1,f n-1 ) ) 1. Remain = {0,…,n-1} 2. Selected = {} 3. while ( |Remain| > 0 ) { 4. k 2 Remain is such that f k = min i 2 Remain f i 5. Selected = Selected [ {k} 6. Remain = Remain – {k} 7. for every i in Remain { 8. if (s i < f k ) then Remain = Remain – {i} 9. } 10. } 11. return Selected

Interval Scheduling Running time: INTERVAL-SCHEDULING( (s 0,f 0 ), …, (s n-1,f n-1 ) ) 1. Remain = {0,…,n-1} 2. Selected = {} 3. while ( |Remain| > 0 ) { 4. k 2 Remain is such that f k = min i 2 Remain f i 5. Selected = Selected [ {k} 6. Remain = Remain – {k} 7. for every i in Remain { 8. if (s i < f k ) then Remain = Remain – {i} 9. } 10. } 11. return Selected

Interval Scheduling Thm: Algorithm works. INTERVAL-SCHEDULING( (s 0,f 0 ), …, (s n-1,f n-1 ) ) 1. Remain = {0,…,n-1} 2. Selected = {} 3. while ( |Remain| > 0 ) { 4. k 2 Remain is such that f k = min i 2 Remain f i 5. Selected = Selected [ {k} 6. Remain = Remain – {k} 7. for every i in Remain { 8. if (s i < f k ) then Remain = Remain – {i} 9. } 10. } 11. return Selected

Interval Scheduling Which type of algorithm did we use ? Thm: Algorithm works.

Schedule All Intervals Input: Output: Objective: a set of time-intervals a partition of the intervals, each part of the partition consists of non-overlapping intervals minimize the number of parts in the partition

Input: Output: Objective: a set of time-intervals a partition of the intervals, each part of the partition consists of non-overlapping intervals minimize the number of parts in the partition max (over time t) number of intervals that are “active” at time t Def: depth = Schedule All Intervals

max (over time t) number of intervals that are “active” at time t Def: depth = Observation 1: Need at least depth parts (labels). SCHEDULE-ALL_INTERVALS ( (s 0,f 0 ), …, (s n-1,f n-1 ) ) 1. Sort intervals by their starting time 2. for j=0 to n-1 do 3. Consider = {1,…,depth} 4. for every i<j that overlaps with j do 5. Consider = Consider – { Label[i] } 6. if |Consider| > 0 then 7. Label[j] = anything from Consider 8. else 9. Label[j] = nothing 10.return Label[] Schedule All Intervals

Thm: Every interval gets a real label. Schedule All Intervals Corollary: Algo returns an optimal solution (i.e. it works!). Running time: SCHEDULE-ALL_INTERVALS ( (s 0,f 0 ), …, (s n-1,f n-1 ) ) 1. Sort intervals by their starting time 2. for j=0 to n-1 do 3. Consider = {1,…,depth} 4. for every i<j that overlaps with j do 5. Consider = Consider – { Label[i] } 6. if |Consider| > 0 then 7. Label[j] = anything from Consider 8. else 9. Label[j] = nothing 10.return Label[]

Input: Output: Objective: a set of time-intervals, each interval has a cost a subset of non-overlapping intervals maximize the sum of the costs in the subset Weighted Interval Scheduling

Input: Output: Objective: a set of time-intervals, each interval has a cost a subset of non-overlapping intervals maximize the sum of the costs in the subset Weighted Interval Scheduling WEIGHTED-SCHED-ATTEMPT((s 0,f 0,c 0 ),…,(s n-1,f n-1,c n-1 )) 1. sort intervals by their finishing time 2. return WEIGHTED-SCHEDULING-RECURSIVE (n) WEIGHTED-SCHEDULING-RECURSIVE (j) 1. if (j<0) then RETURN 0 2. k=j 3. while (interval k and j overlap) do k-- 4. return max(c j + WEIGHTED-SCHEDULING-RECURSIVE(k), WEIGHTED-SCHEDULING-RECURSIVE(j-1))

Weighted Interval Scheduling Does the algorithm below work ? WEIGHTED-SCHED-ATTEMPT((s 0,f 0,c 0 ),…,(s n-1,f n-1,c n-1 )) 1. sort intervals by their finishing time 2. return WEIGHTED-SCHEDULING-RECURSIVE (n) WEIGHTED-SCHEDULING-RECURSIVE (j) 1. if (j<0) then RETURN 0 2. k=j 3. while (interval k and j overlap) do k-- 4. return max(c j + WEIGHTED-SCHEDULING-RECURSIVE(k), WEIGHTED-SCHEDULING-RECURSIVE(j-1))

Weighted Interval Scheduling Dynamic programming ! I.e. memorize the solution for j WEIGHTED-SCHED-ATTEMPT((s 0,f 0,c 0 ),…,(s n-1,f n-1,c n-1 )) 1. sort intervals by their finishing time 2. return WEIGHTED-SCHEDULING-RECURSIVE (n) WEIGHTED-SCHEDULING-RECURSIVE (j) 1. if (j<0) then RETURN 0 2. k=j 3. while (interval k and j overlap) do k-- 4. return max(c j + WEIGHTED-SCHEDULING-RECURSIVE(k), WEIGHTED-SCHEDULING-RECURSIVE(j-1))

Weighted Interval Scheduling Heart of the solution: S[ j ] = max cost of a set of non-overlapping intervals selected from the first j intervals Another part of the heart: how to compute S[j] ? Finally, what do we return ?

Weighted Interval Scheduling WEIGHTED-SCHED ((s 0,f 0,c 0 ), …, (s n-1,f n-1,c n-1 )) 1. Sort intervals by their finishing time 2. Define S[-1] = 0 3. for j=0 to n-1 do 4. k = j 5. while (intervals k and j overlap) do k-- 6. S[j] = max( S[j-1], c j + S[k] ) 7. RETURN S[n] S[ j ] = max cost of a set of non-overlapping intervals selected from the first j intervals Heart of the solution:

Weighted Interval Scheduling WEIGHTED-SCHED ((s 0,f 0,c 0 ), …, (s n-1,f n-1,c n-1 )) 1. Sort intervals by their finishing time 2. Define S[-1] = 0 3. for j=0 to n-1 do 4. k = j 5. while (intervals k and j overlap) do k-- 6. S[j] = max( S[j-1], c j + S[k] ) RETURN S[n] Reconstructing the solution: