Instructor Neelima Gupta Table of Contents Greedy Algorithms.

Slides:



Advertisements
Similar presentations
COMP 482: Design and Analysis of Algorithms
Advertisements

1 Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
MCA 301: Design and Analysis of Algorithms
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Greedy Algorithms Greed is good. (Some of the time)
1 Discrete Structures & Algorithms Graphs and Trees: III EECE 320.
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Greedy Algorithms Basic idea Connection to dynamic programming
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Minimum Spanning Trees. Subgraph A graph G is a subgraph of graph H if –The vertices of G are a subset of the vertices of H, and –The edges of G are a.
Called as the Interval Scheduling Problem. A simpler version of a class of scheduling problems. – Can add weights. – Can add multiple resources – Can ask.
9/8/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 6 Greedy Algorithms.
1 Greedy algorithm 叶德仕 2 Greedy algorithm’s paradigm Algorithm is greedy if it builds up a solution in small steps it chooses a decision.
Data Structures and Algorithms A. G. Malamos
Greedy Algorithms and Matroids Andreas Klappenecker.
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
Lectures on Greedy Algorithms and Dynamic Programming
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Chapter 9 Finding the Optimum 9.1 Finding the Best Tree.
Instructor Neelima Gupta Table of Contents Five representative problems.
1 Chapter 5-1 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
Instructor Neelima Gupta Table of Contents Introduction to Approximation Algorithms Factor 2 approximation algorithm for TSP Factor.
Lecture 9 CSE 331 June 18, The “real” end of Semester blues MondayTuesdayWednesdayThursdayFriday Project 331 HW Exam study Party! Write up a term.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.
PREPARED BY: Qurat Ul Ain SUBMITTED TO: Ma’am Samreen.
Greedy Algorithms General principle of greedy algorithm
Greedy algorithms: CSC317
Greedy Algorithms.
Greedy Algorithms (Chap. 16)
Greedy Algorithms Basic idea Connection to dynamic programming
Chapter 4 Greedy Algorithms
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Presented by Po-Chuan & Chen-Chen 2016/03/08
Autumn 2016 Lecture 11 Minimum Spanning Trees (Part II)
Lecture 21 CSE 331 Oct 21, 2016.
CS200: Algorithm Analysis
ICS 353: Design and Analysis of Algorithms
Autumn 2015 Lecture 11 Minimum Spanning Trees (Part II)
Lecture 21 CSE 331 Oct 20, 2017.
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Lecture 20 CSE 331 Oct 14, 2016.
Autumn 2015 Lecture 10 Minimum Spanning Trees
MCA 301: Design and Analysis of Algorithms
Lecture 11 Overview Self-Reducibility.
Greedy Algorithms: Homework Scheduling and Optimal Caching
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 6 Greedy Algorithms
Richard Anderson Autumn 2016 Lecture 7
Richard Anderson Lecture 10 Minimum Spanning Trees
Lecture 22 CSE 331 Oct 15, 2014.
Richard Anderson Lecture 7 Greedy Algorithms
Richard Anderson Winter 2019 Lecture 8 – Greedy Algorithms II
Instructor Neelima Gupta
Lecture 20 CSE 331 Oct 13, 2017.
Greedy Algorithms Comp 122, Spring 2004.
Autumn 2016 Lecture 10 Minimum Spanning Trees
Richard Anderson Winter 2019 Lecture 7
Winter 2019 Lecture 11 Minimum Spanning Trees (Part II)
Week 2: Greedy Algorithms
Greedy algorithms.
Richard Anderson Autumn 2015 Lecture 7
Week 4 - Friday CS322.
Richard Anderson Autumn 2019 Lecture 8 – Greedy Algorithms II
Autumn 2019 Lecture 11 Minimum Spanning Trees (Part II)
Presentation transcript:

Instructor Neelima Gupta

Table of Contents Greedy Algorithms

What is greedy approach? Choosing a current best solution without worrying about future. In other words the choice does not depend upon future sub-problems. Such algorithms are locally optimal, For some problems, as we will see shortly, this local optimal is global optimal also and we are happy.

General ‘Greedy’ Approach Step 1: Choose the current best solution. Step 2: Obtain greedy solution on the rest.

When to use? There must be a greedy choice to make. The problem must have an optimal substructure.

Activity Selection Problem Given a set of activities, S = {a 1, a 2, …, a n } that need to use some resource. Each activity a i has a possible start time s i & finish time f i, such that 0  s i < f i <  We need to allocate the resource in a compatible manner, such that the number of activities getting the resource is maximized. The resource can be used by one and only one activity at any given time..

Activity Selection Problem Two activities a i and a j are said to be compatible, if the interval they span do not overlap...i.e. f i  s j or f j  s i Example: Consider activities: a 1, a 2, a 3, a 4 s f 1 s f 2 s f 3 s f 4 Here a 1 is compatible with a 3 & a 4 a 2 is compatible with a 3 & a 4 But a 3 and a 4 themselves are not compatible.

Activity Selection Problem Solution: Applying the general greedy algorithm Select the current best choice, a 1 add it to the solution set. Construct a subset S’ of all activities compatible with a 1, find the optimal solution of this subset. Join the two.

Lets think of some possible greedy solutions Shortest Job First In the order of increasing start times In the order of increasing finish times

Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job job Shortest Job First

Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job job Shortest Job First

Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job job OPTIMAL SCHEDULE SCHEDULE CHOSEN BY THIS APPROACH Shortest Job First

Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job job3 20 Increasing Start Times

Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job job3 20 Increasing Start Times

Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job job3 20 SCHEDULE CHOSEN BY THIS APPROACH OPTIMAL SCHEDULE Increasing Start Times

iS i F i P i Thanks to Neha (16) Increasing Finishing Times

Thanks to Neha (16) Increasing Finishing Times Time 0 P(1)=10 P(3)=4 P(4)=20 P(2)= P(5)=2

Increasing Finishing Times Thanks to Neha (16) Time 0 P(1)=10 P(4)=20 P(2)= P(5)=2.

ACTIVITY SELECTION PROBLEM We include a₁ in the solution. And then recurse on S ′ = {a S-{a₁} : a is compatible with a₁} where S is input set of activities. Thanks to: Navneet Kaur(22), MCA 2012

Proving the Optimality

Scheduling Jobs with Processing times and Deadlines Jobs are given with processing times p i and deadlines d i. There is no specified start time. A job can be scheduled at any time. Algorithm decides the start time ( and hence the finish time) of a job. Let fi be the time at which a job finishes as per some schedule. Then its lateness is defined as li = fi – diif fi>di 0, otherwise Aim : minimize max lateness i.e. minimize max{li}

Figures from Anjali, Hemant

Possible Greedy Approaches Shortest Job First: completely ignores half of the input data viz. the deadlines Shortest Job First Doesn’t work : t1 = 1, d1=100, t2=10, d2=10 Minimum Slackness First Doesn’t work : t1 = 1, d1=2, t2=10, d2=10 Earliest Deadline First: completely ignores the other half of the input data viz. the processing time….but it works….gives the optimal

SJF fails: figure from Anjali, Hemant back

 Minimum Slackness First Let si be the time by which the job must be assigned to meet the deadline. i.e. si = di – pi Let the last job scheduled finishes at time t. Then slack- time for job i is defined as sti = si – t. Thus slack time represents, how much we can wait/defer to schedule the ith job. We should schedule the next job for which this time is minimum. i.e. sti is minimum. Since t is same for all the jobs, the job with minimum si is scheduled next.

MSF fails: figure from Anjali, Hemant back

Earliest Deadline First j1j2 j3 f1=p1f2=f1+p2 d1<=d2<=d3…..<=dn s2=f1 s3=f2

Exchange Argument Let O be an optimal solution and S be the solution obtained by our greedy. Gradually we transform O into S without hurting its (O’s) quality. Thereby implying that |O| = |S|. Hence proving that greedy is optimal.

 Inversion We say that a schedule A has an inversion if a job i with deadline di is scheduled before another job j with earlier deadline( dj<di).  Idle Time -The time that passes during a gap -There is work to be done, yet for some reason the machine is sitting idle.

Earliest Deadline First Claim: Our schedule has no inversions and no idle time….trivial Clearly Optimal has no idle time. 1. All schedules with no inversion and no idle time has same maximum lateness Proof: 2. There is an optimal schedule with no inversion and no idle time. A. If O (an optimal) has an inversion then there is a pair of jobs I an j such that j is scheduled immediately after I and d_j < d_i….i.e. pair of inverted jobs that are consecutive. Proof: B. The new swapped schedule has maximum lateness no larger than that of O. Proof:

Proof of 1. :Anjali, Hemant

Network Design Problems Given a set of routers placed at n locations V = {v1 … vn} We want to connect them in a cheapest way. Claim: The minimum cost solution to the above problem is a tree.

Minimum Spanning Tree Assume that all edges have distinct edge costs. We’ll remove this restriction later. Greedy Approaches: Kruskal Prim’s Reverse Delete Cut Property: Let S be a non-trivial subset of V, let e = (v, w) be the minimum weight edge with one end in S and the other end in V – S. Then every MST contains e.