Instructor Neelima Gupta
Table of Contents Greedy Algorithms
What is greedy approach? Choosing a current best solution without worrying about future. In other words the choice does not depend upon future sub-problems. Such algorithms are locally optimal, For some problems, as we will see shortly, this local optimal is global optimal also and we are happy.
General ‘Greedy’ Approach Step 1: Choose the current best solution. Step 2: Obtain greedy solution on the rest.
When to use? There must be a greedy choice to make. The problem must have an optimal substructure.
Activity Selection Problem Given a set of activities, S = {a 1, a 2, …, a n } that need to use some resource. Each activity a i has a possible start time s i & finish time f i, such that 0 s i < f i < We need to allocate the resource in a compatible manner, such that the number of activities getting the resource is maximized. The resource can be used by one and only one activity at any given time..
Activity Selection Problem Two activities a i and a j are said to be compatible, if the interval they span do not overlap...i.e. f i s j or f j s i Example: Consider activities: a 1, a 2, a 3, a 4 s f 1 s f 2 s f 3 s f 4 Here a 1 is compatible with a 3 & a 4 a 2 is compatible with a 3 & a 4 But a 3 and a 4 themselves are not compatible.
Activity Selection Problem Solution: Applying the general greedy algorithm Select the current best choice, a 1 add it to the solution set. Construct a subset S’ of all activities compatible with a 1, find the optimal solution of this subset. Join the two.
Lets think of some possible greedy solutions Shortest Job First In the order of increasing start times In the order of increasing finish times
Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job job Shortest Job First
Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job job Shortest Job First
Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job job OPTIMAL SCHEDULE SCHEDULE CHOSEN BY THIS APPROACH Shortest Job First
Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job job3 20 Increasing Start Times
Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job job3 20 Increasing Start Times
Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job job3 20 SCHEDULE CHOSEN BY THIS APPROACH OPTIMAL SCHEDULE Increasing Start Times
iS i F i P i Thanks to Neha (16) Increasing Finishing Times
Thanks to Neha (16) Increasing Finishing Times Time 0 P(1)=10 P(3)=4 P(4)=20 P(2)= P(5)=2
Increasing Finishing Times Thanks to Neha (16) Time 0 P(1)=10 P(4)=20 P(2)= P(5)=2.
ACTIVITY SELECTION PROBLEM We include a₁ in the solution. And then recurse on S ′ = {a S-{a₁} : a is compatible with a₁} where S is input set of activities. Thanks to: Navneet Kaur(22), MCA 2012
Proving the Optimality
Scheduling Jobs with Processing times and Deadlines Jobs are given with processing times p i and deadlines d i. There is no specified start time. A job can be scheduled at any time. Algorithm decides the start time ( and hence the finish time) of a job. Let fi be the time at which a job finishes as per some schedule. Then its lateness is defined as li = fi – diif fi>di 0, otherwise Aim : minimize max lateness i.e. minimize max{li}
Figures from Anjali, Hemant
Possible Greedy Approaches Shortest Job First: completely ignores half of the input data viz. the deadlines Shortest Job First Doesn’t work : t1 = 1, d1=100, t2=10, d2=10 Minimum Slackness First Doesn’t work : t1 = 1, d1=2, t2=10, d2=10 Earliest Deadline First: completely ignores the other half of the input data viz. the processing time….but it works….gives the optimal
SJF fails: figure from Anjali, Hemant back
Minimum Slackness First Let si be the time by which the job must be assigned to meet the deadline. i.e. si = di – pi Let the last job scheduled finishes at time t. Then slack- time for job i is defined as sti = si – t. Thus slack time represents, how much we can wait/defer to schedule the ith job. We should schedule the next job for which this time is minimum. i.e. sti is minimum. Since t is same for all the jobs, the job with minimum si is scheduled next.
MSF fails: figure from Anjali, Hemant back
Earliest Deadline First j1j2 j3 f1=p1f2=f1+p2 d1<=d2<=d3…..<=dn s2=f1 s3=f2
Exchange Argument Let O be an optimal solution and S be the solution obtained by our greedy. Gradually we transform O into S without hurting its (O’s) quality. Thereby implying that |O| = |S|. Hence proving that greedy is optimal.
Inversion We say that a schedule A has an inversion if a job i with deadline di is scheduled before another job j with earlier deadline( dj<di). Idle Time -The time that passes during a gap -There is work to be done, yet for some reason the machine is sitting idle.
Earliest Deadline First Claim: Our schedule has no inversions and no idle time….trivial Clearly Optimal has no idle time. 1. All schedules with no inversion and no idle time has same maximum lateness Proof: 2. There is an optimal schedule with no inversion and no idle time. A. If O (an optimal) has an inversion then there is a pair of jobs I an j such that j is scheduled immediately after I and d_j < d_i….i.e. pair of inverted jobs that are consecutive. Proof: B. The new swapped schedule has maximum lateness no larger than that of O. Proof:
Proof of 1. :Anjali, Hemant
Network Design Problems Given a set of routers placed at n locations V = {v1 … vn} We want to connect them in a cheapest way. Claim: The minimum cost solution to the above problem is a tree.
Minimum Spanning Tree Assume that all edges have distinct edge costs. We’ll remove this restriction later. Greedy Approaches: Kruskal Prim’s Reverse Delete Cut Property: Let S be a non-trivial subset of V, let e = (v, w) be the minimum weight edge with one end in S and the other end in V – S. Then every MST contains e.