Presentation is loading. Please wait.

Presentation is loading. Please wait.

Instructor Neelima Gupta Table of Contents Greedy Algorithms.

Similar presentations


Presentation on theme: "Instructor Neelima Gupta Table of Contents Greedy Algorithms."— Presentation transcript:

1 Instructor Neelima Gupta ngupta@cs.du.ac.in

2 Table of Contents Greedy Algorithms

3 What is greedy approach? Choosing a current best solution without worrying about future. In other words the choice does not depend upon future sub-problems. Such algorithms are locally optimal, For some problems, as we will see shortly, this local optimal is global optimal also and we are happy.

4 General ‘Greedy’ Approach Step 1: Choose the current best solution. Step 2: Obtain greedy solution on the rest.

5 When to use? There must be a greedy choice to make. The problem must have an optimal substructure.

6 Activity Selection Problem Given a set of activities, S = {a 1, a 2, …, a n } that need to use some resource. Each activity a i has a possible start time s i & finish time f i, such that 0  s i < f i <  We need to allocate the resource in a compatible manner, such that the number of activities getting the resource is maximized. The resource can be used by one and only one activity at any given time..

7 Activity Selection Problem Two activities a i and a j are said to be compatible, if the interval they span do not overlap...i.e. f i  s j or f j  s i Example: Consider activities: a 1, a 2, a 3, a 4 s 1 --------f 1 s 2 ---------f 2 s 3 ------f 3 s 4 ------f 4 Here a 1 is compatible with a 3 & a 4 a 2 is compatible with a 3 & a 4 But a 3 and a 4 themselves are not compatible.

8 Activity Selection Problem Solution: Applying the general greedy algorithm Select the current best choice, a 1 add it to the solution set. Construct a subset S’ of all activities compatible with a 1, find the optimal solution of this subset. Join the two.

9 Lets think of some possible greedy solutions Shortest Job First In the order of increasing start times In the order of increasing finish times

10 Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job2 3 45678 9 1011 job3 1 0 13121514 Shortest Job First

11 Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job2 3 45678 9 1011 job3 1 0 13121514 Shortest Job First

12 Thanks to: Navneet Kaur(22), MCA 2012 Time 2 job1 job2 3 45678 9 1011 job3 1 0 13121514 OPTIMAL SCHEDULE SCHEDULE CHOSEN BY THIS APPROACH Shortest Job First

13 Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job2 2 4681012 14 1618 job3 20 Increasing Start Times

14 Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job2 2 4681012 14 1618 job3 20 Increasing Start Times

15 Thanks to: Navneet Kaur(22), MCA 2012 Time 0 job1 job2 2 4681012 14 1618 job3 20 SCHEDULE CHOSEN BY THIS APPROACH OPTIMAL SCHEDULE Increasing Start Times

16 iS i F i P i Thanks to Neha (16) 224 3 115 10 346 4 458 20 569 2 Increasing Finishing Times

17 Thanks to Neha (16) Increasing Finishing Times Time 0 P(1)=10 P(3)=4 P(4)=20 P(2)=3 1 23456 7 89 P(5)=2

18 Increasing Finishing Times Thanks to Neha (16) Time 0 P(1)=10 P(4)=20 P(2)=3 1 23456 7 89 P(5)=2.

19 ACTIVITY SELECTION PROBLEM We include a₁ in the solution. And then recurse on S ′ = {a S-{a₁} : a is compatible with a₁} where S is input set of activities. Thanks to: Navneet Kaur(22), MCA 2012

20 Proving the Optimality

21 Scheduling Jobs with Processing times and Deadlines Jobs are given with processing times p i and deadlines d i. There is no specified start time. A job can be scheduled at any time. Algorithm decides the start time ( and hence the finish time) of a job. Let fi be the time at which a job finishes as per some schedule. Then its lateness is defined as li = fi – diif fi>di 0, otherwise Aim : minimize max lateness i.e. minimize max{li}

22 Figures from Anjali, Hemant

23 Possible Greedy Approaches Shortest Job First: completely ignores half of the input data viz. the deadlines Shortest Job First Doesn’t work : t1 = 1, d1=100, t2=10, d2=10 Minimum Slackness First Doesn’t work : t1 = 1, d1=2, t2=10, d2=10 Earliest Deadline First: completely ignores the other half of the input data viz. the processing time….but it works….gives the optimal

24 SJF fails: figure from Anjali, Hemant back

25  Minimum Slackness First Let si be the time by which the job must be assigned to meet the deadline. i.e. si = di – pi Let the last job scheduled finishes at time t. Then slack- time for job i is defined as sti = si – t. Thus slack time represents, how much we can wait/defer to schedule the ith job. We should schedule the next job for which this time is minimum. i.e. sti is minimum. Since t is same for all the jobs, the job with minimum si is scheduled next.

26 MSF fails: figure from Anjali, Hemant back

27 Earliest Deadline First j1j2 j3 f1=p1f2=f1+p2 d1<=d2<=d3…..<=dn s2=f1 s3=f2

28 Exchange Argument Let O be an optimal solution and S be the solution obtained by our greedy. Gradually we transform O into S without hurting its (O’s) quality. Thereby implying that |O| = |S|. Hence proving that greedy is optimal.

29  Inversion We say that a schedule A has an inversion if a job i with deadline di is scheduled before another job j with earlier deadline( dj<di).  Idle Time -The time that passes during a gap -There is work to be done, yet for some reason the machine is sitting idle.

30 Earliest Deadline First Claim: Our schedule has no inversions and no idle time….trivial Clearly Optimal has no idle time. 1. All schedules with no inversion and no idle time has same maximum lateness Proof: 2. There is an optimal schedule with no inversion and no idle time. A. If O (an optimal) has an inversion then there is a pair of jobs I an j such that j is scheduled immediately after I and d_j < d_i….i.e. pair of inverted jobs that are consecutive. Proof: B. The new swapped schedule has maximum lateness no larger than that of O. Proof:

31 Proof of 1. :Anjali, Hemant

32 Network Design Problems Given a set of routers placed at n locations V = {v1 … vn} We want to connect them in a cheapest way. Claim: The minimum cost solution to the above problem is a tree.

33 Minimum Spanning Tree Assume that all edges have distinct edge costs. We’ll remove this restriction later. Greedy Approaches: Kruskal Prim’s Reverse Delete Cut Property: Let S be a non-trivial subset of V, let e = (v, w) be the minimum weight edge with one end in S and the other end in V – S. Then every MST contains e.


Download ppt "Instructor Neelima Gupta Table of Contents Greedy Algorithms."

Similar presentations


Ads by Google