MCA 301: Design and Analysis of Algorithms

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Unit-iv.
COMP 482: Design and Analysis of Algorithms
Lecture 7 Paradigm #5 Greedy Algorithms
1 Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CS 332: Algorithms NP Completeness David Luebke /2/2017.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Greedy Algorithms.
Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Chapter 5 Fundamental Algorithm Design Techniques.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Instructor Neelima Gupta Table of Contents Greedy Algorithms.
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Greedy Algorithms Basic idea Connection to dynamic programming
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 7: Greedy Algorithms
Cs333/cutler Greedy1 Introduction to Greedy Algorithms The greedy technique Problems explored –The coin changing problem –Activity selection.
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
Instructor Neelima Gupta Table of Contents Lp –rounding Dual Fitting LP-Duality.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
Week 2: Greedy Algorithms
Lecture 7: Greedy Algorithms II
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
9/3/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Guest lecturer: Martin Furer Algorithm Design and Analysis L ECTURE.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Instructor Neelima Gupta Table of Contents Five representative problems.
Instructor Neelima Gupta Instructor: Ms. Neelima Gupta.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
Introduction to NP Instructor: Neelima Gupta 1.
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Lecture on Design and Analysis of Computer Algorithm
Algorithm Design Methods
Greedy Algorithms Basic idea Connection to dynamic programming
Prepared by Chen & Po-Chuan 2016/03/29
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
CS4335 Design and Analysis of Algorithms/WANG Lusheng
Greedy Algorithm Enyue (Annie) Lu.
MCA 301: Design and Analysis of Algorithms
Advanced Algorithms Analysis and Design
Greedy Algorithms.
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Lecture 6 Topics Greedy Algorithm
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Instructor Neelima Gupta
Lecture 6 Greedy Algorithms
Algorithm Design Methods
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Week 2: Greedy Algorithms
Greedy algorithms.
Advance Algorithm Dynamic Programming
Algorithm Design Methods
Presentation transcript:

MCA 301: Design and Analysis of Algorithms Instructor Neelima Gupta ngupta@cs.du.ac.in

Greedy Approach A tool to design algorithms for optimization problems Table Of Contents Greedy Approach A tool to design algorithms for optimization problems

What is greedy approach? Choosing a current best solution without worrying about future. In other words the choice does not depend upon future sub-problems.

What is greedy approach? Such algorithms are locally optimal, For some problems, as we will see shortly, this local optimal is global optimal also and we are happy.

General ‘Greedy’ Approach Step 1: Choose the current best solution. Step 2: Obtain greedy solution on the rest.

When to use? There must be a greedy choice to make. The problem must have an optimal substructure.

Activity Selection Problem Given a set of activities, S = {a1, a2, …, an} that need to use some resource. Each activity ai has a possible start time si & finish time fi, such that 0  si < fi <  We need to allocate the resource in a compatible manner, such that the number of activities getting the resource is maximized. The resource can be used by one and only one activity at any given time. .

Activity Selection Problem Two activities ai and aj are said to be compatible, if the interval they span do not overlap. ..i.e. fi  sj or fj  si Example: Consider activities: a1, a2, a3, a4 s1--------f1 s2---------f2 s3------f3 s4------f4 Here a1 is compatible with a3 & a4 a2 is compatible with a3 & a4 But a3 and a4 themselves are not compatible.

Activity Selection Problem Solution: Applying the general greedy algorithm Select the current best choice, a1 add it to the solution set. Construct a subset S’ of all activities compatible with a1, find the optimal solution of this subset. Join the two.

Lets think of some possible greedy solutions Shortest Job First In the order of increasing start times In the order of increasing finish times

Shortest Job First job1 job2 job3 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Time Thanks to: Navneet Kaur(22), MCA 2012

Shortest Job First job1 job2 job3 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Time Thanks to: Navneet Kaur(22), MCA 2012

Shortest Job First job1 job2 job3 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 SCHEDULE CHOSEN BY THIS APPROACH Time OPTIMAL SCHEDULE Thanks to: Navneet Kaur(22), MCA 2012

Increasing Start Times job1 job2 job3 2 4 6 8 10 12 14 16 18 20 Time Thanks to: Navneet Kaur(22), MCA 2012

Increasing Start Times job1 job2 job3 2 4 6 8 10 12 14 16 18 20 Time Thanks to: Navneet Kaur(22), MCA 2012

Increasing Start Times job1 job2 job3 2 4 6 8 10 12 14 16 18 20 SCHEDULE CHOSEN BY THIS APPROACH Time OPTIMAL SCHEDULE Thanks to: Navneet Kaur(22), MCA 2012

Increasing Finishing Times i Si Fi Pi 2 2 4 3 1 1 5 10 3 4 6 4 4 5 8 20 5 6 9 2 Thanks to Neha (16)

Increasing Finishing Times P(1)=10 P(2)=3 P(3)=4 P(4)=20 P(5)=2 1 Time 2 3 4 5 6 7 8 9 Thanks to Neha (16)

Increasing Finishing Times P(1)=10 P(2)=3 P(3)=4 P(4)=20 P(5)=2 1 Time 2 3 4 5 6 7 8 9 . Thanks to Neha (16)

ACTIVITY SELECTION PROBLEM We include a₁ in the solution. And then recurse on S′ = {aₓ ԑ S-{a₁} : aₓ is compatible with a₁} where S is input set of activities. Thanks to: Navneet Kaur(22), MCA 2012

ACTIVITY SELECTION PROBLEM CLAIM: If B′ is an optimal solution of S′, then B=B′  {a1} is an optimal solution of S. PROOF: Suppose  an imaginary solution B″, which is optimal and includes a1 . Suppose length of B″, i.e., |B″| = k″ Thanks to: Navneet Kaur(22), MCA 2012

ACTIVITY SELECTION PROBLEM Now, we have to prove two things: I. B is feasible. II. |B| = k″ OR we can prove that |B′| = k″ - 1 Thanks to: Navneet Kaur(22), MCA 2012

ACTIVITY SELECTION PROBLEM Proof of I. --- B′ is a subset of S′. And S′ is compatible with a₁ . Hence, B is feasible. Thanks to: Navneet Kaur(22), MCA 2012

ACTIVITY SELECTION PROBLEM Proof of II. --- Consider the set B″ - {a₁} i.) Can |B′| ≥ k″ ? If yes, then |B′  {a₁}| ≥ k″ + 1 Thanks to: Navneet Kaur(22), MCA 2012

ACTIVITY SELECTION PROBLEM But, this is contradiction to a problem that B″ is optimal because |B″| = k″ And if the size of optimal solution is k″, then we cannot have a solution of size greater than k″ and this is giving a solution of size k″+1, which is not possible. Hence, statement (i) is wrong. Thanks to: Navneet Kaur(22), MCA 2012

ACTIVITY SELECTION PROBLEM (ii) Can |B′| < k″ - 1 ? Consider B″- {a₁}. This is a feasible solution of S′. This implies that OPT(S′) ≥ k″ - 1 Hence, Statement (ii) is wrong. Thanks to: Navneet Kaur(22), MCA 2012

ACTIVITY SELECTION PROBLEM From (i) and (ii), we get |B′| = k″ - 1 This implies that |B| = k″. Hence, B is optimal. Thanks to: Navneet Kaur(22), MCA 2012

Activity Selection Problem Statement:  an optimal solution to a problem that contains a1 Proof: Let A = {ak,…} be an optimal solution. Let ak be the first activity in A i.e. the finishing time of ak is the least. Construct another solution: B = A – {ak}  {a1} = {a1,…}

Activity Selection Problem Proof continued… Clearly, f1  fk thus B is a set of compatible activities, hence an optimal solution too.

Activity Selection Problem Statement: The solution is globally optimal. Proof: Suppose B = {a1…} has an optimal solution containing k+1 elements. (a1 being the first element) Clearly, B – {a1} has an optimal solution with k elements.

Activity Selection Problem Proof continued… Now, suppose for B’ = B - {a1}  another optimal solution containing more than k elements. Then we can construct another optimal solution B* = B’  {a1} with more than k+1 elements. This is a contradiction to our assumption of an optimal solution with k+1 elements.

FRACTIONAL KNAPSACK PROBLEM Given a set S of n items, with value vi and weight wi and a knapsack with capacity W. Aim: Pick items with maximum total value but with weight at most W. You may choose fractions of items.

GREEDY APPROACH Pick the items in the decreasing order of value per unit weight i.e. highest first.

Example Item 2 item 3 vi = 60 vi = 100 vi = 120 knapsack capacity 50 Item 2 item 3 Item 1 vi = 60 vi = 100 vi = 120 vi/ wi = 6 vi/ wi = 5 vi/ wi = 4 30 20 10 Thanks to: Neha Katyal

Example Item 2 item 3 vi = 100 vi = 120 vi/ wi = 5 vi/ wi = 4 30 20 10 knapsack capacity 50 Item 2 item 3 60 vi = 100 vi = 120 vi/ wi = 5 vi/ wi = 4 30 20 10 Thanks to: Neha Katyal

Example item 3 vi = 120 20 vi/ wi = 4 30 10 Thanks to: Neha Katyal 100 knapsack capacity 50 item 3 100 + 60 vi = 120 vi/ wi = 4 20 30 10 Thanks to: Neha Katyal

Example $80 + = 240 20/30 20 10 Thanks to: Neha Katyal 100 knapsack capacity 50 $80 + 100 60 = 240 20/30 20 10 Thanks to: Neha Katyal

Up Next Dynamic Programming

ACTIVITY SELECTION PROBLEM Options that could be followed while scheduling the jobs: Shortest Job First Eg. Three jobs to be scheduled: Job1- start=5, end=10 Job2- start=1, end=7 Job3- start=8, end=15 Our shortest job first would schedule just job1 But the optimal algorithm would have scheduled 2 jobs - job2 and job3. So this approach is not working. Thanks to: Navneet Kaur(22), MCA 2012

Next option that could be followed while scheduling the jobs: Smallest start time first Eg. Three jobs to be scheduled: Job1- start=1, end=20 Job2- start=2, end=7 Job3- start=8, end=15 Our smallest start time first would schedule just job1 But the optimal algorithm would have scheduled 2 jobs - job2 and job3. So this approach is also not working. Thanks to: Navneet Kaur(22), MCA 2012