Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

Greedy Algorithms.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Introduction to Algorithms Jiafen Liu Sept
David Luebke 1 5/4/2015 CS 332: Algorithms Dynamic Programming Greedy Algorithms.
Review: Dynamic Programming
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2006 Lecture 2 Monday, 2/6/06 Design Patterns for Optimization.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
CSE 780 Algorithms Advanced Algorithms Greedy algorithm Job-select problem Greedy vs DP.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Design Patterns for Optimization Problems Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 (Part 3) Tuesday, 1/29/02 Design Patterns for Optimization.
Lecture 7: Greedy Algorithms II
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
CSC5101 Advanced Algorithms Analysis
Greedy Algorithms Z. GuoUNC Chapel Hill CLRS CH. 16, 23, & 24.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Chapter 16 Highlights
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Greedy Algorithms Lecture 10 Asst. Prof. Dr. İlker Kocabaş 1.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
PREPARED BY: Qurat Ul Ain SUBMITTED TO: Ma’am Samreen.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms General principle of greedy algorithm
Greedy algorithms: CSC317
Lecture on Design and Analysis of Computer Algorithm
Review: Dynamic Programming
Analysis of Algorithms CS 477/677
Greedy Algorithms (Chap. 16)
Introduction to Algorithms`
Greedy Algorithms Basic idea Connection to dynamic programming
Presented by Po-Chuan & Chen-Chen 2016/03/08
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
CS6045: Advanced Algorithms
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Greedy Algorithm Enyue (Annie) Lu.
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Greedy Algorithms.
Advanced Algorithms Analysis and Design
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Lecture 6 Topics Greedy Algorithm
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Greedy Algorithms Comp 122, Spring 2004.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Greedy algorithms.
Advance Algorithm Dynamic Programming
Presentation transcript:

Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill

Overview  Like dynamic programming, used to solve optimization problems.  Dynamic programming can be overkill; greedy algorithms tend to be easier to code  Problems exhibit optimal substructure (like DP).  Problems also exhibit the greedy-choice property. »When we have a choice to make, make the one that looks best right now. »Make a locally optimal choice in hope of getting a globally optimal solution.

Activity-selection Problem  Input: Set S of n activities, a 1, a 2, …, a n. »s i = start time of activity i. »f i = finish time of activity i.  Output: Subset A of maximum number of compatible activities. »Two activities are compatible, if their intervals don’t overlap. Example: Activities in each line are compatible

Optimal Substructure  Assume activities are sorted by finishing times. »f 1  f 2  …  f n.  Suppose an optimal solution includes activity a k. »This generates two subproblems. »Selecting from a 1, …, a k-1, activities compatible with one another, and that finish before a k starts (compatible with a k ). »Selecting from a k+1, …, a n, activities compatible with one another, and that start after a k finishes. »The solutions to the two subproblems must be optimal. Prove using the cut-and-paste approach.

Recursive Solution  Let S ij = subset of activities in S that start after a i finishes and finish before a j starts.  Subproblems: Selecting maximum number of mutually compatible activities from S ij.  Let c[i, j] = size of maximum-size subset of mutually compatible activities in S ij. Recursive Solution:

Greedy-choice Property  The problem also exhibits the greedy-choice property. »There is an optimal solution to the subproblem S ij, that includes the activity with the smallest finish time in set S ij.  Hence, there is an optimal solution to S that includes a 1.  Therefore, make this greedy choice without solving subproblems first and evaluating them.  Solve the subproblem that ensues as a result of making this greedy choice.  Combine the greedy choice and the solution to the subproblem.

Recursive Algorithm Recursive-Activity-Selector (s, f, i, j) 1.m  i+1 2.while m < j and s m < f i 3. do m  m+1 4.if m < j 5. then return {a m }  Recursive-Activity-Selector(s, f, m, j) 6. else return  Recursive-Activity-Selector (s, f, i, j) 1.m  i+1 2.while m < j and s m < f i 3. do m  m+1 4.if m < j 5. then return {a m }  Recursive-Activity-Selector(s, f, m, j) 6. else return  Initial Call: Recursive-Activity-Selector (s, f, 0, n+1) Complexity:  (n) Straightforward to convert the algorithm to an iterative one.

Typical Steps  Cast the optimization problem as one in which we make a choice and are left with one subproblem to solve.  Prove that there’s always an optimal solution that makes the greedy choice, so that the greedy choice is always safe.  Show that greedy choice and optimal solution to subproblem  optimal solution to the problem.  Make the greedy choice and solve top-down.  May have to preprocess input to put it into greedy order. »Example: Sorting activities by finish time.

Activity Selection: A Greedy Algorithm  So actual algorithm is simple: »Sort the activities by finish time »Schedule the first activity »Then schedule the next activity in sorted list which starts after previous activity finishes »Repeat until no more activities  Intuition is even more simple: »Always pick the shortest ride available at the time

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.

Analysis of Algorithm The activity selected for consideration is always the one with the earliest finish Why does this work? Intuitively, it always leaves the maximum time possible to schedule more activities The greedy choice maximizes the amount of unscheduled time remaining What is the space and time complexity?

Elements of the Greedy Strategy  Sometimes a greedy strategy results in an optimal solution and sometimes it does not.  No general way to tell if the greedy strategy will result in an optimal solution  Two ingredients usually necessary »greedy-choice property »optimal substructure

Greedy-Choice Property  A globally optimal solution can be arrived at by making a locally optimal (greedy) choice.  Unlike dynamic programming, we solve the problem in a top down manner  Must prove that the greedy choices result in a globally optimal solution

Optimal Substructure  Like dynamic programming, the optimal solution must contain within it optimal solutions to sub-problems.  Given a choice between using a greedy algorithm and a dynamic programming algorithm for the same problem, in general which would you choose?

Greedy versus Dynamic Programming  Both greedy and dynamic programming exploit the optimal substructure property  Optimal substructure: a problem exhibits optimal substructure if an optimal solution to the problem contains within it optimal solutions to the sub- problems.

Task Scheduling  Given: a set T of n tasks, each having: »A start time, s i »A finish time, f i (where s i < f i )  Goal: Perform all the tasks using a minimum number of “machines.”

Task Scheduling Algorithm  Greedy choice: consider tasks by their start time and use as few machines as possible with this order.

Task Scheduling Algorithm  Running time:  Given a set of n tasks specified by their start and finish times, Algorithm TaskSchedule produces a schedule of the tasks with the minimum number of machines in O(nlogn) time. »Use heap-based priority queue to store tasks with the start time as the priorities »Finding the earliest task takes O(logn) time

Example  Given: a set T of n tasks, each having: »A start time, s i »A finish time, f i (where s i < f i ) »[1,4], [1,3], [2,5], [3,7], [4,7], [6,9], [7,8] (ordered by start)  Goal: Perform all tasks on min. number of machines Machine 1 Machine 3 Machine 2