Greedy Algorithms (Chap. 16)

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Design and Analysis of Algorithms Greedy algorithms, coin changing problem Haidong Xue Summer 2012, at GSU.
CS 332: Algorithms NP Completeness David Luebke /2/2017.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Greedy Algorithms.
Introduction to Algorithms Greedy Algorithms
Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
The Greedy Method1. 2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1) Task Scheduling (§5.1.2) Minimum Spanning.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Chapter 5 Fundamental Algorithm Design Techniques.
Analysis of Algorithms
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
David Luebke 1 5/4/2015 CS 332: Algorithms Dynamic Programming Greedy Algorithms.
Greedy Algorithms Basic idea Connection to dynamic programming
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 7: Greedy Algorithms
Lecture 9. Dynamic Programming: optimal coin change We have seen this problem before: you are given an amount in cents, and you want to make change using.
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 1 (Part 3) Tuesday, 9/4/01 Greedy Algorithms.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
1 Greedy methods: Lecture 1 Jose Rolim University of Geneva.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Design Patterns for Optimization Problems Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 (Part 3) Tuesday, 1/29/02 Design Patterns for Optimization.
Week 2: Greedy Algorithms
Lecture 7: Greedy Algorithms II
9/3/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Guest lecturer: Martin Furer Algorithm Design and Analysis L ECTURE.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Lecture 23. Greedy Algorithms
Semester Project: Greedy Algorithms and Genome Rearrangements August/17/2012 Name: Xuanyu Hu Professor: Elise de Doncker.
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
1 Today’s Material Dynamic Programming – Chapter 15 –Introduction to Dynamic Programming –0-1 Knapsack Problem –Longest Common Subsequence –Chain Matrix.
Greedy Algorithms Seize the moment. Elements and Concerns Often an optimization problem Feasible solutions can be built in steps Quality of the next step.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Spanning Trees Dijkstra (Unit 10) SOL: DM.2 Classwork worksheet Homework (day 70) Worksheet Quiz next block.
A greedy algorithm is an algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage with the hope of finding.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
Greedy Algorithms (Chap. 16)
BACK SOLUTION:
Presented by Po-Chuan & Chen-Chen 2016/03/08
CS200: Algorithm Analysis
Chapter 16: Greedy Algorithms
CS200: Algorithm Analysis
Lecture 9 Greedy Strategy
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
أنماط الإدارة المدرسية وتفويض السلطة الدكتور أشرف الصايغ
Start Finish Find the solution 42 out of out of 2200
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Lecture 6 Topics Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
7. Ford-Fulkerson Algorithm with multiple optimal solutions
CSC 380: Design and Analysis of Algorithms
Algorithm Design Techniques
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Greedy algorithms.
Topic 14 Algorithm Families.
Advance Algorithm Dynamic Programming
Greedy Algorithms Kevin Kauffman CS 309s.
Algorithms Lecture # 26 Dr. Sohail Aslam.
Algorithms Lecture # 25 Dr. Sohail Aslam.
Presentation transcript:

Greedy Algorithms (Chap. 16) Optimization problems Dynamic programming, but overkill sometime. Greedy algorithm: Being greedy for local optimization with the hope it will lead to a global optimal solution, not always, but in many situations, it works.

An Activity-Selection Problem Suppose A set of activities S={a1, a2,…, an} They use resources, such as lecture hall, one lecture at a time Each ai, has a start time si, and finish time fi, with 0 si< fi<. ai and aj are compatible if [si, fi) and [sj, fj) do not overlap Goal: select maximum-size subset of mutually compatible activities. Start from dynamic programming, then greedy algorithm, see the relation between the two.

DP solution –step 1 Optimal substructure of activity-selection problem. Furthermore, assume that f1 … fn. Define Sij={ak: fi sk<fksj}, i.e., all activities starting after ai finished and ending before aj begins. Define two fictitious activities a0 with f0=0 and an+1 with sn+1= So f0 f1 … fn+1. Then an optimal solution including ak to Sij contains within it the optimal solution to Sik and Skj.

DP solution –step 2 A recursive solution Assume c[n+1,n+1] with c[i,j] is the number of activities in a maximum-size subset of mutually compatible activities in Sij. So the solution is c[0,n+1]=S0,n+1. C[i,j]= 0 if Sij= max{c[i,k]+c[k,j]+1} if Sij i<k<j and akSij

Converting DP Solution to Greedy Solution Theorem 16.1: consider any nonempty subproblem Sij, and let am be the activity in Sij with earliest finish time: fm=min{fk : ak  Sij}, then Activity am is used in some maximum-size subset of mutually compatible activities of Sij. The subproblem Sim is empty, so that choosing am leaves Smj as the only one that may be nonempty. Proof of the theorem:

Top-Down Rather Than Bottom-Up To solve Sij, choose am in Sij with the earliest finish time, then solve Smj, (Sim is empty) It is certain that optimal solution to Smj is in optimal solution to Sij. No need to solve Smj ahead of Sij. Subproblem pattern: Si,n+1.

Optimal Solution Properties In DP, optimal solution depends: How many subproblems to divide. (2 subproblems) How many choices to determine which subproblem to use. (j-i-1 choices) However, the above theorem (16.1) reduces both significantly One subproblem (the other is sure to be empty). One choice, i.e., the one with earliest finish time in Sij. Moreover, top-down solving, rather than bottom-up in DP. Pattern to the subproblems that we solve, Sm,n+1 from Sij. Pattern to the activities that we choose. The activity with earliest finish time. With this local optimal, it is in fact the global optimal.

Elements of greedy strategy Determine the optimal substructure Develop the recursive solution Prove one of the optimal choices is the greedy choice yet safe Show that all but one of subproblems are empty after greedy choice Develop a recursive algorithm that implements the greedy strategy Convert the recursive algorithm to an iterative one.

Greedy vs. DP Knapsack problem Fractional knapsack, 0/1 knapsack. I1 (v1,w1), I2(v2,w2),…,In(vn,wn). Given a weight W at most he can carry, Find the items which maximize the values Fractional knapsack, Greed algorithm, O(nlogn) 0/1 knapsack. DP, O(nW). Questions: 0/1 knapsack is an NP-complete problem, why O(nW) algorithm?

Copyright © The McGraw-Hill Companies, Inc Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.

Typical tradition problem with greedy solutions Coin changes 25, 10, 5, 1 How about 7, 5, 1 Minimum Spanning Tree Prim’s algorithm Begin from any node, each time add a new node which is closest to the existing subtree. Kruskal’s algorithm Sorting the edges by their weights Each time, add the next edge which will not create cycle after added. Single source shortest pathes Dijkstra’s algorithm Huffman coding Optimal merge