Algorithm Design Methods 황승원 Fall 2011 CSE, POSTECH.

Slides:



Advertisements
Similar presentations
Unit-iv.
Advertisements

Lecture 7 Paradigm #5 Greedy Algorithms
Algorithm Design Methods (I) Fall 2003 CSE, POSTECH.
Algorithm Design Methods Spring 2007 CSE, POSTECH.
 Review: The Greedy Method
MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Greedy Algorithms.
GRAPH BALANCING. Scheduling on Unrelated Machines J1 J2 J3 J4 J5 M1 M2 M3.
Chapter 3: Planning and Scheduling Lesson Plan
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Analysis of Algorithms
Greed is good. (Some of the time)
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
Scheduling Two people are camping out and wish to cook a simple supper consisting of soup and hamburgers. In order to prepare the supper, several tasks.
Lecture 3: Greedy Method Greedy Matching Coin Changing Minimum Spanning Tree Fractional Knapsack Dijkstra's Single-Source Shortest-Path.
Greedy vs Dynamic Programming Approach
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
1 -1 Chapter 1 Introduction Why Do We Need to Study Algorithms? To learn strategies to design efficient algorithms. To understand the difficulty.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
ASC Program Example Part 3 of Associative Computing Examining the MST code in ASC Primer.
Week 2: Greedy Algorithms
CSE 550 Computer Network Design Dr. Mohammed H. Sqalli COE, KFUPM Spring 2007 (Term 062)
Lecture 7: Greedy Algorithms II
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
1 IOE/MFG 543 Chapter 7: Job shops Sections 7.1 and 7.2 (skip section 7.3)
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Iterative Flattening in Cumulative Scheduling. Cumulative Scheduling Problem Set of Jobs Each job consists of a sequence of activities Each activity has.
Approximation Algorithms
Priority Queues, Heaps & Leftist Trees
The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each decision is locally optimal. These.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Bold Stroke January 13, 2003 Advanced Algorithms CS 539/441 OR In Search Of Efficient General Solutions Joe Hoffert
IT 60101: Lecture #201 Foundation of Computing Systems Lecture 20 Classic Optimization Problems.
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
Informatics Department Parahyangan Catholic University D ESIGN & A NALYSIS OF A LGORITHM 12 – G REEDY A LGORITHM.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Approximation Algorithms These lecture slides are adapted from CLRS.
Algorithmics - Lecture 101 LECTURE 10: Greedy technique.
Greedy Methods and Backtracking Dr. Marina Gavrilova Computer Science University of Calgary Canada.
For Wednesday No reading No homework There will be homework for Friday, as well the program being due – plan ahead.
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
Outline Introduction Minimizing the makespan Minimizing total flowtime
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
1 Approximation Algorithm Updated at 2011/01/03. 2 Approximation Algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
© The McGraw-Hill Companies, Inc., Chapter 1 Introduction.
DECISION 1. How do you do a Bubble Sort? Bubble Sort:  You compare adjacent items in a list;  If they are in order, leave them.  If they are not in.
1 Ch18. The Greedy Methods. 2 BIRD’S-EYE VIEW Enter the world of algorithm-design methods In the remainder of this book, we study the methods for the.
Exhaustive search Exhaustive search is simply a brute- force approach to combinatorial problems. It suggests generating each and every element of the problem.
Chapter 3: Planning and Scheduling. Planning and Scheduling - Topics Resolving Conflict via Coloring Bin Packing Scheduling Tasks Critical-Path Schedules.
Hard Problems Some problems are hard to solve.
Lecture on Design and Analysis of Computer Algorithm
Algorithm Design Methods
Design & Analysis of Algorithm Greedy Algorithm
Design and Analysis of Algorithm
CS 3343: Analysis of Algorithms
Greedy Method     Greedy Matching     Coin Changing     Minimum Spanning Tree     Fractional Knapsack     Dijkstra's Single-Source Shortest-Path.
Heuristics Definition – a heuristic is an inexact algorithm that is based on intuitive and plausible arguments which are “likely” to lead to reasonable.
Exam 2 LZW not on syllabus. 73% / 75%.
Course Contents: T1 Greedy Algorithm Divide & Conquer
Lecture 6 Topics Greedy Algorithm
Algorithm Design Methods
Algorithm Design Methods
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Chapter 3: Planning and Scheduling Lesson Plan
Algorithm Design Methods
Presentation transcript:

Algorithm Design Methods 황승원 Fall 2011 CSE, POSTECH

2 Optimization Problem A problem in which some function (called the optimization or objective function) is to be optimized (usually minimized or maximized) subject to some constraints.

3 Machine Scheduling Find a schedule that minimizes the finish time. optimization function … finish time constraints  each job is scheduled continuously on a single machine for an amount of time equal to its processing requirement  no machine processes more than one job at a time

4 Bin Packing Pack items into bins using the fewest number of bins. optimization function … number of bins constraints  each item is packed into a single bin  the capacity of no bin is exceeded

5 Min Cost Spanning Tree Find a spanning tree that has minimum cost. optimization function … sum of edge costs constraints  must select n-1 edges of the given n vertex graph  the selected edges must form a tree

6 Feasible And Optimal Solutions A feasible solution is a solution that satisfies the constraints. An optimal solution is a feasible solution that optimizes the objective/optimization function.

7 Greedy Method Solve problem by making a sequence of decisions. Decisions are made one by one in some order. Each decision is made using a greedy criterion. A decision, once made, is (usually) not changed later.

8 Machine Scheduling LPT Scheduling (Longest Processing Time first) Schedule jobs one by one and in decreasing order of processing time. Each job is scheduled on the machine on which it finishes earliest. Scheduling decisions are made serially using a greedy criterion (minimize finish time of this job). LPT scheduling is an application of the greedy method.

9 LPT Schedule LPT rule does not guarantee minimum finish time schedules. (LPT Finish Time)/(Minimum Finish Time) <= 4/3 - 1/(3m) where m is number of machines Minimum finish time scheduling is NP-hard. In this case, the greedy method does not work. The greedy method does, however, give us a good heuristic for machine scheduling.

10 Container Loading Ship has capacity c. m containers are available for loading. Weight of container i is w i. Each weight is a positive number. Sum of container weights > c. Load as many containers as is possible without sinking the ship.

11 Greedy Solution Load containers in increasing order of weight until we get to a container that doesn ’ t fit. Does this greedy algorithm always load the maximum number of containers? Yes. (proof? see text).

12 무인도 ?

13 0/1 Knapsack Problem

14 0/1 Knapsack Problem Hiker wishes to take n items on a trip. The weight of item i is w i. The profit of item i is p i. The items are to be carried in a knapsack whose weight capacity is c. When sum of item weights <= c, all n items can be carried in the knapsack. When sum of item weights > c, some items must be left behind. Which items should be taken/left to maximize profit?

15 0/1 Knapsack Problem All weights and profits are positive numbers. Hiker wants to select a subset of the n items to take.  The weight of the subset should not exceed the capacity of the knapsack. (constraint)  Cannot select a fraction of an item. (constraint)  The profit/value of the subset is the sum of the profits of the selected items. (optimization function)  The profit/value of the selected subset should be maximum. (optimization criterion)

16 0/1 Knapsack Problem Let x i = 1 when item i is selected and let x i = 0 when item i is not selected. i = 1 n p i x i maximize i = 1 n w i x i <= c subject to and x i = 0 or 1 for all i

17 Greedy Attempt 1 Be greedy on capacity utilization.  Select items in increasing order of weight. n = 2, c = 7 w = [3, 6] p = [2, 10] only item 1 is selected profit (value) of selection is 2 not best selection!

18 Greedy Attempt 2 Be greedy on profit earned.  Select items in decreasing order of profit. n = 3, c = 7 w = [7, 3, 2] p = [10, 8, 6] only item 1 is selected profit (value) of selection is 10 not best selection!

19 Greedy Attempt 3 Be greedy on profit density (p/w).  Select items in decreasing order of profit density. n = 2, c = 7 w = [1, 7] p = [10, 20] only item 1 is selected profit (value) of selection is 10 not best selection!

20 Greedy Attempt 3 Be greedy on profit density (p/w).  Works when selecting a fraction of an item is permitted  Select items in decreasing order of profit density, if next item doesn ’ t fit take a fraction so as to fill knapsack. n = 2, c = 7 w = [1, 7] p = [10, 20] item 1 and 6/7 of item 2 are selected

21 0/1 Knapsack Greedy Heuristics Select a subset with <= k items. If the weight of this subset is > c, discard the subset. If the subset weight is <= c, fill as much of the remaining capacity as possible by being greedy on profit density. Try all subsets with <= k items and select the one that yields maximum profit.

22 0/1 Knapsack Greedy Heuristics (best value - greedy value)/(best value) <= 1/(k+1) Larger k  more accurate but expensive, i,e., O(n k ) Number of solutions (out of 600) within x% of best.

23 Implementing 0/1 Knapsack Greedy Heuristics There are O(n k ) subsets with at most k items. Trying a subset takes O(n) time. Total time is O(n k+1 ) when k > 0. (best value - greedy value)/(best value) <= 1/(k+1)

24 Coming Up Next READING: Ch 18.1~ NEXT: Shortest Path Problem (Ch )