Lecture on Design and Analysis of Computer Algorithm

Slides:



Advertisements
Similar presentations
Unit-iv.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Greed is good. (Some of the time)
Cs333/cutler Greedy1 Introduction to Greedy Algorithms The greedy technique Problems explored –The coin changing problem –Activity selection.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 1 (Part 3) Tuesday, 9/4/01 Greedy Algorithms.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 (Part 3) Tuesday, 1/29/02 Design Patterns for Optimization.
Lecture 7: Greedy Algorithms II
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Analysis of Algorithms
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Algorithmics - Lecture 101 LECTURE 10: Greedy technique.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms General principle of greedy algorithm
Greedy algorithms: CSC317
Greedy Algorithms.
Dynamic Programming Sequence of decisions. Problem state.
Greedy Technique.
Algorithm Design Methods
CS 3343: Analysis of Algorithms
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Greedy Algorithms (Chap. 16)
Design & Analysis of Algorithm Greedy Algorithm
Greedy Algorithms Basic idea Connection to dynamic programming
Presented by Po-Chuan & Chen-Chen 2016/03/08
Greedy Method     Greedy Matching     Coin Changing     Minimum Spanning Tree     Fractional Knapsack     Dijkstra's Single-Source Shortest-Path.
ICS 353: Design and Analysis of Algorithms
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Greedy Algorithm Enyue (Annie) Lu.
Exam 2 LZW not on syllabus. 73% / 75%.
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Greedy Algorithms.
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Lecture 6 Topics Greedy Algorithm
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
3. Brute Force Selection sort Brute-Force string matching
Chapter 16: Greedy algorithms Ming-Te Chi
Branch and Bound Searching Strategies
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
3. Brute Force Selection sort Brute-Force string matching
Algorithm Design Methods
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Algorithm Design Methods
Dynamic Programming Sequence of decisions. Problem state.
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Spanning Trees Lecture 20 CS2110 – Spring 2015.
Greedy algorithms.
Major Design Strategies
Advance Algorithm Dynamic Programming
Major Design Strategies
Algorithm Design Methods
3. Brute Force Selection sort Brute-Force string matching
Presentation transcript:

Lecture on Design and Analysis of Computer Algorithm www.AssignmentPoint.com www.assignmentpoint.com

Greedy Method

Greedy Method: Definition An algorithm which always takes the best immediate, or local, solution while finding an answer. Greedy algorithms will always find the overall, or globally, optimal solution for some optimization problems, but may find less-than-optimal solutions for some instances of other problems.

Example of Greedy Method (1/4) Prim's algorithm and Kruskal's algorithm are greedy algorithms which find the globally optimal solution, a minimum spanning tree. In contrast, any known greedy algorithm to find an Euler cycle might not find the shortest path, that is, a solution to the traveling salesman problem. Dijkstra's algorithm for finding shortest paths is another example of a greedy algorithm which finds an optimal solution.

Example of Greedy Method (2/4) If there is no greedy algorithm which always finds the optimal solution for a problem, one may have to search (exponentially) many possible solutions to find the optimum. Greedy algorithms are usually quicker, since they don't consider possible alternatives.

Example of Greedy Method (3/4) Consider the problem of making change: Coins of values 25c, 10c, 5c and 1c Return 63c in change Which coins? Use greedy strategy: Select largest coin whose value was no greater than 63c Subtract value (25c) from 63 getting 38 Find largest coin … until done

Example of Greedy Method (4/4) At any individual stage, select that option which is “locally optimal” in some particular sense Greedy strategy for making change works because of special property of coins If coins were 1c, 5c and 11c and we need to make change of 15c? Greedy strategy would select 11c coin followed by 4 1c coins Better: 3 5c coins

Problem    Make a change of a given amount using the smallest possible number of coins. MAKE-CHANGE (n)         C ← {100, 25, 10, 5, 1}     // constant.         Sol ← {};                         // set that will hold the solution set.         Sum ← 0 sum of item in solution set         WHILE sum not = n             x = largest item in set C such that sum + x ≤ n             IF no such item THEN                 RETURN    "No Solution"             S ← S {value of x}             sum ← sum + x         RETURN S

Greedy Algorithm Start with a solution to a small subproblem Build up to a solution to the whole problem Make choices that look good in the short term Disadvantage: Greedy algorithms don’t always work ( Short term solutions can be diastrous in the long term). Hard to prove correct Advantage: Greedy algorithm work fast when they work. Simple algorithm, easy to implement

Greedy Algorithm Procedure GREEDY(A,n) // A(1:n) contains the n inputs// solution   //initialize the solution to empty// for i  1 to n do x  SELECT(A) if FEASIBLE(solution,x) then solution  UNION(solution,x) endif repeat return(solution) end GREEDY

Activity-Selection Problem The problem is to select a maximum-size set of mutally compatible activities. Example We have a set S = { 1,2,…,n} of n proposed activities that wish to use a resource, such as a lecture hall, which can be used by only one activities at a time.

Example 5 10 15 i si fi 1 0 6 2 3 5 3 1 4 4 2 13 5 3 8 6 12 14 7 8 11 8 8 12 9 6 10 10 5 7 11 5 9

Brute Force Try every all possible solution Choose the largest subset which is feasible Ineffcient Q(2n) choices

Greedy Approach 5 10 15 Sort by finish time

Activity-Selection Problem Pseudo code Greedy_Activity_Selector(s,f) 1 n <- length[s] 2 A <- {1} 3 j <- 1 4 for i <- 2 to n 5 do if si > fj 6 then A <- A U {i} 7 j <- i 8 return A It can schdule a set S of n activities in Q(n) time, assuming that the activities were already sorted

Proving the greedy algorithm correct We assume that the input activities are in order by increasing finishing time f1 < f2 < … < fn Activities #1 has the earliest finish time then it must be in an optimal solution. 1 possible solution 5 10 15 Activitiy 1 k 1

Proving (cont.) 5 10 15 k 1 Eliminate the activities which has a start time early than the finish time of activity 1

Proving (cont.) Greedy algorithm produces an optimal solution 5 10 15 5 10 15 1 Greedy algorithm produces an optimal solution

Element of the Greedy Strategy Question? How can one tell if a greedy algorithm will solve a particular optimization problem? No general way to tell!!! There are 2 ingredients that exhibited by most problems that lend themselves to a greedy strategy The Greedy Choice Property Optimal Substructure

The Greedy Choice Property A globally optimal solution can be arrived at by making a locally optimal (greedy) choice. Make whatever choice seems best at the moment. May depend on choice so far, but not depend on any future choices or on the solutions to subproblems

Optimal Substructure An optimal solution to the problem contains within it optimal solutions to subproblems

Knapsack Problem We are given n objects and a knapsack. Object i has a weight wi and the knapsack has a capacity M. If a fraction xi, 0  xi  1, of object I is placed into the knapsack the a profit of pixi is earned. The objective is to obtain a filling of the knapsack that maximizes the total weight of all chosen objects to be at most M maximize subject to and 0  xi  1, 1  I  n

Example 10 20 30 50 $60 $100 $120 Item 1 Item 2 Item 3 knapsack

Knapsack 0/1 30 20 $120 $100 Total =$220 20 10 $100 $60 =$160 30 10 =$180

Fractional Knapsack Taking the items in order of greatest value per pound yields an optimal solution 20 10 $100 $60 =$240 Total 30 $80

Optimal Substructure Both fractional knapsack and 0/1 knapsack have an optimal substructure.

Example Fractional Knapsack (cont.) There are 5 objects that have a price and weight list below, the knapsack can contain at most 100 Lbs. Method 1 choose the least weight first Total Weight = 10 + 20 + 30 + 40 = 100 Total Price = 20 + 30 + 66 + 40 = 156

Example Fractional Knapsack (cont.) Method 2 choose the most expensive first Total Weight = 30 + 50 + 20 = 100 Total Price = 66 + 60 + 20 = 146 half

Example Fractional Knapsack (cont.) Method 3 choose the most price/ weight first Total weight = 30 + 10 + 20 + 40 = 100 Total Price = 66 + 20 + 30 + 48 = 164 80%

More Example on fractional knapsac Consider the following instance of the knapsack problem: n = 3, M = 20, (p1,p2,p3) = 25,24,15 and (w1,w2,w3) = (18,15,10) (x1,x2,x3) 1) (1/2,1/3,1/4) 16.5 24.25 2) (1,2/15,0) 20 28.2 3) ( 0,2/3,1) 20 31 4) ( 0,1,1/2) 20 31.5

The Greedy Solution Define the density of object Ai to be wi/si. Use as much of low density objects as possible. That is, process each in increasing order of density. If the whole thing ts, use all of it. If not, fill the remaining space with a fraction of the current object,and discard the rest. First, sort the objects in nondecreasing density, so that wi/si  w i+1/s i+1 for 1  i < n. Then do the following

PseudoCode Procedure GREEDY_KNAPSACK(P,W,M,X,n) //P(1:n) and W(1:n) contain the profits and weights respectively of the n objects ordered so that P(I)/W(I) > P(I+1)/W(I+1). M is the knapsack size and X(1:n) is the solution vector// real P(1:n), W(1:n), X(1:n), M, cu; integer I,n; x <- 0; //initialize solution to zero // cu <- M; // cu = remaining knapsack capacity // for i <- 1 to n do if W(i) > cu then exit endif X(I) <- 1; cu c - W(i); repeat if I < n then X(I) <- cu/W(I) endif End GREEDY_KNAPSACK