Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture on Design and Analysis of Computer Algorithm

Similar presentations


Presentation on theme: "Lecture on Design and Analysis of Computer Algorithm"— Presentation transcript:

1 Lecture on Design and Analysis of Computer Algorithm

2 Greedy Method

3 Greedy Method: Definition
An algorithm which always takes the best immediate, or local, solution while finding an answer. Greedy algorithms will always find the overall, or globally, optimal solution for some optimization problems, but may find less-than-optimal solutions for some instances of other problems.

4 Example of Greedy Method (1/4)
Prim's algorithm and Kruskal's algorithm are greedy algorithms which find the globally optimal solution, a minimum spanning tree. In contrast, any known greedy algorithm to find an Euler cycle might not find the shortest path, that is, a solution to the traveling salesman problem. Dijkstra's algorithm for finding shortest paths is another example of a greedy algorithm which finds an optimal solution.

5 Example of Greedy Method (2/4)
If there is no greedy algorithm which always finds the optimal solution for a problem, one may have to search (exponentially) many possible solutions to find the optimum. Greedy algorithms are usually quicker, since they don't consider possible alternatives.

6 Example of Greedy Method (3/4)
Consider the problem of making change: Coins of values 25c, 10c, 5c and 1c Return 63c in change Which coins? Use greedy strategy: Select largest coin whose value was no greater than 63c Subtract value (25c) from 63 getting 38 Find largest coin … until done

7 Example of Greedy Method (4/4)
At any individual stage, select that option which is “locally optimal” in some particular sense Greedy strategy for making change works because of special property of coins If coins were 1c, 5c and 11c and we need to make change of 15c? Greedy strategy would select 11c coin followed by 4 1c coins Better: 3 5c coins

8 Problem    Make a change of a given amount using the smallest possible number of coins.
MAKE-CHANGE (n)         C ← {100, 25, 10, 5, 1}     // constant.         Sol ← {};                         // set that will hold the solution set.         Sum ← 0 sum of item in solution set         WHILE sum not = n             x = largest item in set C such that sum + x ≤ n             IF no such item THEN                 RETURN    "No Solution"             S ← S {value of x}             sum ← sum + x         RETURN S

9 Greedy Algorithm Start with a solution to a small subproblem
Build up to a solution to the whole problem Make choices that look good in the short term Disadvantage: Greedy algorithms don’t always work ( Short term solutions can be diastrous in the long term). Hard to prove correct Advantage: Greedy algorithm work fast when they work. Simple algorithm, easy to implement

10 Greedy Algorithm Procedure GREEDY(A,n)
// A(1:n) contains the n inputs// solution   //initialize the solution to empty// for i  1 to n do x  SELECT(A) if FEASIBLE(solution,x) then solution  UNION(solution,x) endif repeat return(solution) end GREEDY

11 Activity-Selection Problem
The problem is to select a maximum-size set of mutally compatible activities. Example We have a set S = { 1,2,…,n} of n proposed activities that wish to use a resource, such as a lecture hall, which can be used by only one activities at a time.

12 Example 5 10 15 i si fi 1 0 6 2 3 5 3 1 4 5 3 8

13 Brute Force Try every all possible solution
Choose the largest subset which is feasible Ineffcient Q(2n) choices

14 Greedy Approach 5 10 15 Sort by finish time

15 Activity-Selection Problem Pseudo code
Greedy_Activity_Selector(s,f) 1 n <- length[s] 2 A <- {1} 3 j <- 1 4 for i <- 2 to n do if si > fj 6 then A <- A U {i} j <- i 8 return A It can schdule a set S of n activities in Q(n) time, assuming that the activities were already sorted

16 Proving the greedy algorithm correct
We assume that the input activities are in order by increasing finishing time f1 < f2 < … < fn Activities #1 has the earliest finish time then it must be in an optimal solution. 1 possible solution 5 10 15 Activitiy 1 k 1

17 Proving (cont.) 5 10 15 k 1 Eliminate the activities which has a start time early than the finish time of activity 1

18 Proving (cont.) Greedy algorithm produces an optimal solution 5 10 15
5 10 15 1 Greedy algorithm produces an optimal solution

19 Element of the Greedy Strategy
Question? How can one tell if a greedy algorithm will solve a particular optimization problem? No general way to tell!!! There are 2 ingredients that exhibited by most problems that lend themselves to a greedy strategy The Greedy Choice Property Optimal Substructure

20 The Greedy Choice Property
A globally optimal solution can be arrived at by making a locally optimal (greedy) choice. Make whatever choice seems best at the moment. May depend on choice so far, but not depend on any future choices or on the solutions to subproblems

21 Optimal Substructure An optimal solution to the problem contains within it optimal solutions to subproblems

22 Knapsack Problem We are given n objects and a knapsack. Object i has a weight wi and the knapsack has a capacity M. If a fraction xi, 0  xi  1, of object I is placed into the knapsack the a profit of pixi is earned. The objective is to obtain a filling of the knapsack that maximizes the total weight of all chosen objects to be at most M maximize subject to and 0  xi  1, 1  I  n

23 Example 10 20 30 50 $60 $100 $120 Item 1 Item 2 Item 3 knapsack

24 Knapsack 0/1 30 20 $120 $100 Total =$220 20 10 $100 $60 =$160 30 10
=$180

25 Fractional Knapsack Taking the items in order of greatest value per pound yields an optimal solution 20 10 $100 $60 =$240 Total 30 $80

26 Optimal Substructure Both fractional knapsack and 0/1 knapsack have an optimal substructure.

27 Example Fractional Knapsack (cont.)
There are 5 objects that have a price and weight list below, the knapsack can contain at most 100 Lbs. Method 1 choose the least weight first Total Weight = = 100 Total Price = = 156

28 Example Fractional Knapsack (cont.)
Method 2 choose the most expensive first Total Weight = = 100 Total Price = = 146 half

29 Example Fractional Knapsack (cont.)
Method 3 choose the most price/ weight first Total weight = = 100 Total Price = = 164 80%

30 More Example on fractional knapsac
Consider the following instance of the knapsack problem: n = 3, M = 20, (p1,p2,p3) = 25,24,15 and (w1,w2,w3) = (18,15,10) (x1,x2,x3) 1) (1/2,1/3,1/4) 2) (1,2/15,0) 3) ( 0,2/3,1) 4) ( 0,1,1/2)

31 The Greedy Solution Define the density of object Ai to be wi/si. Use as much of low density objects as possible. That is, process each in increasing order of density. If the whole thing ts, use all of it. If not, fill the remaining space with a fraction of the current object,and discard the rest. First, sort the objects in nondecreasing density, so that wi/si  w i+1/s i+1 for 1  i < n. Then do the following

32 PseudoCode Procedure GREEDY_KNAPSACK(P,W,M,X,n)
//P(1:n) and W(1:n) contain the profits and weights respectively of the n objects ordered so that P(I)/W(I) > P(I+1)/W(I+1). M is the knapsack size and X(1:n) is the solution vector// real P(1:n), W(1:n), X(1:n), M, cu; integer I,n; x <- 0; //initialize solution to zero // cu <- M; // cu = remaining knapsack capacity // for i <- 1 to n do if W(i) > cu then exit endif X(I) <- 1; cu c - W(i); repeat if I < n then X(I) <- cu/W(I) endif End GREEDY_KNAPSACK


Download ppt "Lecture on Design and Analysis of Computer Algorithm"

Similar presentations


Ads by Google