Download presentation
Presentation is loading. Please wait.
Published byDoreen Fields Modified over 8 years ago
1
Greedy Algorithms 15-211 Fundamental Data Structures and Algorithms Ananda Guna February 6, 2003 Based on lectures given by Peter Lee, Avrim Blum, Danny Sleator, William Scherlis, Ananda Guna & Klaus Sutner
2
Announcements HW3 is available due Monday, Feb.10, 11:59pm start today! Quiz #1 available today an online quiz requires up to one hour of uninterrupted time with a web browser must be completed by Friday, 11:59pm
3
Lecture Outline Introduction to Greedy Algorithms Examples of Greedy Algorithms Minimum Coin count problem Fractional Knapsack problem When do Greedy Algorithms Fail? Proving Fractional Knapsack works Shortest path Problems Unweighted Path Weighted Path ( Dijkstra’s algorithm) Other Greedy Algorithms – Prim’s
4
Greedy Algorithms An algorithm that always takes the best immediate, or local, solution while finding an answer
5
A change-counting algorithm An easy algorithm for giving out N cents in change: Choose the largest bill or coin that is N. Subtract the value of the chosen bill/coin from N, to get a new value of N. Repeat until a total of N cents has been counted. Does this work? I.e., does this really give out the minimal number of coins and bills?
6
Our simple algorithm For US currency, this simple algorithm actually works. Why do we call this a greedy algorithm?
7
Greedy algorithms At every step, a greedy algorithm makes a locally optimal decision, We hope that it all adds up to a globally optimal solution Being optimistic like this usually leads to very simple algorithms.Easy to code.
8
But… What happens if we have a 21-cent coin?
9
Hill-climbing Greedy algorithms are often visualized as “hill-climbing”. Suppose you want to reach the summit, but can only see 10 yards ahead and behind (due to thick fog). Which way?
10
Hill-climbing, cont’d Making the locally-best guess is efficient and easy, but doesn’t always work.
11
Fractional Knapsack Problem Definition(NIST): Given materials of different values per unit volume and maximum amounts, find the most valuable mix of materials which fit in a knapsack of fixed volume. Since we may take pieces (fractions) of materials, a greedy algorithm finds the optimum. Take as much as possible of the material that is most valuable per unit volume. If there is still room, take as much as possible of the next most valuable material. Continue until the knapsack is full. greedy algorithm
12
Example 2: Fractional knapsack problem (FKP) You rob a store: find n kinds of items Gold dust. Wheat. Beer. The total inventory for the i th kind of item: Weight: w i pounds Value: v i dollars Knapsack can hold a maximum of W pounds. Q: how much of each kind of item should you take? (Can take fractional weight)
13
FKP: solution Greedy solution 2: Fill knapsack with “most valuable” item until all is taken. Most valuable = v i /w i (dollars per pound) Then next “most valuable” item, etc. Until knapsack is full.
14
Why does it work? 4 ingredients: Optimization problem minimum or maximum problem Can only proceed in stages no direct solution available Greedy-choice property: A locally optimal (greedy) choice will lead to a globally optimal solution. Optimal substructure: An optimal solution to the problem contains within it optimal solutions to subproblems. Show the proof
15
Example 3: Binary knapsack problem (BKP) Variation on FKP. The “Supermarket Shopping Spree”! Suppose, instead, that you can only take an item wholly, or not at all (no fractions allowed). Diamond rings. Laptops. Watches. Q: How many of each item to take? Will the greedy approach still work? Surprisingly, no.
16
BKP: Greedy approach fails item 1 item 2 item 3 knapsack $60, 10 lbs $100, 20 lbs $120, 30 lbs Maximum weight = 50 lbs Dollars/pound Item 1$6 Item 2$5 Item 3$4 BKP has optimal substructure, but not greedy-choice property: optimal solution does not contain greedy choice. $160 $180$220 (optimal)
17
Succeeding with greed 4 ingredients needed: Optimization problem. Proceed in stages. Greedy-choice property: A greedy choice will lead to a globally optimal solution. Optimal substructure: An optimal solution contains within it optimal solutions to subproblems.
18
FKP Proof - informally Lets prove that the optimal solution always contains the greedy choice Lets assume that we have W h of item h (highest value item in red) available and the optimal choice contains W i of the item h. If W i = W h then we are done – That is, we have used all of the greedy choice If W i < W h, then there exists another item L (in blue) in the optimal choice (say we have W L of L in the optimal solution). Fill in as much as you can from item h(in red) to replace the item L (in blue)
19
Greedy Choice Can always be made first Suppose you have W h of item h and the knap sack can hold W If W < W h, then fill-in the knap sack with W of item h and we are done If W > W h, then put ALL of item h to knap sack. In either case we can always choose FIRST the max amount of h.
20
Shortest Paths
21
Airline routes PVD BOS JFK ORD LAX SFO DFW BWI MIA 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621
22
Single-source shortest path Suppose we live in Baltimore (BWI) and want the shortest path to San Francisco (SFO). One way to solve this is to solve the single-source shortest path problem: Find the shortest path from BWI to every city.
23
Single-source shortest path While we may be interested only in BWI-to-SFO, there are no known algorithms that are asymptotically faster than solving the single-source problem for BWI-to-every-city.
24
Shortest paths What do we mean by “shortest path”? Minimize the number of layovers (i.e., fewest hops). Unweighted shortest-path problem. Minimize the total mileage (i.e., fewest frequent-flyer miles ;-). Weighted shortest-path problem.
25
Many applications Shortest paths model many useful real- world problems. How to find the shortest (minimum transitions/min cost) path in a route map? how to direct packets to a destination across a network using the shortest route? Voice recognition – best interpretation of a system We can construct a graph whose vertices correspond to possible words, with an edge between possible neighboring words
26
Unweighted Single-Source Shortest Path Algorithm
27
Unweighted shortest path In order to find the unweighted shortest path, we will augment vertices and edges so that: vertices can be marked with an integer, giving the number of hops from the source node, and edges can be marked as either explored or unexplored. Initially, all edges are unexplored.
28
Unweighted shortest path Algorithm: Set i to 0 and mark source node v with 0. Put source node v into a queue L 0. While L i is not empty: Create new empty queue L i+1 For each w in L i do: For each unexplored edge (w,x) do: –mark (w,x) as explored –if x not marked, mark with i and enqueue x into L i+1 Increment i.
29
Breadth-first search This algorithm is a form of breadth- first search. Performance: O(|V|+|E|). Q: Use this algorithm to find the shortest route (in terms of number of hops) from BWI to SFO. Q: What kind of structure is formed by the edges marked as explored?
30
Airline routes PVD BOS JFK ORD LAX SFO DFW BWI MIA 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621
31
Use of a queue It is very common to use a queue to keep track of: nodes to be visited next, or nodes that we have already visited. Typically, use of a queue leads to a breadth-first visit order. Breadth-first visit order is “cautious” in the sense that it examines every path of length i before going on to paths of length i+1.
32
Weighted Single-Source Shortest Path Algorithm (Dijkstra’s Algorithm)
33
Weighted shortest path Now suppose we want to minimize the total mileage. Breadth-first search does not work! Minimum number of hops does not mean minimum distance. Consider, for example, BWI-to-DFW:
34
Three 2-hop routes to DFW PVD BOS JFK ORD LAX SFO DFW BWI MIA 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621
35
A greedy algorithm Assume that every city is infinitely far away. I.e., every city is miles away from BWI (except BWI, which is 0 miles away). Now perform something similar to breadth-first search, and optimistically guess that we have found the best path to each city as we encounter it. If we later discover we are wrong and find a better path to a particular city, then update the distance to that city.
36
Intuition behind Dijkstra’s alg. For our airline-mileage problem, we can start by guessing that every city is miles away. Mark each city with this guess. Find all cities one hop away from BWI, and check whether the mileage is less than what is currently marked for that city. If so, then revise the guess. Continue for 2 hops, 3 hops, etc.
37
Dijkstra’s algorithm Assume we want to find the shortest path to all other vertices from vertex 1 (source) Let S = {} and V={1,2,…,N} Let D[I] be the length of the shortest path from source(1) to city I. Clearly D[1] =0, and set D[I] = infinity for I=2,…N For I from 1 to N-1 do Choose w in V-S such that D[w] is minimum add w to S for each vertex v in V-S adjacent to w D[v] = min(D[v],D[w]+C[w,v]);
38
Example 1: Find the Single Source Shortest Path 1 2 5 4 3 100 10 50 10 30 60 20 S={ }, V={1,2,3,4,5}
39
Example 2: Vanguard Airline Map 610 440 510 540 820 410 460 540 480 650 290 Find the shortest paths from PGH to all other cities
40
Example 3: Find the Shortest mileage from BWI PVD BOS JFK ORD LAX SFO DFW BWI 0 MIA 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621
41
Questions What is the running time of Dijkstra’s algorithm, in terms of |V| and |E|? Every time around the loop we remove one node. O(log |V|) to removeMin. |V| iterations. Each iteration explores O(|E|) edges. So, O((|V|+|E|) log |V|) time.
42
Next Week Hw3 is due Monday Start early We will discuss Compression Algorithms – Huffman, LZW etc.. Read Chapter 12
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.