Greedy Algorithms 15-211 Fundamental Data Structures and Algorithms Ananda Guna February 6, 2003 Based on lectures given by Peter Lee, Avrim Blum, Danny.

Slides:



Advertisements
Similar presentations
Problem solving with graph search
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS 206 Introduction to Computer Science II 04 / 01 / 2009 Instructor: Michael Eckmann.
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Chapter 5 Fundamental Algorithm Design Techniques.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Greed is good. (Some of the time)
CS 206 Introduction to Computer Science II 11 / 07 / 2008 Instructor: Michael Eckmann.
The Shortest Path Problem. Shortest-Path Algorithms Find the “shortest” path from point A to point B “Shortest” in time, distance, cost, … Numerous.
Shortest Paths and Dijkstra's Algorithm CS 110: Data Structures and Algorithms First Semester,
1 Spanning Trees Lecture 20 CS2110 – Spring
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Lecture 18: Minimum Spanning Trees Shang-Hua Teng.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
1 Graphs: shortest paths Fundamental Data Structures and Algorithms Ananda Guna April 3, 2003.
ASC Program Example Part 3 of Associative Computing Examining the MST code in ASC Primer.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
1 Minimum Spanning Trees Longin Jan Latecki Temple University based on slides by David Matuszek, UPenn, Rose Hoberman, CMU, Bing Liu, U. of Illinois, Boting.
1 Minimum Spanning Trees Longin Jan Latecki Temple University based on slides by David Matuszek, UPenn, Rose Hoberman, CMU, Bing Liu, U. of Illinois, Boting.
1 Shortest Path Calculations in Graphs Prof. S. M. Lee Department of Computer Science.
Dijkstra's algorithm.
Graphs – Shortest Path (Weighted Graph) ORD DFW SFO LAX
Shortest Path Algorithms. Kruskal’s Algorithm We construct a set of edges A satisfying the following invariant:  A is a subset of some MST We start with.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Greedy methods Prudence Wong
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 223 – Advanced Data Structures Graph Algorithms Shortest-Path.
Chapter 9 – Graphs A graph G=(V,E) – vertices and edges
Greedy Algorithms Fundamental Data Structures and Algorithms Peter Lee March 19, 2004.
SPANNING TREES Lecture 21 CS2110 – Spring
IT 60101: Lecture #201 Foundation of Computing Systems Lecture 20 Classic Optimization Problems.
COSC 2007 Data Structures II Chapter 14 Graphs III.
Prims’ spanning tree algorithm Given: connected graph (V, E) (sets of vertices and edges) V1= {an arbitrary node of V}; E1= {}; //inv: (V1, E1) is a tree,
CPSC 320: Intermediate Algorithm Design & Analysis Greedy Algorithms and Graphs Steve Wolfman 1.
Minimum-Cost Spanning Tree CS 110: Data Structures and Algorithms First Semester,
Shortest Paths and Dijkstra’s Algorithm CS 105. SSSP Slide 2 Single-source shortest paths Given a weighted graph G and a source vertex v in G, determine.
Introduction to Algorithms Jiafen Liu Sept
1 Dijkstra’s Algorithm Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
CSC 213 – Large Scale Programming. Today’s Goals  Discuss what is meant by weighted graphs  Where weights placed within Graph  How to use Graph ’s.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Greedy Algorithms Fundamental Data Structures and Algorithms Peter Lee February 6, 2003.
Greedy Algorithms Fundamental Data Structures and Algorithms Margaret Reid-Miller 25 March 2004.
SPANNING TREES Lecture 20 CS2110 – Fall Spanning Trees  Definitions  Minimum spanning trees  3 greedy algorithms (incl. Kruskal’s & Prim’s)
SPANNING TREES Lecture 21 CS2110 – Fall Nate Foster is out of town. NO 3-4pm office hours today!
Graphs - Shortest Paths Fundamental Data Structures and Algorithms Margaret Reid-Miller 24 March 2005.
A Introduction to Computing II Lecture 16: Dijkstra’s Algorithm Fall Session 2000.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Dynamic Programming Fundamental Data Structures and Algorithms Klaus Sutner March 30, 2004.
Spanning Trees Dijkstra (Unit 10) SOL: DM.2 Classwork worksheet Homework (day 70) Worksheet Quiz next block.
SPANNING TREES Lecture 21 CS2110 – Spring
CSC317 Greedy algorithms; Two main properties:
Lecture on Design and Analysis of Computer Algorithm
Spanning Trees Lecture 21 CS2110 – Fall 2016.
I206: Lecture 15: Graphs Marti Hearst Spring 2012.
Minimum Spanning Tree Section 7.3: Examples {1,2,3,4}
Advanced Algorithms Analysis and Design
Directed Graph Algorithms
Directed Graph Algorithms
Minimum Spanning Trees
Single-source shortest paths
Fundamental Data Structures and Algorithms
CSE 373: Data Structures and Algorithms
Fundamental Data Structures and Algorithms
CSE 373 Data Structures and Algorithms
CSC 380: Design and Analysis of Algorithms
Spanning Trees Lecture 20 CS2110 – Spring 2015.
Prims’ spanning tree algorithm
More Graphs Lecture 19 CS2110 – Fall 2009.
Presentation transcript:

Greedy Algorithms Fundamental Data Structures and Algorithms Ananda Guna February 6, 2003 Based on lectures given by Peter Lee, Avrim Blum, Danny Sleator, William Scherlis, Ananda Guna & Klaus Sutner

Announcements HW3 is available  due Monday, Feb.10, 11:59pm  start today! Quiz #1 available today  an online quiz  requires up to one hour of uninterrupted time with a web browser  must be completed by Friday, 11:59pm

Lecture Outline Introduction to Greedy Algorithms Examples of Greedy Algorithms  Minimum Coin count problem  Fractional Knapsack problem When do Greedy Algorithms Fail? Proving Fractional Knapsack works Shortest path Problems  Unweighted Path  Weighted Path ( Dijkstra’s algorithm) Other Greedy Algorithms – Prim’s

Greedy Algorithms An algorithm that always takes the best immediate, or local, solution while finding an answer

A change-counting algorithm An easy algorithm for giving out N cents in change:  Choose the largest bill or coin that is N.  Subtract the value of the chosen bill/coin from N, to get a new value of N.  Repeat until a total of N cents has been counted. Does this work? I.e., does this really give out the minimal number of coins and bills?

Our simple algorithm For US currency, this simple algorithm actually works. Why do we call this a greedy algorithm?

Greedy algorithms At every step, a greedy algorithm  makes a locally optimal decision,  We hope that it all adds up to a globally optimal solution Being optimistic like this usually leads to very simple algorithms.Easy to code.

But… What happens if we have a 21-cent coin?

Hill-climbing Greedy algorithms are often visualized as “hill-climbing”.  Suppose you want to reach the summit, but can only see 10 yards ahead and behind (due to thick fog). Which way?

Hill-climbing, cont’d Making the locally-best guess is efficient and easy, but doesn’t always work.

Fractional Knapsack Problem Definition(NIST): Given materials of different values per unit volume and maximum amounts, find the most valuable mix of materials which fit in a knapsack of fixed volume. Since we may take pieces (fractions) of materials, a greedy algorithm finds the optimum. Take as much as possible of the material that is most valuable per unit volume. If there is still room, take as much as possible of the next most valuable material. Continue until the knapsack is full. greedy algorithm

Example 2: Fractional knapsack problem (FKP) You rob a store: find n kinds of items  Gold dust. Wheat. Beer. The total inventory for the i th kind of item:  Weight: w i pounds  Value: v i dollars Knapsack can hold a maximum of W pounds. Q: how much of each kind of item should you take? (Can take fractional weight)

FKP: solution Greedy solution 2:  Fill knapsack with “most valuable” item until all is taken. Most valuable = v i /w i (dollars per pound)  Then next “most valuable” item, etc.  Until knapsack is full.

Why does it work? 4 ingredients:  Optimization problem minimum or maximum problem  Can only proceed in stages no direct solution available  Greedy-choice property: A locally optimal (greedy) choice will lead to a globally optimal solution.  Optimal substructure: An optimal solution to the problem contains within it optimal solutions to subproblems. Show the proof

Example 3: Binary knapsack problem (BKP) Variation on FKP.  The “Supermarket Shopping Spree”! Suppose, instead, that you can only take an item wholly, or not at all (no fractions allowed).  Diamond rings. Laptops. Watches. Q: How many of each item to take? Will the greedy approach still work? Surprisingly, no.

BKP: Greedy approach fails item 1 item 2 item 3 knapsack $60, 10 lbs $100, 20 lbs $120, 30 lbs Maximum weight = 50 lbs Dollars/pound Item 1$6 Item 2$5 Item 3$4 BKP has optimal substructure, but not greedy-choice property: optimal solution does not contain greedy choice. $160 $180$220 (optimal)

Succeeding with greed 4 ingredients needed: Optimization problem. Proceed in stages. Greedy-choice property: A greedy choice will lead to a globally optimal solution. Optimal substructure: An optimal solution contains within it optimal solutions to subproblems.

FKP Proof - informally Lets prove that the optimal solution always contains the greedy choice  Lets assume that we have W h of item h (highest value item in red) available and the optimal choice contains W i of the item h.  If W i = W h then we are done – That is, we have used all of the greedy choice  If W i < W h, then there exists another item L (in blue) in the optimal choice (say we have W L of L in the optimal solution). Fill in as much as you can from item h(in red) to replace the item L (in blue)

Greedy Choice Can always be made first Suppose you have W h of item h and the knap sack can hold W If W < W h, then fill-in the knap sack with W of item h and we are done If W > W h, then put ALL of item h to knap sack. In either case we can always choose FIRST the max amount of h.

Shortest Paths

Airline routes PVD BOS JFK ORD LAX SFO DFW BWI MIA

Single-source shortest path Suppose we live in Baltimore (BWI) and want the shortest path to San Francisco (SFO). One way to solve this is to solve the single-source shortest path problem:  Find the shortest path from BWI to every city.

Single-source shortest path While we may be interested only in BWI-to-SFO, there are no known algorithms that are asymptotically faster than solving the single-source problem for BWI-to-every-city.

Shortest paths What do we mean by “shortest path”? Minimize the number of layovers (i.e., fewest hops).  Unweighted shortest-path problem. Minimize the total mileage (i.e., fewest frequent-flyer miles ;-).  Weighted shortest-path problem.

Many applications Shortest paths model many useful real- world problems.  How to find the shortest (minimum transitions/min cost) path in a route map?  how to direct packets to a destination across a network using the shortest route?  Voice recognition – best interpretation of a system We can construct a graph whose vertices correspond to possible words, with an edge between possible neighboring words

Unweighted Single-Source Shortest Path Algorithm

Unweighted shortest path In order to find the unweighted shortest path, we will augment vertices and edges so that:  vertices can be marked with an integer, giving the number of hops from the source node, and  edges can be marked as either explored or unexplored. Initially, all edges are unexplored.

Unweighted shortest path Algorithm:  Set i to 0 and mark source node v with 0.  Put source node v into a queue L 0.  While L i is not empty: Create new empty queue L i+1 For each w in L i do: For each unexplored edge (w,x) do: –mark (w,x) as explored –if x not marked, mark with i and enqueue x into L i+1 Increment i.

Breadth-first search This algorithm is a form of breadth- first search. Performance: O(|V|+|E|). Q: Use this algorithm to find the shortest route (in terms of number of hops) from BWI to SFO. Q: What kind of structure is formed by the edges marked as explored?

Airline routes PVD BOS JFK ORD LAX SFO DFW BWI MIA

Use of a queue It is very common to use a queue to keep track of:  nodes to be visited next, or  nodes that we have already visited. Typically, use of a queue leads to a breadth-first visit order. Breadth-first visit order is “cautious” in the sense that it examines every path of length i before going on to paths of length i+1.

Weighted Single-Source Shortest Path Algorithm (Dijkstra’s Algorithm)

Weighted shortest path Now suppose we want to minimize the total mileage. Breadth-first search does not work!  Minimum number of hops does not mean minimum distance.  Consider, for example, BWI-to-DFW:

Three 2-hop routes to DFW PVD BOS JFK ORD LAX SFO DFW BWI MIA

A greedy algorithm Assume that every city is infinitely far away.  I.e., every city is  miles away from BWI (except BWI, which is 0 miles away).  Now perform something similar to breadth-first search, and optimistically guess that we have found the best path to each city as we encounter it.  If we later discover we are wrong and find a better path to a particular city, then update the distance to that city.

Intuition behind Dijkstra’s alg. For our airline-mileage problem, we can start by guessing that every city is  miles away.  Mark each city with this guess. Find all cities one hop away from BWI, and check whether the mileage is less than what is currently marked for that city.  If so, then revise the guess. Continue for 2 hops, 3 hops, etc.

Dijkstra’s algorithm Assume we want to find the shortest path to all other vertices from vertex 1 (source) Let S = {} and V={1,2,…,N} Let D[I] be the length of the shortest path from source(1) to city I. Clearly D[1] =0, and set D[I] = infinity for I=2,…N For I from 1 to N-1 do Choose w in V-S such that D[w] is minimum add w to S for each vertex v in V-S adjacent to w D[v] = min(D[v],D[w]+C[w,v]);

Example 1: Find the Single Source Shortest Path S={ }, V={1,2,3,4,5}

Example 2: Vanguard Airline Map Find the shortest paths from PGH to all other cities

Example 3: Find the Shortest mileage from BWI PVD  BOS  JFK  ORD  LAX  SFO  DFW  BWI 0 MIA 

Questions What is the running time of Dijkstra’s algorithm, in terms of |V| and |E|? Every time around the loop we remove one node.  O(log |V|) to removeMin.  |V| iterations.  Each iteration explores O(|E|) edges. So, O((|V|+|E|) log |V|) time.

Next Week Hw3 is due Monday Start early We will discuss Compression Algorithms – Huffman, LZW etc.. Read Chapter 12