© The McGraw-Hill Companies, Inc., 2005 3 -1 Chapter 3 The Greedy Method.

Slides:



Advertisements
Similar presentations
Unit-iv.
Advertisements

Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.
Lecture 15. Graph Algorithms
CHAPTER 7 Greedy Algorithms.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Greedy Algorithms Greed is good. (Some of the time)
Prof. Sin-Min Lee Department of Computer Science
Greed is good. (Some of the time)
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
5-1 Chapter 5 Tree Searching Strategies. 5-2 Satisfiability problem Tree representation of 8 assignments. If there are n variables x 1, x 2, …,x n, then.
© The McGraw-Hill Companies, Inc., Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Chapter 4 The Greedy Method.
Chapter 3 The Greedy Method 3.
Chapter 23 Minimum Spanning Trees
1 Spanning Trees Lecture 20 CS2110 – Spring
Discussion #36 Spanning Trees
Graph Algorithms: Minimum Spanning Tree We are given a weighted, undirected graph G = (V, E), with weight function w:
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
1 Branch and Bound Searching Strategies 2 Branch-and-bound strategy 2 mechanisms: A mechanism to generate branches A mechanism to generate a bound so.
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Approximation Algorithms
TECH Computer Science Graph Optimization Problems and Greedy Algorithms Greedy Algorithms  // Make the best choice now! Optimization Problems  Minimizing.
Minimum Spanning Trees What is a MST (Minimum Spanning Tree) and how to find it with Prim’s algorithm and Kruskal’s algorithm.
Data Structures and Algorithms Graphs Minimum Spanning Tree PLSD210.
The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each decision is locally optimal. These.
Analysis of Algorithms
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Greedy methods Prudence Wong
Lecture 1: The Greedy Method 主講人 : 虞台文. Content What is it? Activity Selection Problem Fractional Knapsack Problem Minimum Spanning Tree – Kruskal’s Algorithm.
Chapter 2 Graph Algorithms.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Sets.
COSC 2007 Data Structures II Chapter 14 Graphs III.
Dijkstra’s Algorithm. Announcements Assignment #2 Due Tonight Exams Graded Assignment #3 Posted.
© 2015 JW Ryder CSCI 203 Data Structures1. © 2015 JW Ryder CSCI 203 Data Structures2.
3 -1 Chapter 3 The Greedy Method 3 -2 A simple example Problem: Pick k numbers out of n numbers such that the sum of these k numbers is the largest.
Greedy Algorithms and Matroids Andreas Klappenecker.
알고리즘 설계 및 분석 Foundations of Algorithm 유관우. Digital Media Lab. 2 Chap4. Greedy Approach Grabs data items in sequence, each time with “best” choice, without.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
1 Minimum Spanning Trees (some material adapted from slides by Peter Lee, Ananda Guna, Bettina Speckmann)
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
Minimum Spanning Trees CS 146 Prof. Sin-Min Lee Regina Wang.
CSCE350 Algorithms and Data Structure Lecture 19 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible b locally optimal.
Branch and Bound Searching Strategies
Chapter 11. Chapter Summary  Introduction to trees (11.1)  Application of trees (11.2)  Tree traversal (11.3)  Spanning trees (11.4)
COMP108 Algorithmic Foundations Greedy methods
Chapter 9 : Graphs Part II (Minimum Spanning Trees)
Greedy Technique.
Greedy function greedy { S <- S0 //Initialization
Chapter 5. Greedy Algorithms
The Greedy Approach Winter-2004 Young CS 331 D&A of Algo. Greedy.
First-Cut Techniques Lecturer: Jing Liu
Minimum-Cost Spanning Tree
CSCE350 Algorithms and Data Structure
Spanning Trees.
Minimum Spanning Tree.
Minimum-Cost Spanning Tree
Chapter 23 Minimum Spanning Tree
Minimum-Cost Spanning Tree
Graph Searching.
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Minimum Spanning Tree.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
CSE 373: Data Structures and Algorithms
Minimum-Cost Spanning Tree
Presentation transcript:

© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method

© The McGraw-Hill Companies, Inc., The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each decision is locally optimal. These locally optimal solutions will finally add up to a globally optimal solution. Only a few optimization problems can be solved by the greedy method.

© The McGraw-Hill Companies, Inc., A Simple Example Problem: Pick k numbers out of n numbers such that the sum of these k numbers is the largest. Algorithm: FOR i = 1 to k Pick out the largest number and delete this number from the input. ENDFOR

© The McGraw-Hill Companies, Inc., Shortest Paths on a Special Graph Problem: Find a shortest path from v 0 to v 3. The greedy method can solve this problem. The shortest path: = 7.

© The McGraw-Hill Companies, Inc., Shortest Paths on a Multi-stage Graph Problem: Find a shortest path from v 0 to v 3 in a multi-stage graph. Greedy method: v 0 v 1,2 v 2,1 v 3 = 23 Optimal: v 0 v 1,1 v 2,2 v 3 = 7 The greedy method does not work.

© The McGraw-Hill Companies, Inc., This problem can be solved by the dynamic programming method which will be introduced later.

© The McGraw-Hill Companies, Inc., Minimum Spanning Trees (MST) It may be defined on Euclidean space points or on a graph. G = (V, E): weighted connected undirected graph Spanning tree: T = (V, S), S  E. Minimum spanning tree(MST): a spanning tree with the smallest total weight.

© The McGraw-Hill Companies, Inc., An Example of MST A graph and one of its minimum spanning trees.

© The McGraw-Hill Companies, Inc., Kruskal ’ s Algorithm for Finding MST Input : A weighted, connected and undirected graph G = (V, E ). Output : A minimal spanning tree for G. T : =  While T contains less than n - 1 edges do Begin Choose an edge (v, w) from E of the smallest weight Delete (v, w) from E If (the adding of (v, w) to T does not create a cycle in T) then Add (v, w) to T Else Discard (v, w) End

© The McGraw-Hill Companies, Inc., An example of Kruskal ’ s algorithm

© The McGraw-Hill Companies, Inc., The Details for Constructing an MST How do we check if a cycle is formed when a new edge is added? By the SET and UNION algorithm introduced later. A tree in the forest is used to represent a SET. If (u, v)  E and u, v are in the same set, then the addition of (u, v) will form a cycle. If (u, v)  E and u  S 1, v  S 2, then perform UNION of S 1 and S 2.

© The McGraw-Hill Companies, Inc., A spanning forest If (3,4) is added, since 3 and 4 belong to the same set, a cycle will be formed. If (4,5) is added, since 4 and 5 belong to different sets, a new forest (1,2,3,4,5,6,7,8,9) is created. Thus we are always performing the union of two sets.

© The McGraw-Hill Companies, Inc., Time Complexity of Kruskal ’ s Algorithm Time complexity: O(|E|log|E|) = O(n 2 logn), where n = |V|.

© The McGraw-Hill Companies, Inc., Prim ’ s Algorithm for Finding an MST Step 1: x  V, Let A = {x}, B = V - {x}. Step 2: Select (u, v)  E, u  A, v  B such that (u, v) has the smallest weight between A and B. Step 3: Put (u, v) into the tree. A = A  {v}, B = B - {v} Step 4: If B = , stop; otherwise, go to Step 2. Time complexity : O(n 2 ), n = |V|. (see the example on the next page)

© The McGraw-Hill Companies, Inc., An Example for Prim ’ s Algorithm

© The McGraw-Hill Companies, Inc., The Single-Source Shortest Path Problem Shortest paths from v 0 to all destinations

© The McGraw-Hill Companies, Inc., Dijkstra ’ s Algorithm to Generate Single Source Shortest Paths Input: A directed graph G = (V, E) and a source vertex v 0. For each edge (u, v)  E, there is a nonnegative number c(u, v) associated with it. |V|= n + 1. Output: For each v  V, the length of a shortest path from v 0 to v. S : = {v 0 } For i : = 1 to n do Begin If (v 0, v i )  E then L(v i ) : = c(v 0, v i ) else L(v i ) : =  End (continued on the next page)

© The McGraw-Hill Companies, Inc., Dijkstra ’ s Algorithm to Generate Single Source Shortest Paths (cont ’ d) For i : = 1 to n do Begin Choose u from V - S such that L(u) is the smallest S : = S ∪ {u} (* Put u into S *) For all w in V - S do L(w) : = min(L(w), L(u) + c(u, w)) End

© The McGraw-Hill Companies, Inc., Vertex Shortest distance to v 0 (length) v 1 v 0 v 1 (1) v 2 v 0 v 1 v 2 (1 + 3 = 4) v 3 v 0 v 1 v 3 (1 + 4 = 5) v 4 v 0 v 1 v 2 v 4 ( = 6) v 5 v 0 v 1 v 3 v 5 ( = 8)

© The McGraw-Hill Companies, Inc., Time complexity of Dijkstra ’ s algorithm is O(n 2 ). It is optimal.

© The McGraw-Hill Companies, Inc., Given two sorted lists L 1 and L 2, L 1 = (a 1, a 2,..., a n1 ) and L 2 = (b 1, b 2,..., b n2 ), we can merge L 1 and L 2 into a sorted list by using the following algorithm.

© The McGraw-Hill Companies, Inc., Linear Merge Algorithm Input: Two sorted lists, L 1 = ( a 1, a 2,..., a n1 ) and L 2 = (b 1, b 2,..., b n2 ). Output: A sorted list consisting of elements in L 1 and L 2. Begin i : = 1 j : = 1 do compare a i and b j if a i > b j then output b j and j : = j + 1 else output a i and i : = i + 1 while (i  n1 and j  n2) if i > n1 then output b j, b j+1,..., b n2, else output a i, a i+1,..., a n1. End.

© The McGraw-Hill Companies, Inc., If more than two sorted lists are to be merged, we can still apply the linear merge algorithm, which merges two sorted lists, repeatedly. These merging processes are called 2-way merge because each merging step only merges two sorted lists.

© The McGraw-Hill Companies, Inc., A Greedy Algorithm to Generate an Optimal 2-Way Merge Tree Input: m sorted lists, L i, i = 1, 2,..., m, each L i consisting of n i elements. Output: An optimal 2-way merge tree. Step 1. Generate m trees, where each tree has exactly one node (external node) with weight n i. Step 2. Choose two trees T 1 and T 2 with minimal weights. Step 3. Create a new tree T whose root has T 1 and T 2 as its subtrees and weight are equal to the sum of weights of T 1 and T 2. Step 4. Replace T 1 and T 2 by T. Step 5. If there is only one tree left, stop and return; otherwise, go to Step 2.

© The McGraw-Hill Companies, Inc., An example of 2-way merging Example: 6 sorted lists with lengths 2, 3, 5, 7, 11 and 13.

© The McGraw-Hill Companies, Inc.,  Time complexity of the 2-way merge algorithm is O(n log n).

© The McGraw-Hill Companies, Inc., Huffman codes In telecommunication, how do we represent a set of messages, each with an access frequency, by a sequence of 0 ’ s and 1 ’ s? To minimize the transmission and decoding costs, we may use short strings to represent more frequently used messages. This problem can by solved by using the 2- way merge algorithm.

© The McGraw-Hill Companies, Inc., An example of Huffman algorithm Symbols: A, B, C, D, E, F, G freq. : 2, 3, 5, 8, 13, 15, 18 Huffman codes: A: 10100B: C: 1011 D: 100E: 00 F: 01 G: 11 A Huffman code Tree

© The McGraw-Hill Companies, Inc., The minimal cycle basis problem 3 cycles: A 1 = {ab, bc, ca} A 2 = {ac, cd, da} A 3 = {ab, bc, cd, da} where A 3 = A 1  A 2 (A  B = (A  B)-(A  B)) A 2 = A 1  A 3 A 1 = A 2  A 3 Cycle basis: {A 1, A 2 } or {A 1, A 3 } or {A 2, A 3 }

© The McGraw-Hill Companies, Inc., Def: A cycle basis of a graph is a set of cycles such that every cycle in the graph can be generated by applying  on some cycles of this basis. The weighted cycle basis problem: Given a graph, find a minimal cycle basis of this graph.

© The McGraw-Hill Companies, Inc., The minimal cycle basis is {A 1, A 2 }, where A 1 ={ab, bc, ca} and A 2 = {ac, cd, da}.

© The McGraw-Hill Companies, Inc., A greedy algorithm for finding a minimal cycle basis: Step 1: Determine the size of the minimal cycle basis, denoted as k. Step 2: Find all of the cycles. Sort all cycles by weights. Step 3: Add cycles to the cycle basis one by one. Check if the added cycle is a combination of some cycles already existing in the basis. If yes, delete this cycle. Step 4: Stop if the cycle basis has k cycles.

© The McGraw-Hill Companies, Inc., Detailed Steps for the Minimal Cycle Basis Problem Step 1 : A cycle basis corresponds to the fundamental set of cycles with respect to a spanning tree. a graph a spanning tree # of cycles in a cycle basis: = k = |E| - (|V|- 1) = |E| - |V| + 1 a fundamental set of cycles

© The McGraw-Hill Companies, Inc., Step 2: How to find all cycles in a graph? [Reingold, Nievergelt and Deo 1977] How many cycles are there in a graph in the worst case? In a complete digraph of n vertices and n(n-1) edges: Step 3: How to check if a cycle is a linear combination of some cycles? Using Gaussian elimination.

© The McGraw-Hill Companies, Inc., E.g.  2 cycles C 1 and C 2 are represented by a 0/1 matrix  Add C 3   on rows 1 and 3  on rows 2 and 3 : empty ∵ C 3 = C 1  C 2 Gaussian elimination

© The McGraw-Hill Companies, Inc., The 2-terminal One to Any Special Channel Routing Problem Def: Given two sets of terminals on the upper and lower rows, respectively, we have to connect each upper terminal to the lower row in a one-to-one fashion. This connection requires that the number of tracks used is minimized.

© The McGraw-Hill Companies, Inc., Two Feasible Solutions

© The McGraw-Hill Companies, Inc., Redrawing Solutions (a) Optimal solution (b) Another solution

© The McGraw-Hill Companies, Inc., At each point, the local density of the solution is the number of lines the vertical line intersects. The problem: to minimize the density. The density is a lower bound of the number of tracks. Upper row terminals: P 1,P 2, …, P n from left to right. Lower row terminals: Q 1,Q 2, …, Q m from left to right m > n. It would never have a crossing connection:

© The McGraw-Hill Companies, Inc., Suppose that we have the minimum density d of a problem instance. We can use the following greedy algorithm: Step 1 : P 1 is connected Q 1. Step 2 : After P i is connected to Q j, we check whether P i+1 can be connected to Q j+1. If the density is increased to d+1, try to connect P i+1 to Q j+2. Step 3 : Repeat Step2 until all P i ’ s are connected.

© The McGraw-Hill Companies, Inc., A Solution Produced by the Greedy Algorithm. d=1

© The McGraw-Hill Companies, Inc., The knapsack Problem n objects, each with a weight w i > 0 a profit p i > 0 capacity of knapsack: M Maximize Subject to 0  x i  1, 1  i  n

© The McGraw-Hill Companies, Inc., The knapsack problem is different from the 0/1 knapsack problem. In the 0/1 knapsack problem, x i is either 0 or 1 while in the knapsack problem, 0  x i  1.

© The McGraw-Hill Companies, Inc., The knapsack algorithm The greedy algorithm: Step 1: Sort p i /w i into nonincreasing order. Step 2: Put the objects into the knapsack according to the sorted sequence as far as possible. e.g. n = 3, M = 20, (p 1, p 2, p 3 ) = (25, 24, 15) (w 1, w 2, w 3 ) = (18, 15, 10) Sol: p 1 /w 1 = 25/18 = 1.32 p 2 /w 2 = 24/15 = 1.6 p 3 /w 3 = 15/10 = 1.5 Optimal solution: x 1 = 0, x 2 = 1, x 3 = 1/2