Download presentation
Presentation is loading. Please wait.
1
ICS 353: Design and Analysis of Algorithms
King Fahd University of Petroleum & Minerals Information & Computer Science Department ICS 353: Design and Analysis of Algorithms Greedy Algorithms
2
Reading Assignment M. Alsuwaiyel, Introduction to Algorithms: Design Techniques and Analysis, World Scientific Publishing Co., Inc Chapter 8 Section 1. Chapter 8 Sections 8.2 – 8.4 (Except and 8.4.1) T. Cormen, C. Leiserson, R. Rivest & C. Stein Introduction to Algorithms, 2nd Edition, The MIT Press, 2001.
3
Unlike dynamic programming algorithms,
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming algorithms, greedy algorithms are iterative in nature. An optimal solution is reached from local optimal solutions. This approach does not work all the time. A proof that the algorithm does what it claims is needed, and usually not easy to get.
4
Fractional Knapsack Problem
Given n items of sizes s1, s2, …, sn and values v1, v2, …, vn and size C, the problem is to find x1, x2, …, xn that maximize subject to
5
Solution to Fractional Knapsack Problem
Consider yi = vi / si What is yi? What is the solution?
6
Activity Selection Problem
Problem Formulation Given a set of n activities, S = {a1, a2, ..., an} that require exclusive use of a common resource, find the largest possible set of nonoverlapping activities (also called mutually compatible). For example, scheduling the use of a classroom. Assume that ai needs the resource during period [si, fi), which is a half-open interval, where si = start time of the activity, and fi = finish time of the activity. Note: Could have many other objectives: Schedule room for longest time. Maximize income rental fees.
7
Activity Selection Problem: Example
Assume the following set S of activities that are sorted by their finish time, find a maximum-size mutually compatible set. i 1 2 3 4 5 6 7 8 9 si 11 13 fi 10 14 16
8
i 1 2 3 4 5 6 7 8 9 si 11 13 fi 10 14 16
9
Solving the Activity Selection Problem
Define Si,j = {ak S : fi sk < fk sj } activities that start after ai finishes and finish before aj starts Activities in Si,j are compatible with: Add the following [fictitious] activities a0 = [– , 0) and an+1 = [ , +1) Hence, S = S0,n+1 and the range of Si,j is 0 i,j n+1
10
Solving the Activity Selection Problem
Assume that activities are sorted by monotonically increasing finish time: i.e., f0 f1 f2 ... fn < fn+1 Then, Si,j = for i j. Proof: Therefore, we only need to worry about Si,j where 0 i < j n+1
11
Solving the Activity Selection Problem
Suppose that a solution to Si,j includes ak . We have 2 sub-problems: Si,k (start after ai finishes, finish before ak starts) Sk,j (start after ak finishes, finish before aj starts) The Solution to Si,j is (solution to Si,k ) {ak} (solution to Sk,j ) Since ak is in neither sub-problem, and the subproblems are disjoint, |solution to S| = |solution to Si,k|+1+|solution to Sk,j|
12
Recursive Solution to Activity Selection Problem
Let Ai,j = optimal solution to Si,j . So Ai,j = Ai,k {ak} Ak,j, assuming: Si,j is nonempty, and we know ak. Hence,
13
Finding the Greedy Algorithm
Theorem: Let Si,j , and let am be the activity in Si,j with the earliest finish time: fm= min { fk : ak Si,j }. Then: am is used in some maximum-size subset of mutually compatible activities of Si,j Sim= , so that choosing am leaves Sm,j as the only nonempty subproblem.
14
Recursive Greedy Algorithm
15
Iterative Greedy Algorithm
16
Greedy Strategy Determine the optimal substructure.
Develop a recursive solution. Prove that at any stage of recursion, one of the optimal choices is the greedy choice. Therefore, it's always safe to make the greedy choice. Show that all but one of the subproblems resulting from the greedy choice are empty. Develop a recursive greedy algorithm. Convert it to an iterative algorithm.
17
Shortest Paths Problems
Input: A graph with non-negative weights or costs associated with each edge. Output: The list of edges forming the shortest path. Sample problems: Find shortest path between two named vertices Find shortest path from S to all other vertices Find shortest path between all pairs of vertices Will actually calculate only distances, not paths.
18
Shortest Paths Definitions
(A, B) is the shortest distance from vertex A to B. length(A, B) is the weight of the edge connecting A to B. If there is no such edge, then length(A, B) = . A 8 C 1 10 5 B D 7
19
Single-Source Shortest Paths
Problem: Given G=(V,E), start vertex s, find the shortest path from s to all other vertices. Assume V={1, 2, …, n} and s=1 Solution: A greedy algorithm called Dijkstra’s Algorithm
20
Dijkstra’s Algorithm Outline
Partition V into two sets: X={1} and Y= {2, 3, …, n} Initialize [i] for 1 i n as follows: … Select yY such that [y] is minimum [y] is the length of the shortest path from 1 to y that uses only vertices in set X. Remove y from Y, add it to X, and update [w] for each wY where (y,w) E if the path through y is shorter.
21
Example B 5 6 D 1 A 2 11 2 3 15 C E
22
Dijkstra’s Algorithm A B C D E Initial Process A Process
23
Dijkstra’s Algorithm Input: A weighted directed graph G = (V,E), where V = {1,2,…,n} Output: The distance from vertex 1 to every other vertex in G. X = {1}; Y = {2,3,…,n}; [1]=0; for y = 2 to n do if y is adjacent to 1 then [y]=length[1,y]; else [y]= end if end for for j = 1 to n – 1 do Let y Y be such that [y] is minimum; X = X {y} // add vertex y to X Y = Y - {y} // delete vertex y from Y for each edge (y,w) do if w Y and [y] + length[y,w] < [w] then [w] = [y] + length[y,w]
24
Correctness of Dijkstra’s Algorithm
Lemma: In Dijkstra’s algorithm, when a vertex y is chosen in Step 7, if its label [y] is finite then [y] = [y]. Proof:
25
Time Complexity Mainly depends on how we implement step 7, i.e., finding y s.t. [y] is minimum. Approach 1: Scan through the vector representing current distances of vertices in Y: Approach 2: Use a min-heap to maintain vertices in the set Y:
26
Minimum Cost Spanning Trees
Minimum Cost Spanning Tree (MST) Problem: Input: An undirected weighted connected graph G. Output: The subgraph of G that 1) has minimum total cost as measured by summing the weights of all the edges in the subset, and 2) keeps the vertices connected. What does such subgraph look like?
27
MST Example B 5 2 D 1 A 4 11 2 3 2 C E
28
Initially, each vertex is in its own MST.
Kruskal’s Algorithm Initially, each vertex is in its own MST. Merge two MST’s that have the shortest edge between them. Use a priority queue to order the unprocessed edges. Grab next one at each step. How to tell if an edge connects two vertices that are already in the same MST? Use the UNION/FIND algorithm with parent-pointer representation.
29
Example B 5 2 D 1 A 3 11 2 3 2 C E
30
Kruskal’s MST Algorithm
Sort the edges of E(G) by weight in non-decreasing order; For each vertex v V(G) do New_Tree(v); // creating tree with one root node v T=; // MST initialized to empty While |T| < n - 1 do Let (u,v) be the next edge in E(G) if FIND(u) FIND(v) then T = T (u,v); UNION(u,v); Return T;
31
Asymptotic Analysis of Kruskal’s Algorithm
32
Correctness of Kruskal’s Algorithm
Lemma: Algorithm Kruskal correctly finds a minimum cost spanning tree in a weighted undirected graph. Proof: Theorem: Algorithm Kruskal’s finds a minimum cost spanning tree in a weighted undirected graph in O(m log m) time, where m is the number of edges in G.
33
Money Change Problem Given a currency system that has n coins with values v1, v2 , ..., vn, where v1 = 1, the objective is to pay change of value y in such a way that the total number of coins is minimized. More formally, we want to minimize the quantity subject to the constraint Here, x1, x2 , ..., xn, are nonnegative integers (so xi may be zero).
34
Money Change Problem What is a greedy algorithm to solve this problem? Is the greedy algorithm optimal?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.