ICS 353: Design and Analysis of Algorithms

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Greedy Algorithms Greed is good. (Some of the time)
1 Discrete Structures & Algorithms Graphs and Trees: III EECE 320.
Greed is good. (Some of the time)
The Greedy Approach Chapter 8. The Greedy Approach It’s a design technique for solving optimization problems Based on finding optimal local solutions.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2006 Lecture 2 Monday, 2/6/06 Design Patterns for Optimization.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 2 Monday, 9/13/06 Design Patterns for Optimization Problems.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 11 Instructor: Paul Beame.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2008 Lecture 2 Tuesday, 9/16/08 Design Patterns for Optimization.
TECH Computer Science Graph Optimization Problems and Greedy Algorithms Greedy Algorithms  // Make the best choice now! Optimization Problems  Minimizing.
알고리즘 설계 및 분석 Foundations of Algorithm 유관우. Digital Media Lab. 2 Chap4. Greedy Approach Grabs data items in sequence, each time with “best” choice, without.
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
Lectures on Greedy Algorithms and Dynamic Programming
CSC5101 Advanced Algorithms Analysis
Greedy Algorithms Z. GuoUNC Chapel Hill CLRS CH. 16, 23, & 24.
1 22c:31 Algorithms Minimum-cost Spanning Tree (MST)
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 2 Tuesday, 2/2/10 Design Patterns for Optimization.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
ICS 353: Design and Analysis of Algorithms NP-Complete Problems King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Greedy Algorithms General principle of greedy algorithm
Greedy algorithms: CSC317
COMP108 Algorithmic Foundations Greedy methods
Introduction to Algorithms
Lecture on Design and Analysis of Computer Algorithm
Introduction to Algorithms
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Greedy method Idea: sequential choices that are locally optimum combine to form a globally optimum solution. The choices should be both feasible and irrevocable.
Analysis of Algorithms CS 477/677
Greedy Algorithms (Chap. 16)
Design & Analysis of Algorithm Greedy Algorithm
Introduction to Algorithms`
Lecture 12 Algorithm Analysis
Greedy Algorithms / Minimum Spanning Tree Yin Tat Lee
Greedy Algorithms / Interval Scheduling Yin Tat Lee
CS 3343: Analysis of Algorithms
First-Cut Techniques Lecturer: Jing Liu
ICS 353: Design and Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
CS6045: Advanced Algorithms
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Chapter 16: Greedy algorithms Ming-Te Chi
Autumn 2015 Lecture 10 Minimum Spanning Trees
Lecture 12 Algorithm Analysis
Advanced Algorithms Analysis and Design
Advanced Algorithms Analysis and Design
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Minimum Spanning Tree Algorithms
Lecture 6 Topics Greedy Algorithm
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Chapter 16: Greedy algorithms Ming-Te Chi
ICS 353: Design and Analysis of Algorithms
Lecture 14 Shortest Path (cont’d) Minimum Spanning Tree
Greedy Algorithms Comp 122, Spring 2004.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Autumn 2016 Lecture 10 Minimum Spanning Trees
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Lecture 12 Algorithm Analysis
Winter 2019 Lecture 11 Minimum Spanning Trees (Part II)
Lecture 13 Shortest Path (cont’d) Minimum Spanning Tree
Advance Algorithm Dynamic Programming
Presentation transcript:

ICS 353: Design and Analysis of Algorithms King Fahd University of Petroleum & Minerals Information & Computer Science Department ICS 353: Design and Analysis of Algorithms Greedy Algorithms

Reading Assignment M. Alsuwaiyel, Introduction to Algorithms: Design Techniques and Analysis, World Scientific Publishing Co., Inc. 1999. Chapter 8 Section 1. Chapter 8 Sections 8.2 – 8.4 (Except 8.2.1 and 8.4.1) T. Cormen, C. Leiserson, R. Rivest & C. Stein Introduction to Algorithms, 2nd Edition, The MIT Press, 2001.

Unlike dynamic programming algorithms, Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming algorithms, greedy algorithms are iterative in nature. An optimal solution is reached from local optimal solutions. This approach does not work all the time. A proof that the algorithm does what it claims is needed, and usually not easy to get.

Fractional Knapsack Problem Given n items of sizes s1, s2, …, sn and values v1, v2, …, vn and size C, the problem is to find x1, x2, …, xn  that maximize subject to

Solution to Fractional Knapsack Problem Consider yi = vi / si What is yi? What is the solution?

Activity Selection Problem Problem Formulation Given a set of n activities, S = {a1, a2, ..., an} that require exclusive use of a common resource, find the largest possible set of nonoverlapping activities (also called mutually compatible). For example, scheduling the use of a classroom. Assume that ai needs the resource during period [si, fi), which is a half-open interval, where si = start time of the activity, and fi = finish time of the activity. Note: Could have many other objectives: Schedule room for longest time. Maximize income rental fees.

Activity Selection Problem: Example Assume the following set S of activities that are sorted by their finish time, find a maximum-size mutually compatible set. i 1 2 3 4 5 6 7 8 9 si 11 13 fi 10 14 16

i 1 2 3 4 5 6 7 8 9 si 11 13 fi 10 14 16

Solving the Activity Selection Problem Define Si,j = {ak  S : fi  sk < fk  sj } activities that start after ai finishes and finish before aj starts Activities in Si,j are compatible with: Add the following [fictitious] activities a0 = [– , 0) and an+1 = [ , +1) Hence, S = S0,n+1 and the range of Si,j is 0  i,j  n+1

Solving the Activity Selection Problem Assume that activities are sorted by monotonically increasing finish time: i.e., f0  f1  f2  ...  fn < fn+1 Then, Si,j =  for i  j. Proof: Therefore, we only need to worry about Si,j where 0  i < j  n+1

Solving the Activity Selection Problem Suppose that a solution to Si,j includes ak . We have 2 sub-problems: Si,k (start after ai finishes, finish before ak starts) Sk,j (start after ak finishes, finish before aj starts) The Solution to Si,j is (solution to Si,k )  {ak}  (solution to Sk,j ) Since ak is in neither sub-problem, and the subproblems are disjoint, |solution to S| = |solution to Si,k|+1+|solution to Sk,j|

Recursive Solution to Activity Selection Problem Let Ai,j = optimal solution to Si,j . So Ai,j = Ai,k  {ak}  Ak,j, assuming: Si,j is nonempty, and we know ak. Hence,

Finding the Greedy Algorithm Theorem: Let Si,j  , and let am be the activity in Si,j with the earliest finish time: fm= min { fk : ak  Si,j }. Then: am is used in some maximum-size subset of mutually compatible activities of Si,j Sim= , so that choosing am leaves Sm,j as the only nonempty subproblem.

Recursive Greedy Algorithm

Iterative Greedy Algorithm

Greedy Strategy Determine the optimal substructure. Develop a recursive solution. Prove that at any stage of recursion, one of the optimal choices is the greedy choice. Therefore, it's always safe to make the greedy choice. Show that all but one of the subproblems resulting from the greedy choice are empty. Develop a recursive greedy algorithm. Convert it to an iterative algorithm.

Shortest Paths Problems Input: A graph with non-negative weights or costs associated with each edge. Output: The list of edges forming the shortest path. Sample problems: Find shortest path between two named vertices Find shortest path from S to all other vertices Find shortest path between all pairs of vertices Will actually calculate only distances, not paths.

Shortest Paths Definitions (A, B) is the shortest distance from vertex A to B. length(A, B) is the weight of the edge connecting A to B. If there is no such edge, then length(A, B) = . A 8 C 1 10 5 B D 7

Single-Source Shortest Paths Problem: Given G=(V,E), start vertex s, find the shortest path from s to all other vertices. Assume V={1, 2, …, n} and s=1 Solution: A greedy algorithm called Dijkstra’s Algorithm

Dijkstra’s Algorithm Outline Partition V into two sets: X={1} and Y= {2, 3, …, n} Initialize [i] for 1  i  n as follows: … Select yY such that [y] is minimum [y] is the length of the shortest path from 1 to y that uses only vertices in set X. Remove y from Y, add it to X, and update [w] for each wY where (y,w)  E if the path through y is shorter.

Example B 5 6 D 1 A 2 11 2 3 15 C E

Dijkstra’s Algorithm A B C D E Initial  Process A Process

Dijkstra’s Algorithm Input: A weighted directed graph G = (V,E), where V = {1,2,…,n} Output: The distance from vertex 1 to every other vertex in G. X = {1}; Y = {2,3,…,n}; [1]=0; for y = 2 to n do if y is adjacent to 1 then [y]=length[1,y]; else [y]=  end if end for for j = 1 to n – 1 do Let y  Y be such that [y] is minimum; X = X  {y} // add vertex y to X Y = Y - {y} // delete vertex y from Y for each edge (y,w) do if w  Y and [y] + length[y,w] < [w] then [w] = [y] + length[y,w]

Correctness of Dijkstra’s Algorithm Lemma: In Dijkstra’s algorithm, when a vertex y is chosen in Step 7, if its label [y] is finite then [y] =  [y]. Proof:

Time Complexity Mainly depends on how we implement step 7, i.e., finding y s.t. [y] is minimum. Approach 1: Scan through the vector representing current distances of vertices in Y: Approach 2: Use a min-heap to maintain vertices in the set Y:

Minimum Cost Spanning Trees Minimum Cost Spanning Tree (MST) Problem: Input: An undirected weighted connected graph G. Output: The subgraph of G that 1) has minimum total cost as measured by summing the weights of all the edges in the subset, and 2) keeps the vertices connected. What does such subgraph look like?

MST Example B 5 2 D 1 A 4 11 2 3 2 C E

Initially, each vertex is in its own MST. Kruskal’s Algorithm Initially, each vertex is in its own MST. Merge two MST’s that have the shortest edge between them. Use a priority queue to order the unprocessed edges. Grab next one at each step. How to tell if an edge connects two vertices that are already in the same MST? Use the UNION/FIND algorithm with parent-pointer representation.

Example B 5 2 D 1 A 3 11 2 3 2 C E

Kruskal’s MST Algorithm Sort the edges of E(G) by weight in non-decreasing order; For each vertex v  V(G) do New_Tree(v); // creating tree with one root node v T=; // MST initialized to empty While |T| < n - 1 do Let (u,v) be the next edge in E(G) if FIND(u)  FIND(v) then T = T  (u,v); UNION(u,v); Return T;

Asymptotic Analysis of Kruskal’s Algorithm

Correctness of Kruskal’s Algorithm Lemma: Algorithm Kruskal correctly finds a minimum cost spanning tree in a weighted undirected graph. Proof: Theorem: Algorithm Kruskal’s finds a minimum cost spanning tree in a weighted undirected graph in O(m log m) time, where m is the number of edges in G.

Money Change Problem Given a currency system that has n coins with values v1, v2 , ..., vn, where v1 = 1, the objective is to pay change of value y in such a way that the total number of coins is minimized. More formally, we want to minimize the quantity subject to the constraint Here, x1, x2 , ..., xn, are nonnegative integers (so xi may be zero).

Money Change Problem What is a greedy algorithm to solve this problem? Is the greedy algorithm optimal?