Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible, i.e. satisfying the.
Greedy Technique The first key ingredient is the greedy-choice property: a globally optimal solution can be arrived at by making a locally optimal (greedy)
Lecture 15. Graph Algorithms
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Prof. Sin-Min Lee Department of Computer Science
Greed is good. (Some of the time)
22C:19 Discrete Structures Trees Spring 2014 Sukumar Ghosh.
IKI 10100: Data Structures & Algorithms Ruli Manurung (acknowledgments to Denny & Ade Azurat) 1 Fasilkom UI Ruli Manurung (Fasilkom UI)IKI10100: Lecture10.
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
1 Minimum Spanning Tree Prim-Jarnik algorithm Kruskal algorithm.
Chapter 3 The Greedy Method 3.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Design and Analysis of Algorithms - Chapter 91 Greedy algorithms Optimization problems solved through a sequence of choices that are: b feasible b locally.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Minimum-Cost Spanning Tree weighted connected undirected graph spanning tree cost of spanning tree is sum of edge costs find spanning tree that has minimum.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Data Structures – LECTURE 10 Huffman coding
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
Data Structures and Algorithms Graphs Minimum Spanning Tree PLSD210.
Greedy Algorithms Make choice based on immediate rewards rather than looking ahead to see the optimum In many cases this is effective as the look ahead.
Theory of Computing Lecture 10 MAS 714 Hartmut Klauck.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
1 Minimum Spanning Tree Problem Topic 10 ITS033 – Programming & Algorithms Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing.
2IL05 Data Structures Fall 2007 Lecture 13: Minimum Spanning Trees.
Spring 2015 Lecture 11: Minimum Spanning Trees
Dijkstra’s Algorithm. Announcements Assignment #2 Due Tonight Exams Graded Assignment #3 Posted.
Greedy Algorithms Dr. Yingwu Zhu. Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that.
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Lectures on Greedy Algorithms and Dynamic Programming
CSCE350 Algorithms and Data Structure Lecture 19 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible b locally optimal.
Theory of Algorithms: Greedy Techniques James Gain and Edwin Blake {jgain | Department of Computer Science University of Cape Town.
MA/CSSE 473 Day 34 MST details: Kruskal's Algorithm Prim's Algorithm.
1 22c:31 Algorithms Minimum-cost Spanning Tree (MST)
Design and Analysis of Algorithms - Chapter 91 Greedy algorithms Optimization problems solved through a sequence of choices that are: b feasible b locally.
Greedy Algorithms. Zhengjin,Central South University2 Review: Dynamic Programming Summary of the basic idea: Optimal substructure: optimal solution to.
Chapter 11. Chapter Summary  Introduction to trees (11.1)  Application of trees (11.2)  Tree traversal (11.3)  Spanning trees (11.4)
CSCE 411 Design and Analysis of Algorithms
Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible locally optimal irrevocable.
Chapter 5 : Trees.
Greedy Technique.
Minimum Spanning Trees
Greedy method Idea: sequential choices that are locally optimum combine to form a globally optimum solution. The choices should be both feasible and irrevocable.
Courtsey & Copyright: DESIGN AND ANALYSIS OF ALGORITHMS Courtsey & Copyright:
Greedy Algorithms / Minimum Spanning Tree Yin Tat Lee
Minimum-Cost Spanning Tree
CSCE350 Algorithms and Data Structure
Greedy Technique.
Minimum Spanning Trees
Minimum-Cost Spanning Tree
Minimum-Cost Spanning Tree
Minimum Spanning Tree Algorithms
Minimum Spanning Tree.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Huffman Coding Greedy Algorithm
Greedy Algorithms (I) Greed, for lack of a better word, is good! Greed is right! Greed works! - Michael Douglas, U.S. actor in the role of Gordon Gecko,
Analysis of Algorithms CS 477/677
Minimum-Cost Spanning Tree
Presentation transcript:

Chapter 9 Greedy Technique

Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible - has to satisfy the problems constraints b locally optimal - b locally optimal - has to be the best choice among all feasible choices available at that step b Irrevocable b Irrevocable – once made, it cannot be changed on subsequent steps of the algorithm For some problems, yields an optimal solution for every instance. For most, does not but can be useful for fast approximations.

Applications of the Greedy Strategy b Optimal solutions: change making for normal coin denominationschange making for normal coin denominations minimum spanning tree (MST)minimum spanning tree (MST) single-source shortest pathssingle-source shortest paths simple scheduling problemssimple scheduling problems Huffman codesHuffman codes b Approximations: traveling salesman problem (TSP)traveling salesman problem (TSP) knapsack problemknapsack problem other combinatorial optimization problemsother combinatorial optimization problems

Change-Making Problem Given unlimited amounts of coins of denominations d 1 > … > d m, give change for amount n with the least number of coins Example: d 1 = 25c, d 2 =10c, d 3 = 5c, d 4 = 1c and n = 48c Greedy solution: Greedy solution is b optimal for any amount and normal set of denominations b may not be optimal for arbitrary coin denominations

Minimum Spanning Tree (MST) b Spanning tree of a connected graph G: a connected acyclic subgraph of G that includes all of Gs vertices b Minimum spanning tree of a weighted, connected graph G: a spanning tree of G of minimum total weight Example: c d b a

Prims MST algorithm b Start with tree T 1 consisting of one (any) vertex and grow tree one vertex at a time to produce MST through a series of expanding subtrees T 1, T 2, …, T n b On each iteration, construct T i+1 from T i by adding vertex not in T i that is closest to those already in T i (this is a greedy step!) b Stop when all vertices are included

Example c d b a

Notes about Prims algorithm b Proof by induction that this construction actually yields MST b Needs priority queue for locating closest fringe vertex b Efficiency O(n 2 ) for weight matrix representation of graph and array implementation of priority queueO(n 2 ) for weight matrix representation of graph and array implementation of priority queue O(m log n) for adjacency list representation of graph with n vertices and m edges and min-heap implementation of priority queueO(m log n) for adjacency list representation of graph with n vertices and m edges and min-heap implementation of priority queue

Induction Basis step: T 0 consists of a single vertex and hence must be part of any minimum spanning tree. Inductive step: Assume that T i-1 is part of some minimum spanning tree. Need to prove that T i, generated from T i-1 by Prims algorithm, is also a part of a minimum spanning tree. Proof by contradiction: assume that no minimum spanning tree of the graph can contain T i Let e i =(v, u) be the minimum weight edge in T i-1 to a vertex not in T i-1 used by Prims algorithm to expand T i-1 to T i. By our assumption, e i cannot belong to the minimum spanning tree T. Therefore, if we add e i to T, a cycle must be formed.

Another greedy algorithm for MST: Kruskals b Sort the edges in nondecreasing order of lengths b Grow tree one edge at a time to produce MST through a series of expanding forests F 1, F 2, …, F n-1 b On each iteration, add the next edge on the sorted list unless this would create a cycle. (If it would, skip the edge.)

Example c d b a

Notes about Kruskals algorithm b Algorithm looks easier than Prims but is harder to implement (checking for cycles!) b Cycle checking: a cycle is created iff added edge connects vertices in the same connected component b Union-find algorithms – see section 9.2

Kruskals Algorithm re-thought b View the algorithms operations as a progression through a series of forests Containing all of the vertices of a graph andContaining all of the vertices of a graph and Some its edgesSome its edges b The initial forest consists of |V| trivial trees, each comprising a single vertex of the graph On each iteration the algorithm takes the next edge (u,v) from the sorted list of the graphs edgesOn each iteration the algorithm takes the next edge (u,v) from the sorted list of the graphs edges Finds the trees containing the vertices u and v, andFinds the trees containing the vertices u and v, and If these trees are not the same, unites them in a larger tree by adding the edge (u,v)If these trees are not the same, unites them in a larger tree by adding the edge (u,v) b With an efficient sorting algorithm, Kruskal is O(|E| log |E|)

Kruskal and Union-Find b Union-find algorithms for checking whether two vertices belong to the same tree b Partition the n-element set S into a collection of n one- element subsets, each containing a different element of S b The collection is subjected to a sequence of intermixed union and find operations b ADT: makeset(x) creates a one element set {x}makeset(x) creates a one element set {x} find(x) returns a subset contain xfind(x) returns a subset contain x union (x, y) constructs the union of disjoint subsets Sx and Sy, containing x and y, and adds it to the collection to replace Sx and Sy which are deleted from itunion (x, y) constructs the union of disjoint subsets Sx and Sy, containing x and y, and adds it to the collection to replace Sx and Sy which are deleted from it

Minimum spanning tree vs. Steiner tree c db a c db a vs See problem 11 on page 323

Shortest paths – Dijkstras algorithm Single Source Shortest Paths Problem: Given a weighted connected graph G, find shortest paths from source vertex s to each of the other vertices Dijkstras algorithm: Similar to Prims MST algorithm, with a different way of computing numerical labels: Among vertices not already in the tree, it finds vertex u with the smallest sum d v + w(v,u) d v + w(v,u)where v is a vertex for which shortest path has been already found on preceding iterations (such vertices form a tree) v is a vertex for which shortest path has been already found on preceding iterations (such vertices form a tree) d v is the length of the shortest path from source to v w(v,u) is the length (weight) of edge from v to u d v is the length of the shortest path from source to v w(v,u) is the length (weight) of edge from v to u

Dijkstras Algorithm (G,l) G is the graph, s is the start node l is the length of an edge Let S be the set of explored nodes For each u in S, we store the distance d(u) Initially S = {s} and d(s) = 0 While S is not equal to V Select a node v not in S with at least one edge from S for which d(v) = min e=(u,v):u is in S d(u) + l e is as small as possible Add v to S and define d(v) = d(v) EndWhile

Example d 4 Tree vertices Remaining vertices a(-,0) b(a,3) c(-,) d(a,7) e(-,) a b 4 e c a b d 4 c e a b d 4 c e a b d 4 c e b(a,3) c(b,3+4) d(b,3+2) e(-,) d(b,5) c(b,7) e(d,5+4) c(b,7) e(d,9) e(d,9) d a b d 4 c e

Notes on Dijkstras algorithm b Doesnt work for graphs with negative weights b Applicable to both undirected and directed graphs b Efficiency O(|V| 2 ) for graphs represented by weight matrix and array implementation of priority queueO(|V| 2 ) for graphs represented by weight matrix and array implementation of priority queue O(|E|log|V|) for graphs represented by adj. lists and min-heap implementation of priority queueO(|E|log|V|) for graphs represented by adj. lists and min-heap implementation of priority queue b Dont mix up Dijkstras algorithm with Prims algorithm!

Coding Problem Coding: assignment of bit strings to alphabet characters Codewords: bit strings assigned for characters of alphabet Two types of codes: b fixed-length encoding (e.g., ASCII) b variable-length encoding (e,g., Morse code) Prefix-free codes: no codeword is a prefix of another codeword Problem: If frequencies of the character occurrences are known, what is the best binary prefix-free code? known, what is the best binary prefix-free code?

Huffman Codes b Labeling the tree T*: We label all the leaves of depth 1 (if there are any) and label them with the highest-frequency letters in any order.We label all the leaves of depth 1 (if there are any) and label them with the highest-frequency letters in any order. We then take all leaves of depth 2 (if there are any) and label them with the next highest-frequency letters in any order.We then take all leaves of depth 2 (if there are any) and label them with the next highest-frequency letters in any order. Continue through the leaves in order of increasing depthContinue through the leaves in order of increasing depth b There is an optimal prefix code, with corresponding tree T*, in which the two lowest-frequency letters are assigned to leaves that are siblings in T*. Delete them and label their parent with a new letter having the combined frequencyDelete them and label their parent with a new letter having the combined frequency Yields an instance with a smaller alphabetYields an instance with a smaller alphabet

Huffmans Algorithm To construct a prefix code for an alphabet S, with given frequencies: If S has two letters then encode one letter using 0 and other letter using 1 Else Let y* and z* be the two lowest-frequency letters Form a new alphabet S by deleting y* and z* and replacing them with a new letter of frequency f y* + f z* Recursively construct a prefix code for S, with tree T Define a prefix code for S as follows: Start with T Take the leaf labeled and add two children below it labeled y* and z* Endif0…………………………………………………………………………………………………………………………………………… …………………………………………………………………………………………………………………………………………………… …………………………………………………………………………………………………………………………………………………… …………………….

Huffman codes b Any binary tree with edges labeled with 0s and 1s yields a prefix-free code of characters assigned to its leaves b Optimal binary tree minimizing the expected (weighted average) length of a codeword can be constructed as follows Huffmans algorithm Initialize n one-node trees with alphabet characters and the tree weights with their frequencies. Repeat the following step n-1 times: join two binary trees with smallest weights into one (as left and right subtrees) and make its weight equal the sum of the weights of the two trees. Mark edges leading to left and right subtrees with 0s and 1s, respectively.

Example character AB C D _ frequency codeword average bits per character: 2.25 for fixed-length encoding: 3 compression ratio: (3-2.25)/3*100% = 25%