Greedy Algorithms TOPICS Greedy Strategy Activity Selection

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Greedy Algorithms Amihood Amir Bar-Ilan University.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Greedy Algorithms Greed is good. (Some of the time)
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
Data Structures – LECTURE 10 Huffman coding
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Lecture 1: The Greedy Method 主講人 : 虞台文. Content What is it? Activity Selection Problem Fractional Knapsack Problem Minimum Spanning Tree – Kruskal’s Algorithm.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Algorithm Paradigms High Level Approach To solving a Class of Problems.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy algorithms David Kauchak cs161 Summer 2009.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
Agenda Review: –Planar Graphs Lecture Content:  Concepts of Trees  Spanning Trees  Binary Trees Exercise.
CSCE350 Algorithms and Data Structure Lecture 19 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Bahareh Sarrafzadeh 6111 Fall 2009
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
Greedy Algorithms Analysis of Algorithms.
Greedy algorithms 2 David Kauchak cs302 Spring 2012.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
HUFFMAN CODES.
Greedy Algorithms Alexandra Stefan.
CSC317 Greedy algorithms; Two main properties:
Greedy Algorithms – Chapter 5
Lecture on Design and Analysis of Computer Algorithm
Greedy Technique.
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Introduction to Algorithms`
Greedy Algorithm.
Chapter 16: Greedy Algorithm
First-Cut Techniques Lecturer: Jing Liu
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Analysis & Design of Algorithms (CSCE 321)
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
CS6045: Advanced Algorithms
Algorithms (2IL15) – Lecture 2
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Merge Sort Dynamic Programming
Lecture 6 Topics Greedy Algorithm
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Podcast Ch23d Title: Huffman Compression
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Huffman codes Binary character code: each character is represented by a unique binary string. A data file can be coded in two ways: a b c d e f frequency(%)
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

Greedy Algorithms TOPICS Greedy Strategy Activity Selection Minimum Spanning Tree Shortest Paths Huffman Codes Fractional Knapsack 1/18/2019 CSE 5311 Fall 2007 M Kumar

The Greedy Principle The problem: We are required to find a feasible solution that either maximizes or minimizes a given objective solution. It is easy to determine a feasible solution but not necessarily an optimal solution. The greedy method solves this problem in stages, at each stage, a decision is made considering inputs in an order determined by the selection procedure which may be based on an optimization measure. The greedy algorithm always makes the choice that looks best at the moment. For each decision point in the greedy algorithm, the choice that seems best at the moment is chosen It makes a local optimal choice that may lead to a global optimal choice. 1/18/2019 CSE 5311 Fall 2007 M Kumar

Activity Selection Problem Scheduling a resource among several competing activities. S = {1,2, 3, …, n} is the set of n proposed activities The activities share a resource, which can be used by only one activity at a time -a Tennis Court, a Lecture Hall etc., Each activity i has a start time, si and a finish time fi, where si  fi. When selected, the activity takes place during time (si, fi) Activities i and j are compatible if si  fj or sj  fi The activity-selection problem selects the maximum-size set of mutually compatible activities The input activities are in order by increasing finishing times. f1  f2  f3 …  fn ; Can be sorted in O (n log n ) time 1/18/2019 CSE 5311 Fall 2007 M Kumar

Procedure for activity selection (from CLRS) Procedure GREEDY_ACTIVITY_SELECTOR(s, f) n  length [S]; in order of increasing finishing times; A  {1}; first job to finish j  1; for i  2 to n do if si  fj then A  A {i}; j  i; 1/18/2019 CSE 5311 Fall 2007 M Kumar

Initially we choose activity 1 as it has the least finish time. Then, activities 2 and 3 are not compatible as s2 < f1 and s3 < f1. We choose activity 4, s4 > f1, and add activity 4 to the set A. A = {1, 4} Activities 5, 6, and 7 are incompatible and activity 8 is chosen A = {1,4,8} Finally activity 10 is incompatible and activity 11 is chosen A {1,4,8,11} The algorithm can schedule a set of n activities in  (n) time. i si fi 1 1 4 2 3 5 3 0 6 4 5 7 5 3 8 6 5 9 7 6 10 8 8 11 9 8 12 10 2 13 11 12 14 1/18/2019 CSE 5311 Fall 2007 M Kumar

Greedy Algorithms Minimum Cost Spanning Tree Kruskal’s algorithm Prim’s Algorithm Single Source Shortest Path Huffman Codes 1/18/2019 CSE 5311 Fall 2007 M Kumar

Prim’s Algorithm The equivalent Graph and the MCST C 1 D A A A 3 3 F 1 B 2 2 6 4 E 5 C 1 D D The equivalent Graph and the MCST E 2 F C 1 D A 3 C 1 D A E 2 F C 1 D A B F 2 1/18/2019 CSE 5311 Fall 2007 M Kumar

Huffman codes Data Compression Huffman codes are used to compress data. We will study Huffman's greedy algorithm for encoding compressed data. Data Compression A given file can be considered as a string of characters. The work involved in compressing and uncompressing should justify the savings in terms of storage area and/or communication costs. In ASCII all characters are represented by bit strings of size 7. For example if we had 100000 characters in a file then we need 700000 bits to store the file using ASCII. 1/18/2019 CSE 5311 Fall 2007 M Kumar

Fixed-length 000 001 010 011 100 101 codeword Example The file consists of only 6 characters as shown in the table below. Using the fixed-length binary code, the whole file can be encoded in 300,000 bits. However using the variable-length code , the file can be encoded in 224,000 bits. a b c d e f Frequency 45 13 12 16 9 5 (in thousands) Fixed-length 000 001 010 011 100 101 codeword Variable-length 0 101 100 111 1101 1100 codeword A variable length coding scheme assigns frequent characters, short code words and infrequent characters, long code words. In the above variable-length code, 1-bit string represents the most frequent character a, and a 4-bit string represents the most infrequent character f. 1/18/2019 CSE 5311 Fall 2007 M Kumar

Let us denote the characters by C1, C2, …, Cn and denote their frequencies by f1, f2, ,,,, fn. Suppose there is an encoding E in which a bit string Si of length si represents Ci, the length of the file compressed by using encoding E is 1/18/2019 CSE 5311 Fall 2007 M Kumar

This constraint is called the prefix constraint. Prefix Codes The prefixes of an encoding of one character must not be equal to a complete encoding of another character. 1100 and 11001 are not valid codes because 1100 is a prefix of 11001 This constraint is called the prefix constraint. Codes in which no codeword is also a prefix of some other code word are called prefix codes. Shortening the encoding of one character may lengthen the encodings of others. To find an encoding E that satisfies the prefix constraint and minimizes L(E,F). 1/18/2019 CSE 5311 Fall 2007 M Kumar

0 101 100 111 1101 1100 100 1 The prefix code for file can be represented by a binary tree in which every non leaf node has two children. Consider the variable-length code of the table above, a tree corresponding to the variable-length code of the table is shown below. 55 a:45 1 5 30 25 1 1 Note that the length of the code for a character is equal to the depth of the character in the tree shown. c:12 b:13 14 d:16 1 f:5 e:9 1/18/2019 CSE 5311 Fall 2007 M Kumar

Greedy Algorithm for Constructing a Huffman Code The algorithm builds the tree corresponding to the optimal code in a bottom-up manner. The algorithm begins with a set of C leaves and performs a sequence of 'merging' operations to create the tree. C is the set of characters in the alphabet. 1/18/2019 CSE 5311 Fall 2007 M Kumar

Procedure Huffman_Encoding(S,f); Input : S (a string of characters) and f (an array of frequencies). Output : T (the Huffman tree for S) 1. insert all characters into a heap H according to their frequencies; 2. while H is not empty do 3. if H contains only one character x then 4. x  root (T); 5. else 6. z  ALLOCATE_NODE(); 7. x  left[T,z]  EXTRACT_MIN(H); 8. y  right[T,z]  EXTRACT_MIN(H); 9. fz  fx + fy; 10. INSERT(H,z); 1/18/2019 CSE 5311 Fall 2007 M Kumar

A new character replaces two existing ones. f:5 e:9 c:12 b:13 d:16 a:45 The algorithm is based on a reduction of a problem with n characters to a problem with n-1 characters. A new character replaces two existing ones. c:12 b:13 d:16 a:45 1 14 f:5 e:9 25 1/18/2019 CSE 5311 Fall 2007 M Kumar

25 1 14 f:5 e:9 30 d:16 a:45 1 c:12 b:13 a:45 1 25 c:12 b:13 14 f:5 14 f:5 e:9 30 d:16 a:45 1 c:12 b:13 Suppose Ci and Cj are two characters with minimal frequency, there exists a tree that minimizes L (E,F) in which these characters correspond to leaves with the maximal distance from the root. a:45 1 25 c:12 b:13 14 f:5 e:9 30 d:16 55 1/18/2019 CSE 5311 Fall 2007 M Kumar

100 1 55 a:45 1 30 25 1 1 14 d:16 c:12 b:13 1 f:5 e:9 0 101 100 111 1101 1100 1/18/2019 CSE 5311 Fall 2007 M Kumar

Complexity of the algorithm Building a heap in step 1 takes O(n) time Insertions (steps 7 and 8) and deletions (step 10) on H take O (log n) time each Therefore Steps 2 through 10 take O(n logn) time Thus the overall complexity of the algorithm is O( n logn ). 1/18/2019 CSE 5311 Fall 2007 M Kumar

The fractional knapsack problem Limited supply of each item Each item has a size and a value per unit (e.g., Pound) greedy strategy Compute value per Pound for each item Arrange these in non-increasing order Fill sack with the item of greatest value per pound until either the item is exhausted or the sack is full If sack is not full, fill the remainder with the next item in the list Repeat until sack is full How about a 0-1 Knapsack?? Can we use Greedy strategy? 1/18/2019 CSE 5311 Fall 2007 M Kumar

Problems Suppose that we have a set of k activities to schedule among n number of lecture halls; activity i starts at time si and terminates at time fi 1  i  k. We wish to schedule all activities using as few lecture halls as possible. Give an efficient greedy algorithm to determine which activity should use which lecture hall. You are required to purchase n different types of items. Currently each item costs $D. However, the items will become more expensive according to exponential growth curves. In particular the cost of item j increases by a factor rj > 1 each month, where rj is a given parameter. This means that if item j is purchased t months from now, it will cost Drjt. Assume that the growth rates are distinct, that is ri = rj for items i  j. Given that you can buy only one item each month, design an algorithm that takes n rates of growth r1, r2, …, rn, and computes an order in which to buy the items so that the total amount spent is minimized. 1/18/2019 CSE 5311 Fall 2007 M Kumar