Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Lecture 2: Greedy Algorithms II Shang-Hua Teng Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
Lecture 7: Greedy Algorithms II
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Huffman Encoding Veronica Morales.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
Greedy Algorithms.
Greedy Algorithms Analysis of Algorithms.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms General principle of greedy algorithm
Greedy algorithms: CSC317
HUFFMAN CODES.
Greedy Algorithms Alexandra Stefan.
CSC317 Greedy algorithms; Two main properties:
CSCE 411 Design and Analysis of Algorithms
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
The Greedy Method and Text Compression
Greedy Algorithms (Chap. 16)
The Greedy Method and Text Compression
Introduction to Algorithms`
Greedy Algorithm.
Chapter 16: Greedy Algorithm
Chapter 16: Greedy Algorithms
Huffman Coding.
Binhai Zhu Computer Science Department, Montana State University
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
CS6045: Advanced Algorithms
Algorithms (2IL15) – Lecture 2
Advanced Algorithms Analysis and Design
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Data Structure and Algorithms
Chapter 16: Greedy algorithms Ming-Te Chi
Greedy Algorithms Comp 122, Spring 2004.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Podcast Ch23d Title: Huffman Compression
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Asst. Prof. Dr. İlker Kocabaş
Huffman Coding Greedy Algorithm
Advance Algorithm Dynamic Programming
Huffman codes Binary character code: each character is represented by a unique binary string. A data file can be coded in two ways: a b c d e f frequency(%)
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may may be used to build an optimal solution But the greedy approach may not always lead to an optimal solution overall for all problems The key is knowing which problems will work with this approach and which will not We will study The activity selection problem Element of a greedy strategy The problem of generating Huffman codes

The Activity Selection Problem Here are a set of start and finish times What is the maximum number of activities that can be completed? {a3, a9, a11} can be completed But so can {a1, a4, a8’ a11} which is a larger set But it is not unique, consider {a2, a4, a9’ a11} We will solve this problem in the following manner Show the optimal substructure property holds Solve the problem using dynamic programming Show it is greedy and provide a recursive greedy solution Provide an iterative greedy solution

Developing a Dynamic Solution Define the following subset of activities which are activities that can start after ai finishes and finish before aj starts Sort the activities according to finish time We now define the the maximal set of activities from i to j as Let c[i,j] be the maximal number of activities Our recurrence relation for finding c[i, j] becomes We can solve this using dynamic programming, but a simpler approach exists

We Show this is a Greedy Problem What are the consequences? Normally we have to inspect all subproblems, here we only have to choose one subproblem What this theorem says is that we only have to find the first subproblem with the smallest finishing time This means we can solve the problem top down by selecting the optimal solution to the local subproblem

A Top Down Recursive Solution The step by step solution is on the next slide Assuming the activities have been sorted by finish times, then the complexity of this algorithm is Q(n) Developing an iterative algorithm would be even faster

Here is a step by step solution Notice that the solution is not unique But the solution is still optimal No larger set of activities can be found

An Iterative Approach The recursive algorithm is almost tail recursive (what is that?) but there is a final union operation We let fi be the maximum finishing time for any activity in A The loop in lines 4-7 stops when the earliest finishing time is found The overall complexity of this algorithm is Q(n)

Elements of a Greedy Strategy We went through the following steps for the activity selector problem This was designed to show the similarities and differences between dynamic programming and the Greedy approach; these steps can be simplified if we apply the Greedy approach directly

Applying Greedy Directly Steps in designing a greedy algorithm You must show the greedy choice property holds: a global optimal solution can be reached by a local optimal choice The optimal substructure property holds (the same as dynamic programming)

Greedy vs. Dynamic Programming Any problem solvable by Greedy can be solved by dynamic programming, but not vice versa Two related example problems

Knapsack Problems - 1 Both problems exhibit the optimal substructure property, so both can be solved by dynamic programming Only the fractional knapsack problem can be solved by a Greedy approach Some key ideas in the Greedy solution Calculate the value per pound (vi/wi) for each item Store this data in a priority queue so that the maximum value can always be selected next Remove items and add to the total weight until all the weight is exhausted The complexity in O( n log n) (why?)

Knapsack Problems - 2 The same approach will not work for the 0/1 knapsack problem, as seen in the diagram below The 0/1 knapsack problem can be solved using dynamic programming where all subcases are considered

Huffman Codes Most character code systems (ASCII, unicode) use fixed length encoding If frequency data is available and there is a wide variety of frequencies, variable length encoding can save 20% to 90% space Which characters should we assign shorter codes; which characters will have longer codes? At first it is not obvious how decoding will happen, but this is possible if we use prefix codes

Prefix Codes No encoding of a character can be the prefix of the longer encoding of another character, for example, we could not encode t as 01 and x as 01101 since 01 is a prefix of 01101 By using a binary tree representation we will generate prefix codes provided all letters are leaves

Some Properties Prefix codes allow easy decoding Given a: 0, b: 101, c: 100, d: 111, e: 1101, f: 1100 Decode 001011101 going left to right, 0|01011101, a|0|1011101, a|a|101|1101, a|a|b|1101, a|a|b|e An optimal code must be a full binary tree (a tree where every internal node has two children) For C leaves there are C-1 internal nodes The number of bits to encode a file is where f(c) is the freq of c, dT(c) is the tree depth of c, which corresponds to the code length of c

Building the Encoding Tree

The Algorithm An appropriate data structure is a binary min-heap Rebuilding the heap is lg n and n-1 extractions are made, so the complexity is O( n lg n ) The encoding is NOT unique, other encoding may work just as well, but none will work better

Correctness of Huffman’s Algorithm The following results are presented without proof