Introduction to Algorithms

Slides:



Advertisements
Similar presentations
CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 4 1.
Advertisements

Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible, i.e. satisfying the.
Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.
CS 332: Algorithms NP Completeness David Luebke /2/2017.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Greedy Algorithms.
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Greedy Algorithms Amihood Amir Bar-Ilan University.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Greed is good. (Some of the time)
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
Data Structures – LECTURE 10 Huffman coding
Week 2: Greedy Algorithms
Lecture 7: Greedy Algorithms II
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
 Greedy Algorithms. Greedy Algorithm  Greedy Algorithm - Makes locally optimal choice at each stage. - For optimization problems.  If the local optimum.
Data Structures and Algorithms A. G. Malamos
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Bahareh Sarrafzadeh 6111 Fall 2009
 2004 SDU Lecture 6- Minimum Spanning Tree 1.The Minimum Spanning Tree Problem 2.Greedy algorithms 3.A Generic Algorithm 4.Kruskal’s Algorithm.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
Greedy Algorithms.
Greedy Algorithms Chapter 16 Highlights
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 2 Tuesday, 2/2/10 Design Patterns for Optimization.
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
Greedy Algorithms Analysis of Algorithms.
Greedy algorithms 2 David Kauchak cs302 Spring 2012.
Greedy Algorithms Lecture 10 Asst. Prof. Dr. İlker Kocabaş 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
HUFFMAN CODES.
Greedy Algorithms Alexandra Stefan.
CSC317 Greedy algorithms; Two main properties:
CSCE 411 Design and Analysis of Algorithms
The Greedy Method and Text Compression
Introduction to Algorithms`
Greedy Algorithm.
Chapter 16: Greedy Algorithm
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
CS6045: Advanced Algorithms
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Greedy Algorithms Comp 122, Spring 2004.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Asst. Prof. Dr. İlker Kocabaş
Huffman Coding Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

Introduction to Algorithms LECTURE 15 Greedy Algorithms II • Activity-Selection Problem • Knapsack Problem • Huffman Codes • Task Scheduling

Greedy Algorithm: Vertex Cover A vertex cover of an undirected graph G=(V, E) is a subset V'  V such that if (u, v)  E, then u  V' or v  V', or both. The set of vertices covers all the edges. The size of a vertex cover is the number of vertices in the cover. The vertex-cover problem is to find a vertex cover of minimum size in a graph. Greedy heuristic: cover as many edges as possible (vertex with the maximum degree) at each stage and then delete the covered edges. The greedy heuristic cannot always find an optimal solution! The vertex-cover problem is NP-complete.

A Greedy Algorithm A greedy algorithm always makes the choice that looks best at the moment. An Activity-Selection Problem: Given a set S = {1, 2, …, n} of n proposed activities, with a start time si and a finish time fi for each activity i, select a maximum-size set of mutually compatible activities. If selected, activity i takes place during the half-open time interval [si, fi). Activities i and j are compatible if [si, fi) and [sj, fj) do not overlap (i.e., si  fj or sj  fi).

Activity Selection

The Activity-Selection Algorithm Greedy-Activity-Selector(s,f) /* Assume f1  f2  …  fn. */ 1. n  length[s]; 2. A  {1}; 3. j  1; 4. for I  2 to n 5. if si  fj 6. A  A  {i}; 7. j  i; 8. return A. Time complexity excluding sorting: O(n) Theorem: Algorithm Greedy-Activity-Selector produces solutions of maximum size for the activity-selection problem. (Greedy-choice property) Suppose A  S is an optimal solution. Show that if the first activity in A activity k  1, then B = A - {k}  {1} is an optimal solution. (Optimal substructure) Show that if A is an optimal solution to S, then A' = A - {1} is an optimal solution to S' = {i  S: si  f1}. Prove by induction on the number of choices made.

Elements of the Greedy Strategy When to apply greedy algorithms? Greedy-choice property: A global optimal solution can be arrived at by making a locally optimal (greedy) choice. Dynamic programming needs to check the solutions to subproblems. Optimal substructure: An optimal solution to the problem contains within its optimal solutions to subproblems. E.g., if A is an optimal solution to S, then A' = A - {1} is an optimal solution to S' = {i  S: si  f1}. Greedy algorithms (heuristics) do not always produce optimal solutions. Greedy algorithms v.s. dynamic programming (DP) Common: optimal substructure Difference: greedy-choice property DP can be used if greedy solutions are not optimal.

Knapsack Problem Knapsack Problem: Given n items, with ith item worth vi dollars and weighing wi pounds, a thief wants to take as valuable a load as possible, but can carry at most W pounds in his knapsack. The 0-1 knapsack problem: Each item is either taken or not taken (0-1 decision). The fractional knapsack problem: Allow to take fraction of items. Exp: = (60, 100, 120), = (10, 20, 30), W = 50 Greedy solution by taking items in order of greatest value per pound is optimal for the fractional version, but not for the 0-1 version. The 0-1 knapsack problem is NP-complete, but can be solved in O(nW) time by DP. (A polynomial-time DP??)

Coding Used for data compression, instruction-set encoding, etc. Binary character code: character is represented by a unique binary string Fixed-length code (block code): a: 000, b: 001, …, f: 101  ace  000 010 100. Variable-length code: frequent characters  short codeword; infrequent characters  long codeword

Binary Tree v.s. Prefix Code Prefix code: No code is a prefix of some other code.

Optimal Prefix Code Design Coding Cost of T: B(T)= c  C f(c)dT(c) c : each character in the alphabet C f(c): frequency of c dT(c): depth of c's leaf (length of the codeword of c) Code design: Given f(c1), f(c2), …, f(cn), construct a binary tree with n leaves such that B(T) is minimized. Idea: more frequently used characters use shorter depth.

Huffman's Procedure Pair two nodes with the least costs at each step.

Huffman's Algorithm Time complexity: O(n lgn). Huffman(C) 1. n  |C|; 2. Q  C; 3. for I  1 to n-1 4. z  Allocate-Node(); 5. x  left[z]  Extract-Min(Q); 6. y  right[z]  Extract-Min(Q); 7. f[z]  f[x]+f[y]; 8. Insert(Q, z); 9. return Extract-Min(Q) Time complexity: O(n lgn). Extract-Min(Q) needs O(lg n) by a heap operation. Requires initially O(n lgn) time to build a binary heap.

Huffman Algorithm: Greedy Choice Greedy choice: Two characters x and y with the lowest frequencies must have the same length and differ only in the last bit.

Huffman's Algorithm: Optimal Substructure Optimal substructure: Let T be a full binary tree for an optimal prefix code over C. Let z be the parent of two leaf characters x and y. If f [z]=f [x]+f [y], tree T' = T - {x, y} represents an optimal prefix code for C'= C - {x, y}  {z}.

Task Scheduling The task scheduling problem: Schedule unit-time tasks with deadlines and penalties s.t. the total penalty for missed deadlines is minimized. S = {1, 2, …, n} of n unit-time tasks. Deadlines d1, d2, …, dn for tasks, 1  di  n. Penalties w1, w2, …, wn : wi is incurred if task i misses deadline. Set A of tasks is independent if  a schedule with no late tasks. Nt(A): number of tasks in A with deadlines t or earlier, t = 1, 2, …, n. Three equivalent statements for any set of tasks A A is independent. Nt(A)  t, t = 1, 2, …, n. If the task in A are scheduled in order of nondecreasing deadlines, then no task is late.

Greedy Algorithm: Task Scheduling The optimal greedy scheduling algorithm: Sort penalties in non-increasing order. Find tasks of independent sets: no late task in the sets. Schedule tasks in a maximum independent set in order of nondecreasing deadlines. Schedule other tasks (missing deadlines) at the end arbitrarily.