Greedy algorithms David Kauchak cs302 Spring 2012.

Slides:



Advertisements
Similar presentations
CS 332: Algorithms NP Completeness David Luebke /2/2017.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Greedy Algorithms.
Chapter 5 Fundamental Algorithm Design Techniques.
Analysis of Algorithms
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 7: Greedy Algorithms
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 21 Instructor: Paul Beame.
The Theory of NP-Completeness
Lecture 23 CSE 331 Oct 24, Temp letter grades assigned See the blog post for more details.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Week 2: Greedy Algorithms
Lecture 7: Greedy Algorithms II
Lecture 20 CSE 331 Oct 21, Algorithm for Interval Scheduling R: set of requests Set A to be the empty set While R is not empty Choose i in R with.
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Called as the Interval Scheduling Problem. A simpler version of a class of scheduling problems. – Can add weights. – Can add multiple resources – Can ask.
Scott Perryman Jordan Williams.  NP-completeness is a class of unsolved decision problems in Computer Science.  A decision problem is a YES or NO answer.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 8: Complexity Theory.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 8.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Greedy algorithms David Kauchak cs161 Summer 2009.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
Dynamic Programming continued David Kauchak cs302 Spring 2012.
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
NP-Complete problems.
CS6045: Advanced Algorithms NP Completeness. NP-Completeness Some problems are intractable: as they grow large, we are unable to solve them in reasonable.
Dynamic Programming David Kauchak cs161 Summer 2009.
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Interval Scheduling and Fractional Knapsack These slides are based on the Lecture Notes by David Mount for the course CMSC 451 at the.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 8: Crash Course in Computational Complexity.
Greedy Algorithms Chapter 16 Highlights
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Greedy algorithms 2 David Kauchak cs302 Spring 2012.
COSC 3101A - Design and Analysis of Algorithms 14 NP-Completeness.
Divide and Conquer. Problem Solution 4 Example.
A greedy algorithm is an algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage with the hope of finding.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
CSC317 Greedy algorithms; Two main properties:
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Presented by Po-Chuan & Chen-Chen 2016/03/08
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
The Greedy Method Spring 2007 The Greedy Method Merge Sort
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Lecture 19 CSE 331 Oct 12, 2016.
Advanced Algorithms Analysis and Design
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 6 Greedy Algorithms
Lecture 19 CSE 331 Oct 8, 2014.
Lecture 18 CSE 331 Oct 9, 2017.
Richard Anderson Lecture 7 Greedy Algorithms
Richard Anderson Winter 2019 Lecture 7
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Lecture 19 CSE 331 Oct 10, 2016.
Richard Anderson Autumn 2019 Lecture 7
Presentation transcript:

Greedy algorithms David Kauchak cs302 Spring 2012

Administrative Should be all caught up on grading Assignment out today (back to the normal routine)

Interval scheduling Given n activities A = [a 1,a 2,.., a n ] where each activity has start time s i and a finish time f i. Schedule as many as possible of these activities such that they don’t conflict.

Interval scheduling Given n activities A = [a 1,a 2,.., a n ] where each activity has start time s i and a finish time f i. Schedule as many as possible of these activities such that they don’t conflict. Which activities conflict?

Interval scheduling Given n activities A = [a 1,a 2,.., a n ] where each activity has start time s i and a finish time f i. Schedule as many as possible such that they don’t conflict. Which activities conflict?

Simple recursive solution Enumerate all possible solutions and find which schedules the most activities

Simple recursive solution Is it correct? max{all possible solutions} Running time? O(n!)

Can we do better? Dynamic programming (next class) O(n 2 ) Greedy solution – Is there a way to repeatedly make local decisions? Key: we’d still like to end up with the optimal solution

Overview of a greedy approach Greedily pick an activity to schedule Add that activity to the answer Remove that activity and all conflicting activities. Call this A’. Repeat on A’ until A’ is empty

Greedy options Select the activity that starts the earliest, i.e. argmin{s 1, s 2, s 3, …, s n }?

Greedy options Select the activity that starts the earliest? non-optimal

Greedy options Select the shortest activity, i.e. argmin{f 1 -s 1, f 2 -s 2, f 3 -s 3, …, f n -s n }

Greedy options Select the shortest activity, i.e. argmin{f 1 -s 1, f 2 -s 2, f 3 -s 3, …, f n -s n } non-optimal

Greedy options Select the activity with the smallest number of conflicts

Greedy options Select the activity with the smallest number of conflicts

Greedy options Select the activity with the smallest number of conflicts

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }? Multiple optimal solutions

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Greedy options Select the activity that ends the earliest, i.e. argmin{f 1, f 2, f 3, …, f n }?

Efficient greedy algorithm Once you’ve identified a reasonable greedy heuristic: Prove that it always gives the correct answer Develop an efficient solution

Is our greedy approach correct? “Stays ahead” argument: show that no matter what other solution someone provides you, the solution provided by your algorithm always “stays ahead”, in that no other choice could do better

Is our greedy approach correct? “Stays ahead” argument Let r 1, r 2, r 3, …, r k be the solution found by our approach Let o 1, o 2, o 3, …, o k of another optimal solution Show our approach “stays ahead” of any other solution … r1r1 r2r2 r3r3 rkrk o1o1 o2o2 o3o3 okok …

Stays ahead … r1r1 r2r2 r3r3 rkrk o1o1 o2o2 o3o3 okok … Compare first activities of each solution

Stays ahead … r1r1 r2r2 r3r3 rkrk o1o1 o2o2 o3o3 okok … finish(r 1 ) ≤ finish(o 1 )

Stays ahead … r2r2 r3r3 rkrk o2o2 o3o3 okok … We have at least as much time as any other solution to schedule the remaining 2…k tasks

An efficient solution

Running time? Θ(n log n) Θ(n) Overall: Θ(n log n) Better than: O(n!) O(n 2 )

Scheduling all intervals Given n activities, we need to schedule all activities. Goal: minimize the number of resources required.

Greedy approach? The best we could ever do is the maximum number of conflicts for any time period

Calculating max conflicts efficiently 3

1

3

1

Calculating max conflicts

Correctness? We can do no better then the max number of conflicts. This exactly counts the max number of conflicts.

Runtime? O(2n log 2n + n) = O(n log n)

Horn formulas Horn formulas are a particular form of boolean logic formulas They are one approach to allow a program to do logical reasoning Boolean variables: represent some event x = the murder took place in the kitchen y = the butler is innocent z = the colonel was asleep at 8 pm

Implications Left-hand side is an AND of any number of positive literals Right-hand side is a single literal x = the murder took place in the kitchen y = the butler is innocent z = the colonel was asleep at 8 pm If the colonel was asleep at 8 pm and the butler is innocent then the murder took place in the kitchen

Implications Left-hand side is an AND of any number of positive literals Right-hand side is a single literal x = the murder took place in the kitchen y = the butler is innocent z = the colonel was asleep at 8 pm the murder took place in the kitchen

Negative clauses An OR of any number of negative literals u = the constable is innocent t = the colonel is innocent y = the butler is innocent not every one is innocent

Goal Given a horn formula (i.e. set of implications and negative clauses), determine if the formula is satisfiable (i.e. an assignment of true/false that is consistent with all of the formula) u x y z

Goal Given a horn formula (i.e. set of implications and negative clauses), determine if the formula is satisfiable (i.e. an assignment of true/false that is consistent with all of the formula) u x y z not satifiable

Goal Given a horn formula (i.e. set of implications and negative clauses), determine if the formula is satisfiable (i.e. an assignment of true/false that is consistent with all of the formula) ?

Goal Given a horn formula (i.e. set of implications and negative clauses), determine if the formula is satisfiable (i.e. an assignment of true/false that is consistent with all of the formula) implications tell us to set some variables to true negative clauses encourage us make them false

A brute force solution Try each setting of the boolean variables and see if any of them satisfy the formula For n variables, how many settings are there? 2 n

A greedy solution? w 0 x 0 y 0 z 0

A greedy solution? w 0 x 1 y 0 z 0

A greedy solution? w 0 x 1 y 1 z 0

A greedy solution? w 1 x 1 y 1 z 0

A greedy solution? w 1 x 1 y 1 z 0 not satisfiable

A greedy solution

set all variables of the implications of the form “  x” to true

A greedy solution if the all variables of the lhs of an implication are true, then set the rhs variable to true

A greedy solution see if all of the negative clauses are satisfied

Correctness of greedy solution Two parts: If our algorithm returns an assignment, is it a valid assignment? If our algorithm does not return an assignment, does an assignment exist?

Correctness of greedy solution If our algorithm returns an assignment, is it a valid assignment?

Correctness of greedy solution If our algorithm returns an assignment, is it a valid assignment? explicitly check all negative clauses

Correctness of greedy solution If our algorithm returns an assignment, is it a valid assignment? don’t stop until all implications with all lhs elements true have rhs true

Correctness of greedy solution If our algorithm does not return an assignment, does an assignment exist? Our algorithm is “stingy”. It only sets those variables that have to be true. All others remain false.

Running time? ?

O(nm) n = number of variables m = number of formulas

Knapsack problems: Greedy or not? 0-1 Knapsack – A thief robbing a store finds n items worth v 1, v 2,.., v n dollars and weight w 1, w 2, …, w n pounds, where v i and w i are integers. The thief can carry at most W pounds in the knapsack. Which items should the thief take if he wants to maximize value. Fractional knapsack problem – Same as above, but the thief happens to be at the bulk section of the store and can carry fractional portions of the items. For example, the thief could take 20% of item i for a weight of 0.2w i and a value of 0.2v i.