Lecture 6 Greedy Algorithms

Slides:



Advertisements
Similar presentations
Algorithm Design Methods (I) Fall 2003 CSE, POSTECH.
Advertisements

Algorithm Design Methods Spring 2007 CSE, POSTECH.
MCA 301: Design and Analysis of Algorithms
MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Greedy Algorithms.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Chapter 5 Fundamental Algorithm Design Techniques.
Analysis of Algorithms
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
Greedy Algorithms. Announcements Exam #1 ◦ See me for 2 extra points if you got #2(a) wrong. Lab Attendance ◦ 12 Labs, so if anyone needs to miss a lab.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
0-1 Knapsack Problem A burglar breaks into a museum and finds “n” items Let v_i denote the value of ith item, and let w_i denote the weight of the ith.
© 5/7/2002 V.S. Subrahmanian1 Knapsack Problem Notes V.S. Subrahmanian University of Maryland.
KNAPSACK PROBLEM A dynamic approach. Knapsack Problem  Given a sack, able to hold K kg  Given a list of objects  Each has a weight and a value  Try.
Week 2: Greedy Algorithms
Lecture 7: Greedy Algorithms II
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
Called as the Interval Scheduling Problem. A simpler version of a class of scheduling problems. – Can add weights. – Can add multiple resources – Can ask.
Dynamic Programming Sequence of decisions. Problem state. Principle of optimality. Dynamic Programming Recurrence Equations. Solution of recurrence equations.
Gary Sham HKOI 2010 Greedy, Divide and Conquer. Greedy Algorithm Solve the problem by the “BEST” choice. To find the global optimal through local optimal.
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Algorithm Design Methods 황승원 Fall 2011 CSE, POSTECH.
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Interval Scheduling and Fractional Knapsack These slides are based on the Lecture Notes by David Mount for the course CMSC 451 at the.
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Divide and Conquer. Problem Solution 4 Example.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Lecture 2: Divide and Conquer
Greedy Algorithms Alexandra Stefan.
Dynamic Programming Sequence of decisions. Problem state.
Algorithm Design Methods
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Seminar on Dynamic Programming.
Algorithm Design Methods
Prepared by Chen & Po-Chuan 2016/03/29
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
CS4335 Design and Analysis of Algorithms/WANG Lusheng
Greedy Algorithm Enyue (Annie) Lu.
Exam 2 LZW not on syllabus. 73% / 75%.
Advanced Algorithms Analysis and Design
Greedy Algorithms.
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Weighted Interval Scheduling
Lecture 6 Topics Greedy Algorithm
Richard Anderson Lecture 6 Greedy Algorithms
Greedy Algorithms Alexandra Stefan.
Richard Anderson Lecture 7 Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Algorithm Design Methods
Lecture 4 Dynamic Programming
Algorithm Design Methods
Weighted Interval Scheduling
The results for Challenging Problem 1.
Lecture 5 Dynamic Programming
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Lecture 5 Dynamic Programming
Week 2: Greedy Algorithms
Greedy algorithms.
COMPSCI 330 Design and Analysis of Algorithms
Advance Algorithm Dynamic Programming
Knapsack Problem A dynamic approach.
Algorithm Design Methods
Seminar on Dynamic Programming.
Presentation transcript:

Lecture 6 Greedy Algorithms

Basic Algorithm Design Techniques Divide and conquer Dynamic Programming Greedy Common Theme: To solve a large, complicated problem, break it into many smaller sub-problems.

Greedy Algorithm If a problem requires to make a sequence of decisions, for the first decision, make the “best” choice given the current situation. (This automatically reduces the problem to a smaller sub-problem which requires making one fewer decisions)

Warm-up: Walking in Manhattan “Walk in a direction that reduces distance to the destination.”

Greedy does not always work Driving in New York: one-way streets, traffic…

Design and Analysis Designing a Greedy Algorithm: 1. Break the problem into a sequence of decisions. 2. Identify a rule for the “best” option. Analyzing a Greedy Algorithm: Important! Often fails if you cannot find a proof. Technique: Proof by contradiction. Assume there is a better solution, show that it is actually not better than what the algorithm did.

Fractional Knapsack Problem There is a knapsack that can hold items of total weight at most W. There are now n items with weights w1,w2,…, wn. Each item also has a value v1,v2,…,vn. The items are infinitely divisible: can put ½ (or any fraction) of an item into the knapsack. Goal: Select fractions p1,p2,…,pn such that Capacity constraint: p1w1+p2w2+…+pnwn <= W Maximum Value: p1v1+p2v2+…+pnvn maximized.

Example Capacity W = 10, 3 items with (weight, value) = (6, 20), (5, 15), (4, 10) Solution: Item 1 + 0.8 Item 2. Weight = 10, Value = 32

Interval Scheduling There are n meeting requests, meeting i takes time (si, ti) Cannot schedule two meeting together if their intervals overlap. Goal: Schedule as many meetings as possible. Example: Meetings (1,3), (2, 4), (4, 5), (4, 6), (6, 8) Solution: 3 meetings ((1, 3), (4, 5), (6, 8))