Algorithm Design Methods

Slides:



Advertisements
Similar presentations
CS 332: Algorithms NP Completeness David Luebke /2/2017.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Algorithm Design approaches Dr. Jey Veerasamy. Petrol cost minimization problem You need to go from S to T by car, spending the minimum for petrol. 2.
MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Greedy Algorithms.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Analysis of Algorithms
Greed is good. (Some of the time)
Dynamic Programming.
Greedy vs Dynamic Programming Approach
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
Week 2: Greedy Algorithms
Dynamic Programming 0-1 Knapsack These notes are taken from the notes by Dr. Steve Goddard at
Lecture 7: Greedy Algorithms II
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Recursion and Dynamic Programming. Recursive thinking… Recursion is a method where the solution to a problem depends on solutions to smaller instances.
1 0-1 Knapsack problem Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy Methods and Backtracking Dr. Marina Gavrilova Computer Science University of Calgary Canada.
1 Programming for Engineers in Python Autumn Lecture 12: Dynamic Programming.
Main Index Contents 11 Main Index Contents Building a Ruler: drawRuler() Building a Ruler: drawRuler() Merge Algorithm Example (4 slides) Merge Algorithm.
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
COSC 3101A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
2/19/ ITCS 6114 Dynamic programming 0-1 Knapsack problem.
Greedy Algorithms Chapter 16 Highlights
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Divide and Conquer. Problem Solution 4 Example.
A greedy algorithm is an algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage with the hope of finding.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CSC317 Greedy algorithms; Two main properties:
CSCE 411 Design and Analysis of Algorithms
Lecture on Design and Analysis of Computer Algorithm
Algorithm Design Methods
Greedy Algorithms (Chap. 16)
Seminar on Dynamic Programming.
Prepared by Chen & Po-Chuan 2016/03/29
CS 3343: Analysis of Algorithms
CS6045: Advanced Algorithms
CS4335 Design and Analysis of Algorithms/WANG Lusheng
Greedy Algorithm Enyue (Annie) Lu.
Exam 2 LZW not on syllabus. 73% / 75%.
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Advanced Algorithms Analysis and Design
Lecture 6 Topics Greedy Algorithm
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Algorithm Design Methods
Lecture 4 Dynamic Programming
Algorithm Design Methods
The results for Challenging Problem 1.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Lecture 2: Greedy Algorithms
0-1 Knapsack problem.
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Advance Algorithm Dynamic Programming
Algorithm Design Methods
Seminar on Dynamic Programming.
Presentation transcript:

Algorithm Design Methods Greedy Algorithm

Greedy Algorithm It makes the choice that looks best at the moment and adds it to the current subsolution. It makes a locally optimal choice in the hope that this choice will lead to a globally optimal solution. Examples Prim/Kruskal’s MST algorithms Dijkstra’s mimimun path algorithm

The Knapsack Problem A thief robbing a store finds n items The i th item is worth (or gives a profit of) pi dollars and weighs wi pounds Thief’s knapsack can carry at most M pounds What items to select to maximize profit? Has anyone heard of knapsack problem before? (Yes: Could you tell us what you know about it, excellent, great, Do you know how to solve it. It is interesting and fun) (No: good. The knapsack problem is a very interesting and famous problem. I am going to talk about it. So later on, when people talk about knapsack problem, you can say something about it. Is it fun? this interesting problem to you) We label n item from 0 to n-1 Each item has a value and a weight. The thief wants to take as valuable a load as possible

Let xi be the fraction of item i, which will be put in the knapsack The Knapsack Problem The fractional knapsack problem: Thief can take fractions of items The binary knapsack problem: Each item is either taken or left entirely pi, wi, and M are integers (0-1) Today I will introduce 2 versions of knapsack problem: …. We can think that in …. Problem, items can be Gold dusts, silver dusts, and so on. Gold bars If xi =0: item wil not be put in bag 1 ½ 1/3… Think of items as gold dust Think of items as gold ingots (bar) Let xi be the fraction of item i, which will be put in the knapsack

Fractional Knapsack Problem The problem: Given a knapsack with a certain capacity M, n items, which are to be put into the knapsack, each item has a weight and a profit . The goal: find where s.t. is maximized and   Let’s first talk about fractional knapsack problem. What’s the notation for summation Sum notation P1*xi+P2*x2… W1*x1+w2*x2 For item i: The profit made for the selection: pixi The weight put into the knapsack: wixi

How to solve the fractional knapsack problem ? Greedy method The thief will put items one by one Items are ordered in a way that the thief thinks that he can achieve the maximum profit at each time Dose anyone have any idea?

Example: å å ) , 15 2 1 ( = x 2 . 28 15 24 1 25 = ´ + x p ) 1 , 3 2 ( Greedy Strategy#1: items are ordered in nonincreasing order of profits (1,2,3) ) , 15 2 1 ( 3 = x 2 . 28 15 24 1 25 3 = ´ + å i x p Greedy Strategy#2: items are ordered in nondecreasing order of weights (3,2,1) ) 1 , 3 2 ( = x 31 1 15 3 2 24 25 = ´ + å i x p

Example: å Optimal Solution? 6 4 5 . 1 10 15 24 18 25 = » w p ) 1 , 3 Greedy Strategy#3: items are ordered in nonincreasing order of p/w 6 4 5 . 1 10 15 24 18 25 3 2 = » w p ) 1 , 3 2 ( Þ Optimal Solution? 1 ) 2 , ( 3 = x 5 . 31 2 1 15 24 25 3 = ´ + å i x p

Proof of correctness Proved by contradiction Let X be the solution of greedy strategy #3 Assume that X is not optimal There is an optimal solution Y and the profit of Y is greater than the profit of X Consider the item j in X but not in Y get rid of some items with total weight wj (possibly fractional items) and add item j to Y The capacity remains the same Total value is not decreased One more item in X is added to Y Repeat the process until Y is changed to contain all items selected in X Total value is not decreased. X is optimal too Contradiction!

Greedy Algorithm 1. Calculate vi = pi / wi for i = 1, 2, …, n 2. Sort items by nonincreasing vi. (all wi, pi are also reordered correspondingly) 3. Let M' be the current weight limit (Initially, M' = M and xi=0 ).In each iteration, choose item i from the head of the unselected list. If M' >= wi , set xi=1, and M' = M'-wi If M' < wi , set xi=M'/wi and the algorithm is finished. vI= 5, 1.4, 1.5, 1.2 Order item: item 1, item 3, item 2, item 4 (1,3, 2, 4) Put item 1: Put item 2, Total value is 25+15+(2/3)*21=25+15+14=54 Exercise: Work it out

Time Complexity O(nlogn) O(n) O(nlogn) O(n) O(1) 1. Calculate vi = pi / wi for i = 1, 2, …, n 2. Sort items by nonincreasing vi. (all wi are also reordered correspondingly ) 3. Let M' be the current weight limit (Initially, M' = M and xi=0 ).In each iteration, choose item i from the head of the unselected list. If M' >= wi , set xi=1, and M' = M'-wi If M' < wi , set xi=M'/wi and the algorithm is finished. O(n) O(nlogn) O(n) O(1)

Greedy Algorithm Greedy Choice Property: it makes the choice that looks best at the moment and adds it to the current subsolution. Optimal Sub Structure: it makes a locally optimal choice in the hope that this choice will lead to a globally optimal solution.

0-1 Knapsack Problem (xi can be 0 or 1) Knapsack capacity: 50 + 100 60 =160 + 120 60 =180 + 120 100 =220 i 1 2 3 pi 60 100 120 wi 10 20 30 Can 0-1 Knapsack be solved by greedy algorithm?

Recursive Solution M 1 M M-w1 p1 2 2 M M-w2 p2 M-w1 p1 M-w1-w2 p1+p2 M Capacity M Profit Item 1 selected Item 1 not selected 1 M M-w1 p1 2 2 M M-w2 p2 M-w1 p1 M-w1-w2 p1+p2 M M-w1 p1 M-w1-w3 p1+p3 M-w3 p3

Recursive Solution Let us define P(i,k) as the maximum profit possible using items i, i+1,…,n and capacity k We can write expressions for P(i,k) for i=n and i<n as follows ï î í ì + > = k i P p n ) , 1 ( £ < w & Max{P(i+1,k), pi+P (i+1,k-wi)} Item i is not chosen Item i is chosen

Recursive Solution Recursive algorithm will take O(2n) time NP-complete problem Inefficient because P(i,k) for the same i and k will be computed many times

Example: n=5, M=10, w=[2, 2, 6, 5, 4], p=[6, 3, 5, 4, 6] Same subproblem

Dynamic Programming Solution The inefficiency could be overcome by computing each P(i,k) once storing the results in a table for future use

Example n=5, c=10, w = [2, 2, 6, 5, 4], p = [2, 3, 5, 4, 6] i\k 1 2 3 4 5 6 7 8 9 10

Example n=5, c=10, w = [2, 2, 6, 5, 4], p = [2, 3, 5, 4, 6] i\k 1 2 3 4 5 6 7 8 9 10

Example n=5, c=10, w = [2, 2, 6, 5, 4], p = [2, 3, 5, 4, 6] i\k 1 2 3 4 5 6 7 8 9 10 11

Example n=5, c=10, w = [2, 2, 6, 5, 4], p = [2, 3, 5, 4, 6] i\k 1 2 3 4 5 6 7 8 9 10 11

Example n=5, M=10, w = [2, 2, 6, 5, 4], p = [2, 3, 5, 4, 6] i\k 1 2 3 4 5 6 7 8 9 10 11

Example n=5, M=10, w = [2, 2, 6, 5, 4], p = [2, 3, 5, 4, 6] i\k 1 2 3 4 5 6 7 8 9 10 11 x = [0,0,1,0,1] x = [1,1,0,0,1]

Dynamic programming Identify a recursive definition of how a larger solution is built from optimal results for smaller sub-problems. Create a table that we can build bottom-up to calculate results for sub-problems and eventually solve the entire problem. Running time and space: O(nM). For large M=O(2n), it is still NP-complete problem

Dynamic Programming The strategy of the dynamic programming handles divide-and-conquer algorithms that involved overlapping data

Finding all subsets powerSet() Example

Recursive calls for fib(5)

Affect of fib(5) Using Dynamic Programming

§- Dynamic Programming Summary Slide 4 §- Dynamic Programming - Two type of dynamic programming: 1) top-down dynamic programming - uses a vector to store Fibonacci numbers as a recursive function computes them - avoids costly redundant recursive calls and leads to an O(n) algorithm that computes the nth Fibonacci number. - recursive function that does not apply dynamic programming has exponential running time. - improve the recursive computation for C(n,k), the combinations of n things taken k at a time. 30

Summary Slide 5 2) bottom-up dynamic programming - evaluates a function by computing all the function values in order, starting at the lowest level and using previously computed values at each step to compute the current value. - 0/1 knapsack problem 31