5/15/201591.404 - Algorithms1 Algorithms – Ch4 - Divide & Conquer Recurrences: as we saw in another lecture, the Divide and Conquer approach leads to Recurrence.

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

Divide-and-Conquer CIS 606 Spring 2010.
Introduction to Algorithms Quicksort
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
A simple example finding the maximum of a set S of n numbers.
Theory of Computing Lecture 3 MAS 714 Hartmut Klauck.
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
More on Divide and Conquer. The divide-and-conquer design paradigm 1. Divide the problem (instance) into subproblems. 2. Conquer the subproblems by solving.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
Divide-and-Conquer Dr. Yingwu Zhu P65-74, p83-88, p93-96, p
CS4413 Divide-and-Conquer
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
Theory of Algorithms: Divide and Conquer
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Divide and Conquer. Recall Complexity Analysis – Comparison of algorithm – Big O Simplification From source code – Recursive.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
1 Exceptions Exceptions oops, mistake correction oops, mistake correction Issues with Matrix and Vector Issues with Matrix and Vector Quicksort Quicksort.
Spring 2015 Lecture 5: QuickSort & Selection
CS38 Introduction to Algorithms Lecture 7 April 22, 2014.
1 Issues with Matrix and Vector Issues with Matrix and Vector Quicksort Quicksort Determining Algorithm Efficiency Determining Algorithm Efficiency Substitution.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Ch. 7 - QuickSort Quick but not Guaranteed. Ch.7 - QuickSort Another Divide-and-Conquer sorting algorithm… As it turns out, MERGESORT and HEAPSORT, although.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Recurrences - 1 Recurrences.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Divide-and-Conquer 7 2  9 4   2   4   7
Analysis of Algorithms
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
10/14/ Algorithms1 Algorithms - Ch2 - Sorting.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
4.Divide-and-Conquer Hsu, Lih-Hsing. Computer Theory Lab. Chapter 4P.2 Instruction Divide Conquer Combine.
1 Chapter 4 Divide-and-Conquer. 2 About this lecture Recall the divide-and-conquer paradigm, which we used for merge sort: – Divide the problem into a.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
CSC317 1 So far so good, but can we do better? Yes, cheaper by halves... orkbook/cheaperbyhalf.html.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Chapter 9 Recursion © 2006 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.
CSC317 1 Recap: Oh, Omega, Theta Oh (like ≤) Omega (like ≥) Theta (like =) O(n) is asymptotic upper bound 0 ≤ f(n) ≤ cg(n) Ω(n) is asymptotic lower bound.
Chapter 4: Divide and Conquer
CSCE 411 Design and Analysis of Algorithms
CS 3343: Analysis of Algorithms
Ch 4: Recurrences Ming-Te Chi
Divide and Conquer (Merge Sort)
Trevor Brown CS 341: Algorithms Trevor Brown
David Kauchak cs161 Summer 2009
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

5/15/ Algorithms1 Algorithms – Ch4 - Divide & Conquer Recurrences: as we saw in another lecture, the Divide and Conquer approach leads to Recurrence Formulae to represent its time costs: for low values of n (1 is typical, but we could have n larger) T(n) =  (1). for large(r) values of n, T(n) = a T(n/b) + f(n), where f(n) represents the costs of dividing and those of recombining the solutions. As we go, we will examine in some detail a few methods of solution.

5/15/ Algorithms2 Algorithms - Ch4 - Divide & Conquer 1.The substitution method: essentially, guess the solution and prove the guess to be correct by mathematical induction. This will work well in all cases. Slight difficulty: you have to guess right – and that is not easy in most cases… 2.The recursion-tree method: convert the recurrence into a tree (as in MERGESORT ) and add up all the costs of each level, and then add up the costs of all the levels. Slight difficulty: unless the costs at each level are easy to compute, you may have some difficult bounds to cook up. 3.The Master Method: there is a theorem... Basically, find out which set of hypotheses hold in your special case, and read off the conclusion. Slight difficulty: sometimes you will have trouble proving that the hypotheses you need (or any relevant hypotheses) are actually satisfied.

5/15/ Algorithms3 Algorithms – Ch4 - Divide & Conquer Examples of Divide and Conquer: the Maximum Subarray problem. Problem: given an array of n numbers, find the (a) contiguous subarray whose sum has the largest value. Application: an unrealistic stock market game, in which you decide when to buy and see a stock, with full knowledge of the past and future. The restriction is that you can perform just one buy followed by a sell. The buy and sell both occur right after the close of the market. The interpretation of the numbers: each number represents the stock value at closing on any particular day.

5/15/ Algorithms4 Algorithms – Ch4 - Divide & Conquer Example:

5/15/ Algorithms5 Algorithms – Ch4 - Divide & Conquer Another Example: buying low and selling high, even with perfect knowledge, is not trivial:

5/15/ Algorithms6 Algorithms – Ch4 - Divide & Conquer First Solution: compute the value change of each subarray corresponding to each pair of dates, and find the maximum. 1.How many pairs of dates: 2.This belongs to the class  (n 2 ) 3.The rest of the costs, although possibly constant, don’t improve the situation:  (n 2 ). Not a pleasant prospect if we are rummaging through long time- series (Who told you it was easy to get rich???), even if you are allowed to post-date your stock options…

5/15/ Algorithms7 Algorithms – Ch4 - Divide & Conquer We are going to find an algorithm with an o(n 2 ) running time (i.e. strictly asymptotically faster than n 2 ), which should allow us to look at longer time-series. Transformation: Instead of the daily price, let us consider the daily change: A[i] is the difference between the closing value on day i and that on day i-1. The problem becomes that of finding a contiguous subarray the sum of whose values is maximum. On a first look this seems even worse: roughly the same number of intervals (one fewer, to be precise), and the requirement to add the values in the subarray rather than just computing a difference:  (n 3 )?

5/15/ Algorithms8 Algorithms – Ch4 - Divide & Conquer It is actually possible to perform the computation in  (n 2 ) time by 1.Computing all the daily changes; 2.Computing the changes over 2 days (one addition each) 3.Computing the changes over 3 days (one further addition to extend the length-2 arrays… etc… check it out. You’ll need a two-dimensional array to store the intermediate computations. Still bad though. Can we do better??

5/15/ Algorithms9 Algorithms – Ch4 - Divide & Conquer How do we divide? We observe that a maximum contiguous subarray A[i…j] must be located as follows: 1.It lies entirely in the left half of the original array: [low…mid]; 2.It lies entirely in the right half of the original array: [mid+1…high]; 3.It straddles the midpoint of the original array: i ≤ mid < j.

5/15/ Algorithms10 Algorithms – Ch4 - Divide & Conquer The “left” and “right” subproblems are smaller versions of the original problem, so they are part of the standard Divide & Conquer recursion. The “middle” subproblem is not, so we will need to count its cost as part of the “combine” (or “divide”) cost. The crucial observation (and it may not be entirely obvious) is that we can find the maximum crossing subarray in time linear in the length of the A[low…high] subarray. How? A[i,…,j] must be made up of A[i…mid] and A[m+1…j] – so we find the largest A[i…mid] and the largest A[m+1…j] and combine them.

5/15/ Algorithms11 Algorithms – Ch4 - Divide & Conquer The Algorithm:

5/15/ Algorithms12 Algorithms – Ch4 - Divide & Conquer The total numbers of iterations for both loops is exactly high-low+1. The left-recursion will return the indices and value for the largest contiguous subarray in the left half of A[low…high], the right recursion will return the indices and value for the largest contiguous subarray in the left half of A[low…high], and FIND- MAX-CROSSING-SUBARRAY will return the indices and value for the largest contiguous subarray that straddles the midpoint of A[low…high]. It is now easy to choose the contiguous subarray with largest value and return its endpoints and value to the caller.

5/15/ Algorithms13 Algorithms – Ch4 - Divide & Conquer The recursive algorithm:

5/15/ Algorithms14 Algorithms – Ch4 - Divide & Conquer The analysis: 1. The base case n = 1. This is easy: T(1) =  (1), as one might expect. 2. The recursion case n > 1. We make the simplifying assumption that n is a power of 2, so we can always “split in half”. T(n) = cost of splitting + 2T(n/2) + cost of finding largest midpoint-straddlingsubarray + cost of comparing the three subarray values and returning the correct triple. The cost of splitting is constant =  (1); the cost of finding the largest straddling subarray is  (n); the cost of the final comparison and return is constant  (1). T(n) =  (1) + 2T(n/2) +  (n) +  (1).

5/15/ Algorithms15 Algorithms – Ch4 - Divide & Conquer We finally have: The recurrence has the same form as that for MERGESORT, and thus we should expect it to have the same solution T(n) =  (n lg n). This algorithm is clearly substantially faster than any of the brute- force methods. It required some cleverness, and the programming is a little more complicated – but the payoff is large.

5/15/ Algorithms16 Algorithms – Ch4 - Divide & Conquer Strassen’s Algorithm for Matrix Multiplication. We start from the standard algorithm for matrix multiplication: The triple nested loop clearly implies a time  (n 3 ), which is not quite as bad as it looks since we are dealing with n 2 elements, so n 3 = (n 2 ) 1.5.

5/15/ Algorithms17 Algorithms – Ch4 - Divide & Conquer If we are going to try some clever Divide & Conquer scheme, we could start by coming up with a non-clever one… Here it is: in is a power of 2, we can always subdivide an n-by-n matrix into 4 n/2-by- n/2 ones:

5/15/ Algorithms18 Algorithms – Ch4 - Divide & Conquer A pretty immediate conclusion is that multiplying 2 n-by-n matrices involves 8 multiplications of n/2-by-n/2 matrices and 4 additions of n/2-by-n/2 matrices. Additions of matrices are easy: 2 n-by-n matrices cost  (n 2 ) to add, and the total cost of the 4 additions remains  (n 2 ). Partitioning of matrices has cost that depends on the method: copying would have a  (n 2 ) cost (because of the n 2 elements), using indices that provide information on the original array (avoiding copying) could cost as little as  (1).

5/15/ Algorithms19 Algorithms – Ch4 - Divide & Conquer The Algorithm:

5/15/ Algorithms20 Algorithms – Ch4 - Divide & Conquer The Recursion formula is: Using the Master method (apply the theorem) we should end up with the same result we had before. We could also try T(n) = cn 3 and see if we can show an appropriate inequality remains satisfied through a mathematical induction. More importantly, one can ask the question: do we need all 8 multiplications or can we find a clever way or coming up with fewer? We would expect a cost, probably in a larger number of additions, but additions are much cheaper and, as long as the number of additions doesn’t go up with n, we can put them in the  (n 2 ) term.

5/15/ Algorithms21 Algorithms – Ch4 - Divide & Conquer Strassen’s Method:

5/15/ Algorithms22 Algorithms – Ch4 - Divide & Conquer The Recursion Formula becomes: With solution T(n) =  (n lg7 ) =  (n 2.81 ).