Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]

Slides:



Advertisements
Similar presentations
CPSC 411 Design and Analysis of Algorithms Set 3: Divide and Conquer Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 3 1.
Advertisements

Divide-and-Conquer 7 2  9 4   2   4   7
Divide-and-Conquer CIS 606 Spring 2010.
A simple example finding the maximum of a set S of n numbers.
More on Divide and Conquer. The divide-and-conquer design paradigm 1. Divide the problem (instance) into subproblems. 2. Conquer the subproblems by solving.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
Divide and Conquer Strategy
Introduction to Algorithms Jiafen Liu Sept
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
1 Divide-and-Conquer CSC401 – Analysis of Algorithms Lecture Notes 11 Divide-and-Conquer Objectives: Introduce the Divide-and-conquer paradigm Review the.
Introduction to Algorithms 6.046J/18.401J L ECTURE 3 Divide and Conquer Binary search Powering a number Fibonacci numbers Matrix multiplication Strassen’s.
Algorithm Design Strategy Divide and Conquer. More examples of Divide and Conquer  Review of Divide & Conquer Concept  More examples  Finding closest.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CPSC 320: Intermediate Algorithm Design & Analysis Divide & Conquer and Recurrences Steve Wolfman 1.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Lecture 2: Divide and Conquer I: Merge-Sort and Master Theorem Shang-Hua Teng.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Analysis of Recursive Algorithms
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
CSE 421 Algorithms Richard Anderson Lecture 11 Recurrences.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
October 1, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer 7 2  9 4   2   4   7
Analysis of Algorithms
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MA/CSSE 473 Day 17 Divide-and-conquer Convex Hull Strassen's Algorithm: Matrix Multiplication.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 3 Prof. Erik Demaine.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
Project 2 due … Project 2 due … Project 2 Project 2.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
1 Chapter 4 Divide-and-Conquer. 2 About this lecture Recall the divide-and-conquer paradigm, which we used for merge sort: – Divide the problem into a.
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Recurrences – II. Comp 122, Spring 2004.
Divide and Conquer Strategy
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Master Method Some of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
Lecture 6 Sorting II Divide-and-Conquer Algorithms.
Introduction to Algorithms: Divide-n-Conquer Algorithms
Chapter 4 Divide-and-Conquer
Divide-and-Conquer 6/30/2018 9:16 AM
Chapter 4: Divide and Conquer
Chapter 4 Divide-and-Conquer
Divide-and-Conquer The most-well known algorithm design strategy:
CSCE 411 Design and Analysis of Algorithms
Algorithms and Data Structures Lecture III
Divide-and-Conquer 7 2  9 4   2   4   7
Divide-and-Conquer The most-well known algorithm design strategy:
CSE 2010: Algorithms and Data Structures
Introduction to Algorithms
Divide-and-Conquer The most-well known algorithm design strategy:
Divide and Conquer (Merge Sort)
Divide-and-Conquer 7 2  9 4   2   4   7
Trevor Brown CS 341: Algorithms Trevor Brown
Algorithms Recurrences.
Divide-and-Conquer 7 2  9 4   2   4   7
Divide and Conquer Merge sort and quick sort Binary search
Presentation transcript:

Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]

Divide and Conquer Paradigm An important general technique for designing algorithms: divide problem into subproblems recursively solve subproblems combine solutions to subproblems to get solution to original problem Use recurrences to analyze the running time of such algorithms

Mergesort

Example: Mergesort DIVIDE the input sequence in half RECURSIVELY sort the two halves basis of the recursion is sequence with 1 key COMBINE the two sorted subsequences by merging them

Mergesort Example

Mergesort Animation odels/run.cgi?MergeSort http://ccl.northwestern.edu/netlogo/m odels/run.cgi?MergeSort

Recurrence Relation for Mergesort Let T(n) be worst case time on a sequence of n keys If n = 1, then T(n) =  (1) (constant) If n > 1, then T(n) = 2 T(n/2) +  (n) two subproblems of size n/2 each that are solved recursively  (n) time to do the merge

Recurrence Relations

How To Solve Recurrences Ad hoc method: expand several times guess the pattern can verify with proof by induction Master theorem general formula that works if recurrence has the form T(n) = aT(n/b) + f(n) a is number of subproblems n/b is size of each subproblem f(n) is cost of non-recursive part

Master Theorem Consider a recurrence of the form T(n) = a T(n/b) + f(n) with a>=1, b>1, and f(n) eventually positive. a)If f(n) = O(n log b (a)-  ), then T(n)=  (n log b (a) ). b)If f(n) =  (n log b (a) ), then T(n)=  (n log b (a) log(n)). c) If f(n) =  (n log b (a)+  ) and f(n) is regular, then T(n)=  (f(n)) [f(n) regular iff eventually af(n/b)<= cf(n) for some constant c<1]

Excuse me, what did it say??? Essentially, the Master theorem compares the function f(n) with the function g(n)=n log b (a). Roughly, the theorem says: a)If f(n) << g(n) then T(n)=  (g(n)). b)If f(n)  g(n) then T(n)=  (g(n)log(n)). c)If f(n) >> g(n) then T(n)=  (f(n)). Now go back and memorize the theorem!

Déjà vu: Master Theorem Consider a recurrence of the form T(n) = a T(n/b) + f(n) with a>=1, b>1, and f(n) eventually positive. a)If f(n) = O(n log b (a)-  ), then T(n)=  (n log b (a) ). b)If f(n) =  (n log b (a) ), then T(n)=  (n log b (a) log(n)). c) If f(n) =  (n log b (a)+  ) and f(n) is regular, then T(n)=  (f(n)) [f(n) regular iff eventually af(n/b)<= cf(n) for some constant c<1]

Nothing is perfect… The Master theorem does not cover all possible cases. For example, if f(n) =  (n log b (a) log n), then we lie between cases 2) and 3), but the theorem does not apply. There exist better versions of the Master theorem that cover more cases, but these are even harder to memorize.

Idea of the Proof Let us iteratively substitute the recurrence:

Idea of the Proof Thus, we obtained T(n) = n log b (a) T(1) +  a i f(n/b i ) The proof proceeds by distinguishing three cases: 1)The first term in dominant: f(n) = O(n log b (a)-  ) 2)Each part of the summation is equally dominant: f(n) =  (n log b (a) ) 3)The summation can be bounded by a geometric series: f(n) =  (n log b (a)+  ) and the regularity of f is key to make the argument work.

Further Divide and Conquer Examples

Additional D&C Algorithms binary search divide sequence into two halves by comparing search key to midpoint recursively search in one of the two halves combine step is empty quicksort divide sequence into two parts by comparing pivot to each key recursively sort the two parts combine step is empty

Additional D&C applications computational geometry finding closest pair of points finding convex hull mathematical calculations converting binary to decimal integer multiplication matrix multiplication matrix inversion Fast Fourier Transform

Strassen’s Matrix Multiplication

Matrix Multiplication Consider two n by n matrices A and B Definition of AxB is n by n matrix C whose (i,j)-th entry is computed like this: consider row i of A and column j of B multiply together the first entries of the rown and column, the second entries, etc. then add up all the products Number of scalar operations (multiplies and adds) in straightforward algorithm is O(n 3 ). Can we do it faster?

Divide-and-Conquer Divide matrices A and B into four submatrices each We have 8 smaller matrix multiplications and 4 additions. Is it faster? A  B = C A0A0 A1A1 A2A2 A3A3 B0B0 B1B1 B2B2 B3B3 A 0  B 0 +A 1  B 2 A 0  B 1 +A 1  B 3 A 2  B 0 +A 3  B 2 A 2  B 1 +A 3  B 3  =

Divide-and-Conquer Let us investigate this recursive version of the matrix multiplication. Since we divide A, B and C into 4 submatrices each, we can compute the resulting matrix C by 8 matrix multiplications on the submatrices of A and B, plus  (n 2 ) scalar operations

Divide-and-Conquer Running time of recursive version of straightfoward algorithm is T(n) = 8T(n/2) +  (n 2 ) T(2) =  (1) where T(n) is running time on an n x n matrix Master theorem gives us: T(n) =  (n 3 ) Can we do fewer recursive calls (fewer multiplications of the n/2 x n/2 submatrices)?

Strassen’s Matrix Multiplication P 1 = (A 11 + A 22 )(B 11 +B 22 ) P 2 = (A 21 + A 22 ) * B 11 P 3 = A 11 * (B 12 - B 22 ) P 4 = A 22 * (B 21 - B 11 ) P 5 = (A 11 + A 12 ) * B 22 P 6 = (A 21 - A 11 ) * (B 11 + B 12 ) P 7 = (A 12 - A 22 ) * (B 21 + B 22 ) C 11 = P 1 + P 4 - P 5 + P 7 C 12 = P 3 + P 5 C 21 = P 2 + P 4 C 22 = P 1 + P 3 - P 2 + P 6 A  B = C A0A0 A1A1 A2A2 A3A3 B0B0 B1B1 B2B2 B3B3 C 11 C 12 C 21 C 22  =

Strassen's Matrix Multiplication Strassen found a way to get all the required information with only 7 matrix multiplications, instead of 8. Recurrence for new algorithm is T(n) = 7T(n/2) +  (n 2 )

Solving the Recurrence Relation Applying the Master Theorem to T(n) = a T(n/b) + f(n) with a=7, b=2, and f(n)=  (n 2 ). Since f(n) = O(n log b (a)-  ) = O(n log 2 (7)-  ), case a) applies and we get T(n)=  (n log b (a) ) =  (n log 2 (7) ) = O(n 2.81 ).

Discussion of Strassen's Algorithm Not always practical constant factor is larger than for naïve method specially designed methods are better on sparse matrices issues of numerical (in)stability recursion uses lots of space Not the fastest known method Fastest known is O(n ) Best known lower bound is  (n 2 )