Download presentation
Presentation is loading. Please wait.
Published byCharleen Bruce Modified over 9 years ago
1
Divide and Conquer
2
Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by solving them recursively. If the sub- problem sizes are small enough, however, just solve the sub-problems in a straightforward manner. Combine the solutions to the sub-problems into the solution for the original problem.
3
Recurrences A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs. We define the running time of MERGE-SORT via a recurring equation We will study three methods to solve recurrences—that is, for obtaining asymptotic “Θ” or “O” bounds on the solution: 1.Substitution method 2.Recurrence tree 3.Master Method
4
Strassen’s algorithm for matrix multiplication If A = (a ij ) and B = (b ij ) are square n x n matrices, then in the product C = A x B, we define the c ij, for i, j = 1,2, …,n, by We must compute n 2 matrix entries, and each is the sum of n values.
5
Matrix multiplication procedure For two square matrices the following procedure computes their product Time Complexity
6
A simple divide-and-conquer algorithm
7
Recursive algorithm for matrix product
8
Analysis T(1) = ϴ(1) In lines 6–9, we recursively call SQUARE-MATRIX-MULTIPLY- RECURSIVE a total of eight times. the time taken by all eight recursive calls is 8 T (n/2) We also must account for the four matrix additions in lines 6–9. Each of these matrices contains n 2 /4 entries, and so each of the four matrix additions takes ϴ (n 2 ) time T(1) = ϴ(n 3 )
9
Strassen’s method
10
Algorithm
11
Algorithm (1) Step 3 Step 4
12
Algorithm (2)
13
Solving Recurrences We have three methods to solve recurrence equations 1.Substitution Method 2.Recurrence Tree Method 3.Master Method
14
Substitution Method The substitution method for solving recurrences comprises two steps: 1.Guess the form of the solution. 2.Use mathematical induction to find the constants and show that the solution works.
15
Example Consider the recurrence We guess that the solution is T(n) = O(n lg n) The substitution require us to prove that T(n) 0 We start by assuming that this bound holds for all positive m < n, in particular for m = Floor (n/2) It gives Substituting it into the recurrence yields
16
Example - continued Now we require to show that this solution holds for boundary conditions Let us assume, for the sake of argument, that T(1) = 1 For n = 1? We only require to prove for any n 0 for n > 3, the recurrence does not depend directly on T(1) Distinction between base case of recurrence and induction We derive from the recurrence that T(2) = 4 and T(3) = 5 We can complete the inductive proof that T(n) =c n lgn for some constant c>=1 by choosing c large enough so that T(2) <= c2 lg 2 and T(3) <= c3 lg 3 c >= 2?
17
Another Example Make sure you show the same exact form when doing a substitution proof. Consider the recurrence T (n) = 8T (n/2) + Θ (n 2 ) For an upper bound: T (n) ≤ 8T (n/2) + cn 2. Guess: T (n) ≤ dn 3. T (n) ≤ 8d(n/2) 3 + cn 2 = 8d(n 3 /8) + cn 2 = dn 3 + cn 2 ≤ dn 3 doesn’t work!
18
Another Example - continued Remedy: Subtract off a lower-order term. Guess: T (n) ≤ dn 3 − d’n 2. T (n) ≤ 8(d(n/2) 3 − d’(n/2) 2 ) + cn 2 = 8d(n 3 /8) − 8d’(n 2 /4) + cn 2 = dn 3 − 2d’n 2 + cn 2 = dn 3 − d’n 2 − d’n 2 + cn 2 ≤ dn 3 − d’n 2 if −d’n 2 + cn 2 ≤ 0, d’ ≥ c
19
Yet another example T(n) = cn + 3T(2n/3) How about F(n) = n lg n? cn + 3kF(2n/3) = cn + 3k(2n/3) lg (2n/3) = cn + 2kn lg n − 2kn lg (2/3) = cn + 2kn lg n + 2kn lg (3/2) There is no way to choose k to make the left side (kn lg n) larger Therefore, n lg n is not correct
20
Yet another example - continued Try a higher order of growth like n 2 or n 3, but which one? Maybe n x We can solve for the correct exponent x by plugging in kn x : cn + 3T(2n/3) = cn + 3k(2/3) x n x This will be asymptotically less than kn x as long as 3(2/3) x > 1, which requires x > lg 3/2 3 Let a = lg 3/2 3, then our algorithm is O(n a+ε ) for any positive ε Let's try O(n a ) itself, the RHS after substituting kn a is cn + 3(2/3) a kn a = cn + kn a ≥ kn a This tells us that kn a is an asymptotic lower bound on T(n): T(n) is Ω(n a ). So the complexity is somewhere between Ω(n a ) and O(n a+ε ). It is in fact Θ(n a ). To show the upper bound, we will try F(n) = n a + bn where b is a constant to be filled in later.
21
Yet another example - continued The idea is to pick a b so that bn will compensate for the cn term that shows up in the recurrence. Because bn is O(n a ), showing T(n) is O(n a + bn) is the same as showing that it is O(n a ). Substituting kF(n)for T(n) in the RHS of the recurrence, we obtain: cn + 3kF(2n/3) = cn + 3k((2n/3) a + b(2n/3)) = cn + 3k(2n/3) a + 3kb(2n/3) = cn + kn a + 2kbn = kn a + (3kb+c)n The substituted LHS of the recurrence is kn a + kbn, which is larger than kn a + (2kb+c)n as long as kb>2kb+c, or b<−c/k. There is no requirement that b be positive, so choosing k=1, b= −1 satisfies the recurrence. Therefore T(n) = O(n a + bn) = O(n a ), and since T(n) is both O(n a ) and Ω(n a ), it is Θ(n a ).
22
Substitution method - warning Be careful when using asymptotic notation. The false proof for the recurrence T (n) = 4T (n/4) + n, that T (n) = O(n):T (n) ≤ 4(c(n/4)) + n ≤ cn + n = O(n) wrong! Because we haven’t proven the exact form of our inductive hypothesis (which is that T (n) ≤ cn), this proof is false.
23
Recursion tree method Use to generate a guess. Then verify by substitution method. T (n) = T (n/3)+T (2n/3)+Θ (n). For upper bound, rewrite as T (n) ≤ T (n/3) + T (2n/3) + cn; for lower bound, as T (n) ≥ T (n/3) + T (2n/3) + cn. By summing across each level, the recursion tree shows the cost at each level of recursion (minus the costs of recursive calls, which appear in subtrees):
24
Recursion tree method
25
There are log 3 n full levels, and after log 3/2 n levels, the problem size is down to 1. Each level contributes ≤ cn. Lower bound guess: ≥ dn log 3 n =(nlg n) for some positive constant d. Upper bound guess: ≤ dn log 3/2 n=O(nlg n) for some positive constant d. Then prove by substitution.
26
Recursion tree method Upper bound: Guess: T (n) ≤ dn lg n. Substitution: T (n) ≤ T (n/3) + T (2n/3) + cn ≤ d(n/3) lg(n/3) + d(2n/3) lg(2n/3) + cn = (d(n/3) lg n − d(n/3) lg 3) + (d(2n/3) lg n − d(2n/3) lg(3/2)) + cn = dn lg n − d((n/3) lg 3 + (2n/3) lg(3/2)) + cn = dn lg n − d((n/3) lg 3 + (2n/3) lg 3 − (2n/3) lg 2) + cn = dn lg n − dn(lg 3 − 2/3) + cn ≤ dn lg n if −dn(lg 3 − 2/3) + cn ≤ 0, d ≥ c / lg 3 − 2/3 Lower bound?
27
Another example T(n) = 3T(n/4) + cn 2.
28
Another example
29
The sub-problem size for a node at depth i is n=4 i Thus, the sub-problem size hits n = 1 when n/4 i =1 or, equivalently, when i= log 4 n. Thus, the tree has log 4 n + 1 levels (at depths 0; 1; 2, …, log 4 n). the number of nodes at depth i is 3 i each node at depth i, for i = 0; 1; 2, …, log 4 n -1, has a cost of c(n/4 i ) 2 Total cost at depth i is (3/16) i cn 2 The bottom level, at depth log 4 n has each cost T(1) for a total cost of
30
Another example Taking advantage of a decreasing geometric sequence
31
Another example
32
Master Method Used for many divide-and-conquer recurrences of the form T (n) = aT (n/b) + f (n), where a ≥ 1, b > 1, and f (n) > 0. Based on the master theorem
33
Master Method
34
Examples
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.