Download presentation
Presentation is loading. Please wait.
Published byRobyn Woods Modified over 9 years ago
1
BY Lecturer: Aisha Dawood
2
A recurrence is a function is defined in terms of: one or more base cases, and itself, with smaller arguments. 2
3
Technical issues: Floors and ceilings Exact vs. asymptotic functions Boundary conditions Example: T (n) = 2T (n/2) + (n), with solution T (n) = (n lg n). The boundary conditions are usually expressed as T(n) = O(1) for sufficiently small n. When we desire an exact, rather than an asymptotic, solution, we need to deal with boundary conditions. In practice, we just use asymptotics most of the time, and we ignore boundary conditions. 3
4
There are three ways of solving recurrences running time: Substitution method. Recursion tree method. Master method. 4
5
Substitution method 1. Guess the solution. 2. Use induction to find the constants and show that the solution works. 1. Guess: T (n) = n lg n + n. 2. Induction: Base case: n = 1 ⇒ n lg n + n = 1 = T (n) (exact solution) Inductive step: Inductive hypothesis is that T (k) = k lg k + k for all k < n. We’ll use this inductive hypothesis for T (n/2). 5
6
This is an exact solution. 6
7
Generally, we use asymptotic notation: We would write T (n) = 2T (n/2) + (n). We assume T (n) = O(1) for sufficiently small n. We express the solution by asymptotic notation: T (n) = (n log n). We don’t worry about boundary cases, nor do we show base cases in the substitution proof. For the substitution method: Name the constant in the additive term. Show the upper (O) and lower ( Ω ) bounds separately. Might need to use different constants for each. 7
8
Example: T (n) = 2T (n/2)+ (n). If we want to show an upper bound of T (n) = 2T (n/2) + O(n), we write T (n) ≤ 2T(n/2) + cn for some positive constant c. 1. Upper bound: Guess: T (n) ≤ dn log n for some positive constant d. Substitution: 8
9
2. Lower bound: Write T (n) ≥ 2T (n/2) + cn for some positive constant c. Guess: T (n) ≥ dn log n for some positive constant d. Substitution: 9
10
There is no general way to guess the correct solutions to recurrences. Guessing a solution takes experience and, occasionally, creativity. There are some heuristics that can help you become a good guesser (e.g. Recursion tree). If a recurrence is similar to one you have seen before, then guessing a similar solution is reasonable. As an example, consider the recurrence T (n) = 2T ( ⌊ n/2 ⌋ + 17) + n, we make the guess that T (n) = O(n log n). 10
11
Another way to make a good guess is to prove loose upper and lower bounds on the recurrence and then reduce the range of uncertainty. For example, we might start with a lower bound of T (n) = Ω(n) for the recurrence, since we have the term n in the recurrence, and we can prove an initial upper bound of T (n) = O(n2). Then, we can gradually lower the upper bound and raise the lower bound until we converge on the correct, asymptotically tight solution of T (n) = Θ(n log n). 11
12
Sometimes you guess at an asymptotic bound on the solution of a recurrence, but somehow the inductive doesn’t work. This is solved through revising the guess by subtracting a lower-order term often permits the math to go through. Example : 12
13
As an example, consider the recurrence We can simplify this recurrence, though, with a change of variables. T (2 m ) = 2T (2 m/2 ) + m. We can now rename S(m) = T(2 m ) to produce the new recurrence S(m) = 2S(m/2) + m, This new recurrence has the same solution: S(m) = O(m lg m). Changing back from S(m) to T (n), we obtain T (n) = T (2 m ) = S(m) = O(m lg m) = O(lg n lg lg n). 13
14
Another way to make a good guess is to prove loose upper and lower bounds on the recurrence and then reduce the range of uncertainty. For example, we might start with a lower bound of T (n) = Ω(n) for the recurrence, since we have the term n in the recurrence, and we can prove an initial upper bound of T (n) = O(n2). Then, we can gradually lower the upper bound and raise the lower bound until we converge on the correct, asymptotically tight solution of T (n) = Θ(n log n). 14
15
In a recursion tree, each node represents the cost of a single subproblem somewhere in the set of recursive function invocations. We sum the costs within each level of the tree to obtain a set of per-level costs, and then we sum all the per- level costs to determine the total cost of all levels of the recursion. A recursion tree is best used to generate a good guess, which is then verified by the substitution method. 15
16
For example, let us see how a recursion tree would provide a good guess for the recurrence T (n) = 3T ( ⌊ n/4 ⌋ ) + Θ(n 2 ). we create a recursion tree form the recurrence: T (n) = 3T(n/4) + cn 2, the constant coefficient c > 0. We get the recursion tree 16
17
17 T (n) = 3T(n/4) + cn 2
18
The subproblem size for a node at depth i is n/4 i. Thus, the subproblem size hits n = 1 when n/4 i = 1 or, equivalently, when i = log 4 n. Thus, the tree has log 4 n + 1 levels (0, 1, 2,..., log 4 n). Next we determine the cost at each level of the tree. Each level has three times more nodes than the level above, and so the number of nodes at depth i is 3 i. Because subproblem sizes reduce by a factor of 4 for each level we go down from the root, each node at depth i, for i = 0, 1, 2,..., log 4 n - 1, has a cost of c(n/4 i ) 2. Multiplying, we see that the total cost over all nodes at depth i, for i = 0, 1, 2,..., log 4 n - 1, is 3 i c(n/4 i ) 2 = (3/16) i cn 2. The last level, at depth log 4 n, has nodes 3 log 4 n = n log 4 3 each contributing cost T (1), for a total cost of n log 4 3 T (1), which is n log 4 3. 18
19
Now we add up the costs over all levels to determine the cost for the entire tree: 19 as an upper bound:
20
Now we can use the substitution method to verify that our guess was correct, that is, T (n) = O(n 2 ) is an upper bound for the recurrence T (n) = 3T ( ⌊ n/4 ⌋ )+Θ(n 2 ). We want to show that T (n) ≤ dn 2 for some constant d > 0. Using the same constant c > 0 as before. where the last step holds as long as d ≥ (16/3)c. 20 T(n)T(n)≤3T(⌊n/4⌋) + cn 2 ≤3d⌊n/4⌋ 2 + cn 2 ≤3d(n/4) 2 + cn 2 =3/16 dn 2 + cn 2 ≤dn 2,
21
Another example: the recursion tree for T (n) = T(n/3) + T(2n/3) + O(n). 21
22
Another example: the recursion tree for T (n) = T(n/3) + T(2n/3) + O(n). 22
23
The longest path from the root to a leaf is n → (2/3)n → (2/3) 2 n → ··· → 1. Since (2/3) k n = 1 when k = log 3/2 n, the height of the tree is log 3/2 n. Intuitively, we expect the solution to the recurrence to O(cn log 3/2 n) = O(n lg n). Each level contributes ≤ cn. Lower bound guess: ≥ dn log 3 n = (n lg n) for some positive constant d. Upper bound guess: ≤ dn log 3/2 n = O(n lg n) for some positive constant d. Then prove by substitution. 23
24
24
25
25
26
The master method provides a "cookbook" method for solving recurrences of the form where a ≥ 1 and b > 1 are constants and f (n) is an asymptotically positive function. The master method requires memorization of three cases, but then the solution of many recurrences can be determined quite easily without paper and pencil. 26
27
(Master theorem): Compare 27
28
28
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.