1, and f(n) is given positive funtion."> 1, and f(n) is given positive funtion.">
Download presentation
Presentation is loading. Please wait.
1
A little bit of recurrences here.. From CLRS. Dr. M. Sakalli, Marmara University Picture May 5 th 2006, RPI, NY
2
2-1 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Plan for Analysis of Recursive Algorithms o Decide on a parameter indicating an input’s size. o Identify the algorithm’s basic operation. o Check whether the number of times the basic op. is executed may vary on different inputs of the same size. (If it may, the worst, average, and best cases must be investigated separately.) o Set up a recurrence relation with an appropriate initial condition expressing the number of times the basic op. is executed. o Solve the recurrence (or, at the very least, establish its solution’s order of growth) by backward substitutions or another method.
3
2-2 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Three methods for solving recurrences -- obtaining θ or big-O bounds on T(n): 1.The substitution method: make a guess and then prove the guess is correct by induction. 2.The recursion-tree method: convert the recurrence into a tree of costs & sum them. 3.The "master method": gives bounds for recurrences of the form: T(n) = aT(n/b)+f(n) where a ≥ 1, b > 1, and f(n) is given positive funtion.
4
2-3 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Substitution method, two steps -Make a guess for the solution -Use inductive method to prove that the initial guess holds. -Check the valid initial conditions. To Determine upper bound 1- for a choice of c>0, prove that 2- f(n) ≤ c g(n), for every n ≥ n 0, T(n) ≤ c n ln(n)(2) f(n) = O(g(n)) We say that “f(n) is big-O of g(n).” As n increases, f(n) grows no faster than g(n). In other words, g(n) is an asymptotic upper bound on f(n) for c>0, and n ≥ n 0.
5
2-4 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Big- notation To Determine lower bound 1- for a choice of c>0, prove that 2- c g(n) ≤ f(n), for all n ≥ n 0, c n ln(n) ≤ T(n) (2) We say that “ f(n) is omega of g(n).” As n increases, f(n) grows no slower than g(n). In other words, g(n) is an asymptotic lower bound on f(n).
6
2-5 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Prove n 2 + n = O(n 3 ), or question would have been: Prove that f(n) is big-O of g(n)=n 3. f(n) ≤ c g(n), for all n>n 0. for n 0 =1, c≥2, for all n>n 0, this gives n 2 + n ≤ 2 n 3, end of proof. or we could have chosen, for n 0 =2, c≥1..
7
2-6 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes The same example: Determine upper bound of T(n) = 2T( n/2 ) + n (1) 1- for a choice of c, prove that 2- f(n) ≤ c g(n), for every n ≥ n 0, T(n) ≤ c n ln(n)(2) This is the guess made g(n)=n ln(n), Hence start with by assuming that T(n) ≤ cn lg(n) holds for f loor(n/2), i.e. that T( n/2 ) ≤ c n/2 lg( n/2 ) Substitute this into the equation above: T(n) ≤ 2[c n/2 lg( n/2 )] + n ≤ c n lg(n/2) + n = c n lg(n) - cn lg(2) + n = cn lg(n) - cn + n ≤ c n lg(n), this will hold if c ≥ 1.(2) -To complete the inductive proof, determine c and n 0.. And show that inequality holds for the boundary conditions. -T(1)>0, but lg(1)=0, which seems doesn’t satisfy the equation. -Then work out between equations (1) and (2), obtain n=2, T(2) = 2T(1)+2 ≤ c 2 lg(2), T(3) =2T(1)+3 < c 3 lg(3).. must -Determining c requires establishing the base cases for T(1).. - (2T(1)+2)/ (2 lg(2) )≤ c T(1)+1 < c, (2T(1)+3) /(3 lg(3))< c, - Chose the maximum of {(T(1)+1), (2T(1)+3) /(3 lg(3))} T(1)+1, therefore for T(1) =1, c=2.
8
2-7 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Sometimes manipulation variables can make an unknown recurrence look like equation. Sometimes manipulation variables can make an unknown recurrence look like equation. For example, consider the recurrence: T(n) = 2T( √ n ) + lg n Let m = lg(n), then n = 2 m : T(2 m ) = 2T(2 m/2 ) + m And now, let S(m') = T(2 m ): S(m') = 2S(m'/2) + m' Solution!! S(m') = O(m' lg m'). Changing back to T(n), T(n) = T(2 m ) = S(m') = O(lg n lg lg n).
9
2-8 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Example 1: Substitution method: Recursive evaluation of n! Definition: n ! = 1 2 … (n-1) n for n ≥ 1 and 0! = 1 Recursive definition of n!: F(n) = F(n-1) n for n ≥ 1 and F(0) = 1 F(0) = 1 Size: n Basic operation: Multiplication Recurrence relation: Recurrence relation: M(n) = M(n-1) + 1, M(0) = 0 M(n) = M(n-1) + 1, n>0, M(0) = 0 M(n) = M(n-2) + 1 + 1 … M(n) = M(n-i) + i, i=n, M(n) =n..
10
2-9 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Example 3: Counting #bits A(2 k ) = A(2 k-1 )+1, A(2 0 ) = 0… A(2 k ) = A(2 k-1 )+1, = [A(2 k-2 )+1]+1. = [A(2 k-2 )+1]+1. ….= A(2 k-k )+k n=2 k k = log2(n) (log(n))
11
2-10 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes The recursion-tree method In a recursion tree, each node represents the cost of a recursive call of the algorithm somewhere in its recursive invocation, but ignoring the cost of further recursive calls from within it, which are accounted for by "child" costs of that node. The costs of the nodes are summed to get the total cost at each level and finally the sum of the per level costs... Recursion trees are especially useful for the T(n) of a divide-and-conquer algorithm, they are used to generate good guesses, which can then be verified by the substitution method. Hence, a bit of "sloppiness" to be tolerated. Find an upper bound for T(n) = 3T( n/4 ) + θ(n^2) Ignore the floor function and non-integer values of n/4, so that T(n) = 3T(n/4) + cn^2 for some c > 0. cn^2 cn^2 / | \ / | \ T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) Continue this process until reaching the boundary which is a subproblem of size 1 at depth i, n/4 i = 1, i = (lg(n))/2 =log 4 (n). Thus the tree has (lg(n)/2) + 1 levels: 0, 1, 2,..., (lg(n))/2. Non-recursive cost at each level decreasing by a factor of 4 and three times as many of them as in the previous level.. So at each level the cost is c(n/4 i ) 2 and 3 i of branches, c3 i (n/4 i ) 2,= (3/16) i cn 2 …
12
2-11 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes At the last level of depth log 4 (n), there are 3 lg(n)/2 = 3 log 4 (n) = n log 4 (3) nodes of cost T(1) for a total cost of θ(n log 4 (3) =n log 4 (3) ). Thus the total cost of all levels is: 4.2.3 log 4 (n)-1 inf log 4 (n)-1 inf T(n) = Σ (3/16) i cn 2 + θ(n log 4 (3) ) < Σ( (3/16) i cn 2 ) + θ(n log 4 (3) ) i=0 i=0 i=0 i=0 = (1/(1 - 3/16)) cn 2 + θ(n log 4 (3) ) = 16/13 cn 2 + θ(n log 4 (3) ) = (1/(1 - 3/16)) cn 2 + θ(n log 4 (3) ) = 16/13 cn 2 + θ(n log 4 (3) ) = O(n 2 ) = O(n 2 ) Which gives the initial guess that would have been chosen at the fist hand. Now use the substitution method to verify the guess, T(n) = O(n 2 ) as an upper bound T(n) = 3T( n/4 ) + θ(n 2 ) T(n) = 3T( n/4 ) + θ(n 2 ) Need to show T(n) ≤ dn 2 for some d > 0. Letting c be as before, we have: T(n) ≤ 3T( n/4 ) + cn 2 ≤ 3d( n/4 ) 2 + cn 2 by ind. hyp. ≤ 3d( n/4 ) 2 + cn 2 by ind. hyp. ≤ 3d(n/4) 2 + cn 2 ≤ 3d(n/4) 2 + cn 2 = (3/16)dn 2 + cn 2 = (3/16)dn 2 + cn 2 ≤ dn 2 ≤ dn 2 Choose d ≥ (16/13)c
13
2-12 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes T(n) = (n log b a )+ Σ j=0 log b n-1 a j f(n/b j )T(n) = (n log b a )+ Σ j=0 log b n-1 a j f(n/b j ) j=0 log b n-1
14
2-13 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes The master theorem Let a ≥ 1 and b > 1, f(n) asymptotically positive!!!, and let T(n) be defined by: T(n) = aT(n/b) + f(n) where n/b can be interpreted to be rounded either up or down. Then T(n) can be bounded asymptotically: 1. If f(n) = O(n log b (a)-Є ) for some Є> 0, then T(n) = θ(n log b (a) ) 2. If f(n) = θ(n log b (a) ), k≥0, then T(n) = θ(n log b (a) log k+1 (n) ) 3. If f(n) = Ω(n log b (a)+Є ) for some Є> 0, and if regularity condition, af(n/b) ≤ cf(n) for some c n 0 for some n 0 >0, and then T(n) = θ(f(n)). The regularity condition implies that if f(n), the driving cost is not a polynomial then a limited 4 th condition allows considering polylog factors To cover the gap between 1-2 and 2-3, corollary below.. Corollary if f(n) is θ(n log b (a) lg k (n)), then T(n) = θ(n log b (a) lg k+1 (n) ) For example T(n)= 2 T(n/2) + n log n. n lg2 =n vs (n lgn). Since f(n)=n lgn is asymptotically less than n 1+ for any . This which makes the case 2.) a=2, b=2, non-polynomial f(n), however f(n) (n log n), k=1, therefore (n log 2 n)!!
15
2-14 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes If f(n)=n d no need to check regularity, just compare both side. If f(n)=n d no need to check regularity, just compare both side. Master Theorem: If T(n) = aT(n/b) + O(n d ) for some constants a ≥ 1, b > 1, d ≥ 0, then (n log b a )if d b d ) T(n) = (n d lgn) if d = log b a, (a = b d ) (n d ) if d > log b a, (a log b a, (a < b d ) Why? The proof uses a recursion tree argument. case 1: f(n) is "polynomially smaller than (n log b a ) case 2: f(n) is "asymptotically equal to (n log b a ) case 3: f(n) is "polynomially larger than (n log b a ) Corollary case4: if f(n)= (n log b a lg k n), then T(n)= (n log b a lg k+1 n). (as exercise)
16
2-15 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Exp1: T(n) = 9T(n/3) + n f(n) = n, n log b a = n 2f(n) = n, n log b a = n 2 compare f(n) = n with the cost of recursiveness..compare f(n) = n with the cost of recursiveness.. n = O(n 2 - ) ( =1, so f(n) is polynomially smaller than n log b a ), case 1 will apply here T(n) = (n 2 ) Exp2: T(n) = T(n/2) + 1 f(n) = 1, n log b a = n 0 = 1f(n) = 1, n log b a = n 0 = 1 1 = (n 0 ) (driving cost f(n) is polynomial equal to the cost of recursiveness, n log b a ) case 2 applies: T(n) = (n 0 lgn) = (lgn)case 2 applies: T(n) = (n 0 lgn) = (lgn) Exp3: T(n) = T(n/2) + n 2 D(n) = n 2, n log b a = n log 2 1 = n 0 = 1D(n) = n 2, n log b a = n log 2 1 = n 0 = 1 n 2 = (n 0+ ) (0< <=2, so D(n) is polynomially larger) Since D(n) is a polynomial in n, case 3 applies T(n) = (n 2 )Since D(n) is a polynomial in n, case 3 applies T(n) = (n 2 ) Exp4: T(n) = 4T(n/2) + n 2 D(n) = n 2, n log b a = n lg4 = n 2 D(n) = n 2, n log b a = n lg4 = n 2 compare both sides, D(n) with n 2 compare both sides, D(n) with n 2 polynomially equal, case 2 holds, T(n) = (n 2 lgn) polynomially equal, case 2 holds, T(n) = (n 2 lgn)
17
2-16 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Exp5: T(n) = 7T(n/3) + n 2 f(n) = n 2, n log b a = n log 3 7 = n 1+ f(n) = n 2, n log b a = n log 3 7 = n 1+ compare f(n) = n 2 with n 1+ , n 2 = (n 1+ ) so f(n) is polynomially larger compare f(n) = n 2 with n 1+ , n 2 = (n 1+ ) so f(n) is polynomially larger Since f(n) is a polynomial in n, case 3 holds and T(n) = (n 2 ) Since f(n) is a polynomial in n, case 3 holds and T(n) = (n 2 ) Exp6: T(n) = 7T(n/2) + n 2 n log b a = n log 2 7 = n 2+ n log b a = n log 2 7 = n 2+ compare f(n) = n 2 with n 2+ f(n) is polynomially smaller compare f(n) = n 2 with n 2+ f(n) is polynomially smaller Case 1 holds and T(n) = (n log 2 7 ) Case 1 holds and T(n) = (n log 2 7 ) Exp7: T(n) = 3T(n/4) + n lgn f(n) = nlgn, recursive side n log 4 3 = (n 1- ) f(n) = nlgn, recursive side n log 4 3 = (n 1- ) comparing both sides, case 3 applies here, since f(n)=log(n), we need to check regularity condition for sufficiently large n comparing both sides, case 3 applies here, since f(n)=log(n), we need to check regularity condition for sufficiently large n af(n/b) = 3(n/4)lg(n/4) c n lg n = cf(n) for c=3/4 af(n/b) = 3(n/4)lg(n/4) c n lg n = cf(n) for c=3/4 T(n) = (n lg n) T(n) = (n lg n) Use recursion tree for a good guess and give a proof of your result by using substitution method forUse recursion tree for a good guess and give a proof of your result by using substitution method for T(n)=3T(n/4) + cn 2T(n)=3T(n/4) + cn 2 T(n)=T(n/3) + T(2n/3) + cnT(n)=T(n/3) + T(2n/3) + cn
18
2-17 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Proof of Master Theorem, CLRS, Better to read from Chapter 4 of CLRS. o The proof for the exact powers, n=b k for k 1. o Lemma 4.2 (1) if n=1 for T(n) =for T(n) = aT(n/b)+f(n) if n=b k for k 1 aT(n/b)+f(n) if n=b k for k 1 where a 1, b>1, f(n) be a nonnegative function,where a 1, b>1, f(n) be a nonnegative function, ThenThen T(n) = (n log b a )+ a j f(n/b j )T(n) = (n log b a )+ a j f(n/b j ) o Proof: By iterating the recurrenceBy iterating the recurrence By recursion treeBy recursion tree j=0 log b n-1
19
2-18 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes 18 Recursion tree for T(n)=aT(n/b)+f(n)
20
2-19 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Proof of Master Theorem (cont.) Lemma 4.3: Let a 1, b>1, f(n) be a nonnegative function defined on exact power of b, thenLet a 1, b>1, f(n) be a nonnegative function defined on exact power of b, then g(n)= a j f(n/b j ) can be bounded for exact power of b as:g(n)= a j f(n/b j ) can be bounded for exact power of b as: 1.If f(n)=O(n log b a - ) for some >0, then g(n)= O(n log b a ). 2.If f(n)= (n log b a ), then g(n)= (n log b a lg n). 3.If f(n)= (n log b a + ) for some >0 and if af(n/b) cf(n) for some c 0 and if af(n/b) cf(n) for some c<1 and all sufficiently large n b, then g(n)= (f(n)). j=0 log b n -1
21
2-20 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Proof of Lemma 4.3 o For case 1: f(n)=O(n log b a - ) implies f(n/b j )=O((n /b j ) log b a - ), so o g(n)= a j f(n/b j ) =O( a j (n /b j ) log b a - ) o = O(n log b a - a j /(b log b a - ) j ) = O(n log b a - a j /(a j (b - ) j )) o = O(n log b a - (b ) j ) = O(n log b a - (((b ) log b n -1)/(b -1) ) o = O(n log b a - (((b log b n ) -1)/(b -1)))=O(n log b a n - (n -1)/(b -1)) o = O(n log b a ) j=0 log b n -1 j=0 log b n -1 j=0 log b n -1 j=0 log b n -1 j=0 log b n -1
22
2-21 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Proof of Lemma 4.3(cont.) o For case 2: f(n)= (n log b a ) implies f(n/b j )= ((n /b j ) log b a ), so o g(n)= a j f(n/b j ) = ( a j (n /b j ) log b a ) o = (n log b a a j /(b log b a ) j ) = (n log b a 1) o = (n log b a log b n ) = (n log b a lg n) j=0 log b n-1 j=0 log b n-1 j=0 log b n-1 j=0 log b n-1
23
2-22 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Proof of Lemma 4.3(cont.) o For case 3: Since g(n) contains f(n), g(n) = (f(n))Since g(n) contains f(n), g(n) = (f(n)) Since af(n/b) cf(n), a j f(n/b j ) c j f(n), why???Since af(n/b) cf(n), a j f(n/b j ) c j f(n), why??? g(n)= a j f(n/b j ) c j f(n) f(n) c jg(n)= a j f(n/b j ) c j f(n) f(n) c j =f(n)(1/(1-c)) =O(f(n)) =f(n)(1/(1-c)) =O(f(n)) Thus, g(n)= (f(n))Thus, g(n)= (f(n)) j=0 log b n-1 j=0 log b n-1 j=0
24
2-23 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Proof of Master Theorem (cont.) o Lemma 4.4: for T(n) = (1) if n=1for T(n) = (1) if n=1 aT(n/b)+f(n) if n=b k for k 1 aT(n/b)+f(n) if n=b k for k 1 where a 1, b>1, f(n) be a nonnegative function,where a 1, b>1, f(n) be a nonnegative function, 1.If f(n)=O(n log b a - ) for some >0, then T(n)= (n log b a ). 2.If f(n)= (n log b a ), then T(n)= (n log b a lg n). 3.If f(n)= (n log b a + ) for some >0, and if af(n/b) cf(n) for some c 0, and if af(n/b) cf(n) for some c<1 and all sufficiently large n, then T(n)= (f(n)).
25
2-24 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Proof of Lemma 4.4 (cont.) o Combine Lemma 4.2 and 4.3, For case 1:For case 1: –T(n)= (n log b a )+O(n log b a )= (n log b a ). For case 2:For case 2: –T(n)= (n log b a )+ (n log b a lg n)= (n log b a lg n). For case 3:For case 3: –T(n)= (n log b a )+ (f(n))= (f(n)) because f(n)= (n log b a + ).
26
2-25 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Floors and Ceilings o T(n)=aT( n/b )+f(n) and T(n)=aT( n/b )+f(n) o Want to prove both equal to T(n)=aT(n/b)+f(n) o Two results: Master theorem applied to all integers n.Master theorem applied to all integers n. Floors and ceilings do not change the result.Floors and ceilings do not change the result. –(Note: we proved this by domain transformation too). o Since n/b n/b, and n/b n/b, upper bound for floors and lower bound for ceiling is held. o So prove upper bound for ceilings (similar for lower bound for floors).
27
2-26 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Upper bound of proof for T(n)=aT( n/b )+f(n) o consider sequence n, n/b , n/b /b , n/b /b /b , … o Let us define n j as follows: o n j = n if j=0 o = n j-1 /b if j>0 o The sequence will be n 0, n 1, …, n log b n o Draw recursion tree:
28
2-27 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes
29
2-28 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes The proof of upper bound for ceiling T(n) = (n log b a )+ a j f(n j )T(n) = (n log b a )+ a j f(n j ) Thus similar to Lemma 4.3 and 4.4, the upper bound is proven.Thus similar to Lemma 4.3 and 4.4, the upper bound is proven. j=0 log b n -1
30
2-29 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Where Are the Gaps n log b a f(n), case 2: within constant distances f(n), case 2: within constant distances c1c1c1c1 c2c2c2c2 nnnn f(n), case 1, at least polynomially smaller Gap between case 1 and 2 nnnn Gap between case 3 and 2 f(n), case 3, at least polynomially larger Note: 1. for case 3, the regularity also must hold. 2. if f(n) is lg n smaller, then fall in gap in 1 and 2 2. if f(n) is lg n smaller, then fall in gap in 1 and 2 3. if f(n) is lg n larger, then fall in gap in 3 and 2 3. if f(n) is lg n larger, then fall in gap in 3 and 2 4. if f(n)= (n log b a lg k n), then T(n)= (n log b a lg k+1 n). (as exercise) 4. if f(n)= (n log b a lg k n), then T(n)= (n log b a lg k+1 n). (as exercise)
31
2-30 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes o # of levels * cost of each level = O(cn log 3/2 n)=O(n lg n) o Complication If the tree is complete binary tree, # of leaves =If the tree is complete binary tree, # of leaves = But the tree is not completeBut the tree is not complete Go down from the root, more and more internal nodes are absentGo down from the root, more and more internal nodes are absent o Verify O(n lg n) is an upper bound by the substitution method o Exmp1: T(n) = T(2n/3) + 1 (Case 2) a=1, b=3/2, f(n)=1a=1, b=3/2, f(n)=1 n log b a = n log 3/2 1 =1, f(n)=1 = (1), n log b a = n log 3/2 1 =1, f(n)=1 = (1), T(n) = (lg n)T(n) = (lg n) A recursion tree for T(n)=T(n/3) + T(2n/3) + cn
32
2-31 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Running time of the BS algorithm given below Algorithm recursive Binary-Search-Rec(A[1…n], k, l, r) Input: a sorted array A of n comparable items, search key k, leftmost and rightmost index positions in A Input: a sorted array A of n comparable items, search key k, leftmost and rightmost index positions in A Output: Index of array’s element that is equal to k or -1 if k not found Output: Index of array’s element that is equal to k or -1 if k not found function BSR(X : name; start, end: integer) // l = start; r =end. begin middle := (start+end)/2; // while l <= r do if names(middle)=x then return numbers(middle); elseif X<names(middle) then return BSR(X,start,middle-1); else -- X>names(middle) return BSR(X,middle+1,end); end if; end search; You can write BS nonrecursive.. Simple.. Searcing up and down stream of array.. T(n)=T(n/2)+1. worst case not ordered and ordered input cases O(n) and O(lg(n)), respectively. Best case. O(1) for both cases. Recursive or iterative shouldn’t make any difference.
33
2-32 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes algorithm stoogesort(array L, i = 0, j = length(L)-1) if L[j] < L[i] then if L[j] < L[i] then L[i] ↔ L[j] L[i] ↔ L[j] if j - i > 1 then if j - i > 1 then t = (j - i + 1)/3 t = (j - i + 1)/3 stoogesort(L, i, j-t) stoogesort(L, i, j-t) stoogesort(L, i+t, j ) stoogesort(L, i+t, j ) stoogesort(L, i, j-t) stoogesort(L, i, j-t) return L return L Provided extra file..
34
2-33 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes
35
2-34 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Recurrence Relations Liber Abaci, Fibonacci, the Florentian scholar from the Renaissance studied simplified population growth functions. At some stage starts with a population of 1 pair of rabbits. After a month fertile rabbits produce a newborn pair. Happy they are, they never die and reproduce each month. f(n)=f(n-1)+f(n-2), start from f(0)=f(1)=1.. f(n) ≥ 2 ∗ f(n−2), faster than exponential growth of 2 k. The Fibonacci recurrence is an example of a linear homogeneous recurrence relation: a n = C 1 a n-1 + C 2 a n-2 +... + C r a n-r A general solution of this, involves a linear combination of solutions, a(n) = n and there is no f(n). *** Substitute it.. n = C 1 n-1 + C 2 n-2 +... + C r n-r Characteristic equation is r = C 1 r-1 + C 2 r-2 +... + C r Assume no complex roots, solutions 1, 2, 3, …, r. then any linear combination is also a general solution. n = A 1 1 n + A 2 2 n +... + A r r n
36
2-35 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Fibonacci: f n = f n-1 + f n-2,, f 0 = f 1 =1. *** Substitute k.. n = n-1 + n-2 2 - - 1 = 0; Roots of this characteristic equation are 1,2 = (1 + - sqrt(5))/2 A n = b ((1+sqrt(5))/2) n + d ((1 - sqrt(5))/2) n. Initial values, 1 and 1. b r 1 n + d r 2 n = b + d = 1, n=0 A n = b ((1+sqrt(5))/2) n + d ((1 - sqrt(5))/2) n = 1; n=1 b = (1/sqrt(5))((1+sqrt(5))/2) d = (-1/sqrt(5))((1-sqrt(5))/2) Substitute b, d back to the equation to get specific equation.. f n = (1/sqrt(5)) [((1+sqrt(5))/2) n+1 - ((1 - sqrt(5))/2) n+1 ],
37
2-36 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Homogenous First Order Linear Homogeneous Recurrence Relations a n = C 1 a n-1 rewriting with the roots A n = C 1 n-1 and = C 1, A n = 0 (C 1 ) n Example: Compound interest A n = A n-1 +.3 A n-1 A n = A 0 (1.3) n Example: Compound interest A n = 7A n-1 ; A 0 = 5, hence = 7, a n = 5(7) n Second Order Linear Homogeneous Recurrence Relations a n = C 1 a n-1 + C 2 a n-2, quadratic, solve two roots, as usual, r 1, r 2. A n = b r 1 n + d r 2 n. Needs initial values such as b and d.
38
2-37 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes a n = C 1 a n-1 rewriting with the roots A n = C 1 n-1 and = C 1, A n = 0 (C 1 ) n Example: Compound interest A n = A n-1 +.3 A n-1 A n = A 0 (1.3) n Example: Compound interest A n = 7A n-1 ; A 0 = 5, hence = 7, a n = 5(7) n Second Order Linear Homogeneous Recurrence Relations a n = C 1 a n-1 + C 2 a n-2, quadratic, solve two roots, as usual, r 1, r 2. A n = b r 1 n + d r 2 n. Needs initial values such as b and d.
39
2-38 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Inhomogeneous Recurrence. a n = ca n-1 + f(n) Solution for homogeneous side in this case, a 0 c n Suppose particular solution for inhomogeneous form is * ( n * = c n + * n + f(n)); a n = bc + * n ; is a general solution. Roots of this characteristic equation are 1,2 = (1 + - sqrt(5))/2 A n = b ((1+sqrt(5))/2) n + d ((1 - sqrt(5))/2) n. Initial values, 1 and 1. b r 1 n + d r 2 n = b + d = 1, n=0 A n = b ((1+sqrt(5))/2) n + d ((1 - sqrt(5))/2) n = 1; n=1 b = (1/sqrt(5))((1+sqrt(5))/2) d = (-1/sqrt(5))((1-sqrt(5))/2) Substitute b, d back to the equation to get specific equation.. f n = (1/sqrt(5)) [((1+sqrt(5))/2) n+1 - ((1 - sqrt(5))/2) n+1 ]
40
2-39 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes Quiz Evaluate T(n) of functions given below. 1) T(n) = T(n/3) + T(2n/3) +c n, Deterministic select algorithm using median. S 1) T(n) = T(n/3) + T(2n/3) +c n, Deterministic select algorithm using median. Solve the recurrence for the running time of the algorithm by drawing a recurrence tree. Answer: The height of the tree is at least log 3 n, and is at most log 3/2 n and the sum of the costs in each level is n. Hence T(n)= (nlgn) 2) T(n) = 2T(n/2) + O(1) Answer: Case 1: (n), epsilon=1. 3) T(n) = 4T(n/2) + n 2 Answer: Case 2: T(n) = (n 2 log n) 4) What is the upper complexity of T(n)=k n + log 8 (n 2n ) + (10n) k. Answer: if c<=1, O(nlog 8 (n)), otherwise O(k n ). 5) T(n) = 3T(2ceil(n/3)) + 1000 stooge sort.. lg 3/2 (3) = lg3/lg(3/2)=lg3/(lg3-1) = ~2.72 lg 3/2 (2*1.5)=lg 3/2 (2)+lg 3/2 (1.5)=lg 3/2 (2)+1=~l.7+1.. Case 1. what is the meaning of this.. 5) M(n) = M(n-1) + 1, n>0, M(0) = 0, substation method, sums to (n) 6) A(2 k ) = A(2 k-1 )+1, case 2, yields (lgn) 6) Recursion in d-dimensional mesh (recursion on d-dimensional mesh, d>0) T(n) = T(n/(2 d )) + 47d 2 n 1/d h.s =1, T(n)=theta(47d 2 n 1/d ), T(n)=theta(47d 2 n 1/d ),, case 3. No need to check regularity. 7) T(n) = T(n/2) + log n (PRAM mergesort) h.s =1, for sufficiently large n, (log n)>1, appealing to T(n)=log n, regularity check af(n/b) cf(n), log(n/2) clog(n), (1-c)lgn 1, not possible always to satisfy this relation for large n,, and 0 1, appealing to T(n)=log n, regularity check af(n/b) cf(n), log(n/2) clog(n), (1-c)lgn 1, not possible always to satisfy this relation for large n,, and 0<c<1, (for the regularity condition 0<c <1). Therefore one needs to use other methods, simply check the equation given below (the final equation obtained at the bottom of iterative tree method), there the first part is 1, and the second part is log 2 n. Therefore the result is theta(log 2 n). T(n) = (n log b a )+ Σ i=0 log b n-1 a j f(n/b j ) = (1)+ Σ i=0 lgn-1 log(n/b j )=T(n) = (n log b a )+ Σ i=0 log b n-1 a j f(n/b j ) = (1)+ Σ i=0 lgn-1 log(n/b j )= =logn Σ i=0 lgn-1 log(n/2 j ), lgn-j, lgn-1, lgn-2, lgn-3,.., lgn-lgn+1 = lgnlgn- Σ i=0 lgn-1 j, (n(n+1)/2), (lg(n)-1)(lgn)/2 ( log 2 n )=logn Σ i=0 lgn-1 log(n/2 j ), lgn-j, lgn-1, lgn-2, lgn-3,.., lgn-lgn+1 = lgnlgn- Σ i=0 lgn-1 j, (n(n+1)/2), (lg(n)-1)(lgn)/2 ( log 2 n )
41
2-40 M, Sakalli, CS246 Design & Analysis of Algorithms, Lecture Notes wikipedia b Inadmissible equations b The following equations cannot be solved using the master theorem: [2] [2] *** a is not a constant, n^lg2^n=n^n, the number of children increasing with exp trend *** a is not a constant, n^lg2^n=n^n, the number of children increasing with exp trend non-polynomial difference between f(n) and n^logba=n non-polynomial difference between f(n) and n^logba=n a<1 cannot have less than one sub problem a<1 cannot have less than one sub problem f(n) is not positive f(n) is not positive case 3 but regularity violation. case 3 but regularity violation.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.