Introduction to Algorithms

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Introduction to Algorithms 6.046J/18.401J
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 2 Prof. Erik Demaine.
Introduction to Algorithms 6.046J
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
The master method The master method applies to recurrences of the form T(n) = a T(n/b) + f (n), where a  1, b > 1, and f is asymptotically positive.
Updates HW#1 has been delayed until next MONDAY. There were two errors in the assignment Merge sort runs in Θ(n log n). Insertion sort runs in Θ(n2).
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Analysis of Algorithms
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Chapter 4: Solution of recurrence relationships
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Introduction to Algorithms Jiafen Liu Sept
Mathematics Review and Asymptotic Notation
Analysis of Algorithms
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
Merge Sort Solving Recurrences The Master Theorem
Asymptotic Analysis-Ch. 3
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Algorithms Merge Sort Solving Recurrences The Master Theorem.
©Silberschatz, Korth and Sudarshan3.1 Algorithms Analysis Algorithm efficiency can be measured in terms of:  Time  Space  Other resources such as processors,
Algorithm Analysis Part of slides are borrowed from UST.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Spring 2015 Lecture 2: Analysis of Algorithms
Asymptotic Notation Faculty Name: Ruhi Fatima
Chapter 4: Solution of recurrence relationships Techniques: Substitution: proof by induction Tree analysis: graphical representation Master theorem: Recipe.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Equal costs at all levels
Introduction to Algorithms
Recursion Ali.
Analysis of Algorithms CS 477/677
Introduction to Algorithms: Recurrences
Introduction to Algorithms: Asymptotic Notation
CS 3343: Analysis of Algorithms
Introduction to Algorithms Prof. Charles E. Leiserson
Introduction to the Design and Analysis of Algorithms
CS 3343: Analysis of Algorithms
Analysis of Algorithms
O-notation (upper bound)
CS 3343: Analysis of Algorithms
Asymptotic Notations Algorithms Lecture 9.
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Introduction to Algorithms 6.046J
Asymptotic Growth Rate
Algorithms and Data Structures Lecture III
CS 3343: Analysis of Algorithms
Recursion-tree method
CS 3343: Analysis of Algorithms
Introduction to Algorithms
Algorithms Analysis Algorithm efficiency can be measured in terms of:
Divide and Conquer (Merge Sort)
Algorithms Analysis Algorithm efficiency can be measured in terms of:
Analysis of Algorithms
At the end of this session, learner will be able to:
Introduction To Algorithms
Algorithms Recurrences.
Quicksort Quick sort Correctness of partition - loop invariant
Algorithm Course Dr. Aref Rashad
Algorithms and Data Structures Lecture III
Presentation transcript:

Introduction to Algorithms LECTURE 2 Asymptotic Notation O-, -, and -notation Recurrences Substitution method Iterating the recurrence Recursion tree Master method Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

“Big Oh” Notation Note that it is tempting to think that O(f(n)) means “approximately f(n).” Although used in this manner frequently, this interpretation or use of “big Oh” is absolutely incorrect. Most frequently, O(f(n)) is pronounced “on the order of f(n)” or “order f(n),” but keep in mind that this also does not mean “approximately f(n).”

Let f(n) and g(n) be functions:

f(n) is said to be less than g(n) if f(n) <= g(n) for all n. For example, n2 is less than n4 + 1. f(n) g(n) n

To within a constant factor, f(n) is said to be less than g(n) if there exists a positive constant c such that f(n) <= cg(n) for all n. cg(n) f(n) g(n) n

Example: g(n) = 3n2 + 2 f(n) = 6n2 + 3 Note: 6n2 + 3 is not less than 3n2 + 2 However: To within a constant factor 6n2 + 3 is less than 3n2 + 2 Proof: Let c=9, then we see that: 6n2 + 3 <= 9(3n2 + 2) = 27n2 + 18

By the way, for these two functions it is also the case that: To within a constant factor 3n2 + 2 is less than 6n2 + 3 Proof: Let c=1, then we see that: 3n2 + 2 <= 1(6n2 + 3) = 6n2 + 3 In fact 3n2 + 2 is actually less than 6n2 + 3 In other words, asymptotically, there ain’t much difference in the growth of the two functions, as n goes to infinite. (always enjoy using that word in class)

Question: g(n) = n2 f(n) = 2n Is f(n), to within a constant factor, less than g(n)? In other words, is there a constant c such that: f(n) <= cg(n) ? No! Not even if we let c = 1,000,000, or larger!

Definition: f(n) is said to be O(g(n)) if there exists two positive constants c and n0 such that f(n) <= cg(n) for all n >= n0. Intuitively, we say that, to within a constant factor, f(n) is less than g(n). as n goes to infinity Stated another way, to within a constant factor, f(n) is bounded from above by g(n) as n goes to infinite. cg(n) f(n) n n0

From the definition of big-O, it should now be clear that saying T(n) is O(g(n)) means that g(n) is an upper bound on T(n), and does not mean approximately. An analogy: Consider two integers x and y, where x <= y. Is y approximately equal to x? Of course not! Similarly for the definition of big-O.

Let c=2 and n0 = 1, then we see that: Example: g(n) = n2 f(n) = n2 + 1 Claim: n2 + 1 is O(n2) Proof: Let c=2 and n0 = 1, then we see that: n2 + 1 <= 2n2 for all n >= n0 = 1. Note that in this case f(n) is not less than g(n), even to within a constant. cg(n) f(n) n0

This suggests a simple procedure - choose the highest order term, drop the constant. This procedure is consistent with the formal definition of big-O. f(n) = n2 + 1 f(n) is O(n2) g(n) = n + 6 g(n) is O(n) h(n) = n35 + n16 + n4 + 3n2 + 1025 h(n) is O(n35) r(n) = (1/2)n + 300 r(n) is O(n) s(n) = 4n2 + 2n + 25 s(n) is O(n2) And yes, the depth of loop nesting, at least on simple programs, does indicate the running time.

More generally: If: f(n) = am-1nm-1 + am-2nm-2 + …+ a1n1 + a0 Then: f(n) is O(nm-1 ). Explanation: Let: c = |am-1| + |am-2| + … |a1| + |a0| n0 = 1 Then it will be the case that: f(n) <= cnm-1 for all n>= n0

Why is f(n) <= cnm-1 for all n>=n0=1? cnm-1 = |am-1|nm-1+ |am-2|nm-1 + …+ |a1|nm-1 + |a0| nm-1 f(n) = am-1nm-1 + am-2nm-2 + …+ a1n1 + a0

Another thing worth noting: If, for example, f(n) = 5n3 + 35n2 + 3n + 1025 Then f(n) is O(n3), but also O(n4), O(n5), etc. This may seem counter-intuitive, but it is exactly what we want big-oh to be, i.e., an upper bound.

Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 . Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 . EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2) The equivalence is odd/funny because it is “one way.” I prefer to say “is” rather than “=“ Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Set definition of O-notation O(g(n)) = { f(n) : there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 } The equivalence is odd/funny because it is “one way.” I prefer to say “is” rather than “=“ Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Set definition of O-notation O(g(n)) = { f(n) : there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 } EXAMPLE: 2n2  O(n3) The equivalence is odd/funny because it is “one way.” I prefer to say “is” rather than “=“ Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Macro substitution Convention: A set in a formula represents an anonymous function in the set. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.9

Macro substitution Convention: A set in a formula represents an anonymous function in the set. EXAMPLE: f(n) = n3 + O(n2) means f(n) = n3 + h(n) for some h(n)  O(n2) . September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.10

Macro substitution Convention: A set in a formula represents an anonymous function in the set. EXAMPLE: n2 + O(n) = O(n2) means for any f(n)  O(n): n2 + f(n) = h(n) for some h(n)  O(n2) . September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.11

O-notation (upper bounds) O-notation is an upper-bound notation. Saying that f(n) = O(g(n)) means that, to within a constant factor, f(n) is bounded from above by g(n), as n goes to infinite. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.12

Common Misinterpretation It is common to think that O(g(n)) means “approximately” g(n). Although used in this manner frequently, this interpretation or use of “big Oh” is absolutely incorrect. O-notation is an upper-bound notation. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.12

-notation (lower bounds) (g(n)) = { f(n) : there exist constants c > 0 and n0 > 0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 } EXAMPLE: 𝒏  (lg n) (c = 1, n0 = 16) The equivalence is odd/funny because it is “one way.” I prefer to say “is” rather than “=“ Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Common Misunderstanding It is common to think that (f(n)) is used to specify the best case for an algorithm. This is also incorrect. -notation specifies a lower-bound on the worst (or average) case. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.12

-notation (tight bounds) gg((nn))))==OO((gg((nn))))  ((gg((nn)))) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.15

-notation (tight bounds) gg((nn))))==OO((gg((nn))))  ((gg((nn)))) EXAMPLE: 1 n2  2n  (n2 ) 2 September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.16

-notation and -notation O-notation and -notation are like  and . o-notation and -notation are like < and >. gg((nn))))=={{ff((nn))::ffoorraannyyccoonnssttaannttcc>>00,, tthheerreeiissaaccoonnssttaannttnn00>>00 ssuucchhtthhaatt00 ff((nn)) ccgg((nn)) ffoorraallllnn nn00}} EXAMPLE: September 12, 2005 2n2 = o(n3) (n0 = 2/c) Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.17

-notation and -notation O-notation and -notation are like  and . o-notation and -notation are like < and >. gg((nn))))=={{ff((nn))::ffoorraannyyccoonnssttaannttcc>>00,, tthheerreeiissaaccoonnssttaannttnn00>>00 ssuucchhtthhaatt00 ccgg((nn)) ff((nn)) ffoorraallllnn nn00}} EXAMPLE: September 12, 2005 n  (lg n) (n0 = 1+1/c) Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.18

Solving recurrences The analysis of merge sort from Lecture 1 required us to solve a recurrence. Recurrences are like solving integrals, differential equations, etc. o Learn a few tricks. Lecture 3: Applications of recurrences to divide-and-conquer algorithms. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.19

Substitution method The most general method: Guess the form of the solution. Verify by induction. Solve for constants. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.20

Substitution method The most general method: Guess the form of the solution. Verify by induction. Solve for constants. EXAMPLE: T(n) = 4T(n/2) + n [Assume that T(1) = (1).] Guess O(n3) . (Prove O and  separately.) Assume that T(k)  ck3 for k < n . Prove T(n)  cn3 by induction. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.21

Example of substitution T (n)  4T (n / 2)  n  4c(n / 2)3  n  (c / 2)n3  n  cn3  ((c / 2)n3  n) desired – residual  cn3 desired whenever (c/2)n3 – n  0, for example, if c  2 and n  1. residual September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.22

Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = (1) for all n < n0, where n0 is a suitable constant. For 1  n < n0, we have “(1)”  cn3, if we pick c big enough. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.23

Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = (1) for all n < n0, where n0 is a suitable constant. For 1  n < n0, we have “(1)”  cn3, if we pick c big enough. This bound is not tight! September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.24

A tighter upper bound? We shall prove that T(n) = O(n2). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.25

A tighter upper bound? We shall prove that T(n) = O(n2). Assume that T(k)  ck2 for k < n: T (n)  4T (n / 2)  n  4c(n / 2)2  n  cn2  n  O(n2 ) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.26

 O(n2 ) Wrong! We must prove the I.H. A tighter upper bound? We shall prove that T(n) = O(n2). Assume that T(k)  ck2 for k < n: T (n)  4T (n / 2)  n  4c(n / 2)2  n  cn2  n  O(n2 ) Wrong! We must prove the I.H. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.27

 O(n2 ) Wrong! We must prove the I.H. A tighter upper bound? We shall prove that T(n) = O(n2). Assume that T(k)  ck2 for k < n: T (n)  4T (n / 2)  n  4c(n / 2)2  n  cn2  n  O(n2 ) Wrong! We must prove the I.H.  cn2  (n) [ desired – residual ]  cn2 for no choice of c > 0. Lose! September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.28

A tighter upper bound! IDEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k)  c1k2 – c2k for k < n. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.29

A tighter upper bound! IDEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k)  c1k2 – c2k for k < n. T(n) = 4T(n/2) + n = 4(c1(n/2)2 – c2(n/2)) + n = c1n2 – 2c2n + n = c1n2 – c2n – (c2n – n)  c1n2 – c2n if c2  1. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.30

A tighter upper bound! IDEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k)  c1k2 – c2k for k < n. T(n) = 4T(n/2) + n = 4(c1(n/2)2 – c2(n/2)) + n = c1n2 – 2c2n + n = c1n2 – c2n – (c2n – n)  c1n2 – c2n if c2  1. Pick c1 big enough to handle the initial conditions. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.31

Recursion-tree method A recursion tree models the costs (time) of a recursive execution of an algorithm. The recursion-tree method can be unreliable, just like any method that uses ellipses (…). The recursion-tree method promotes intuition, however. The recursion tree method is good for generating guesses for the substitution method. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.32

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.33

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: T(n) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.34

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: n2 T(n/4) T(n/2) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.35

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 T(n/16) T(n/8) (n/2)2 T(n/8) T(n/4) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.36

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 (n/8)2 (n/4)2 … (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.37

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: n2 n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 (n/8)2 (n/4)2 … (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.38

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 16 (n/8)2 (n/4)2 … (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.39

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 16 25 n2 (n/8)2 (n/4)2 256 … … (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.40

Example of recursion tree Solve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 (n/8)2 (n/4)2 16 25 n2 256 … … 1        L (1) 2 3 2 Total = n 5 16 16 16 5 5 = (n2) geometric series September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.41

The master method The master method applies to recurrences of the form T(n) = a T(n/b) + f (n) , where a  1, b > 1, and f is asymptotically positive. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.42

Three common cases Compare f (n) with nlogba: f (n) = O(nlogba – ) for some constant  > 0. f (n) grows polynomially slower than nlogba (by an n factor). Solution: T(n) = (nlogba) . September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.43

Three common cases Compare f (n) with nlogba: f (n) = O(nlogba – ) for some constant  > 0. f (n) grows polynomially slower than nlogba (by an n factor). Solution: T(n) = (nlogba) . f (n) = (nlogba lgkn) for some constant k  0. f (n) and nlogba grow at similar rates. Solution: T(n) = (nlogba lgk+1n) . September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.44

Three common cases (cont.) Compare f (n) with nlogba: f (n) = (nlogba + ) for some constant  > 0. f (n) grows polynomially faster than nlogba (by an n factor), and f (n) satisfies the regularity condition that af (n/b)  cf (n) for some constant c < 1. Solution: T(n) = ( f (n)) . September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.45

Examples EX. T(n) = 4T(n/2) + n a = 4, b = 2  nlogba = n2; f (n) = n. CASE 1: f (n) = O(n2 – ) for  = 1.  T(n) = (n2). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.46

Examples EX. T(n) = 4T(n/2) + n a = 4, b = 2  nlogba = n2; f (n) = n. CASE 1: f (n) = O(n2 – ) for  = 1.  T(n) = (n2). EX. T(n) = 4T(n/2) + n2 a = 4, b = 2  nlogba = n2; f (n) = n2. CASE 2: f (n) = (n2lg0n), that is, k = 0.  T(n) = (n2lg n). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.47

Examples EX. T(n) = 4T(n/2) + n3 a = 4, b = 2  nlogba = n2; f (n) = n3. CASE 3: f (n) = (n2 + ) for  = 1 and 4(n/2)3  cn3 (reg. cond.) for c = 1/2.  T(n) = (n3). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.48

Examples EX. T(n) = 4T(n/2) + n3 a = 4, b = 2  nlogba = n2; f (n) = n3. CASE 3: f (n) = (n2 + ) for  = 1 and 4(n/2)3  cn3 (reg. cond.) for c = 1/2.  T(n) = (n3). EX. T(n) = 4T(n/2) + n2/lg n a = 4, b = 2  nlogba = n2; f (n) = n2/lg n. Master method does not apply. In particular, for every constant  > 0, we have n  (lg n). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.49

f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1) Recursion tree: f (n) a … f (n/b) f (n/b) a f (n/b) f (n/b2) f (n/b2) … f (n/b2)  (1) … September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.50

f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1) Recursion tree: f (n) f (n) a … f (n/b) f (n/b) f (n/b) a af (n/b) a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) … …  (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.51

f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1) Recursion tree: f (n) f (n) a … f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) … …  (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.52

f (n/b2) f (n/b2) … f (n/b2) = alogbn = nlogba Idea of master theorem Recursion tree: f (n) f (n) a … f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) … #leaves = ah = alogbn …  (1) nlogba (1) = nlogba September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.53

f (n/b2) f (n/b2) … f (n/b2) (nlogba) Idea of master theorem  (1) Recursion tree: f (n) f (n) a … f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) … CCAASSEE11::TThheewweeiigghhttiinnccrreeaasseess ggeeoommeettrriiccaallllyyffrroommtthheerroooottttootthhee lleeaavveess..TThheelleeaavveesshhoollddaaccoonnssttaanntt …  (1) nlogba (1) ffrraaccttiioonnoofftthheettoottaallwweeiigghhtt.. (nlogba) L2.54 September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1) Recursion tree: f (n) f (n) a … f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) … … CCAASSEE22::((kk==00))TThheewweeiigghhtt iissaapppprrooxxiimmaatteellyytthheessaammeeoonn  (1) nlogba (1) eeaacchhoofftthheellooggbnnlleevveellss.. b (nlogbalg n) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.55

f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1) Recursion tree: f (n) f (n) a … f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) … CCAASSEE33::TThheewweeiigghhttddeeccrreeaasseess ggeeoommeettrriiccaallllyyffrroommtthheerroooottttootthhee lleeaavveess..TThheerrooootthhoollddssaaccoonnssttaanntt …  (1) nlogba (1) ffrraaccttiioonnoofftthheettoottaallwweeiigghhtt.. ( f (n)) L2.56 September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson

Appendix: geometric series n  1  xn1 2  1  x  x L x for x  1 1  x 1  x  x2 L  1 for |x| < 1 1  x Return to last slide viewed. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.57