Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Algorithms

Similar presentations


Presentation on theme: "Introduction to Algorithms"— Presentation transcript:

1 Introduction to Algorithms
LECTURE 2 Asymptotic Notation O-, -, and -notation Recurrences Substitution method Iterating the recurrence Recursion tree Master method Copyright © Erik D. Demaine and Charles E. Leiserson

2 “Big Oh” Notation Note that it is tempting to think that O(f(n)) means “approximately f(n).” Although used in this manner frequently, this interpretation or use of “big Oh” is absolutely incorrect. Most frequently, O(f(n)) is pronounced “on the order of f(n)” or “order f(n),” but keep in mind that this also does not mean “approximately f(n).”

3 Let f(n) and g(n) be functions:

4 f(n) is said to be less than g(n) if f(n) <= g(n) for all n.
For example, n2 is less than n4 + 1. f(n) g(n) n

5 To within a constant factor, f(n) is said to be less than g(n) if there exists a positive constant c such that f(n) <= cg(n) for all n. cg(n) f(n) g(n) n

6 Example: g(n) = 3n2 + 2 f(n) = 6n2 + 3 Note: 6n2 + 3 is not less than 3n2 + 2 However: To within a constant factor 6n2 + 3 is less than 3n2 + 2 Proof: Let c=9, then we see that: 6n2 + 3 <= 9(3n2 + 2) = 27n2 + 18

7 By the way, for these two functions it is also the case that:
To within a constant factor 3n2 + 2 is less than 6n2 + 3 Proof: Let c=1, then we see that: 3n2 + 2 <= 1(6n2 + 3) = 6n2 + 3 In fact 3n2 + 2 is actually less than 6n2 + 3 In other words, asymptotically, there ain’t much difference in the growth of the two functions, as n goes to infinite. (always enjoy using that word in class)

8 Question: g(n) = n2 f(n) = 2n Is f(n), to within a constant factor, less than g(n)? In other words, is there a constant c such that: f(n) <= cg(n) ? No! Not even if we let c = 1,000,000, or larger!

9 Definition: f(n) is said to be O(g(n)) if there exists two positive constants c and n0 such that f(n) <= cg(n) for all n >= n0. Intuitively, we say that, to within a constant factor, f(n) is less than g(n). as n goes to infinity Stated another way, to within a constant factor, f(n) is bounded from above by g(n) as n goes to infinite. cg(n) f(n) n n0

10 From the definition of big-O, it should now be clear that saying T(n) is O(g(n)) means that g(n) is an upper bound on T(n), and does not mean approximately. An analogy: Consider two integers x and y, where x <= y. Is y approximately equal to x? Of course not! Similarly for the definition of big-O.

11 Let c=2 and n0 = 1, then we see that:
Example: g(n) = n2 f(n) = n2 + 1 Claim: n2 + 1 is O(n2) Proof: Let c=2 and n0 = 1, then we see that: n2 + 1 <= 2n2 for all n >= n0 = 1. Note that in this case f(n) is not less than g(n), even to within a constant. cg(n) f(n) n0

12 This suggests a simple procedure - choose the highest order term, drop the constant.
This procedure is consistent with the formal definition of big-O. f(n) = n f(n) is O(n2) g(n) = n g(n) is O(n) h(n) = n35 + n16 + n4 + 3n h(n) is O(n35) r(n) = (1/2)n r(n) is O(n) s(n) = 4n2 + 2n s(n) is O(n2) And yes, the depth of loop nesting, at least on simple programs, does indicate the running time.

13 More generally: If: f(n) = am-1nm-1 + am-2nm-2 + …+ a1n1 + a0 Then: f(n) is O(nm-1 ). Explanation: Let: c = |am-1| + |am-2| + … |a1| + |a0| n0 = 1 Then it will be the case that: f(n) <= cnm-1 for all n>= n0

14 Why is f(n) <= cnm-1 for all n>=n0=1?
cnm-1 = |am-1|nm-1+ |am-2|nm-1 + …+ |a1|nm-1 + |a0| nm-1 f(n) = am-1nm-1 + am-2nm-2 + …+ a1n1 + a0

15 Another thing worth noting:
If, for example, f(n) = 5n3 + 35n2 + 3n Then f(n) is O(n3), but also O(n4), O(n5), etc. This may seem counter-intuitive, but it is exactly what we want big-oh to be, i.e., an upper bound.

16 Asymptotic notation O-notation (upper bounds):
We write f(n) = O(g(n)) if there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 . Copyright © Erik D. Demaine and Charles E. Leiserson

17 Asymptotic notation O-notation (upper bounds):
We write f(n) = O(g(n)) if there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 . EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2) The equivalence is odd/funny because it is “one way.” I prefer to say “is” rather than “=“ Copyright © Erik D. Demaine and Charles E. Leiserson

18 Set definition of O-notation
O(g(n)) = { f(n) : there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 } The equivalence is odd/funny because it is “one way.” I prefer to say “is” rather than “=“ Copyright © Erik D. Demaine and Charles E. Leiserson

19 Set definition of O-notation
O(g(n)) = { f(n) : there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 } EXAMPLE: 2n2  O(n3) The equivalence is odd/funny because it is “one way.” I prefer to say “is” rather than “=“ Copyright © Erik D. Demaine and Charles E. Leiserson

20 Macro substitution Convention: A set in a formula represents an anonymous function in the set. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.9

21 Macro substitution Convention: A set in a formula represents an anonymous function in the set. EXAMPLE: f(n) = n3 + O(n2) means f(n) = n3 + h(n) for some h(n)  O(n2) . September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.10

22 Macro substitution Convention: A set in a formula represents an anonymous function in the set. EXAMPLE: n2 + O(n) = O(n2) means for any f(n)  O(n): n2 + f(n) = h(n) for some h(n)  O(n2) . September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.11

23 O-notation (upper bounds)
O-notation is an upper-bound notation. Saying that f(n) = O(g(n)) means that, to within a constant factor, f(n) is bounded from above by g(n), as n goes to infinite. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.12

24 Common Misinterpretation
It is common to think that O(g(n)) means “approximately” g(n). Although used in this manner frequently, this interpretation or use of “big Oh” is absolutely incorrect. O-notation is an upper-bound notation. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.12

25 -notation (lower bounds)
(g(n)) = { f(n) : there exist constants c > 0 and n0 > 0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 } EXAMPLE: 𝒏  (lg n) (c = 1, n0 = 16) The equivalence is odd/funny because it is “one way.” I prefer to say “is” rather than “=“ Copyright © Erik D. Demaine and Charles E. Leiserson

26 Common Misunderstanding
It is common to think that (f(n)) is used to specify the best case for an algorithm. This is also incorrect. -notation specifies a lower-bound on the worst (or average) case. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.12

27 -notation (tight bounds)
gg((nn))))==OO((gg((nn))))  ((gg((nn)))) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.15

28 -notation (tight bounds)
gg((nn))))==OO((gg((nn))))  ((gg((nn)))) EXAMPLE: 1 n2  2n  (n2 ) 2 September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.16

29 -notation and -notation
O-notation and -notation are like  and . o-notation and -notation are like < and >. gg((nn))))=={{ff((nn))::ffoorraannyyccoonnssttaannttcc>>00,, tthheerreeiissaaccoonnssttaannttnn00>>00 ssuucchhtthhaatt00 ff((nn)) ccgg((nn)) ffoorraallllnn nn00}} EXAMPLE: September 12, 2005 2n2 = o(n3) (n0 = 2/c) Copyright © Erik D. Demaine and Charles E. Leiserson L2.17

30 -notation and -notation
O-notation and -notation are like  and . o-notation and -notation are like < and >. gg((nn))))=={{ff((nn))::ffoorraannyyccoonnssttaannttcc>>00,, tthheerreeiissaaccoonnssttaannttnn00>>00 ssuucchhtthhaatt00 ccgg((nn)) ff((nn)) ffoorraallllnn nn00}} EXAMPLE: September 12, 2005 n  (lg n) (n0 = 1+1/c) Copyright © Erik D. Demaine and Charles E. Leiserson L2.18

31 Solving recurrences The analysis of merge sort from Lecture 1
required us to solve a recurrence. Recurrences are like solving integrals, differential equations, etc. o Learn a few tricks. Lecture 3: Applications of recurrences to divide-and-conquer algorithms. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.19

32 Substitution method The most general method:
Guess the form of the solution. Verify by induction. Solve for constants. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.20

33 Substitution method The most general method:
Guess the form of the solution. Verify by induction. Solve for constants. EXAMPLE: T(n) = 4T(n/2) + n [Assume that T(1) = (1).] Guess O(n3) . (Prove O and  separately.) Assume that T(k)  ck3 for k < n . Prove T(n)  cn3 by induction. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.21

34 Example of substitution
T (n)  4T (n / 2)  n  4c(n / 2)3  n  (c / 2)n3  n  cn3  ((c / 2)n3  n) desired – residual  cn3 desired whenever (c/2)n3 – n  0, for example, if c  2 and n  1. residual September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.22

35 Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = (1) for all n < n0, where n0 is a suitable constant. For 1  n < n0, we have “(1)”  cn3, if we pick c big enough. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.23

36 Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = (1) for all n < n0, where n0 is a suitable constant. For 1  n < n0, we have “(1)”  cn3, if we pick c big enough. This bound is not tight! September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.24

37 A tighter upper bound? We shall prove that T(n) = O(n2).
September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.25

38 A tighter upper bound? We shall prove that T(n) = O(n2).
Assume that T(k)  ck2 for k < n: T (n)  4T (n / 2)  n  4c(n / 2)2  n  cn2  n  O(n2 ) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.26

39  O(n2 ) Wrong! We must prove the I.H.
A tighter upper bound? We shall prove that T(n) = O(n2). Assume that T(k)  ck2 for k < n: T (n)  4T (n / 2)  n  4c(n / 2)2  n  cn2  n  O(n2 ) Wrong! We must prove the I.H. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.27

40  O(n2 ) Wrong! We must prove the I.H.
A tighter upper bound? We shall prove that T(n) = O(n2). Assume that T(k)  ck2 for k < n: T (n)  4T (n / 2)  n  4c(n / 2)2  n  cn2  n  O(n2 ) Wrong! We must prove the I.H.  cn2  (n) [ desired – residual ]  cn2 for no choice of c > 0. Lose! September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.28

41 A tighter upper bound! IDEA: Strengthen the inductive hypothesis.
Subtract a low-order term. Inductive hypothesis: T(k)  c1k2 – c2k for k < n. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.29

42 A tighter upper bound! IDEA: Strengthen the inductive hypothesis.
Subtract a low-order term. Inductive hypothesis: T(k)  c1k2 – c2k for k < n. T(n) = 4T(n/2) + n = 4(c1(n/2)2 – c2(n/2)) + n = c1n2 – 2c2n + n = c1n2 – c2n – (c2n – n)  c1n2 – c2n if c2  1. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.30

43 A tighter upper bound! IDEA: Strengthen the inductive hypothesis.
Subtract a low-order term. Inductive hypothesis: T(k)  c1k2 – c2k for k < n. T(n) = 4T(n/2) + n = 4(c1(n/2)2 – c2(n/2)) + n = c1n2 – 2c2n + n = c1n2 – c2n – (c2n – n)  c1n2 – c2n if c2  1. Pick c1 big enough to handle the initial conditions. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.31

44 Recursion-tree method
A recursion tree models the costs (time) of a recursive execution of an algorithm. The recursion-tree method can be unreliable, just like any method that uses ellipses (…). The recursion-tree method promotes intuition, however. The recursion tree method is good for generating guesses for the substitution method. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.32

45 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.33

46 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: T(n) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.34

47 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: n2 T(n/4) T(n/2) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.35

48 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 T(n/16) T(n/8) (n/2)2 T(n/8) T(n/4) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.36

49 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 (n/8)2 (n/4)2 (1) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.37

50 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: n2 n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 (n/8)2 (n/4)2 (1) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.38

51 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 16 (n/8)2 (n/4)2 (1) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.39

52 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 16 25 n2 (n/8)2 (n/4)2 256 (1) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.40

53 Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/16)2 (n/8)2 (n/2)2 (n/8)2 (n/4)2 16 25 n2 256 1        L (1) 2 3 2 Total = n 5 5 5 = (n2) geometric series September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.41

54 The master method The master method applies to recurrences of the form
T(n) = a T(n/b) + f (n) , where a  1, b > 1, and f is asymptotically positive. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.42

55 Three common cases Compare f (n) with nlogba:
f (n) = O(nlogba – ) for some constant  > 0. f (n) grows polynomially slower than nlogba (by an n factor). Solution: T(n) = (nlogba) . September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.43

56 Three common cases Compare f (n) with nlogba:
f (n) = O(nlogba – ) for some constant  > 0. f (n) grows polynomially slower than nlogba (by an n factor). Solution: T(n) = (nlogba) . f (n) = (nlogba lgkn) for some constant k  0. f (n) and nlogba grow at similar rates. Solution: T(n) = (nlogba lgk+1n) . September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.44

57 Three common cases (cont.)
Compare f (n) with nlogba: f (n) = (nlogba + ) for some constant  > 0. f (n) grows polynomially faster than nlogba (by an n factor), and f (n) satisfies the regularity condition that af (n/b)  cf (n) for some constant c < 1. Solution: T(n) = ( f (n)) . September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.45

58 Examples EX. T(n) = 4T(n/2) + n a = 4, b = 2  nlogba = n2; f (n) = n.
CASE 1: f (n) = O(n2 – ) for  = 1.  T(n) = (n2). September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.46

59 Examples EX. T(n) = 4T(n/2) + n a = 4, b = 2  nlogba = n2; f (n) = n.
CASE 1: f (n) = O(n2 – ) for  = 1.  T(n) = (n2). EX. T(n) = 4T(n/2) + n2 a = 4, b = 2  nlogba = n2; f (n) = n2. CASE 2: f (n) = (n2lg0n), that is, k = 0.  T(n) = (n2lg n). September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.47

60 Examples EX. T(n) = 4T(n/2) + n3
a = 4, b = 2  nlogba = n2; f (n) = n3. CASE 3: f (n) = (n2 + ) for  = 1 and 4(n/2)3  cn3 (reg. cond.) for c = 1/2.  T(n) = (n3). September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.48

61 Examples EX. T(n) = 4T(n/2) + n3
a = 4, b = 2  nlogba = n2; f (n) = n3. CASE 3: f (n) = (n2 + ) for  = 1 and 4(n/2)3  cn3 (reg. cond.) for c = 1/2.  T(n) = (n3). EX. T(n) = 4T(n/2) + n2/lg n a = 4, b = 2  nlogba = n2; f (n) = n2/lg n. Master method does not apply. In particular, for every constant  > 0, we have n  (lg n). September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.49

62 f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1)
Recursion tree: f (n) a f (n/b) f (n/b) a f (n/b) f (n/b2) f (n/b2) … f (n/b2)  (1) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.50

63 f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1)
Recursion tree: f (n) f (n) a f (n/b) f (n/b) f (n/b) a af (n/b) a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2)  (1) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.51

64 f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1)
Recursion tree: f (n) f (n) a f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2)  (1) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.52

65 f (n/b2) f (n/b2) … f (n/b2) = alogbn = nlogba Idea of master theorem
Recursion tree: f (n) f (n) a f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) #leaves = ah = alogbn  (1) nlogba (1) = nlogba September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.53

66 f (n/b2) f (n/b2) … f (n/b2) (nlogba) Idea of master theorem  (1)
Recursion tree: f (n) f (n) a f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) CCAASSEE11::TThheewweeiigghhttiinnccrreeaasseess ggeeoommeettrriiccaallllyyffrroommtthheerroooottttootthhee lleeaavveess..TThheelleeaavveesshhoollddaaccoonnssttaanntt  (1) nlogba (1) ffrraaccttiioonnoofftthheettoottaallwweeiigghhtt.. (nlogba) L2.54 September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson

67 f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1)
Recursion tree: f (n) f (n) a f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) CCAASSEE22::((kk==00))TThheewweeiigghhtt iissaapppprrooxxiimmaatteellyytthheessaammeeoonn  (1) nlogba (1) eeaacchhoofftthheellooggbnnlleevveellss.. b (nlogbalg n) September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.55

68 f (n/b2) f (n/b2) … f (n/b2) Idea of master theorem  (1)
Recursion tree: f (n) f (n) a f (n/b) f (n/b) f (n/b) a af (n/b) h = log n b a2 f (n/b2) f (n/b2) f (n/b2) … f (n/b2) CCAASSEE33::TThheewweeiigghhttddeeccrreeaasseess ggeeoommeettrriiccaallllyyffrroommtthheerroooottttootthhee lleeaavveess..TThheerrooootthhoollddssaaccoonnssttaanntt  (1) nlogba (1) ffrraaccttiioonnoofftthheettoottaallwweeiigghhtt.. ( f (n)) L2.56 September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson

69 Appendix: geometric series
n  1  xn1 2  1  x  x L x for x  1 1  x 1  x  x2 L  1 for |x| < 1 1  x Return to last slide viewed. September 12, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.57


Download ppt "Introduction to Algorithms"

Similar presentations


Ads by Google