Presentation is loading. Please wait.

Presentation is loading. Please wait.

Big-O Analyzing Algorithms Asymptotically CS 226 5 February 2002 Noah Smith ( )

Similar presentations


Presentation on theme: "Big-O Analyzing Algorithms Asymptotically CS 226 5 February 2002 Noah Smith ( )"— Presentation transcript:

1 Big-O Analyzing Algorithms Asymptotically CS 226 5 February 2002 Noah Smith ( nasmith@cs )

2 Comparing Algorithms Should we use Program 1 or Program 2? Is Program 1 “fast”? “Fast enough”? P1P2

3 You and Igor: the empirical approach Implement each candidate Run itTime it That could be lots of work – also error-prone. Which inputs? What machine/OS ?

4 Toward an analytic approach … Today is just math: How to solve “which algorithm” problems without machines, test data, or Igor!

5 The Big Picture Input (n = 3) Output Input (n = 4) Output Input (n = 8) Output How long does it take for the algorithm to finish? Algorithm

6 Primitives Primitive operations Primitive operations x = 4assignment x = 4assignment... x + 5...arithmetic... x + 5...arithmetic if (x < y)...comparison if (x < y)...comparison x[4]index an array x[4]index an array *xdereference (C) *xdereference (C) x.foo( )calling a method x.foo( )calling a method Others Others new/mallocmemory usage new/mallocmemory usage

7 How many foos? for (j = 1; j <= N; ++j) { foo( ); } ΣN j = 1 1 = N

8 How many foos? for (j = 1; j <= N; ++j) { for (k = 1; k <= M; ++k) { foo( ); }} ΣN j = 1 1 = NM ΣM k = 1

9 How many foos? for (j = 1; j <= N; ++j) { for (k = 1; k <= j; ++k) { foo( ); }} ΣN j = 1 1 = Σj k = 1 ΣN j = 1 j = N (N + 1) 2

10 How many foos? for (j = 0; j < N; ++j) { for (k = 0; k < j; ++k) { foo( ); }} for (j = 0; j < N; ++j) { for (k = 0; k < M; ++k) { foo( ); }} N(N + 1)/2 NM

11 How many foos? void foo(int N) { if(N <= 2) return; foo(N / 2); } T(0) = T(1) = T(2) = 1 T(n) = 1 + T(n/2) if n > 2 T(n)= 1 + (1 + T(n/4)) = 2 + T(n/4) = 2 + (1 + T(n/8)) = 3 + T(n/8) = 3 + (1 + T(n/16)) = 4 + T(n/16) … ≈ log 2 n

12 The trick an k + bn k-1 + … + yn + z an k n k

13 Big O Definition: Let f and g be functions mapping N to R. We say that f(n) is O(g(n)) if there exist c є R, c > 0 and n 0 є N, n 0 ≥ 1 such that f(n) ≤ cg(n) for all n є N, n ≥ n 0

14 Example 1 f(n)f(n) g(n)g(n) n0n0

15 Example 2 h(n)h(n) g(n)g(n) n0n0

16 Example 3 h(n)h(n) k(n)k(n)

17 k(n)k(n) h(n)h(n)

18 h(n)h(n) k(n)k(n)

19 Example 4 3n23n2 n2n2

20 Some complexity classes … Constant O(1) Logarithmic O(log n) Linear O(n)O(n)O(n)O(n) Quadratic O(n2)O(n2)O(n2)O(n2) Cubic O(n3)O(n3)O(n3)O(n3) Polynomial O(np)O(np)O(np)O(np) Exponential O(an)O(an)O(an)O(an)

21 Don’t be confused … We typically say, We typically say, f(n) is O(g(n)) or f(n) = O(g(n)) But O(g(n)) is really a set of functions. But O(g(n)) is really a set of functions. It might be more clear to say, It might be more clear to say, f(n) є O(g(n)) But I don’t make the rules. But I don’t make the rules. Crystal clear: “f(n) is order (g(n)” Crystal clear: “f(n) is order (g(n)”

22 Intuitively … To say f(n) is O(g(n)) is to say that To say f(n) is O(g(n)) is to say that f(n) is “less than or equal to” g(n) We also have (G&T pp. 118-120): We also have (G&T pp. 118-120): Θ(g(n)) Big Theta “equal to” Ω(g(n)) Big Omega “greater than or equal to” o(g(n)) Little o “strictly less than” ω(g(n)) Little omega “strictly greater than”

23 Big-Omega and Big-Theta Ω is just like O except that f(n) ≥ cg(n); f(n) is O(g(n))  g(n) is Ω(f(n)) Θ is both O and Ω (and the constants need not match); f(n) is O(g(n))  f(n) is Ω(g(n))  f(n) is Θ(g(n))

24 little o Definition: Let f and g be functions mapping N to R. We say that f(n) is o(g(n)) if for any c є R, c > 0 there exists n 0 є N, n 0 > 0 such that f(n) ≤ cg(n) for all n є N, n ≥ n 0 (little omega, ω, is the same but with ≥)

25 Multiple variables for(j = 1; j <= N; ++j) for(k = 1; k <= N; ++k) for(l = 1; l <= M; ++l) foo(); for(j = 1; j <= N; ++j) for(k = 1; k <= M; ++k) for(l = 1; l <= M; ++l) foo(); O(N 2 M + NM 2 )

26 Multiple primitives for(j = 1; j <= N; ++j){ sum += A[j]; for(k = 1; k <= M; ++k) { sum2 += B[j][k]; C[j][k] = B[j][k] * A[j] + 1; for(l = 1; l <= k; ++l) B[j][k] -= B[j][l]; }}

27 Tradeoffs: an example 0, 0 0, 1 0, 2 0, 3 … 1, 0 1, 1 1, 2 1, 3 … N, 0 N, 1 N, 2 … N, N 0 0 1 0 2  N … (lots of options in between)

28 Another example I have a set of integers between 0 and 1,000,000. I have a set of integers between 0 and 1,000,000. I need to store them, and I want O(1) lookup, insertion, and deletion. I need to store them, and I want O(1) lookup, insertion, and deletion. Constant time and constant space, right? Constant time and constant space, right? 0 1 2 3 … 1,000,000

29 Big-O and Deceit Beware huge coefficients Beware huge coefficients Beware key lower order terms Beware key lower order terms Beware when n is “small” Beware when n is “small”

30 Does it matter? Let n = 1,000, and 1 ms / operation. n = 1000, 1 ms/op max n in one day n 1 second 86,400,000 n log 2 n 10 seconds 3,943,234 n2n2n2n2 17 minutes 9,295 n3n3n3n3 12 days 442 n4n4n4n4 32 years 96 n 10 3.17  10 19 years 6 2n2n2n2n 1.07  10 301 years 26

31 Worst, best, and average Gideon is a fast runner … up hills. … up hills. … down hills. … down hills. … on flat ground. … on flat ground. Gideon is the fastest swimmer … on the JHU team. … on the JHU team. … in molasses. … in molasses. … in our research lab. … in our research lab. … in 5-yard race. … in 5-yard race. … on Tuesdays. … on Tuesdays. … in an average race. … in an average race.

32 What’s average? Strictly speaking, average (mean) is relative to some probability distribution. Strictly speaking, average (mean) is relative to some probability distribution. mean(X) = Pr(x)  x Unless you have some notion of a probability distribution over test cases, it’s hard to talk about average requirements. Unless you have some notion of a probability distribution over test cases, it’s hard to talk about average requirements. Σ x

33 Now you know … How to analyze the run-time (or space requirements) of a piece of pseudo-code. How to analyze the run-time (or space requirements) of a piece of pseudo-code. Some new uses for Greek letters. Some new uses for Greek letters. Why the order of an algorithm matters. Why the order of an algorithm matters. How to avoid some pitfalls. How to avoid some pitfalls.


Download ppt "Big-O Analyzing Algorithms Asymptotically CS 226 5 February 2002 Noah Smith ( )"

Similar presentations


Ads by Google