Download presentation
Presentation is loading. Please wait.
Published byReilly Hewes Modified over 10 years ago
1
Intro to Analysis of Algorithms
2
Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any legitimate input in a finite amount of time.” Named for Al Khwarizmi, who laid out basic procedures for arithmetic functions. (Read about him!)
3
Analysis of Algorithms Correctness Generality Optimality Simplicity Time Efficiency Space Efficiency
4
Measuring Efficiency What is basic unit to measure input size? (n) What is basic unit of resource? – Time: basic unit operation – Space: memory units Best, worst, or average case? Find its efficiency class
5
Why do we care? Let’s look at Fibonacci numbers: 1, 1, 2, 3, 5, 8, 13, …
6
Fibonacci Sequence We want to compute the nth number in the sequence. (F 3 = 2, for example.)
7
This definition can be translated directly into code – a recursive method. How many additions does it take to compute F n ?
8
Which is better? function fib2(n) create an array f[0...n] f[0] = 0, f[1] = 1 for i = 2...n: f[i] = f[i-1] + f[i-2] return f[n] function fib1(n) if n = 0: return 0 if n = 1: return 1 return fib1(n-1) + fib1(n-2)
9
Consider calculating F 200. The fib1 method takes over 2 138 steps. Computers can do several billion instructions per second. Suppose we have a supercomputer that does 40 trillion instructions per second.
10
Consider calculating F 200. The fib1 method takes over 2 138 steps. Computers can do several billion instructions per second. Suppose we have a supercomputer that does 40 trillion instructions per second. Even on this machine, fib1(200) takes at least 2 92 seconds, or 10 18 centuries, long after the expected end of our sun!!!
11
Consider calculating F 200. The fib1 method takes over 2 138 steps. Computers can do several billion instructions per second. Suppose we have a supercomputer that does 40 trillion instructions per second. Even on this machine, fib1(200) takes at least 2 92 seconds, or 10 18 centuries, long after the expected end of our sun!!! But, fib2(200) would take less than a billionth of a second to compute!!!
12
10 6 instructions/sec, runtimes N O(log N)O(N)O(N log N)O(N 2 ) 10 0.0000030.000010.0000330.0001 100 0.0000070.000100.0006640.1000 1,000 0.0000100.001000.0100001.0 10,000 0.0000130.010000.1329001.7 min 100,000 0.0000170.100001.6610002.78 hr 1,000,000 0.0000201.019.911.6 day 1,000,000,000 0.00003016.7 min18.3 hr318 centuries
13
Some helpful mathematics 1 + 2 + 3 + 4 + … + N – N(N+1)/2 = N 2 /2 + N/2 is O(N 2 ) N + N + N + …. + N (total of N times) – N*N = N 2 which is O(N 2 ) 1 + 2 + 4 + … + 2 N – 2 N+1 – 1 = 2 x 2 N – 1 which is O(2 N )
14
Basics of Efficiency Big-oh – upper bound on the efficiency class Efficiency classes don’t worry with constants A cubic is worse than a quadratic, quartic worse than cubic… Getting big-oh analysis of non-recursive code is pretty easy
15
Maximum(A[1..n]) max <- A[0] for i<- 1 to n if A[i] > max max <- A[i] return max 1.What is input size? 2.What is unit of time? 3.What is big-oh analysis?
16
Maximum(A[1..n]) max <- A[0] for i<- 1 to n if A[i] > max max <- A[i] return max 1 assignment n times: 1 compare maybe 1 assignment 1 addition to i 1 return ______________________ 1 + n(3) + 1 = O(3n+2)
17
The algorithm is O(3n+2), which is O(n). We only care about the efficiency class. Why?
18
The algorithm is O(3n+2), which is O(n). We only care about the efficiency class. Why? At some point, every parabola (n 2 ) overtakes any line (n). We only really care about large input.
19
Efficiency classes So we really just care about the leading term, which determines the shape of the graph. This means for non-recursive algorithms, what’s important is the loops.
20
Analyze this… AllUnique(A[1..n]) for i<-1 to n for j<- 1 to n if A[i] = A[j] and i ≠ j return false return true Best case? Worst case? Average case?
21
Analyze this… AllUnique(A[1..n]) for i<-1 to n for j<- 1 to n if A[i] = A[j] and i ≠ j return false return true Average case? Quit halfway through, O(n*n/2) Still O(n 2 ) Often, Average case = Worst case.
22
AllUnique(A[1..n]) for i<-1 to n for j<- i to n if A[i] = A[j] return false return true
23
AllUnique(A[1..n]) for i<-1 to n for j<- i to n if A[i] = A[j] return false return true n + (n-1) + (n-2) + … + 3 + 2 + 1 = n(n+1)/2
24
MatrixMultiply(A[nxn], B[nxn]) Initialize empty C[nxn] for i<- 1 to n for j <- 1 to n C[i,j] <- 0 for k<- 1 to n C[i,j] <- C[i,j] + A[i,k]*B[k,j] return C
25
MethodA(n) – answer <- 1 – for i<- 1 to n/2: answer <- answer + i for i<- 1 to n – answer <- answer + 1 – return answer MethodB(n) – answer <- 1 – for i <- 1 to lg n: answer = answer + MethodA(n) return answer
26
Big-oh Definition Let f(n) and g(n) be functions from positive integers to positive reals. We say f = O(g) if there exists some constant c > 0 such that f(n) ≤ cg(n) for all n.
27
Big-oh Definition Let f(n) and g(n) be functions from positive integers to positive reals. We say f = O(g) if there exists some constant c > 0 such that f(n) ≤ cg(n) for all n. Say what?
28
Big-O f(n) = O(g(n), think: f(n) ≤ g(n) n = O(n 2 ) because I can find a big number to multiply the parabola by that makes it lie completely above the line. I can’t do that in reverse.
29
Know the shapes constant logarithmic linear quadratic exponential
30
Theta Notation Theta = tight bound Multiply constants by a function, results are always (asymptotically) above and below. What it means: If you have a parabola, you can always find parabolas that stay above and below the given parabola. Same for lines, and other curves. Just tells us the efficiency category.
31
Practice – True or False? ½n(n-1) = Ө(n) if f(n) = Ө g(n), then g(n) = Ө(f(n)) -n – 100000 = Ө(n)
32
O - notation f = O(g) if f grows slower than or the same as g That is, asymptotically, g is not below f g is an upper bound Think: O is “≤”
33
Practice – True or False? n = O(n 2 ) n 3 = O(n 2 ).00000001n 3 = O(n 2 ) 100n + 5 = O(n 2 ) ½ n(n-1) = O(n 2 ) n 4 +n+1= O(n 2 )
34
-Notation Opposite of big-oh f = (g) means f is always above or equal to g. It gives an asymptotic lower bound. Think: is “≥”
35
Practice – True or False? n 3 = (n 2 ) ½ n(n-1) = (n 2 ) 100n + 5 = (n 2 )
36
o and ω f = O(g) means g is above or equal to f f = o(g) means g is strictly above f Also means – there is a better, tighter bound ω is analogous for - there is a tighter bound available
37
Function Growth Rates Say that f isMean that f isWrite “big oh of g”No faster than g, ≤ f = O(g) “theta g”About as fast as g, = f = Θ(g) “omega g”No slower than g, ≥ f = Ω(g)
38
Polynomials are Easy – What about other functions? constant (1): Very few examples of algorithms that don’t grow with input size logarithmic (lg n): Usually result of cutting input size by a constant factor each time through the loop linear (n): Look at each input element a constant number of times nlgn: Divide and conquer quadratic (n 2 ): Two embedded loops cubic (n 3 ): Three embedded loops exponential (2 n ): Generate all subsets of the input elements factorial (n!): Generate all permutations of the input n n : Generate all permutations of input, allowing repetitions
39
Other standard functions Polylog – log 3 n = (logn) 3 Any log grows slower than any polynomial: log a n = o(n b ), for any a,b > 0 Any polynomial grows slower than any exponential with base c > 1. n b = o(c n ) for any c > 1 n! = o(n n ) n! = ω(2 n ) lg(n!) = Ө(nlgn)
40
Relative Growths –O, , or Ө? lg 2 n 2n n 4 +3n2 n √nn 2/3 4 n 4 n/2 n+2 n 3 n 100n + lgnn+(lgn) 2 lg(n 2 ) lg(n 3 )
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.