Download presentation
Presentation is loading. Please wait.
1
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1
2
2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh, Little omega Big Oh, Big Omega Rules to manipulate Big-Oh expressions Typical growth rates
3
3 Introduction The work done by an algorithm, i.e. its complexity, is determined by the number of the basic operations necessary to solve the problem. The number of basic operations depends on the size of input
4
4 Basic Operations – Example 1 Problem: Find x in an array Operation: Comparison of x with an entry in the array Size of input: The number of the elements in the array
5
5 Basic Operations – Example 2 Problem: Sort an array of numbers Operation: Comparison of two array entries plus moving elements in the array Size of input: The number of elements in the array
6
6 The Task Determine how the number of operations depend on the size of input N - size of input F(N) - number of operations
7
7 Asymptotic Growth of Functions Asymptotic growth : The rate of growth of a function Given a particular differentiable function f(n), all other differentiable functions fall into three classes: growing with the same rate growing faster growing slower
8
8 Asymptotic Growth of Functions The rate of growth is determined by the highest order term of the function Examples n 3 + n 2 grows faster than n 2 + 1000, nlog(n),... n 3 + n 2 has same rate of growth as n 3,, n 3 + n,... n 3 + n 2 grows slower than n 4 + n, n 3 log(n),...
9
9 Theta: Same rate of growth f(n) = (g(n)) if f(n) and g(n) are of same order Examples: 10n 3 + n 2 = (n 3 ) = (5n 3 +n) = (n 3 + log(n)) Note: the coefficients can be disregarded
10
10 Little oh: Lower rate of growth f(n) = o(g(n)) if f(n) has lower rate of growth, f(n) is of lower order than g(n) Examples 10n 3 + n 2 = (n 4 ) 10n 3 + n 2 = (5n 5 +n) 10n 3 + n 2 = (n 6 + log(n))
11
Little omega: higher rate of growth f(n) = (g(n)) if f(n) has higher rate of growth, f(n) is of higher order than g(n) Examples 10n 3 + n 2 = (n 2 ) 10n 3 + n 2 = (5n 2 +n) 10n 3 + n 2 = (n + log(n))
12
12 Little oh and Little omega if g(n) = o( f(n) ) then f(n) = ω( g(n) ) Examples: Compare n and n 2 n = o(n 2 ) n 2 = ω(n)
13
13 The Big-Oh notation f(n) = O(g(n)) if f(n) grows with same rate or slower than g(n). f(n) = Θ(g(n)) or f(n) = o(g(n))
14
14 Example n+5 = Θ(n) n+5 = O(n) the closest estimation: n+5 = Θ(n) the general practice is to use the Big-Oh notation: n+5 = O(n)
15
15 The Big-Omega notation The inverse of Big-Oh is Ω If g(n) = O(f(n)), then f(n) = Ω (g(n)) Example: n 2 = O(n 3 ), n 3 = Ω(n 2 ) f(n) grows faster or with the same rate as g(n): f(n) = Ω (g(n))
16
16 Rules to manipulate Big-Oh expressions Rule 1: a. If T1(N) = O(f(N)) and T2(N) = O(g(N)) then T1(N) + T2(N) = max( O( f (N) ), O( g(N) ) )
17
17 Rules to manipulate Big-Oh expressions b. If T1(N) = O( f(N) ) and T2(N) = O( g(N) ) then T1(N) * T2(N) = O( f(N)* g(N) )
18
Rules to manipulate Big-Oh expressions Rule 2: If T(N) is a polynomial of degree k, then T(N) = (N k ), T(N) = O(N k ) Rule 3: (Log(N)) k = O(N) for any constant k
19
19 Examples O(n 2 ) + O(n) = O(n 2 ) we disregard any lower-order term O(nlog(n)) + O(n) = O(nlog(n)) O(n 2 ) + O(nlog(n)) = O(n 2 )
20
20 Examples O(n 2 ) * O(n) = O(n 3 ) O(nlog(n)) * O(n) = O(n 2 log(n)) O(n 2 +n + 4) = O(n 2 ) (log(n) 5 ) = O(n)
21
21 Typical Growth Rates Cconstant, we write O(1) logNlogarithmic log 2 Nlog-squared Nlinear NlogN N 2 quadratic N 3 cubic 2 N exponential N!factorial
22
22 Problems N 2 = O(N 2 )true 2N = O(N 2 ) true N = O(N 2 ) true N 2 = O(N) false 2N = O(N) true N = O(N) true
23
23 Problems N 2 = Θ (N 2 ) true 2N = Θ (N 2 ) false N = Θ (N 2 ) false N 2 = Θ (N)false 2N = Θ (N) true N = Θ (N) true
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.