Download presentation
Presentation is loading. Please wait.
Published byJustina Webster Modified over 9 years ago
1
CHAPTER 1 Compiled by: Dr. Mohammad Omar Alhawarat Algorithm’s Analysis
2
What is an Algorithm An algorithm is a set of clear instructions to solve a problem. Algorithms are usually written in an English- like language called Pseudo code. Algorithms are not programs! 2
3
Why Performance Matter? Suppose N = 106 A PC can read/process N records in 1 sec. But if some algorithm does N*N computation, then it takes 1M seconds = 11 days!!! 100 City Traveling Salesman Problem. A supercomputer checking 100 billion tours/sec still requires 10 100 years! 3
4
Factors Affecting Choosing An Algorithm Running time (Speed). Storage usage. Graphical User Interface (GUI). (Ease of use) Security. Maintenance. Portability. Open Source. 4
5
Algorithm Performance Performance = Efficiency = Complexity. Usually performance of an algorithm is computed either by: Empirical Method. OR Analytical Method. 5
6
Empirical Method We have two algorithms A & B that solve Problem X. To compare these algorithms using empirical method then we need to convert each to a program using a programming language. After that we run both programs and see which one runs faster. 6
7
Why Empirical Method is not Feasible? That’s because when running a program on a computer there are some factors that affect the speed of running such as: CPU speed. RAM size. Hard Disk Drive (HDD). Graphics Card. Operating System (OS). Compiler. Environment (Backup, Antivirus, …etc). 7
8
Analytical Method In this method each algorithm is mapped to a function. Then the functions are compared in terms of their growth rate to decide which is better. First some mathematical review will be done to cover: exponents, logarithms and concept of growth rate. 8
9
Mathematical Review: Exponents X 0 = 1 by definition X a X b = X (a+b) X a / X b = X (a-b) Show that: X -n = 1 / X n (X a ) b = X ab 9
10
Mathematical Review: Logarithms log a X = Y a Y = X, a > 0, X > 0 E.G: log 2 8 = 3; 2 3 = 8 log a 1 = 0 because a 0 = 1 logX means log 2 X lgX means log 10 X lnX means log e X, where ‘e’ is the natural number 10
11
Mathematical Review: Logarithms log a (XY) = log a X + log a Y log a (X/Y) = log a X – log a Y log a (X n ) = nlog a X log a b = (log 2 b)/ (log 2 a) a log a x = x 11
12
Typical Function Families 12
13
Typical Functions: Examples 13
14
Typical Functions: Examples n2n2 n log n n log n n3n3 2n2n 14
15
Complexity & Computability Note: Assume the computer does 1 billion ops per sec. 15
16
Growth Rate: Cross-over Point 16
17
Typical Growth Rates CConstant, we write O(1) logNLogarithmic NLinear NlogN Linearithmic N 2 Quadratic N 3 Cubic 2 N Exponential N!Factorial 17
18
Function Growth lim ( n ) = ∞, n → ∞ lim ( n a ) = ∞, n → ∞, a > 0 lim ( 1 / n ) = 0, n → ∞ lim ( 1 / (n a ) )= 0, n → ∞, a > 0 lim ( log( n )) = ∞, n → ∞ lim ( a n ) = ∞, n → ∞, a > 0 18
19
Function Growth lim (f(x) + g(x)) = lim (f(x)) + lim (g(x)) lim (f(x) * g(x)) = lim (f(x)) * lim (g(x)) lim (f(x) / g(x)) = lim (f(x)) / lim (g(x)) lim (f(x) / g(x)) = lim (f '(x) / g '(x)) 19
20
Examples lim (n/ n 2 ) = 0, n → ∞ lim (n 2 / n) = ∞, n → ∞ lim (n 2 / n 3 ) = 0, n → ∞ lim (n 3 / n 2 ) = ∞, n → ∞ lim (n / ((n+1)/2) = 2, n → ∞. 20
21
Classifying Functions by Their Asymptotic Growth Asymptotic growth : The rate of growth of a function Given a particular differentiable function f(n), all other differentiable functions fall into three classes:. growing with the same rate. growing faster. growing slower 21
22
Asymptotic Notations in Algorithm Analysis Classifying Functions by Their Asymptotic Growth: Little oh, Little omega Theta Big Oh, Big Omega 22
23
Theta f(n) and g(n) have same rate of growth, if lim( f(n) / g(n) ) = c, 0 ∞ Notation: f(n) = Θ( g(n) ) pronounced "theta" 23
24
Little oh f(n) grows slower than g(n) (or g(n) grows faster than f(n)) if lim( f(n) / g(n) ) = 0, n → ∞ Notation: f(n) = o( g(n) ) pronounced "little oh" 24
25
Little omega f(n) grows faster than g(n) (or g(n) grows slower than f(n)) if lim( f(n) / g(n) ) = ∞, n -> ∞ Notation: f(n) = ω (g(n)) pronounced "little omega" 25
26
Little omega and Little Oh if g(n) = o( f(n) ) then f(n) = ω( g(n) ) Examples: Compare n and n 2 lim( n/n 2 ) = 0, n → ∞, n = o(n 2 ) lim( n 2 /n ) = ∞, n → ∞, n 2 = ω(n) 26
27
Algorithms with Same Complexity Two algorithms have same complexity, if the functions representing the number of operations have same rate of growth. Among all functions with same rate of growth we choose the simplest one to represent the complexity. 27
28
Examples Compare n and (n+1)/2 lim( n / ((n+1)/2 )) = 2, same rate of growth (n+1)/2 = Θ(n) - rate of growth of a linear function 28
29
Examples Compare n 2 and n 2 + 6n lim( n 2 / (n 2 + 6n ) )= 1 same rate of growth. n 2 +6n = Θ(n 2 ) rate of growth of a quadratic function 29
30
Examples Compare log n and log n 2 lim( log n / log n 2 ) = 1/2 same rate of growth. (log n 2 =2log n) log n 2 = Θ(log n) logarithmic rate of growth 30
31
Examples Θ(n 3 ):n 3, 5n 3 + 4n,105n 3 + 4n 2 +6n Θ(n 2 ):n 2, 5n 2 + 4n + 6,n 2 + 5 Θ(log n):log n, log n 2, log (n + n 3 ) 31
32
Comparing Functions same rate of growth: g(n) = Θ(f(n)) different rate of growth: either g(n) = o (f(n)) g(n) grows slower than f(n), and hence f(n) = ω(g(n)) or g(n) = ω (f(n)) g(n) grows faster than f(n), and hence f(n) = o(g(n)) 32
33
Big-Oh Notation f(n) = O(g(n)) if f(n) grows with same rate or slower than g(n). f(n) = Θ(g(n)) or f(n) = o(g(n)) 33
34
Example n+5 = Θ(n) = O(n) = O(n 2 ) = O(n 3 ) = O(n 5 ) the closest estimation: n+5 = Θ(n) the general practice is to use the Big-Oh notation: n+5 = O(n) 34
35
Big-Omega Notation The inverse of Big-Oh is Ω If g(n) = O(f(n)), then f(n) = Ω (g(n)) f(n) grows faster or with the same rate as g(n): f(n) = Ω (g(n)) 35
36
Big-Oh Rules 36
37
Big-Oh Rules We disregard any lower-order term and choose the highest-order term n 2 + n = O(n 2 ) 1000 + n 2 + nlog(n) = O(n 2 ) We disregard any coefficient of 5n 2 + 3n = O(n 2 ) 37
38
Problems N 2 = O(N 2 )true 2N = O(N 2 ) true N = O(N 2 ) true N 2 = O(N) false 2N = O(N) true N = O(N) true 38
39
Problems N 2 = Θ (N 2 )true 2N = Θ (N 2 ) false N = Θ (N 2 ) false N 2 = Θ (N)false 2N = Θ (N) true N = Θ (N) true 39
40
T(N)=6N+4 : n0=4 and c=7, f(N)=N T(N)=6N+4 =4 7N+4 = O(N) 15N+20 = O(N) N 2 =O(N)? N log N = O(N)? N log N = O(N 2 )? N 2 = O(N log N)? N 10 = O(2 N )? 6N + 4 = W(N) ? 7N? N+4 ? N 2 ? N log N? N log N = W(N 2 )? 3 = O(1) 1000000=O(1) Sum i = O(N)? T(N) f(N) c f(N) n0n0 T(N)=O(f(N)) Examples 40
41
When sorting a list of numbers in ascending order; we might have one of the following cases: n Worst Case: The list is sorted in descending order, then maximum number of swaps is performed. n Average Case: The list is unsorted, then average number of swaps is performed. n Best Case: The list is sorted in ascending order, then minimum number of swaps is performed. Worst Case, Best Case, and Average Case 41
42
Big-Oh Definition 42
43
Mathematical stuff 43
44
Examples for Big-Oh: Loop sum = 0; for( i = 0; i < n; i++ ) sum = sum + i; The running time is O(n) 44
45
Examples for Big-Oh: Nested Loop sum = 0; for( i = 0; i < n; i++) for( j = 0; j < n; j++) sum++; The running time is O(n 2 ) 45
46
Examples for Big-Oh: Sequential sum = 0;O(n) for( i = 0; i < n; i++) sum = sum + i; sum = 0; O(n 2 ) for( i = 0; i < n; i++) for( j = 0; j < 2n; j++) sum++; The maximum is O(n 2 ) 46
47
Examples for Big-Oh: If stat. if C S1; else S2; The running time is the maximum of the running times of S1 and S2. 47
48
More Examples O(n 3 ): sum = 0; for( i = 0 ; i < n; i++) for( j = 0 ; j < n*n ; j++ ) sum++; 48
49
More Examples 49 O(n 2 ): sum = 0; for( i = 0; i < n ; i++) for( j = 0; j < i ; j++) sum++;
50
More Examples 50 O(n 3 logn) for(j = 0; j < n*n; j++) compute_val(j); The complexity of compute_val(x) is given to be O(n*logn)
51
More Examples 51 for (i = 0; i < n; i++) for (j = 0; j < m; j++) if (a[ i ][ j ] == x) return 1 ; return -1; O(n*m)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.