Asymptotic Notations Dr. Munesh Singh.

Slides:



Advertisements
Similar presentations
11.2 Complexity Analysis. Complexity Analysis As true computer scientists, we need a way to compare the efficiency of algorithms. Should we just use a.
Advertisements

Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial.
Chapter 7: Sorting Algorithms
Proof Techniques and Recursion. Proof Techniques Proof by induction –Step 1: Prove the base case –Step 2: Inductive hypothesis, assume theorem is true.
Complexity Analysis (Part I)
Algorithm Analysis. Math Review – 1.2 Exponents –X A X B = X A+B –X A /X B =X A-B –(X A ) B = X AB –X N +X N = 2X N ≠ X 2N –2 N+ 2 N = 2 N+1 Logarithms.
Analysis of Recursive Algorithms
Analysis of Algorithms
Algorithm/Running Time Analysis. Running Time Why do we need to analyze the running time of a program? Option 1: Run the program and time it –Why is this.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Analysis of Algorithm Lecture 3 Recurrence, control structure and few examples (Part 1) Huma Ayub (Assistant Professor) Department of Software Engineering.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
Mathematics Review and Asymptotic Notation
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
Iterative Algorithm Analysis & Asymptotic Notations
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Foundations of Algorithms, Fourth Edition
CSE 2813 Discrete Structures Solving Recurrence Relations Section 6.2.
Analysis of Algorithms Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Analysis of Algorithms & Recurrence Relations. Recursive Algorithms Definition –An algorithm that calls itself Components of a recursive algorithm 1.Base.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Lecture #3 Analysis of Recursive Algorithms
Big-O. Speed as function Function relating input size to execution time – f(n) = steps where n = length of array f(n) = 4(n-1) + 3 = 4n – 1.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Algorithm Analysis 1.
Unit 6 Analysis of Recursive Algorithms
Chapter 10 Algorithm Efficiency
Complexity Analysis (Part I)
Chapter 2 Algorithm Analysis
Theoretical analysis of time efficiency
Design and Analysis of Algorithms
Analysis of Non – Recursive Algorithms
Analysis of Non – Recursive Algorithms
Introduction to Analysis of Algorithms
Introduction to the Design and Analysis of Algorithms
Analysis of Algorithms
Amortised Analysis.
Introduction to Analysis of Algorithms
Introduction to Analysis of Algorithms
What is an Algorithm? Algorithm Specification.
Algorithms Algorithm Analysis.
Running Time Performance analysis
Complexity Analysis.
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Algorithm design and Analysis
Introduction to Algorithms Analysis
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Defining Efficiency Asymptotic Complexity - how running time scales as function of size of input Two problems: What is the “input size” ? How do we express.
Programming and Data Structure
Programming and Data Structure
Complexity Analysis Text is mainly from Chapter 2 by Drozdek.
8. Comparison of Algorithms
At the end of this session, learner will be able to:
Fundamentals of the Analysis of Algorithm Efficiency
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Algorithm/Running Time Analysis
Complexity Analysis (Part I)
Algorithm Course Dr. Aref Rashad
Complexity Analysis (Part I)
Presentation transcript:

Asymptotic Notations Dr. Munesh Singh

Problem generalization

Asymptotic Notations O notation- (Worst case complexity of an algorithm) Ω  notation-(Best case complexity of an algorithm) Θ notation –(Average case complexity of algorithm)

Few series Linear Series (Arithmetic Series): For n  0, Quadratic Series: For n  0,

Few series [cont..] Cubic Series: For n  0, Geometric Series: For real x  1, For |x| < 1,

Few Series [cont..] Linear-Geometric Series: For n  0, real c  1, Harmonic Series: nth harmonic number, nI+,

How to calculate complexity Normally the time complexity with worst case is important. If the time complexity of worst case (Big O) ==best case (Big omega) then average case (Big-theta) complexity we evaluate If the loops are independent then multiplication of this gives complexity A() { int i; for (i=1 to n) % n times loop is running for (j=1 to n) % n times loop is running pf(“ravi”); } Time Complexity= n x n , then O(n^2)

How to calculate (cont.…) If the loops are depended on each other than we have to enrol it. And calculate the complexity based on last loop or calculation i.e. A() Unroll { int i=1, s=1; while (s<=n) s: 1 3 6 10 15 21 { i: 1 2 3 4 5 6 i++; s=s+i; pf(“ravi”); }

How to calculate (cont.…) This iteration is not going linear incremental So, we assume after k iteration the loop get completed See the pattern of increment of s, and select the appropriate series Here, the S pattern of increment is sum of first two natural number for k iteration. Hence, k(k+1)/2 After k iteration, the value of s is increased beyond n and condition gets terminated the looping. So k(k+1)/2>n k^2+k=2n k= O(√n)

Types of Algorithm Algorithm Iterative Recursive A() A(n) { { for i=1 to n max(a,b) } A(n) { If () A(n/2) }

Examples A() { Int i; for (i=1 to n) % calculate the number of time a loop execute Pf(“ravi”); } What will be the complexity of this algorithm Time complexity= O(n)

Example 3 Find out the time complexity…….. A() { Int i=1; for( i=1, i^2<=n, i++) %calculate the number of times a loop runs Pf(“ravi”) } Time complexity = O(√n) = = Ω(√n) == ø(√n)

Example 2 A() { int i; for(i=1 to n) % calculate the number of time a loop for(j=1 to n) % calculate the number of time a loop pf(“ravi”); } What will be the time complexity Time complexity=O(n^2)

Example 4 A() Unroll it int i,j,k,n for(i=1,i<=n,i++) i= { j= for(j=1,j<=i,j++) k= { for(k=1,k<=100,k++) loops are dependent { =1x100+2x100+3x300+--------nx100 pf(“ravi”); = 100(1+2+3+-------------------n) } = 100x (n(n+1)/2) Time complexity = O(n^2) } 1 2 3 ------- n 1 t 2 t 3 t -------- n t 1x100 2x100 3x100 --------- nx100

Example 5 A() i= { j= int i, j, k, n k= for (i=1, i<=n, i++) { for (j=1, j<=i^2; j++) Time complexity for (k=1, k<=n/2, k++) = n/2(1+4+9+16+------------+n^2) { = n/2(n(n+1)(2n+1)/6) = O(n^4) pf(“ravi”) } 1 2 3 --------- n 1 t 4 t 9 t n^2 1xn/2 4xn/2 9xn/2 n^2xn/2

Example 6 A() { for(i=1,i<n,i=i*2) Pf(“ravi”) } Tell me Sequential increment or non sequential increment Dependent or independent If non-linear we will use k times iteration to solve the complexity

Unroll it i = 1 2^0 i= 2 2^1 i= 4 2^2 i= 8 2^3 i=16 2^4 i=k 2^k > In terms of n, we have to calculate the complexity. > When I values reaches to n then 2^k is equal to n 2^k=n K= log(n) base 2 because increment is by 2 =O(log2(n))

EXAMPLE 6 A() { Int i, j, k; for(i=n/2, i<=n,i++) run -> n/2 for(j=1,j<=n/2,j++) runs-> n/2 for (k=1,k<=n; k=k*2) runs-> log2(n) pf(“ravi”) } Calculate the time complexity Hence, the time complexity will be multiple because loops are independent n/2xn/2x1og2(n) = O(n^2log2(n))

Example 7 A() { int i, j, k; for (i=n/2, i<=n, i++) runs = n/2 for(j=1;j<=n;j=2*j) runs = log2n for(k=1, k<=n,k=k*2) runs = log2n pf(“ravi”); } All the loops are independent so Time complexity= n/2*log2n*log2n

Example 8 A() { for(i=1, i<=n, i++) i =1 2 3 -----k-------- n for(j=1, j<=n, j=j+i) j= 1 to n 1 to n 1 to n-----1 to n---1 to n pf(“ravi”) n n/2 n/3……..n/k……n/n } Hence, the time complexity will be. =n(1+1/2+1/3+1/4+----1/k+…+1/n) =n(logn) =O(nlogn)

Example 9 A() int 2^2^k; for(i=1; i<=n; i++) { J=2; while(j<=n) j=j^2 Pf(“ravi”); } So in terms of n what will be the result. N=2^2^k=log2(n)=2^k= loglog2(n)=k Substitute in equation =O(n*(loglog2(n)) k=1 k=2 K=3 K=k n=4 n=16 n=256 N=2^k J=2 to 4 J=2,4,16 J=2,4,16,24 J=2,4,16,k n*2t n*3 n*4 n*(k+1)

Example 10 A() { int i, count=0; for(i=0;i<n;i++) i=0 1 2……..n for(j=0;j<i;j++) j=0 1 2………n Count++ } Hence, the time complexity will be the sum of the first two natural number… =0+1+2+3+4………..+n =n(n-1)/2 O(n^2)

Example 11 A() { int count = 0,i,j; for (i = N; i > 0; i /= 2) for ( j = 0; j < i; j++) count++; } i=N N/2 N/4………N/N J=N N/2 N/4………N/N Hence… =N(1+1/2+1/3+1/4+…………+1/N) =N*logN O(NlogN)

Example 12 // Here c is a constant A() { for (int i = 1; i <= 100; i++) Pf(“deepa”); } As c is constant the time complexity will be constant =O(1)

Example 13 A() { Int Count=0; for (int i = n; i > 0; i -= c) } Time complexity will be O(n)

Example 14 1) int x = 0; for (int i = 1; i < n; i++) for (int j = 1; j < i; j++) x += i + j 2) int x = 0; for (int j = i; j < 100; j++) x += i + j; 3) int x = 0; for (int j = n; j > i; j /= 3)

Examples 15 for (int i = 1; i < n * n; i++) 4) int x = 0; for (int i = 1; i < n * n; i++) for (int j = 1; j < i; j++) x += i + j;

Recursive Algorithm The recursive algorithm in which function call itself and make the looping scenario to execute the problem. Take an example A(n) { If() return (A(n/2)+A(n/2)) } The recursive expression for above algorithm will be T(n)=C+2T(n/2)

Complexity analysis for Recursive Algo Example A() { If(n>1) return(n-1) } Recursive expression T(n)=C+T(n-1) n>1 T(n) =1 n=0 To solve it, we use back substitution method

Back substitution T(n)=1+T(n-1)--------(1) T(n-1)=1+T(n-1-1) Substitute the eq (2) in eq(1) T(n)=1+1+T(n-2) T(n)=2+T(n-2) ----------------(4) Substitute the eq(3) in eq(4) T(n)=2+1+T(n-3) T(n)=3+T(n-3)……………………..T(n)=K+T(n-K) in terms on n the complexity n-k=1 K=n-1; T(n)=n-1+T(1) T(n)=n-1+1 T(n)=O(n)

Example 2 T(n)=n+T(n-1), n>1-------------(1) 1 , n=1 Substitute the eq(2) in eq(1) T(n)=n+n-1+T(n-2)-------------(4) Substitute the eq(3) in eq(4) T(n)=n+n-1+n-2+T(n-3) T(n)=n+n-1+n-2+T(n-k)----------T(n-k+1) n-k+1=1 K=n-2 n+n-1+n-2+……+2+1 n(n+1)/2 O(n^2)