At the end of this session, learner will be able to:

Slides:



Advertisements
Similar presentations
Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial.
Advertisements

Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Lecture 8 Analysis of Recursive Algorithms King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer.
Analysis of Recursive Algorithms
Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 2 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Analysis of Algorithm Lecture 3 Recurrence, control structure and few examples (Part 1) Huma Ayub (Assistant Professor) Department of Software Engineering.
Program Performance & Asymptotic Notations CSE, POSTECH.
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
Mathematics Review and Asymptotic Notation
CS 3343: Analysis of Algorithms
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
Chapter Objectives To understand how to think recursively
Analysis of Algorithms
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Asymptotics and Recurrence Equations Prepared by John Reif, Ph.D. Analysis of Algorithms.
2-0 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 2 Theoretical.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 2 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Lecture #3 Analysis of Recursive Algorithms
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Asymptotic Complexity
Algorithm Analysis 1.
Unit 6 Analysis of Recursive Algorithms
Chapter 2 Algorithm Analysis
Theoretical analysis of time efficiency
Analysis of Algorithms
Analysis of algorithms
Analysis of algorithms
Analysis of Recursive Algorithms
Introduction to Analysis of Algorithms
Modeling with Recurrence Relations
Introduction to the Design and Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms
CS 3343: Analysis of Algorithms
Analysis of algorithms
Growth of functions CSC317.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
Analysis of Algorithms
Analysis of algorithms
CS 3343: Analysis of Algorithms
Introduction to Algorithms Analysis
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
CS 201 Fundamental Structures of Computer Science
Chapter 2.
Fundamentals of the Analysis of Algorithm Efficiency
Solving Recurrence Relations
Analysis of algorithms
Discrete Mathematics 7th edition, 2009
Analysis of algorithms
Analysis of Recursive Algorithms
Fundamentals of the Analysis of Algorithm Efficiency
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Complexity Analysis (Part II)
Algorithm Course Dr. Aref Rashad
Analysis of Recursive Algorithms
Presentation transcript:

17CS1102 (Data Structures) Topic: Typical Growth Rates, RECURRENCE RELATIONS Indicator:2 At the end of this session, learner will be able to: 1. Understand typical growth rates 2. Compare the growth rates of the functions and how reduce the running time of a program. 3. Define Recurrence relation and closed form 4. Understand how to derive Recurrence relation for an algorithm, deriving closed form solution, the results of careless use of recursion.

Basic asymptotic efficiency classes/Typical Growth Rates 1 constant log n logarithmic n linear n log n n-log-n n2 quadratic n3 cubic 2n exponential n! factorial

Order of growth Most important: Order of growth within a constant multiple as n→∞ Example: How much faster will algorithm run on computer that is twice as fast? How much longer does it take to solve problem of double input size? Example: cn2 how much faster on twice as fast computer? (2) how much longer for 2n? (4)

Values of some important functions as n  

Asymptotic order of growth A way of comparing functions that ignores constant factors and small input sizes O(g(n)): class of functions f(n) that grow no faster than g(n) Θ(g(n)): class of functions f(n) that grow at same rate as g(n) Ω(g(n)): class of functions f(n) that grow at least as fast as g(n)

Big-oh

Big-omega

Big-theta

Establishing order of growth using the definition Definition: f(n) is in O(g(n)) if order of growth of f(n) ≤ order of growth of g(n) (within constant multiple), i.e., there exist positive constant c and non-negative integer n0 such that f(n) ≤ c g(n) for every n ≥ n0 Examples: 10n is O(n2) 5n+20 is O(n) Examples: 10n is O(n2) since 10n ≤ 10n2 for n ≥ 1 or 10n ≤ n2 for n ≥ 10 c n0 5n+20 is O(10n) since 5n+20 ≤ 10 n for n ≥ 4

Some properties of asymptotic order of growth f(n)  O(f(n)) f(n)  O(g(n)) iff g(n) (f(n)) If f (n)  O(g (n)) and g(n)  O(h(n)) , then f(n)  O(h(n)) Note similarity with a ≤ b If f1(n)  O(g1(n)) and f2(n)  O(g2(n)) , then f1(n) + f2(n)  O(max{g1(n), g2(n)})

Establishing order of growth using limits 0 order of growth of T(n) < order of growth of g(n) c > 0 order of growth of T(n) = order of growth of g(n) ∞ order of growth of T(n) > order of growth of g(n) lim T(n)/g(n) = n→∞ Examples: 10n vs. n2 n(n+1)/2 vs. n2

L’Hôpital’s rule and Stirling’s formula L’Hôpital’s rule: If limn f(n) = limn g(n) =  and the derivatives f´, g´ exist, then Stirling’s formula: n!  (2n)1/2 (n/e)n f(n) g(n) lim n = f ´(n) g ´(n) Example: log n vs. n Example: 2n vs. n!

Orders of growth of some important functions All logarithmic functions loga n belong to the same class (log n) no matter what the logarithm’s base a > 1 is All polynomials of the same degree k belong to the same class: aknk + ak-1nk-1 + … + a0  (nk) Exponential functions an have different orders of growth for different a’s order log n < order n (>0) < order an < order n! < order nn

Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial Analysis Of Recursive Binary Search Analysis Of Recursive Fibonacci

What is a recurrence relation? A recurrence relation, T(n), is a recursive function of an integer variable n. Like all recursive functions, it has one or more recursive cases and one or more base cases. Example: The portion of the definition that does not contain T is called the base case of the recurrence relation; the portion that contains T is called the recurrent or recursive case. Recurrence relations are useful for expressing the running times (i.e., the number of basic operations executed) of recursive algorithms The specific values of the constants such as a, b, and c (in the above recurrence) are important in determining the exact solution to the recurrence. Often however we are only concerned with finding an asymptotic upper bound on the solution. We call such a bound an asymptotic solution to the recurrence.

Forming Recurrence Relations For a given recursive method, the base case and the recursive case of its recurrence relation correspond directly to the base case and the recursive case of the method. Example 1: Write the recurrence relation for the following method: The base case is reached when n = = 0. The method performs one comparison. Thus, the number of operations when n = = 0, T(0), is some constant a. When n > 0, the method performs two basic operations and then calls itself, using ONE recursive call, with a parameter n – 1. Therefore the recurrence relation is: T(0) = a for some constant a T(n) = b + T(n – 1) for some constant b public void f (int n) { if (n > 0) { System.out.println(n); f(n-1); } In General, T(n) is usually a sum of various choices of T(m ), the cost of the recursive subproblems, plus the cost of the work done outside the recursive calls: T(n ) = aT(f(n)) + bT(g(n)) + . . . + c(n) where a and b are the number of subproblems, f(n) and g(n) are subproblem sizes, and c(n) is the cost of the work done outside the recursive calls [Note: c(n) may be a constant]

Forming Recurrence Relations (Cont’d) Example 2: Write the recurrence relation for the following method: The base case is reached when n == 1. The method performs one comparison and one return statement. Therefore, T(1), is some constant c. When n > 1, the method performs TWO recursive calls, each with the parameter n / 2, and some constant # of basic operations. Hence, the recurrence relation is: T(1) = c for some constant c T(n) = b + 2T(n / 2) for some constant b public int g(int n) { if (n == 1) return 2; else return 3 * g(n / 2) + g( n / 2) + 5; }

Forming Recurrence Relations (Cont’d) Example 3: Write the recurrence relation for the following method: The base case is reached when n == 1 or n == 2. The method performs one comparison and one return statement. Therefore each of T(1) and T(2) is some constant c. When n > 2, the method performs TWO recursive calls, one with the parameter n - 1 , another with parameter n – 2, and some constant # of basic operations. Hence, the recurrence relation is: T(n) = c if n = 1 or n = 2 T(n) = T(n – 1) + T(n – 2) + b if n > 2 long fibonacci (int n) { // Recursively calculates Fibonacci number if( n == 1 || n == 2) return 1; else return fibonacci(n – 1) + fibonacci(n – 2); }

Forming Recurrence Relations (Cont’d) Example 4: Write the recurrence relation for the following method: The base case is reached when n == 0 or n == 1. The method performs one comparison and one return statement. ThereforeT(0) and T(1) is some constant c. At every step the problem size reduces to half the size. When the power is an odd number, an additional multiplication is involved. To work out time complexity, let us consider the worst case, that is we assume that at every step an additional multiplication is needed. Thus total number of operations T(n) will reduce to number of operations for n/2, that is T(n/2) with seven additional basic operations (the odd power case) Hence, the recurrence relation is: T(n) = c if n = 0 or n = 1 T(n) = 2T(n /2) + b if n > 2 long power (long x, long n) { if(n == 0) return 1; else if(n == 1) return x; else if ((n % 2) == 0) return power (x, n/2) * power (x, n/2); else return x * power (x, n/2) * power (x, n/2); }

Solving Recurrence Relations To solve a recurrence relation T(n) we need to derive a form of T(n) that is not a recurrence relation. Such a form is called a closed form of the recurrence relation. There are five methods to solve recurrence relations that represent the running time of recursive methods: Iteration method (unrolling and summing) Substitution method (Guess the solution and verify by induction) Recursion tree method Master theorem (Master method) Using Generating functions or Characteristic equations In this course, we will use the Iteration method and a simplified Master theorem.

Solving Recurrence Relations - Iteration method Steps: Expand the recurrence Express the expansion as a summation by plugging the recurrence back into itself until you see a pattern.   Evaluate the summation In evaluating the summation one or more of the following summation formulae may be used: Arithmetic series: Geometric Series: Special Cases of Geometric Series:

Solving Recurrence Relations - Iteration method (Cont’d) Harmonic Series: Others:

Analysis Of Recursive Factorial method Example1: Form and solve the recurrence relation for the running time of factorial method and hence determine its big-O complexity: T(0) = c (1) T(n) = b + T(n - 1) (2) = b + b + T(n - 2) by subtituting T(n – 1) in (2) = b +b +b + T(n - 3) by substituting T(n – 2) in (2) … = kb + T(n - k) The base case is reached when n – k = 0  k = n, we then have: T(n) = nb + T(n - n) = bn + T(0) = bn + c Therefore the method factorial is O(n) long factorial (int n) { if (n == 0) return 1; else return n * factorial (n – 1); }

Analysis Of Recursive Binary Search public int binarySearch (int target, int[] array, int low, int high) { if (low > high) return -1; else { int middle = (low + high)/2; if (array[middle] == target) return middle; else if(array[middle] < target) return binarySearch(target, array, middle + 1, high); else return binarySearch(target, array, low, middle - 1); } The recurrence relation for the running time of the method is: T(1) = a if n = 1 (one element array) T(n) = T(n / 2) + b if n > 1

Analysis Of Recursive Binary Search (Cont’d) Without loss of generality, assume n, the problem size, is a multiple of 2, i.e., n = 2k Expanding: T(1) = a (1) T(n) = T(n / 2) + b (2) = [T(n / 22) + b] + b = T (n / 22) + 2b by substituting T(n/2) in (2) = [T(n / 23) + b] + 2b = T(n / 23) + 3b by substituting T(n/22) in (2) = …….. = T( n / 2k) + kb The base case is reached when n / 2k = 1  n = 2k  k = log2 n, we then have: T(n) = T(1) + b log2 n = a + b log2 n Therefore, Recursive Binary Search is O(log n)

Analysis Of Recursive Fibonacci long fibonacci (int n) { // Recursively calculates Fibonacci number if( n == 1 || n == 2) return 1; else return fibonacci(n – 1) + fibonacci(n – 2); } T(n) = c if n = 1 or n = 2 (1) T(n) = T(n – 1) + T(n – 2) + b if n > 2 (2) We determine a lower bound on T(n): Expanding: T(n) = T(n - 1) + T(n - 2) + b ≥ T(n - 2) + T(n-2) + b = 2T(n - 2) + b = 2[T(n - 3) + T(n - 4) + b] + b by substituting T(n - 2) in (2)  2[T(n - 4) + T(n - 4) + b] + b = 22T(n - 4) + 2b + b = 22[T(n - 5) + T(n - 6) + b] + 2b + b by substituting T(n - 4) in (2) ≥ 23T(n – 6) + (22 + 21 + 20)b . . .  2kT(n – 2k) + (2k-1 + 2k-2 + . . . + 21 + 20)b = 2kT(n – 2k) + (2k – 1)b The base case is reached when n – 2k = 2  k = (n - 2) / 2 Hence T(n) ≥ 2 (n – 2) / 2 T(2) + [2 (n - 2) / 2 – 1]b = (b + c)2 (n – 2) / 2 – b = [(b + c) / 2]*(2)n/2 – b  Recursive Fibonacci is exponential

Mathematical Analysis of Recursive Algorithms Steps in mathematical analysis of non-recursive algorithms Decide on parameter n indicating input size Identify algorithm’s basic operation Determine worst, average, and best case for input of size n Set up a recurrence relation and initial condition(s) Solve the recurrence to obtain a closed form or estimate the order of magnitude of the solution (see Appendix B)

Important Recurrence Types One (constant) operation reduces problem size by one. T(n) = T(n-1) + c T(1) = d Solution: T(n) = (n-1)c + d linear A pass through input reduces problem size by one. T(n) = T(n-1) + cn T(1) = d Solution: T(n) = [n(n+1)/2 – 1] c + d quadratic One (constant) operation reduces problem size by half. T(n) = T(n/2) + c T(1) = d Solution: T(n) = c log n + d logarithmic A pass through input reduces problem size by half. T(n) = 2T(n/2) + cn T(1) = d Solution: T(n) = cn log n + d n n log n

Example 1: Factorial T(n) = T(1) + (n-1) = = n Telescoping: n! = n*(n-1)! 0! = 1 Recurrence relation: T(n) = T(n-1) + 1 T(1) = 1 Telescoping: T(n) = T(n-1) + 1 T(n-1) = T(n-2) + 1 T(n-2) = T(n-3) + 1 … T(2) = T(1 ) + 1 Add the equations and cross equal terms on opposite sides: T(n) = T(1) + (n-1) = = n

Example 2: Binary Search Recurrence Relation T(n) = T(n/2) + 1, T(1) = 1 Telescoping T(n/2) = T(n/4) + 1 … T(2) = T(1) + 1 Add the equations and cross equal terms on opposite sides: T(n) = T(1) + log(n) = O(log(n))