Performance analysis of algorithms

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Chapter 1 – Basic Concepts
Introduction to Analysis of Algorithms
CHAPTER 11 Space and Time Complexity in Chapter 1 All the programs in this file are selected from Ellis Horowitz, Sartaj Sahni, and Susan Anderson-Freed.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
CHAPTER 1 BASIC CONCEPT All the programs in this file are selected from Ellis Horowitz, Sartaj Sahni, and Susan Anderson-Freed “Fundamentals of Data Structures.
What is Program  A Set of Instructions  Data Structures + Algorithms  Data Structure = A Container stores Data  Algoirthm = Logic + Control.
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Program Performance & Asymptotic Notations CSE, POSTECH.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
C. – C. Yao Data Structure. C. – C. Yao Chap 1 Basic Concepts.
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
Unit III : Introduction To Data Structures and Analysis Of Algorithm 10/8/ Objective : 1.To understand primitive storage structures and types 2.To.
CS Data Structures Chapter 1 Basic Concepts.
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Data Structure Introduction.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Introduction to Analysis of Algorithms CS342 S2004.
Algorithm Analysis Part of slides are borrowed from UST.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Software Learning Resource Service Platform BASIC CONCEPT CHAPTER 1 BASIC CONCEPT 1.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Algorithm Analysis 1.
CMPT 438 Algorithms.
Chapter 2 Algorithm Analysis
Mathematical Foundation
Introduction to Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
COMP108 Algorithmic Foundations Algorithm efficiency
Introduction to complexity
Analysis of Algorithms
COMP 53 – Week Seven Big O Sorting.
DATA STRUCTURES Introduction: Basic Concepts and Notations
CS 213: Data Structures and Algorithms
CS 3343: Analysis of Algorithms
Recursion "To understand recursion, one must first understand recursion." -Stephen Hawking.
Big-Oh and Execution Time: A Review
Algorithm Analysis (not included in any exams!)
Building Java Programs
Algorithm design and Analysis
Analysis Algorithms.
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
What is CS 253 about? Contrary to the wide spread belief that the #1 job of computers is to perform calculations (which is why the are called “computers”),
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Analysis of Algorithms
Analysis of Algorithms
Programming and Data Structure
Building Java Programs
Searching, Sorting, and Asymptotic Complexity
Revision of C++.
At the end of this session, learner will be able to:
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
CHAPTER 1 BASIC CONCEPT All the programs in this file are selected from Ellis Horowitz, Sartaj Sahni, and Susan Anderson-Freed “Fundamentals of Data Structures.
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Complexity Analysis (Part II)
Data structures & Algorithm Strategies
Analysis of Algorithms
Algorithms and data structures: basic definitions
Presentation transcript:

Performance analysis of algorithms

What is Programming? Programming is to represent data and solve the problem using the data. Data Structures Algorithms

What is Data Structure? Definition It is a way of collecting and organizing data in a computer. An aggregation of atomic and composite data into a set with defined relationships. Data Structures Primitive Non-Primitive Arrays Integer Linear Stacks Lists Queues Float Non-Linear Trees Character Files Graphs

What is Algorithm? Definition: a finite set of instructions that should satisfy the following: Input: zero or more inputs Output: at least one output Definiteness: clear and unambiguous instructions Finiteness: terminating after a finite number of steps Effectiveness (Machine-executable): enough to be carried out In computational theory, algorithm and program are different Program does not satisfy 4): eg. OS

Algorithm Specification How to express algorithms High-level description Natural language Graphic representation, e.g., flowcharts Pseudocode: informal language-dependent description Implementation description C, C++, Java, and etc.

Natural Language vs. Graphic Chart Example: Selection Sort From those integers that are currently unsorted, find the smallest value. Place it next in the sorted list. list[0] 6 2 list[1] 5 3 list[2] 4 list[3] list[4] Step 1 Step 2 Step 3 Step 4 Step 5 Step 6

Pseudocode (C-like Language) Example: Selection Sort for (i=0; i<n; i++) { Examine numbers in list[i] to list[n-1]. Suppose that the smallest integer is at list[min]. Interchange list[i] and list[min]. }

Implementation in C Example: Selection Sort void sort(int list[], int n) { int i, j, min; for (i = 0; i < n - 1; i++) { min = i; for (j = i + 1; j < n; j++) if (list[j] < list[min]) min = j; SWAP(list[i], list[min]); }

Algorithm Analysis How do we evaluate algorithms? Performance analysis Does the algorithm use the storage efficiently? Is the running time of the algorithm acceptable for the task? Performance analysis Estimating machine-independent time and space void search(int arr[], int len, int target) { for (int i = 0; i < len; i++) { if (arr[i] == target) return i; } return -1 Space complexity: an amount of memory needed Time complexity: an amount of time taken for an algorithm to finish

Space Complexity Definition Example (machine-independent) space required by an algorithm Example int abc(int a, int b, int c) { return a + b + b*c + 4.0; } char* give_me_memory(int n) { char *p = malloc(n); return p; }

Space Complexity What is better for space complexity? float sum(float *list, int n) { float tempsum = 0; for (int i = 0; i < n; i++) tempsum += *(list + i); return tempsum; } float rsum(float *list, int n) { if (n > 0) return rsum(list, n - 1) + list[n - 1]; else return 0; }

Time Complexity Definition Alternative (machine-independent) time required by an algorithm Time is not easy to estimate! Alternative Count the number of program steps instead of time. 10 Additions, 10 subtractions, 10 multiplications Þ 10Ca+10Cs+10Cm Ca: time for one addition Cs: time for one subtraction Cm: time for one multiplication 10 Additions, 10 subtractions, 10 multiplications Þ 30 steps

Time Complexity Program steps Syntactically or semantically meaningful program segment whose execution time is independent of the number of inputs Any one basic operation Þ one step +, -, *, /, assignment, jump, comparison, etc. Any combination of basic operations Þ one step +=, *=, /=, (a+c*d), etc.

Time Complexity Example void add(int a[][MAX_SIZE], ...) { int i, j; int abc(int a, int b, int c) { return a + b + b*c + 4.0; } void add(int a[][MAX_SIZE], ...) { int i, j; for (i = 0; i < rows; i++) for (j = 0; j < cols; j++) c[i][j] = a[i][j] + b[i][j]; }

Time Complexity What is better for time complexity? float sum(float *list, int n) { float tempsum = 0; for (int i = 0; i < n; i++) tempsum += *(list + i); return tempsum; } float rsum(float *list, int n) { if (n > 0) return rsum(list, n - 1) + list[n - 1]; else return 0; }

Asymptotic Notation Do we need to calculate exact numbers? What is a important factor? INCREASING SPEED!! The highest term is enough to represent the complexity. Constants is not important. We have three algorithms, A, B, and C for the same problem. The time complexity of A: n2+n+1 The time complexity of B: n2 The time complexity of C: 200*n*log(n) 10 100 1,000 10,000 A 111 10,101 1,001,001 ??? B 1,000,000 C 2000 40,000 600,000

Asymptotic Notation What is better? (10n+10) vs. (0.01n2+10) (2000n+3) vs. (nlogn+1000) (n3 ) vs. (10n2+1000000n) Simple rule: 1. Ignore any constants. 2. Compare only the term of the highest order.

Asymptotic Notation Three notations for complexity Big-O notation : O(f(n)) The complexity is not increasing faster than f(n). f(n) is an upper bound of the complexity. Big- notation : (f(n)) The complexity is not increasing slower than f(n). f(n) is a lower bound of the complexity. Big-  notation : (f(n)) The complexity is equally increasing to f(n).

Big-O Notation Definition Example 3n + log n + 2 = (n), because 3n + log n + 2 £ 5n for n ³ 2 10n2 + 4n + 2 = (n4), because 10n2 + 4n + 2 £ 10n4 for n ³ 2 f(n) = (g(n))  f(n) is not increasing faster than g(n)  For a large number no, c*g(n) will be larger than f(n)  There exist positive constants c and no such that f(n) £ c*g(n) for all n > no

Example: Asymptotic Notation Three asymptotic notations for space complexity float sum(float *list, int n) { float tempsum = 0; for (int i = 0; i < n; i++) tempsum += *(list + i); return tempsum; } (1) O(1) (1) 16 bytes float rsum(float *list, int n) { if (n > 0) return rsum(list, n - 1) + list[n - 1]; else return 0; } (n) O(n) (n) 8*n bytes

Example: Asymptotic Notation Three asymptotic notations for time complexity float sum(float *list, int n) { float tempsum = 0; for (int i = 0; i < n; i++) tempsum += *(list + i); return tempsum; } (n) O(n) (n) 2n+3 void add(int a[][MAX_SIZE], ...) { int i, j; for (i = 0; i < r; i++) for (j = 0; j < c; j++) c[i][j] = a[i][j] + b[i][j]; } (r*c) O(r*c) (r*c) 2*r*c + 2*r + 1

Discussion: Asymptotic Notation Big-O notation is widely used. Big- notation is the most informative, but the exact value is hard to know. Big- notation is the least informative. Why? Big-O notation is good for rough description. There is algorithm A. The exact complexity is very hard to evaluate. However, we know that n2 £ complexity £ n3. Then, we can say the complexity is O(n3).

Comparison: Asymptotic Notation Which is more costly? O(1), O(log n), O(n log n), O(n2), O(n3), O(2n), O(n!), etc.. O(1): constant O(log2n): logarithmic O(n): linear O(n·log2n): log-linear O(n2): quadratic O(n3): cubic O(2n): exponential O(n!): factorial

Guideline for Asymptotic Analysis Loops The number of iterations * the running time of the statements inside the loop // executes n times for (int i = 0; i < n; i++) m = m + 1; // constant time, c // Total time = c * n = O(n) // outer loop executed n times for (int i = 0; i < n; i++) // inner loop executed n times for (int j = 0; j < n; j++) m = m + 1; // constant time, c // Total time = c * n * n = O(n2)

Guideline for Asymptotic Analysis Consecutive statements Add the time complexities of each statement. // executes n times for (int i = 0; i < n; i++) m = m + 1; // constant time, c1 // outer loop executed n times // inner loop executed n times for (int j = 0; j < n; j++) k = k + 1; // constant time, c2 // Total time = c1 * n + c2 * n2 = O(n2)

Guideline for Asymptotic Analysis If-then-else statements Consider the worst-case running time among the if, else-if, or else part (whichever is the larger). // executes n times if (len > 0) for (int i = 0; i < n; i++) m = m + 1; // constant time, c1 else { // outer loop executed n times if (i > 0) k = k + 2 // constant time, c2 else // inner loop executed n times for (int j = 0; j < n; j++) k = k + 1; // constant time, c3 } // Total time = n * n * c3 = O(n2)

Guideline for Asymptotic Analysis Logarithmic complexity An algorithm is O(log n) if it takes a constant time to cut the problem size by a fraction (usually by ½). // At kth step, 2k = n and come out of loop. for (int i = 1; i < n; i*=2) m = m + 1; // constant time, c // Because k = log2n, total time = O(log n) // The same condition holds for decreasing sequence. for (int i = n; i > 0; i/=2) m = m + 1; // constant time, c // Because k = log2n, total time = O(log n)

Recursion

What is Recursion? Definition A repetitive process in which an algorithm calls itself #include <stdio.h> void Recursive(int n) { // Base case: termination condition! if (n == 0) return; printf("Recursive call: %d\n", n); Recursive(n - 1); }

Example: Summing from 1 to n Iterative vs. Recursive programming int sum(int n) { int sum = 0; for (int i = 0; i < n; i++) sum = sum + i; return sum; } 𝑆 𝑛 = 𝑖=0 𝑛 𝑖 int rsum(int n) { if (n == 0) return 0; else return rsum(n - 1) + n; } 𝑆 𝑛 = 0 𝑖𝑓 𝑛=0 𝑆 𝑛−1 +𝑛 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Designing Recursive Programming Two parts Base case: Solve the smallest problem directly. Recursive case: Simplify the problem into smaller ones and calculate a recurrence relation. 𝑖=0 𝑛 𝑖=𝑛+ 𝑛−1 +…+1+0 𝑖=0 𝑛 𝑖=𝑛+ 𝑖=0 𝑛−1 𝑖 int S(int n) { if (n == 0) return 0; else return S(n - 1) + n; } 𝑆 𝑛 = 0 𝑖𝑓 𝑛=0 𝑆 𝑛−1 +𝑛 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Function Call/Return Is it a correct code? Stack overflow: Eventually halt when runs out of (stack) memory. void Recursive(int n) { printf("Recursive call: %d\n", n); Recursive(n - 1); } void Recursive(int n) { // Base case: termination condition! if (n == 0) return; else printf("Recursive call: %d\n", n); Recursive(n - 1); }

Example: Summing from 1 to n How does recursive programming work? How many calls/returns happen? int rsum(int n) { if (n == 0) return 0; else return rsum(n - 1) + n; } call S(3) call S(2) S(2)+3 call S(1) return 6 S(1)+2 call S(0) return 3 S(0)+1 return 1 return 0

Function Call/Return How is stack memory changed? When a function calls, it is sequentially stored in stack memory. The returned address is kept into system stack. All local variables are newly allocated into system stack. S(0) S(1) S(2) S(3) call S(3) call S(2) S(2)+3 call S(1) S(1)+2 call S(0) S(0)+1

Function Call/Return How is stack memory changed? When a function is returned, it is removed from stack memory. All local variables are removed. Return the address kept in the stack. S(0) S(1) S(2) S(3) call S(3) call S(2) S(2)+3 call S(1) return 6 S(1)+2 call S(0) return 3 S(0)+1 return 1 35 return 0

Recursion vs. Iteration Terminate when a condition is proven to be false. Each iteration does NOT require extra memory. Some iterative solutions may not be as obvious as recursive solutions. Recursion Terminate when a base case is reached. Each recursive call requires extra space on stack memory. Some solutions are easier to formulate recursively.

Summing Multiples of Three Calculate the sum of all multiples of three from 0 to n. int sum(int n) { int sum = 0; for (int i = 0; i <= n; i+=3) sum = sum + i; return sum; } int rsum(int n) { if (n == 0) return 0; else if (n % 3 != 0) return rsum(n – n % 3); else return rsum(n - 3) + n; }

Finding Maximum Number Search the maximum number in array. int findMax(int* arr, int n) { int max = arr[0]; for (int i = 1; i < n; i++) if (arr[i] > max) max = arr[i]; } return max;

Finding Maximum Number Search the maximum number in array. max 𝑣 1 , 𝑣 2 , …, 𝑣 𝑛 = 𝑣 1 𝑖𝑓 𝑛=1 max⁡( 𝑣 𝑘 , max 𝑣 1 ,…, 𝑣 𝑘−1 ) 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 int rfindMax(int* arr, int n) { if (n == 1) return arr[0]; else int max = rfindMax(arr, n - 1); if (max < arr[n - 1]) return arr[n - 1]; return max; }

Printing Reverse String Print a string in a reverse manner. rprint("abc")  cba, rprint("2a1bc")  cb1a2 void rprint(char* s, int n) { for (int i = n - 1; i >= 0; i--) printf("%c", s[i]); } void rrprint(char* s, int n) { if (n == 0) return; else { printf("%c", s[n - 1]); return rrprint(s, n - 1); }

Printing Binary Number Print a binary number using recursion. Note: Input a positive integer only. binary(1)  1, binary(3)  11 binary(10)  1010, binary(109)  1101101 void binary(int n) { if (n == 0) return; else { binary(n / 2); printf("%d", n % 2); }

Calculating the Power of X Iterative vs. recursive programming int power(int x, int n) { int pow = 1; for (int i = 0; i < n; i++) pow *= x; return pow; } int rpower(int x, int n) { if (n == 0) return 1; else return x * rpower(x, n - 1); }

Calculating the Power of X How to implement recursion more efficiently? 𝑥 10 = 𝑥 5 2 = 𝑥 2 2 ×𝑥 2 int rpower(int x, int n) { if (n == 0) return 1; else return x * rpower(x, n - 1); } int rpower2(int x, int n) { if (n == 0) return 1; else if (n % 2 == 0) { int m = rpower2(x, n / 2); return m * m; } else return x * rpower2(x, n - 1);

Calculating Fibonacci Numbers Every number is the sum of two preceding ones. 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, … int fibo(int n) { if (n == 1 || n == 2) return 1; else { int prev = 1, cur = 1, next = 1; for (int i = 3; i <= n; i++) { prev = cur, cur = next; next = prev + cur; } return next; 𝐹 𝑛 = 𝐹 1 =1 𝑖𝑓 𝑛=1 𝐹 2 =1 𝑖𝑓 𝑛=2 𝐹 𝑛−1 + 𝐹 𝑛−2 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 int rfibo(int n) { if (n == 1 || n == 2) return 1; else return rfibo(n - 1) + rfibo(n - 2); }

Recursion for Fibonacci Numbers How many calls happen? 1 fibo(7) 2 fibo(6) 17 fibo(5) 3 fibo(5) 12 fibo(4) 18 fibo(4) 23 fibo(3) 16 19 22 24 25 4 fibo(4) 9 fibo(3) 13 fibo(3) fibo(2) fibo(3) fibo(2) fibo(1) fibo(2) 5 fibo(3) fibo(2) fibo(1) fibo(2) fibo(1) fibo(2) fibo(1) fibo(2) 8 10 11 14 15 20 21 fibo(1) fibo(2) 6 7

Binary Search using Recursion Compare the median value in search space to the target value. Can eliminate half of the search space at each step. It will eventually be left with a search space consisting of a single element. int bsearch(int arr[], int low, int high, int target) { if (low > high) return -1; else { int mid = (low + high) / 2; if (target == arr[mid]) return (mid); else if (target < arr[mid]) bsearch(arr, low, mid - 1, target); else bsearch(arr, mid + 1, high, target); }

Time Complexity for Recursion How to calculate time complexity for recursion T(n): the maximum amount of time taken on input of size n Formulate a recurrence relation with sub-problems. int S(int n) { if (n == 0) return 0; else return S(n - 1) + n; } 𝑇 𝑛 =𝑇 𝑛−1 +1 𝑇 𝑛 =𝑇 𝑛−2 +2 …… 𝑻 𝒏 =𝑻 𝟎 +𝒏=𝒏+𝟏=𝑶(𝒏)

Time Complexity for Recursion Time complexity for binary search 𝑇 𝑛 =𝑇 𝑛 2 +1=𝑇 𝑛 4 +2=𝑇 𝑛 2 𝑘 +𝑘, where 𝑛 2 𝑘 =1 ⟹𝑇 𝑛 =1+ log 2 𝑛 =𝑂( log 𝑛 ) int bsearch(int arr[], int low, int high, int target) { if (low > high) return -1; else { int mid = (low + high) / 2; if (target == arr[mid]) return (mid); else if (target < arr[mid]) bsearch(arr, low, mid - 1, target); else bsearch(arr, mid + 1, high, target); }

Recursion Tree Visualizing how recurrences are iterated The recursion tree for this recurrence has the following form: 𝑇 𝑛 =2𝑇 𝑛 2 + 𝑛 2 𝑛 8 2 𝑛 4 2 𝑛 2 2 𝑛 2 ℎ𝑒𝑖𝑔ℎ𝑡= log 2 𝑛 ……

Recursion Tree In the recursion tree, The depth of the tree does not really matter. The amount of work at each level is decreasing so quickly that the total is only a constant factor more than the root. 𝑛 8 2 𝑛 4 2 𝑛 2 2 𝑛 2 ℎ𝑒𝑖𝑔ℎ𝑡= log 2 𝑛 …… 𝑇 𝑛 = 𝑛 2 + 𝑛 2 2 + 𝑛 2 4 + 𝑛 2 8 +…+ 𝑛 2 2 log 2 𝑛 = 𝑛 2 1+ 1 2 + 1 4 + 1 8 +… 1 2 log 2 𝑛 =𝑂 𝑛 2

Example: Recursion Tree Concern the following recurrence relation form: 𝑇 𝑛 =2𝑇 𝑛 2 +𝑛 𝑛 + 𝑛 log 2 𝑛 =𝑶 𝒏 𝐥𝐨𝐠 𝒏 𝑛/8 𝑛/4 𝑛/2 𝑛 ℎ𝑒𝑖𝑔ℎ𝑡= log 2 𝑛 ……

Master Theorem Consider the following recurrence relation form: 𝐂𝐚𝐬𝐞 𝟏: 𝒇 𝒏 =𝑶 𝒏 𝒍𝒐𝒈 𝒃 𝒂 −𝜺 The leaves grow faster than 𝑓 𝑛 . So, 𝑻 𝒏 =𝚯( 𝒏 𝐥𝐨𝐠 𝒃 𝒂 ). 𝐂𝐚𝐬𝐞 𝟐: 𝒇 𝒏 =𝚯 𝒏 𝒍𝒐𝒈 𝒃 𝒂 The leaves grow at the same rate as 𝑓 𝑛 . So, 𝑻 𝒏 =𝜣( 𝒏 𝒍𝒐𝒈 𝒃 𝒂 𝐥𝐨𝐠 𝒏 ). 𝐂𝐚𝐬𝐞 𝟑: 𝒇 𝒏 =𝛀 𝒏 𝒍𝒐𝒈 𝒃 𝒂 +𝜺 𝑓 𝑛 grows faster than the number of leaves. So, 𝑻 𝒏 =𝜣(𝒇(𝒏)). 𝑇 𝑛 =𝑎𝑇 𝑛 𝑏 +𝑓 𝑛 , where 𝑎≥1, 𝑏>1

Example: Master Theorem Consider the following recurrence relations 𝑇 𝑛 =4𝑇 𝑛 2 +𝑛 𝑎=4, 𝑏=2 → 𝑛 log 2 4 = 𝑛 2 and 𝑓 𝑛 =𝑂( 𝑛 2−𝜀 ) for 𝜀=1. Case 1 applies. Thus, 𝑇 𝑛 =Θ 𝑛 log 2 4 =Θ( 𝑛 2 ). 𝑇 𝑛 =4𝑇 𝑛 2 + 𝑛 2 𝑎=4, 𝑏=2 → 𝑛 log 2 4 = 𝑛 2 and 𝑓 𝑛 =𝑂( 𝑛 2 ). Case 2 applies. Thus, 𝑇 𝑛 =Θ 𝑛 log 2 4 log 𝑛 =Θ( 𝑛 2 log 𝑛 ). 𝑇 𝑛 =4𝑇 𝑛 2 + 𝑛 3 𝑎=4, 𝑏=2 → 𝑛 log 2 4 = 𝑛 2 and 𝑓 𝑛 =𝑂( 𝑛 2+𝜀 ) for 𝜀=1. Case 3 applies. Thus, 𝑇 𝑛 =Θ 𝑓(𝑛) =Θ( 𝑛 3 ).

Tail Recursion It is a subroutine call performed as the final action of a function Tail-call optimization: The current function is no longer needed and can thus be replaced by the tail-call function. int rsum(int n) { if (n == 0) return 0; else return rsum(n - 1) + n; } rsum(4) rsum(3) + 4 (rsum(2) + 3) + 4 ((rsum(1) + 2) + 3) + 4 (((rsum(0) + 1) + 2) + 3) + 4 int rsum2(int n, int sum) { if (n == 0) return sum; else return rsum2(n - 1, sum + n); } rsum2(4, 0) rsum2(3, 4) rsum2(2, 7) rsum2(1, 9) rsum2(0, 10) return 10

Example: Fibonacci Numbers Key advantage of tail recursion Tail-call optimization can save both space and time, because it only needs the address of the tail-call function. int fibo(int n) { if (n == 1 || n == 2) return 1; else return fibo(n - 1) + fibo(n - 2); } int rfiboTail(int n, int prev, int cur) { if (n == 1 || n == 2) return cur; else return rfiboTail(n - 1, cur, prev + cur); }

Example: Finding Maximum Number How to implement a tail-recursive version int findMax(int* arr, int n) { int max = arr[0]; for (int i = 1; i < n; i++) if (arr[i] > max) max = arr[i]; return max; } int rfindMaxTail(int* arr, int n, int max) { if (n == 1) return max; else { if (max < arr[n - 1]) max = arr[n - 1]; return rfindMaxTail(arr, n - 1, max); }