Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park
Outline Informal Definition of an Algorithm(previous file) FindLargest(previous file) Three Basic Constructs(previous file) Sorting Algorithms(previous file) Searching Algorithms(previous file) Recursion(previous file) Algorithm Performance Time Complexity Asymptotic Notation Growth Rate
Algorithm Performance Algorithm ____________ is the amount of computer memory and time needed to run an algorithm How is it determined? Analytically: performance ________ Experimentally: performance ____________ Space complexity is defined as the amount of ________ an algorithm needs to run to completion Time complexity is defined as the amount of computer _____ an algorithm needs to run to completion
Time Complexity How to measure? Count a particular operation(________ counts) Count the number of steps(_____ counts) ___________ complexity Running Example: _________ Sort for (int i = 1; i < n; i++)// n is the number of elements in an array { // insert a[i] into a[0:i−1] int t = a[i]; int j; for (j = i−1; j ≥ 0 && t < a[j]; j--) a[j+1] = a[j]; a[j+1] = t; }
Operation Count Pick an ________ characteristic … n, n = the number of elements in case of insertion sort Determine count as a ________ of this instance characteristic
Comparison Count How many comparisons are made? The number of comparisons depends on ___ and __ as well as on __ ______ case count = maximum count ______ case count = minimum count ______ case count for (int i = 1; i < n; i++) for (j = i−1; j ≥ 0 && t < a[j]; j--) a[j+1] = a[j];
Worst Case Comparison Count for (j = i−1; j ≥ 0 && t < a[j]; j--) a[j+1] = a[j]; a = [1,2,3,4] and t = 0→ __ comparisons a = [1,2,3,4,…,i] and t = 0→ __ comparisons for (int i = 1; i < n; i++) for (j = i−1; j ≥ 0 && t < a[j]; j--) a[j+1] = a[j]; Total comparisons = …+(n−1) = ________
Asymptotic Complexity of Insertion Sort O(__) What does this mean? Time or number of operations does not ________ c n 2 on any input of size n So, the worst case time is expected to ________ each time n is doubled
Growth Rates It is often necessary to __________ the number of steps required to execute an algorithm We can approximate one function with another function using a _______ rate We will examine the asymptotic behavior of two functions f and g by comparing f(n) and g(n) for _____ positive values of n
Asymptotic Notation Big O is an example of asymptotic notation Bio O provides an ______ bound for an algorithm’s time performance Other asymptotic notation Big Omega notation provides a ______-bound for an algorithm’s time performance Big Theta notation is used when an algorithm can be bounded _____ from above and below by the ______ function
Big Oh Notation This is written f(n) = O(g(n)) and is stated, “f(n) is Big Oh g(n)” A more mathematical definition of Big Oh is: f(n) = O(g(n)), if there are positive numbers c and m such that ____________ for all n ≥ m Since 5n 3 +4n 2 <= 7n 3 for all n ≥ 2, 5n 3 +4n 2 = O(__) Big Oh is often called an _______ bound
Big Omega Notation “f has an order greater than or equal to g” if there are positive numbers c and m such that _____________ for all n ≥ m This is written f(n) = (g(n)) and is stated as “f(n) is Big Omega g(n)” Big Omega is often called a ______ bound
Big Theta Notation “f has same growth rate as g” if we can find a number m and two nonzero numbers c and d such that _________________ for all n ≥ m Big Theta ( θ ) notation is used when an algorithm can be bounded both from _______ and _______ by the same function
Common Growth Rate Functions (1/2) 1 (constant) growth is ______________ of the problem size N log 2 N (__________) growth increases slowly compared to the problem size (binary search) N (_______) growth is directly proportional to the size of the problem N * log 2 N typical of some divide and conquer approaches (_______ sort)
Common Growth Rate Functions (2/2) N 2 (quadratic) typical in _______ loops N 3 (______) more nested loops 2 N (___________) growth is extremely rapid and possibly __________ N!
Growth Rate log 2 N
Growth Rate N 2
Practical Complexities logNNNlogNN2N2 N3N3 2N2N