Download presentation
Presentation is loading. Please wait.
1
Analysis of Algorithms Aaron Tan http://www.comp.nus.edu.sg/~tantc/cs1101.html
2
2 Analysis of Algorithms Introduction to Analysis of Algorithms After you have read and studied this chapter, you should be able to Know what is analysis of algorithms (complexity analysis) Know the definition and uses of big-O notation How to analyse the running time of an algorithm
3
3 Analysis of Algorithms Introduction (1/2) + Two aspects on writing efficient codes: Programming techniques Implementation of algorithms Practitioner’s viewpoint Asymptotic analysis (“big-O” notation, etc.) Analysis and design of algorithms Theoretician’s viewpoint Asymptotic analysis keeps the student’s head in the clouds, while attention to implementation details keeps his feet on the grounds.
4
4 Analysis of Algorithms Introduction (2/2) + Programming techniques versus Algorithm design: int f1 (int n) { int a, sum=0; for (a=1; a<=n; a++) sum += a; return sum; }
5
5 Analysis of Algorithms Sum of Two Elements (1/5) Given this problem: A sorted list of integers list and a value is given. Write a program to find the indices of (any) two distinct elements in the list whose sum is equal to the given value. Example: list: 2, 3, 8, 12, 15, 19, 22, 24 sum: 23 answer: elements 8 (at subscript 2) and 15 (at subscript 4)
6
6 Analysis of Algorithms Sum of Two Elements (2/5) Algorithm A: list: 2, 3, 8, 12, 15, 19, 22, 24 sum: 23 answer: elements 8 (at subscript 2) and 15 (at subscript 4) n = size of list for x from 0 to n – 2 for y from x + 1 to n – 1 if ((list[x] + list[y]) == sum) then found! (answers are x and y)
7
7 Analysis of Algorithms Sum of Two Elements (3/5) Code for algorithm A: import java.util.*; class SumOfTwoA { public static void main(String[] args) { Scanner scanner = new Scanner(System.in); int[] list = { 2, 3, 8, 12, 15, 19, 22, 24 }; int n = list.length; System.out.print("Enter sum: "); int sum = scanner.nextInt(); boolean found = false; for (int x = 0; x < n-1 && !found; x++) for (int y = x+1; y < n && !found; y++) if (list[x] + list[y] == sum) { System.out.println("Indices at " + x + " and " + y); found = true; }
8
8 Analysis of Algorithms Sum of Two Elements (4/5) Algorithm A used nested loop and scans some elements many times. Algorithm B: can you use a single loop to examine each element at most once? If this can be done, it will be more efficient than Algorithm A.
9
9 Analysis of Algorithms Sum of Two Elements (5/5) Code for algorithm B: +
10
10 Analysis of Algorithms The Race Who will reach the finishing line first? 100 metres ahead
11
11 Analysis of Algorithms Complexity Analysis (1/2) Complexity analysis: to measure and predict the behavior (running time, storage space) of an algorithm. We will focus on running time here. Inexact, but provides a good basis for comparisons. We want to have a good judgment on how an algorithm will perform if the problem size gets very big.
12
12 Analysis of Algorithms Complexity Analysis (2/2) Problem size is defined based on the problem on hand. Examples of problem size: Number of elements in an array (for sorting problems). Length of the strings in an anagram problem. Number of discs in the Tower of Hanoi problem. Number of cities in the Traveling Salesman Problem (TSP).
13
13 Analysis of Algorithms Definition (1/4) Assume problem size is n and T(n) is the running time. Upper bound – Big-O notation Definition: T(n) = O(f(n)) if there are constants c and n 0 such that T(n) c*f(n) when n n 0 We read the equal sign = as “is a member of” ( ), because O(f(n)) is a set of functions. We may also say that T(n) is bounded above by f(n).
14
14 Analysis of Algorithms Definition (2/4) 0 3000 2000 1000 1500100015002000 n-axis f(n)f(n) K * g(n) n0n0 Graphical Meaning of big-O Notation A(n)A(n) c*B(n) A(n) = O(B(n))
15
15 Analysis of Algorithms Definition (3/4) The functions’ relative rates of growth are compared. For instance, compare f(n) = n 2 with g(n) = 1000n. Although at some points f(n) is smaller than g(n), f(n) actually grows at a faster rate than g(n). (Hence, an algorithm with running time complexity of f(n) is slower than another with running time complexity of g(n) in this example.) Hence, g(n) = O(f(n)). The definition says that eventually there is some point n 0 past which c*f(n) is always larger or equal to g(n). Here, we can make c = 1 and n 0 = 1000.
16
16 Analysis of Algorithms Definition (4/4) Besides the big-O (upper bound) analysis, there are the Omega (lower bound) analysis, the Theta (tight bounds) analysis, and others.
17
17 Analysis of Algorithms Exercises (1/4) f(n) = 1 + 2 + 3 + … + n Show that f(n) = O(n 2 ) Proof: 1 + 2 + 3 + … + n = n(n+1)/2 = n 2 /2 + n/2 n 2 /2 + n 2 /2 = n 2 The above is the running time of basic sorting algorithms such as bubblesort, insertion sort, selection sort.
18
18 Analysis of Algorithms Exercises (2/4) f(n) = 17 + n + n/3 Show that f(n) = O(n) Proof: 17 + n + n/3 3n = O(n) f(n) = n 4 + n 2 + 20n + 100 Show that f(n) = O(n 4 ) Proof: n 4 + n 2 + 20n + 100 4n 4 = O(n 4 ) From the two examples above, it can be seen that an expression is dominated by the term of the highest degree.
19
19 Analysis of Algorithms Exercises (3/4) Tower of Hanoi Algorithm: Tower(n, source, temp, dest) { if (n > 0) { tower(n-1, source, dest, temp); move disc from source to dest; tower(n-1, temp, source, dest); } Let T(n) = number of move to solve a tower of n discs.
20
20 Analysis of Algorithms Exercises (4/4) Tower of Hanoi (cont.) Let T(n) = number of move to solve a tower of n discs. Prove that T(n) = 2 n – 1. T(0) = 0 T(n) = T(n – 1) + 1 + T(n – 1) = 2 T(n – 1) + 1 = 2 (2 n–1 – 1) + 1 = 2 2 n–1 – 2 +1 = 2 n – 1 Hence T(n) = O(2 n )
21
21 Analysis of Algorithms Some Common Series 1 + 2 + 4 + 8 + 16 + …+ 2 n 1 + 2 + 3 + 4 + 5 … + n 1 2 + 2 2 + 3 2 + 4 2 + 5 2 … + n 2
22
22 Analysis of Algorithms The Conversation Boss: Your program is too slow! Rewrite it! You: But why? All we need to do is to buy faster computer! Is this really the solution?
23
23 Analysis of Algorithms Complexity Classes (1/4) There are some common complexity classes. In analysis of algorithm, log refers to log 2, or sometimes written simply as lg. NotationName O(1)Constant. O(log n)Logarithmic. O(n)Linear. O(n log n)Linearithmic, loglinear, quasilinear or supralinear. O(n 2 )Quadratic. O(n c ), c > 1Polynomial, sometimes called algebraic. Examples: O(n 2 ), O(n 3 ), O(n 4 ). O(c n )Exponential, sometimes called geometric. Examples: O(2 n ), O(3 n ). O(n!)Factorial, sometimes called combinatorial. n is problem size.
24
24 Analysis of Algorithms Complexity Classes (2/4) Algorithms of polynomial running times are desirable. Table 1. Running Times for Different Complexity Classes
25
25 Analysis of Algorithms Complexity Classes (3/4) Table 2.Running Times for Algorithm A in Different Time Units.
26
26 Analysis of Algorithms Complexity Classes (4/4) Table 3.Size of Largest Problem that Algorithm A can solve if solution is computed in time <= T at 1 micro-sec per step.
27
27 Analysis of Algorithms Analysing Simple Codes (1/4) Some rules. Basic operations are those that can be computed in O(1) or constant time. Examples are assignment statements, comparison statements, and simple arithmetic operations.
28
28 Analysis of Algorithms Analysing Simple Codes (2/4) Code fragment 1: temp = x; x = y; y = temp; Running time: 3 statements = O(1) Code fragment 2: p = list.size(); for (int i = 0; i < p; ++i) { list[i] += 3; } Running time: 1 + p statements = O(p)
29
29 Analysis of Algorithms Analysing Simple Codes (3/4) Code fragment 3: if (x < y) { a = 1; b = 2; c = 3; } else { a = 2; b = 4; c = 8; d = 13; e = 51; } Running time: max{3, 5} statements = 5 = O(1) In code fragments 2 and 3, we consider only assignment statements as our basic operations. Even if we include the loop test operation (i < p) and update operation (++i) in fragment 2, and the if test operation (x < y) in fragment 3, it will not affect the final result in big- O notation, since they are each of constant time.
30
30 Analysis of Algorithms Analysing Simple Codes (4/4) Code fragment 4: sum = 0.0; for (int k = 0; k < n; ++k) sum += array[k]; avg = sum/n; Running time: 1 + n + 1 statements = n + 2 = O(n) Code fragment 5: for (int i = 0; i < n; i++) for (int j = 0; j < i; ++j) sum += matrix[i][j]; Running time: 0 + 1 + 2 + … + (n-1) = n(n-1)/2 = O(n 2 )
31
31 Analysis of Algorithms General Rules (1/3) Rule 1: Loops The running time of a loop is at most the running time of the statements inside the loop times the number of iterations. for (...) { …; } m statements n iterations n m statements Example: if m = 3, then running time is 3n or O(n).
32
32 Analysis of Algorithms General Rules (2/3) Rule 2: Nested loops Analyse these inside out. The total running time of a statement inside a group of nested loops is the running time of the statement multiplied by the product of the sizes of all the loops. for (…) { for (...) { …; } O(n)O(n) k iterations O(kn)O(kn)
33
33 Analysis of Algorithms General Rules (3/3) Rule 3: Selection statements For the fragment if (condition) S1; else S2; the running time of an if-else statement is never more than the running time of the condition test plus the larger of the running times of S1 and S2. if (…) { …; } else { …; } m statements max{m, n} n statements
34
34 Analysis of Algorithms Worst-case Analysis We may analyze an algorithm/code based on the best- case, average-case and worst-case scenarios. Average-case and worst-case analysis are usually better indicators of performance than best-case analysis. Worst-case is usually easier to determine than average- case.
35
35 Analysis of Algorithms Running Time of Some Known Algorithms The following are worst-case running time of some known algorithms on arrays. The problem size, n, is the number of elements in the array. Sequential search (linear search) in an array: O(n). Binary search in a sorted array: O(lg n). Simple sorts (bubblesort, selection sort, insertion sort): O(n 2 ). Mergesort: O(n lg n).
36
36 Analysis of Algorithms Sequential Search vs Binary Search (1/2) Sequential/linear search: Start from first element, visit each element to see if it matches the search item. public static int linearSearch(int[] list, int searchValue) { for (int i = 0; i < list.length; i++) { if (list[i] == searchValue) return i; } return -1; }
37
37 Analysis of Algorithms Sequential Search vs Binary Search (2/2) Binary search: Works for sorted array. Examine middle element, and eliminate half of the array. public static int binarySearch(int[] list, int searchValue) { int left = 0; int right = list.length - 1; int mid; while (left <= right) { mid = (left + right)/2; if (list[mid] == searchValue) return mid; else if (list[mid] < searchValue) left = mid + 1; else right = mid - 1; } return -1; }
38
38 Analysis of Algorithms Analysis of Sequential Search Assume: An array with n elements. Basic operation is the comparison operation. Best-case: When the key is found at the first element. Running time: O(1). Worst-case: When the key is found at the last element, or when the key is not found. Running time: O(n). Average case: Assuming that the chance of every element that matches the key is equal, then on average the key is found after n/2 compare operations. Running time: O(n).
39
39 Analysis of Algorithms Analysis of Binary Search Assume: An array with n elements. Basic operation is the comparison operation. Best-case: When the key is found at the middle element. Running time: O(1). Worst-case: Running time: O(lg n). Why? If you start with the value n, how many times can you half it until it becomes 1? Examples: Starting with 8, it takes 3 halving to get it to 1; starting with 32, it takes 5 halving; starting with 1024, it takes 10 halving.
40
40 Analysis of Algorithms Analysis of Sort Algorithms For comparison-based sorting algorithms, the basic operations used in analysis is The number of comparisons, or The number of swaps (exchanges). Worst-case analysis: All the three basic sorts – selection sort, bubble sort, and insertion sort – have worst-case running time of O(n 2 ), where n is the array size. What is the worst-case scenario for bubble sort? For selection sort? For insertion sort?
41
41 Analysis of Algorithms Maximum Subsequence Sum (1/6) Given this problem: Given (possibly negative) integers a 0, a 1, a 2, …, a n-1, find the maximum value of (For convenience, the maximum subsequence sum is 0 if all the integers are negative.) Example: list: -2, 11, -4, 13, -5, -2 answer: 20 (a 1 through a 3 ). Many algorithms to solve this problem.
42
42 Analysis of Algorithms Maximum Subsequence Sum (2/6) Algorithm 1 public static int maxSubseqSum(int[] list) { int thisSum, maxSum; maxSum = 0; for (int i = 0; i < list.length; i++) for (int j = 0; j < list.length; j++) { thisSum = 0; for (int k = i; k <= j; k++) thisSum += list[k]; // count this line if (thisSum > maxSum) maxSum = thisSum; } return maxSum; }
43
43 Analysis of Algorithms Maximum Subsequence Sum (3/6) Algorithm 1: Analysis How many times is line thisSum += list[k]; executed?
44
44 Analysis of Algorithms Maximum Subsequence Sum (4/6) Algorithm 2 +
45
45 Analysis of Algorithms Maximum Subsequence Sum (5/6) Algorithm 2: Analysis Algorithm 2 avoids the cubic running time O(n 3 ) by removing the inner-most for-k loop in algorithm 1. New running-time complexity is O(n 2 ).
46
46 Analysis of Algorithms Maximum Subsequence Sum (6/6) Algorithm 3 +
47
47 End of file
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.