Download presentation
Presentation is loading. Please wait.
Published byDennis Flowers Modified over 9 years ago
1
1 Algorithm Analysis
2
2 Question Suppose you have two programs that will sort a list of student records and allow you to search for student information. How do you judge which is a better program?
3
3 How to Measure “Betterness” Critical resources in computer Time (faster is better) Memory Space (less is better) For most algorithms, running time depends on size n of data. Notation: t(n) -- Time t is a function of data size n.
4
4 To Determine “Betterness” By running programs (Benchmarks) Machine speed Parallel processing Machine memory Data size Depends on external conditions Algorithm Analysis Estimating intrinsic properties of algorithms
5
5 Algorithm Analysis Is a methodology for estimating the resource (time and space) consumption of an algorithm. Allows us to compare the relative costs of two or more algorithms for solving the same problem.
6
6 Estimation Estimation is based on: Size of the input Number of basic operations The time to complete a basic operation does not depend on the value of its operands.
7
7 Example: largest() int largest(int a[], int n) { int posBig = 0; for (int i = 1; i a[posBig]) posBig = i; return posBig; } Find the position of the largest value in array
8
8 Example: largest() The basic operation is the “comparison” It takes a fixed amount of time to do one comparison, regardless of the value of the two integers or their positions in the array. The size of the problem is n The running time is: t(n) = c 1 + c 2 n.
9
9 Example: Assignment int x = a[0]; The running time is: t(n) = c This is called constant running time.
10
10 Example: Sum What is the running time for this code? The basic operation is sum++ The cost of one sum operation can be bundled into constant time c. Inner loop takes cn The total running time is: t(n) = c 1 + c 2 n int total(int a[], int n) { int sum = 0; for (int i = 1; i <= n; i++) sum += a[i]; return sum; }
11
11 Example: Selection Sort 35616251815201230192823 a sorted 9 unsortedcurrentsmall Loop (for current = 0 to n) small current Loop (for i = current to n) If (a[i} < a[small]) Then small i End If End Loop swap (a[current], a[small]) End Loop
12
12 Example: Selection Sort void slectSort(int a[], int n){ int small, temp; for (int i = 0; i < n; i++){ least = i; for (int j = i; j < n; j++){ if (a[j] < small]) least = j; } temp = a[i]; a[i] = a[small]; a[small] = temp; } }
13
13 Example: Selection Sort Total number of steps (to find small): n + (n-1) + (n-2) + (n-3) +... + (n-n+2) + 1 1 + (n-n+2) + (n-n+3) + (n-n=4) +... + (n-1) + n _______________________________________________ (n+1)+(n+1)+(n+1) + (n+1) +... + (n+1) +(n+1) 2(total) = n(n + 1) total = (1/2)(n 2 + n) T(n) = cn 2 +cn
14
14 Dominant Term Consider: t(n) = cn 2 +cn If n is large, e.g., 1000, then t(n) = c(1000) 2 + c(1000) ≈ c(1,000,000+ c(1000)
15
15 The Growth Rate Growth Rate for an algorithm is the rate at which the cost of the algorithm grows as the data size n grows. Assumptions Growth rates are estimates for comparing the behavior of algorithms (not absolute speeds) They have meaning when n is large.
16
16 Order of Complexity Big-O Notation Suppose t(n) = c 1 + c 2 n describes the algorithm’s time dependency on n. Then growth rate of the algorithm is O(n) -- “Big O of n” or “order n”
17
17 Big-O Notation Given: t(n) = c Growth Rate: O(1) -- constant Given: t(n) = c 1 + c 2 n + c 3 n 2 Growth Rate: O(n 2 ) -- quadratic Given: t(n) = c 1 + c 2 n + c 3 n 2 +c 4 n 3 Growth Rate: O(n 3 ) -- cubic Given: t(n) = c 1 + c 2 2 n Growth Rate: O(2 n ) -- exponential
18
18 Growth Rate Graph c T(n) n log n
19
19 Characteristics of Growth Rates Growth Rates Constant— t(n) = c Independent of n Linear— t(n) = cn Constant slope Quadratic— t(n) = cn 2 Increasing slope Cubic— t(n) = cn 3 Exponential— t(n) = c2 n Very fast rise What are their growth rates?
20
20 Practical Considerations Many problems whose obvious solution requires O(n 2 ) time also has a solution that requires O(nlogn). Examples: Sorting Searching
21
21 Practical Considerations Not a large difference in running time between O 1 (n) and O 2 (nlogn). Example: O 1 (10,000) = 10,000 O 2 (10,000) = =10,000 log 10 10,000 = 40,000
22
22 Practical Considerations There is an enormous difference between O 1 (n 2 ) and O 2 (nlogn). O 1 (10,000) = 100,000,000 O 2 (10,000) = 10,000 log 10 10,000 = = 40,000
23
23 Big-O Examples Linear search bool linearSearch(int a[], int key, int count){ bool found = false; for (i = 0; i < count; i++) if (key == a[i]) return true; return false; } T(n) = c 1 + c 2 n Thus, growth rate: O(n)
24
24 Big-O Examples Modifying row-1 in 2-D array int sum = 0; for (int col = 0; col < 100; col++) { sum += a[1][col]; T(n) = c 1 + (c 2 x c 3 ) This, growth rate: O(1)
25
25 Big-O Examples Bubble Sort void bubbleSort(int a[], int n) { for (int i = 0; - a[j + 1]) swap(a[j], a[j + 1]); } } } T(n) = f(n) x g(n) Thus, growth rate: O(n) * O(n) = O(n 2 )
26
26 Simplifying Rules O(f + g) = greater of O(f) and O(g) O(f x g) = O(f) x O(g) E.g. O(c 1 n + c 2 n 2 ) = O(c 1 n) + O(c 2 n 2 ) = O(n 2 ) O(c 1 n x c 2 n 2 ) = O(c 1 n ) x O(c 2 n 2 ) = O(n 3 )
27
27 Your Turn Given a 2-D array of n-rows and n-columns, what is the growth rate of an algorithm which finds the average of all elements? sum = 0; for (i = 0; i < n; i++){ for (j = 0; j < n; j++){ sum += a[i][j]; } } ave = sum/(n*n);
28
28 Big-O Examples How many elements are examined in worst case? Binary Search
29
29 Big-O Examples bool binSearch(int a[], int count, int key){ int lo = 0; int hi = count – 1; int mid = (lo + hi) / 2; bool found = false; while (lo <= hi && !found) if (key == a[mid}) found = true; else if (key < a[mid]) hi = mid – 1; else lo = mid + 1; return found; } Binary Search
30
30 Big O Examples Binary Search Comparison Remain E.g. 0 n 1000 1 n/2 500 2 n/4 250 3 n/8 125 4 n/16 62 5 n/32 31 6 n/64 14 7 n/128 7 8 n/256 3 9 n/512 2 k n/2 k 1
31
31 Big O Example n / 2 k = 1 n = 2 k log 2 (n) = log 2 (2 k ) = k (log 2 2) = k(1) log 2 (n) = k Thus, growth rate = O(log n) Binary Search
32
32 Best, Worst, Average Cases For an algorithm with a given growth rate, we consider Best case Worst case Average case For example: Sequential search for key in an array of n integers Best case: The first item of the array equals K Worst case: The last position of the array equals K Average case: Match at n/2
33
33 Which Analysis to Use The Best Case Normally, we are not interested in the best case, because: It is too optimistic Not a fair characterization of the algorithms’ running time Useful in some rare cases, where the best case has high probability of occurring.
34
34 Which Analysis to Use The Worst Case. Useful in many real-time applications. Advantage: Predictability You know for certain that the algorithm must perform at least that well. Disadvantage: Might not be a representative measure of the behavior of the algorithm on inputs of size n.
35
35 Which Analysis to Use? The average case. Often we prefer to know the average-case running time. The average case reveals the typical behavior of the algorithm on inputs of size n. Average case estimation is not always possible. For the sequential search example, it assumes that the integer value of K is equally likely to appear in any position of the array. This assumption is not always correct.
36
36 The moral of the story If we know enough about the distribution of our input we prefer the average-case analysis. If we do not know the distribution, then we must resort to worst-case analysis. For real-time applications, the worst- case analysis is the preferred method.
37
37 Your Turn: insertAtFront(item) with array vector What is the growth rate for the insertAtFront() method of a vector, implemented by an array? solution solution
38
38 Your Turn: insertAtBack(item) with array vector What is the growth rate for the insertAtBack() method of a vector, implemented by an array? solution solution
39
39 Your Turn: insertInOrder(item) with array vector What is the growth rate for the insertInOrder() method of a vector, implemented by an array? solution solution
40
40 Your Turn: push(item) with array stack What is the growth rate for the push() method of a stack, implemented by an array? solution solution
41
41 Your Turn: pop() with array stack What is the growth rate for the pop() method of a stack, implemented by an array? solution solution
42
42 Your Turn: enqueue(item) with array queue What is the growth rate for the enqueue() method of a queue, implemented by an array? solution solution
43
43 Your Turn: dequeue() with array queue What is the growth rate for the dequeue() method of a queue, implemented by an array? solution solution
44
44 Your Turn: insertAtFront(item) with vector using linked list What is the growth rate for the insertAtFront() method of a vector, implemented by a linked list? solution solution
45
45 Your Turn: insertAtBack with vector using linked list What is the growth rate for the insertAtBack() method of a vector, implemented by a linked list? solution solution
46
46 Your Turn: insert(item) with BST using pointers What is the growth rate for the insert() method of a Binary Search Tree, implemented with pointers? solution solution
47
47 Your Turn: search() with BST using pointers What is the growth rate for the search() method of a Binary Search Tree, implemented with pointers? solution solution
48
48 Solution: Vector with array of size n insertAtFront(item) Shift element from pos=0 to n c 1 n v[n] item c 2 n++ c 3 t(n) = c 1 n + c 2 + c 3 Growth Rate: O(n) insertAtBack(item) v[n] item c1 n++ c2 t(n) = c1 + c2 Groth Rate: O(1) back
49
49 Solution: Vector with array of size n (2) insertInOrder(item) Find the position to insert (e.g., pos) c 1 n Shift elements to right from pos through n c 2 n v[pos] item c 3 n++ c 4 t(n) = c 1 n + c 2 n + c 3 + c 4 = n(c 1 + c 2 ) + c 5 = nc 6 + c 5 Growth Rate: O(n) back
50
50 Solution: Stack with array of size n push(item) top++ c 1 stack[top] item c 2 t(n) = c 1 + c 2 Growth Rate: O(1) pop() save stack[top] c 1 top - - c 2 t(n) = c1 + c2 Growth Rate: O(1) back
51
51 Solution: Queue with array of size n enqueue(item) back (back + 1) mod MAX c 1 queue[back] item c 2 t(n) = c 1 + c 2 Growth Rate: O(1) dequeue() save queue[front] c 1 front (front + 1) mod MAX c 2 t(n) = c 1 + c 2 Growth Rate: O(1) back
52
52 Solution: Vector as linked list with n nodes insertAtFront(item) t new node c 1 t head c 2 head t c 3 t(n) = c 1 + c 2 + c 3 Growth Rate: O(1) insertAtBack() t new node c 1 Find node at end (pointed to by s) c 2 n s->next t c 3 t(n) = c 1 + c 2 n + c 3 Growth Rate: O(n) back
53
53 Solution: BST with pointers insert(item) t new node c 1 Find the node before which item will be inserted c 2 n Insert t node c 3 t(n) = c 1 + c 2 n + c 3 Growth Rate: O(n) search(key) Find node with a match by eliminating ½ of tree after each comparison (like binary search) t(n) = c 1 log(n) Growth Rate: O(log n) back
54
54 Running Time Examples (1) Example 1: a = b; This assignment takes constant time, so it is (1). Example 2: sum = 0; for (i=1; i<=n; i++) sum += n;
55
55 Running Time Examples (2) Example 3: sum = 0; for (j=1; j<=n; j++) for (i=1; i<=j; i++) sum++; for (k=0; k<n; k++) A[k] = k;
56
56 Running Time Examples (3) Example 4: sum1 = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) sum1++; sum2 = 0; for (i=1; i<=n; i++) for (j=1; j<=i; j++) sum2++;
57
57 Running Time Examples (4) Example 5: sum1 = 0; for (k=1; k<=n; k*=2) for (j=1; j<=n; j++) sum1++; sum2 = 0; for (k=1; k<=n; k*=2) for (j=1; j<=k; j++) sum2++;
58
58 Other Control Statements while loop: Analyze like a for loop. if statement: Take greater complexity of then / else clauses. switch statement: Take complexity of most expensive case. Subroutine call: Complexity of the subroutine.
59
59 Practical Considerations Code tuning can also lead to dramatic improvements in running time; Code tuning is the art of hand- optimizing a program to run faster or require less storage. For many programs, code tuning can reduce running time by a factor of ten. Code tuning
60
60 Remarks Most statements in a program do not have much effect on the running time of that program; There is little point to cutting in half the running time of a subroutine that accounts for only 1% of the total. Focus your attention on the parts of the program that have the most impact Code tuning
61
61 Remarks When tuning code, it is important to gather good timing statistics; Be careful not to use tricks that make the program unreadable; Make use of compiler optimizations; Check that your optimizations really improve the program. Code tuning
62
62 Remarks Comparative timing of programs is a difficult business: Experimental errors from uncontrolled factors (system load, language, compiler etc..); Bias towards a program; Unequal code tuning.
63
63 Remarks The greatest time and space improvements come from a better data structure or algorithm “FIRST TUNE THE ALGORITHM, THEN TUNE THE CODE”
64
64 Appendix A - Notes Algorithm Analysis
65
65 Computational complexity theory Complexity theory is part of the theory of computation dealing with the resources required during computation to solve a given problem.theory of computation The most common resources are time (how many steps does it take to solve a problem) and space (how much memory does it take to solve a problem).
66
66 Computational complexity theory Other resources can also be considered, such as how many parallel processors are needed to solve a problem in parallel. Complexity theory differs from computability theory, which deals with whether a problem can be solved at all, regardless of the resources required. computability theory
67
67 Computational complexity theory If a problem has time complexity O(n²) on one typical computer, then it will also have complexity O(n²) on most other computers This notation allows us to generalize away from the details of a particular computer.
68
68 Big Oh The Big Oh is the upper bound of a function. In the case of algorithm analysis, we use it to bound the worst-case running time, or the longest running time possible for any input of size n. We can say that the maximum running time of the algorithm is in the order of Big Oh.
69
69 Appendix B - Exercises Algorithm Analysis
70
70 Write True or False 1. The equation T(n) = 3n + 2 is an example of a linear growth rate. 2. If Algorithm A has a faster growth rate than Algorithm B in the average case, that means Algorithm A is more efficient than Algorithm B on average. 3. When performing asymptotic analysis, we can ignore constants and low order terms.
71
71 Write True or False 4. The best case for an algorithm occurs when the input size is as small as possible 5. Asymptotic algorithm analysis is most useful when the input size is small 6. When performing algorithm analysis, we measure the cost of programs in terms of basic operations. Each operation should require constant time.
72
72 Write True or False If a program has a growth rate proportional to n2 for an input size n, then a computer that runs twice as fast will be able to run in one hour an input that is twice as large as that which can be run in one hour on the slower computer.
73
73 Write True or False The concepts of asymptotic analysis apply equally well to space costs as they do to time costs. The most reliable method for comparing two approaches to solving a problem is simply to write to programs and compare their running time. We can often make a program faster if we are willing to use more space, and conversely, we can often make a program require less space if we are willing to take more running time.
74
74 Exercise Suppose that a particular algorithm has time complexity T(n) = n 2, and that executing an implementation of it on a particular machine takes T seconds for N inputs. Now suppose that we are presented with a machine that is 64 times as fast. How many inputs could we process on the new machine in T seconds?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.