1 Algorithm Analysis. 2 Question Suppose you have two programs that will sort a list of student records and allow you to search for student information.

Slides:



Advertisements
Similar presentations
Algorithm Analysis.
Advertisements

Lecture3: Algorithm Analysis Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
HST 952 Computing for Biomedical Scientists Lecture 10.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
the fourth iteration of this loop is shown here
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 10 Algorithm Efficiency
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Introduction to Analysis of Algorithms
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
Analysis of Algorithms (Chapter 4)
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Algorithm Efficiency and Sorting
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Data Structure Algorithm Analysis TA: Abbas Sarraf
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithm.
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Algorithm Analysis (Big O)
COMP s1 Computing 2 Complexity
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Lecture 2 Computational Complexity
Mathematics Review and Asymptotic Notation
Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CSC 211 Data Structures Lecture 13
Data Structure Introduction.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Algorithm Analysis (Big O)
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 3 Chapter Summary  Algorithms o Example Algorithms searching for an element in a list sorting a list so its elements are in some prescribed.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Analysis of Algorithms
Analysis of Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Algorithm Analysis (not included in any exams!)
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Analysis of Algorithms
Programming and Data Structure
Revision of C++.
Analysis of Algorithms
Presentation transcript:

1 Algorithm Analysis

2 Question Suppose you have two programs that will sort a list of student records and allow you to search for student information. How do you judge which is a better program?

3 How to Measure “Betterness” Critical resources in computer Time (faster is better) Memory Space (less is better) For most algorithms, running time depends on size n of data. Notation: t(n) -- Time t is a function of data size n.

4 To Determine “Betterness” By running programs (Benchmarks) Machine speed Parallel processing Machine memory Data size Depends on external conditions Algorithm Analysis Estimating intrinsic properties of algorithms

5 Algorithm Analysis Is a methodology for estimating the resource (time and space) consumption of an algorithm. Allows us to compare the relative costs of two or more algorithms for solving the same problem.

6 Estimation Estimation is based on: Size of the input Number of basic operations The time to complete a basic operation does not depend on the value of its operands.

7 Example: largest() int largest(int a[], int n) { int posBig = 0; for (int i = 1; i a[posBig]) posBig = i; return posBig; } Find the position of the largest value in array

8 Example: largest() The basic operation is the “comparison” It takes a fixed amount of time to do one comparison, regardless of the value of the two integers or their positions in the array. The size of the problem is n The running time is: t(n) = c 1 + c 2 n.

9 Example: Assignment int x = a[0]; The running time is: t(n) = c This is called constant running time.

10 Example: Sum What is the running time for this code? The basic operation is sum++ The cost of one sum operation can be bundled into constant time c. Inner loop takes cn The total running time is: t(n) = c 1 + c 2 n int total(int a[], int n) { int sum = 0; for (int i = 1; i <= n; i++) sum += a[i]; return sum; }

11 Example: Selection Sort a sorted 9 unsortedcurrentsmall Loop (for current = 0 to n) small  current Loop (for i = current to n) If (a[i} < a[small]) Then small  i End If End Loop swap (a[current], a[small]) End Loop

12 Example: Selection Sort void slectSort(int a[], int n){ int small, temp; for (int i = 0; i < n; i++){ least = i; for (int j = i; j < n; j++){ if (a[j] < small]) least = j; } temp = a[i]; a[i] = a[small]; a[small] = temp; } }

13 Example: Selection Sort Total number of steps (to find small): n + (n-1) + (n-2) + (n-3) (n-n+2) (n-n+2) + (n-n+3) + (n-n=4) (n-1) + n _______________________________________________ (n+1)+(n+1)+(n+1) + (n+1) (n+1) +(n+1) 2(total) = n(n + 1) total = (1/2)(n 2 + n) T(n) = cn 2 +cn

14 Dominant Term Consider: t(n) = cn 2 +cn If n is large, e.g., 1000, then t(n) = c(1000) 2 + c(1000) ≈ c(1,000,000+ c(1000)

15 The Growth Rate Growth Rate for an algorithm is the rate at which the cost of the algorithm grows as the data size n grows. Assumptions Growth rates are estimates for comparing the behavior of algorithms (not absolute speeds) They have meaning when n is large.

16 Order of Complexity Big-O Notation Suppose t(n) = c 1 + c 2 n describes the algorithm’s time dependency on n. Then growth rate of the algorithm is O(n) -- “Big O of n” or “order n”

17 Big-O Notation Given: t(n) = c Growth Rate: O(1) -- constant Given: t(n) = c 1 + c 2 n + c 3 n 2 Growth Rate: O(n 2 ) -- quadratic Given: t(n) = c 1 + c 2 n + c 3 n 2 +c 4 n 3 Growth Rate: O(n 3 ) -- cubic Given: t(n) = c 1 + c 2 2 n Growth Rate: O(2 n ) -- exponential

18 Growth Rate Graph c T(n) n log n

19 Characteristics of Growth Rates Growth Rates Constant— t(n) = c Independent of n Linear— t(n) = cn Constant slope Quadratic— t(n) = cn 2 Increasing slope Cubic— t(n) = cn 3 Exponential— t(n) = c2 n Very fast rise What are their growth rates?

20 Practical Considerations Many problems whose obvious solution requires O(n 2 ) time also has a solution that requires O(nlogn). Examples: Sorting Searching

21 Practical Considerations Not a large difference in running time between O 1 (n) and O 2 (nlogn). Example: O 1 (10,000) = 10,000 O 2 (10,000) = =10,000 log 10 10,000 = 40,000

22 Practical Considerations There is an enormous difference between O 1 (n 2 ) and O 2 (nlogn). O 1 (10,000) = 100,000,000 O 2 (10,000) = 10,000 log 10 10,000 = = 40,000

23 Big-O Examples Linear search bool linearSearch(int a[], int key, int count){ bool found = false; for (i = 0; i < count; i++) if (key == a[i]) return true; return false; } T(n) = c 1 + c 2 n Thus, growth rate: O(n)

24 Big-O Examples Modifying row-1 in 2-D array int sum = 0; for (int col = 0; col < 100; col++) { sum += a[1][col]; T(n) = c 1 + (c 2 x c 3 ) This, growth rate: O(1)

25 Big-O Examples Bubble Sort void bubbleSort(int a[], int n) { for (int i = 0; - a[j + 1]) swap(a[j], a[j + 1]); } } } T(n) = f(n) x g(n) Thus, growth rate: O(n) * O(n) = O(n 2 )

26 Simplifying Rules O(f + g) = greater of O(f) and O(g) O(f x g) = O(f) x O(g) E.g. O(c 1 n + c 2 n 2 ) = O(c 1 n) + O(c 2 n 2 ) = O(n 2 ) O(c 1 n x c 2 n 2 ) = O(c 1 n ) x O(c 2 n 2 ) = O(n 3 )

27 Your Turn Given a 2-D array of n-rows and n-columns, what is the growth rate of an algorithm which finds the average of all elements? sum = 0; for (i = 0; i < n; i++){ for (j = 0; j < n; j++){ sum += a[i][j]; } } ave = sum/(n*n);

28 Big-O Examples How many elements are examined in worst case? Binary Search

29 Big-O Examples bool binSearch(int a[], int count, int key){ int lo = 0; int hi = count – 1; int mid = (lo + hi) / 2; bool found = false; while (lo <= hi && !found) if (key == a[mid}) found = true; else if (key < a[mid]) hi = mid – 1; else lo = mid + 1; return found; } Binary Search

30 Big O Examples Binary Search Comparison Remain E.g. 0 n n/ n/ n/ n/ n/ n/ n/ n/ n/512 2 k n/2 k 1

31 Big O Example n / 2 k = 1 n = 2 k log 2 (n) = log 2 (2 k ) = k (log 2 2) = k(1) log 2 (n) = k Thus, growth rate = O(log n) Binary Search

32 Best, Worst, Average Cases For an algorithm with a given growth rate, we consider Best case Worst case Average case For example: Sequential search for key in an array of n integers Best case: The first item of the array equals K Worst case: The last position of the array equals K Average case: Match at n/2

33 Which Analysis to Use The Best Case Normally, we are not interested in the best case, because: It is too optimistic Not a fair characterization of the algorithms’ running time Useful in some rare cases, where the best case has high probability of occurring.

34 Which Analysis to Use The Worst Case. Useful in many real-time applications. Advantage: Predictability You know for certain that the algorithm must perform at least that well. Disadvantage: Might not be a representative measure of the behavior of the algorithm on inputs of size n.

35 Which Analysis to Use? The average case. Often we prefer to know the average-case running time. The average case reveals the typical behavior of the algorithm on inputs of size n. Average case estimation is not always possible. For the sequential search example, it assumes that the integer value of K is equally likely to appear in any position of the array. This assumption is not always correct.

36 The moral of the story If we know enough about the distribution of our input we prefer the average-case analysis. If we do not know the distribution, then we must resort to worst-case analysis. For real-time applications, the worst- case analysis is the preferred method.

37 Your Turn: insertAtFront(item) with array vector What is the growth rate for the insertAtFront() method of a vector, implemented by an array? solution solution

38 Your Turn: insertAtBack(item) with array vector What is the growth rate for the insertAtBack() method of a vector, implemented by an array? solution solution

39 Your Turn: insertInOrder(item) with array vector What is the growth rate for the insertInOrder() method of a vector, implemented by an array? solution solution

40 Your Turn: push(item) with array stack What is the growth rate for the push() method of a stack, implemented by an array? solution solution

41 Your Turn: pop() with array stack What is the growth rate for the pop() method of a stack, implemented by an array? solution solution

42 Your Turn: enqueue(item) with array queue What is the growth rate for the enqueue() method of a queue, implemented by an array? solution solution

43 Your Turn: dequeue() with array queue What is the growth rate for the dequeue() method of a queue, implemented by an array? solution solution

44 Your Turn: insertAtFront(item) with vector using linked list What is the growth rate for the insertAtFront() method of a vector, implemented by a linked list? solution solution

45 Your Turn: insertAtBack with vector using linked list What is the growth rate for the insertAtBack() method of a vector, implemented by a linked list? solution solution

46 Your Turn: insert(item) with BST using pointers What is the growth rate for the insert() method of a Binary Search Tree, implemented with pointers? solution solution

47 Your Turn: search() with BST using pointers What is the growth rate for the search() method of a Binary Search Tree, implemented with pointers? solution solution

48 Solution: Vector with array of size n insertAtFront(item) Shift element from pos=0 to n c 1 n v[n]  item c 2 n++ c 3 t(n) = c 1 n + c 2 + c 3 Growth Rate: O(n) insertAtBack(item) v[n]  item c1 n++ c2 t(n) = c1 + c2 Groth Rate: O(1) back

49 Solution: Vector with array of size n (2) insertInOrder(item) Find the position to insert (e.g., pos) c 1 n Shift elements to right from pos through n c 2 n v[pos]  item c 3 n++ c 4 t(n) = c 1 n + c 2 n + c 3 + c 4 = n(c 1 + c 2 ) + c 5 = nc 6 + c 5 Growth Rate: O(n) back

50 Solution: Stack with array of size n push(item) top++ c 1 stack[top]  item c 2 t(n) = c 1 + c 2 Growth Rate: O(1) pop() save  stack[top] c 1 top - - c 2 t(n) = c1 + c2 Growth Rate: O(1) back

51 Solution: Queue with array of size n enqueue(item) back  (back + 1) mod MAX c 1 queue[back]  item c 2 t(n) = c 1 + c 2 Growth Rate: O(1) dequeue() save  queue[front] c 1 front  (front + 1) mod MAX c 2 t(n) = c 1 + c 2 Growth Rate: O(1) back

52 Solution: Vector as linked list with n nodes insertAtFront(item) t  new node c 1 t  head c 2 head  t c 3 t(n) = c 1 + c 2 + c 3 Growth Rate: O(1) insertAtBack() t  new node c 1 Find node at end (pointed to by s) c 2 n s->next  t c 3 t(n) = c 1 + c 2 n + c 3 Growth Rate: O(n) back

53 Solution: BST with pointers insert(item) t  new node c 1 Find the node before which item will be inserted c 2 n Insert t node c 3 t(n) = c 1 + c 2 n + c 3 Growth Rate: O(n) search(key) Find node with a match by eliminating ½ of tree after each comparison (like binary search) t(n) = c 1 log(n) Growth Rate: O(log n) back

54 Running Time Examples (1) Example 1: a = b; This assignment takes constant time, so it is  (1). Example 2: sum = 0; for (i=1; i<=n; i++) sum += n;

55 Running Time Examples (2) Example 3: sum = 0; for (j=1; j<=n; j++) for (i=1; i<=j; i++) sum++; for (k=0; k<n; k++) A[k] = k;

56 Running Time Examples (3) Example 4: sum1 = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) sum1++; sum2 = 0; for (i=1; i<=n; i++) for (j=1; j<=i; j++) sum2++;

57 Running Time Examples (4) Example 5: sum1 = 0; for (k=1; k<=n; k*=2) for (j=1; j<=n; j++) sum1++; sum2 = 0; for (k=1; k<=n; k*=2) for (j=1; j<=k; j++) sum2++;

58 Other Control Statements while loop: Analyze like a for loop. if statement: Take greater complexity of then / else clauses. switch statement: Take complexity of most expensive case. Subroutine call: Complexity of the subroutine.

59 Practical Considerations Code tuning can also lead to dramatic improvements in running time; Code tuning is the art of hand- optimizing a program to run faster or require less storage. For many programs, code tuning can reduce running time by a factor of ten. Code tuning

60 Remarks Most statements in a program do not have much effect on the running time of that program; There is little point to cutting in half the running time of a subroutine that accounts for only 1% of the total. Focus your attention on the parts of the program that have the most impact Code tuning

61 Remarks When tuning code, it is important to gather good timing statistics; Be careful not to use tricks that make the program unreadable; Make use of compiler optimizations; Check that your optimizations really improve the program. Code tuning

62 Remarks Comparative timing of programs is a difficult business: Experimental errors from uncontrolled factors (system load, language, compiler etc..); Bias towards a program; Unequal code tuning.

63 Remarks The greatest time and space improvements come from a better data structure or algorithm “FIRST TUNE THE ALGORITHM, THEN TUNE THE CODE”

64 Appendix A - Notes Algorithm Analysis

65 Computational complexity theory Complexity theory is part of the theory of computation dealing with the resources required during computation to solve a given problem.theory of computation The most common resources are time (how many steps does it take to solve a problem) and space (how much memory does it take to solve a problem).

66 Computational complexity theory Other resources can also be considered, such as how many parallel processors are needed to solve a problem in parallel. Complexity theory differs from computability theory, which deals with whether a problem can be solved at all, regardless of the resources required. computability theory

67 Computational complexity theory If a problem has time complexity O(n²) on one typical computer, then it will also have complexity O(n²) on most other computers This notation allows us to generalize away from the details of a particular computer.

68 Big Oh The Big Oh is the upper bound of a function. In the case of algorithm analysis, we use it to bound the worst-case running time, or the longest running time possible for any input of size n. We can say that the maximum running time of the algorithm is in the order of Big Oh.

69 Appendix B - Exercises Algorithm Analysis

70 Write True or False 1. The equation T(n) = 3n + 2 is an example of a linear growth rate. 2. If Algorithm A has a faster growth rate than Algorithm B in the average case, that means Algorithm A is more efficient than Algorithm B on average. 3. When performing asymptotic analysis, we can ignore constants and low order terms.

71 Write True or False 4. The best case for an algorithm occurs when the input size is as small as possible 5. Asymptotic algorithm analysis is most useful when the input size is small 6. When performing algorithm analysis, we measure the cost of programs in terms of basic operations. Each operation should require constant time.

72 Write True or False If a program has a growth rate proportional to n2 for an input size n, then a computer that runs twice as fast will be able to run in one hour an input that is twice as large as that which can be run in one hour on the slower computer.

73 Write True or False The concepts of asymptotic analysis apply equally well to space costs as they do to time costs. The most reliable method for comparing two approaches to solving a problem is simply to write to programs and compare their running time. We can often make a program faster if we are willing to use more space, and conversely, we can often make a program require less space if we are willing to take more running time.

74 Exercise Suppose that a particular algorithm has time complexity T(n) = n 2, and that executing an implementation of it on a particular machine takes T seconds for N inputs. Now suppose that we are presented with a machine that is 64 times as fast. How many inputs could we process on the new machine in T seconds?