1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.

Slides:



Advertisements
Similar presentations
Chapter 2: Algorithm Analysis
Advertisements

1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Introduction to Analysis of Algorithms
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Algorithm Analysis. Math Review – 1.2 Exponents –X A X B = X A+B –X A /X B =X A-B –(X A ) B = X AB –X N +X N = 2X N ≠ X 2N –2 N+ 2 N = 2 N+1 Logarithms.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Elementary Data Structures and Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Analysis of Performance
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Complexity Analysis Chapter 1.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Asymptotic Analysis-Ch. 3
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Introduction to Algorithms: Verification, Complexity, and Searching (2) Andy Wang Data Structures, Algorithms, and Generic Programming.
Introduction to Analysis of Algorithms CS342 S2004.
CE 221 Data Structures and Algorithms Chapter 2: Algorithm Analysis - I Text: Read Weiss, §2.1 – Izmir University of Economics.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Algorithm Analysis Part of slides are borrowed from UST.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
CISC 235: Topic 1 Complexity of Iterative Algorithms.
Big-Oh Notation. Agenda  What is Big-Oh Notation?  Example  Guidelines  Theorems.
Algorithm Analysis Algorithm Analysis Lectures 3 & 4 Resources Data Structures & Algorithms Analysis in C++ (MAW): Chap. 2 Introduction to Algorithms (Cormen,
Introduction to Algorithms: Verification, Complexity, and Searching Andy Wang Data Structures, Algorithms, and Generic Programming.
Announcement We will have a 10 minutes Quiz on Feb. 4 at the end of the lecture. The quiz is about Big O notation. The weight of this quiz is 3% (please.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
Analysis of Non – Recursive Algorithms
Analysis of Non – Recursive Algorithms
Analysis of Algorithms
Introduction to Algorithms
Data Structures Using The Big-O Notation 1.
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
Algorithm Analysis (not included in any exams!)
Algorithm Efficiency Chapter 10.
Asymptotic Growth Rate
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
Chapter 2.
Fundamentals of the Analysis of Algorithm Efficiency
CE 221 Data Structures and Algorithms
CE 221 Data Structures and Algorithms
At the end of this session, learner will be able to:
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Complexity Analysis (Part II)
Estimating Algorithm Performance
Analysis of Algorithms
Presentation transcript:

1 Chapter 2 Algorithm Analysis All sections

2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the following Compare different algorithms See how time varies with size of the input Operation count –Count the number of operations that we expect to take the most time Asymptotic analysis –See how fast time increases as the input size approaches infinity

3 Operation Count Examples Example 1 for(i=0; i<n; i++) cout << A[i] << endl; Example 2 template bool IsSorted(T *A, int n) { bool sorted = true; for(int i=0; i<n-1; i++) if(A[i] > A[i+1]) sorted = false; return sorted; } Number of output = n Number of comparisons = n - 1 Example 3: Triangular matrix- vector multiplication pseudo- code c i  0, i = 1 to n for i = 1 to n for j = 1 to i c i += a ij b j ; Number of multiplications =  i=1 n i = n(n+1)/2

4 Scaling Analysis How much will time increase in example 1, if n is doubled? –t(2n)/t(n) = 2n/n = 2 Time will double If time t(n) = 2n 2 for some algorithm, then how much will time increase if the input size is doubled? –t(2n)/t(n) = 2 (2n) 2 / (2n 2 ) = 4n 2 / n 2 = 4

5 Comparing Algorithms Assume that algorithm 1 takes time t 1 (n) = 100n+n 2 and algorithm 2 takes time t 2 (n) = 10n 2 –If an application typically has n < 10, then which algorithms is faster? –If an application typically has n > 100, then which algorithms is faster? Assume algorithms with the following times –Algorithm 1: insert - n, delete - log n, lookup - 1 –Algorithm 2: insert - log n, delete - n, lookup - log n Which algorithm is faster if an application has many inserts but few deletes and lookups?

6 Motivation for Asymptotic Analysis - 1 Compare x 2 (red line) and x (blue line – almost on x-axis) –x 2 is much larger than x for large x

7 Motivation for Asymptotic Analysis - 2 Compare x 2 (red line) and x (blue line – almost on x-axis) –0.0001x 2 is much larger than x for large x The form (x 2 versus x) is most important for large x

8 Motivation for Asymptotic Analysis - 3 Red: x 2, blue: x, green: 100 log x, magenta: sum of these –0.0001x 2 primarily contributes to the sum for large x

9 Asymptotic Complexity Analysis Compares growth of two functions –T = f(n) –Variables: non- negative integers For example, size of input data –Values: non-negative real numbers For example, running time of an algorithm Dependent on –Eventual (asymptotic) behavior Independent of –constant multipliers –and lower-order effects Metrics –“Big O” Notation: O() –“Big Omega” Notation: () –“Big Theta” Notation: ()

10 Big “O” Notation f(n) =O(g(n)) –If and only if there exist two positive constants c > 0 and n 0 > 0, such that f(n) = n 0 –iff  c, n 0 > 0 | 0 = n 0 f(n) cg(n) n0n0 f(n) is asymptotically upper bounded by g(n)

11 Big “Omega” Notation f(n) =  (g(n)) –iff  c, n 0 > 0 | 0 = n 0 f(n) cg(n) n0n0 f(n) is asymptotically lower bounded by g(n)

12 Big “Theta” Notation f(n) =  (g(n)) –iff  c 1, c 2, n 0 > 0 | 0 = n 0 f(n) c 1 g(n) n0n0 c 2 g(n) f(n) has the same long-term rate of growth as g(n)

13 Examples f(n) = 3n  (1),  (n),  (n 2 )  lower bounds O(n 2 ), O(n 3 ), …  upper bounds  (n 2 )  exact bound f(n) = 1000 n n 3  (?)  lower bounds O(?)  upper bounds  (?)  exact bound

14 Analogous to Real Numbers f(n) = O(g(n)) (a < b) f(n) =  (g(n)) (a > b) f(n) =  (g(n))(a = b) The above analogy is not quite accurate, but its convenient to think of function complexity in these terms.

15 Transitivity f(n) = O(g(n)) (a < b) f(n) =  (g(n)) (a > b) f(n) =  (g(n))(a = b) If f(n) = O(g(n)) and g(n) = O(h(n)) –Then f(n) = O(h(n)) If f(n) =  (g(n)) and g(n) =  (h(n)) –Then f(n) =  (h(n)) If f(n) =  (g(n)) and g(n) =  (h(n)) –Then f(n) =  (h(n)) And many other properties

16 Some Rules of Thumb If f(n) is a polynomial of degree k –Then f(N) =  (N k ) log k N = O(N) for any constant k –Logarithms grow very slowly compared to even linear growth

17 Typical Growth Rates

18 Exercise f(N) = N logN and g(N) = N 1.5 –Which one grows faster?? Note that g(N) = N 1.5 = N*N 0.5 –Hence, between f(N) and g(N), we only need to compare growth rate of log N and N 0.5 –Equivalently, we can compare growth rate of log 2 N with N –Now, refer to the result on the last slide to figure out whether f(N) or g(N) grows faster!

19 How Complexity Affects Running Times

20 Running Time Calculations - Loops for (j = 0; j < n; ++j) { // 3 atomics } Number of atomic operations –Each iteration has 3 atomic operations, so 3n –Cost of the iteration itself One initialization assignment n increment (of j) n comparisons (between j and n) Complexity =  (3n) =  (n)

21 Loops with Break for (j = 0; j < n; ++j) { // 3 atomics if (condition) break; } Upper bound = O(4n) = O(n) Lower bound = Ω(4) = Ω(1) Complexity = O(n) Why don’t we have a  (…) notation here?

22 Sequential Search Given an unsorted vector a[ ], find the location of element X. for (i = 0; i < n; i++) { if (a[i] == X) return true; } return false; Input size: n = a.size() Complexity = O(n)

23 If-then-else Statement Complexity = ?? = O(1) + max ( O(1), O(N)) = O(1) + O(N) = O(N) if(condition) i = 0; else for ( j = 0; j < n; j++) a[j] = j;

24 Consecutive Statements Add the complexity of consecutive statements Complexity =  (3n + 5n) =  (n) for (j = 0; j < n; ++j) { // 3 atomics } for (j = 0; j < n; ++j) { // 5 atomics }

25 Nested Loop Statements Analyze such statements inside out for (j = 0; j < n; ++j) { // 2 atomics for (k = 0; k < n; ++k) { // 3 atomics } Complexity =  ((2 + 3n)n) =  (n 2 )

26 Recursion long factorial( int n ) { if( n <= 1 ) return 1; else return n*factorial(n- 1); } In terms of big-Oh: t(1) = 1 t(n) = 1 + t(n-1) = t(n-2) =... k + t(n-k) Choose k = n-1 t(n) = n-1 + t(1) = n = O(n) Consider the following time complexity: t(0) = 1 t(n) = 1 + 2t(n-1) = 1 + 2(1 + 2t(n-2)) = t(n-2) = (1 + 2t(n-3)) = t(n-3) = k k t(n-k) Choose k = n t(n) n n = 2 n+1 - 1

27 Binary Search Given a sorted vector a[ ], find the location of element X unsigned int binary_search(vector a, int X) { unsigned int low = 0, high = a.size()-1; while (low <= high) { int mid = (low + high) / 2; if (a[mid] < X) low = mid + 1; else if( a[mid] > X ) high = mid - 1; else return mid; } return NOT_FOUND; } Input size: n = a.size() Complexity = O( k iterations x (1 comparison+1 assignment) per loop) = O(log(n))