1 Chapter 2 Algorithm Analysis Reading: Chapter 2.

Slides:



Advertisements
Similar presentations
Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
Advertisements

Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 2: Algorithm Analysis
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Algorithm Analysis. Math Review – 1.2 Exponents –X A X B = X A+B –X A /X B =X A-B –(X A ) B = X AB –X N +X N = 2X N ≠ X 2N –2 N+ 2 N = 2 N+1 Logarithms.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Cpt S 223 – Advanced Data Structures
Elementary Data Structures and Algorithms
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Analysis of Performance
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
1 Sorting Algorithms (Basic) Search Algorithms BinaryInterpolation Big-O Notation Complexity Sorting, Searching, Recursion Intro to Algorithms Selection.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Analysis Tools Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Lecture 2 Computational Complexity
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Complexity Analysis Chapter 1.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Asymptotic Analysis-Ch. 3
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Asymptotic Algorithm Analysis The asymptotic analysis of an algorithm determines the running time in big-Oh (big O) notation To perform the asymptotic.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Introduction to Algorithms: Verification, Complexity, and Searching (2) Andy Wang Data Structures, Algorithms, and Generic Programming.
CE 221 Data Structures and Algorithms Chapter 2: Algorithm Analysis - I Text: Read Weiss, §2.1 – Izmir University of Economics.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Algorithm Analysis Part of slides are borrowed from UST.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
CISC 235: Topic 1 Complexity of Iterative Algorithms.
Algorithm Analysis (Big O)
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Announcement We will have a 10 minutes Quiz on Feb. 4 at the end of the lecture. The quiz is about Big O notation. The weight of this quiz is 3% (please.
Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 3. Introduction to the Analysis of Algorithms.
Asymptotic Complexity
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
Introduction to Algorithms
Complexity Analysis.
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
Algorithm Analysis (not included in any exams!)
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Chapter 2.
Programming and Data Structure
Fundamentals of the Analysis of Algorithm Efficiency
Complexity Analysis Text is mainly from Chapter 2 by Drozdek.
CE 221 Data Structures and Algorithms
CE 221 Data Structures and Algorithms
8. Comparison of Algorithms
At the end of this session, learner will be able to:
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Estimating Algorithm Performance
Presentation transcript:

1 Chapter 2 Algorithm Analysis Reading: Chapter 2

2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the following Compare different algorithms See how time varies with size of the input Operation count –Count the number of operations that we expect to take the most time Asymptotic analysis –See how fast time increases as the input size approaches infinity

3 Operation Count Examples Example 1 for(i=0; i<n; i++) cout << A[i] << endl; Example 2 template bool IsSorted(T *A, int n) { bool sorted = true; for(int i=0; i<n-1; i++) if(A[i] > A[i+1]) sorted = false; return sorted; } Number of outputs = n Number of comparisons = n - 1

4 Scaling Analysis How much will time increase in example 1, if n is doubled? –t(2n)/t(n) = 2n/n = 2 Time will double If time t(n) = 2n 2 for some algorithm, then how much will time increase if the input size is doubled? –t(2n)/t(n) = 2 (2n) 2 / (2n 2 ) = 4n 2 / n 2 = 4

5 Comparing Algorithms Assume that algorithm 1 takes time t 1 (n) = 100n+n 2 and algorithm 2 takes time t 2 (n) = 10n 2 –If an application typically has n < 10, then which algorithms is faster? –If an application typically has n > 100, then which algorithms is faster? Assume algorithms with the following times –Algorithm 1: insert - n, delete - log n, lookup - 1 –Algorithm 2: insert - log n, delete - n, lookup - log n Which algorithm is faster if an application has many inserts but few deletes and lookups?

6 Asymptotic Complexity Analysis Compares growth rate of two functions –T = f(n) (estimated run time) –Variables: non-negative integers For example, size of input data –Values: non-negative real numbers For example, running time of an algorithm Dependent on –Eventual (asymptotic) behavior Independent of –constant multipliers –and lower-order effects Metrics –“Big O” Notation: O() –“Big Omega” Notation: () –“Big Theta” Notation: ()

7 Big “O” Notation f(n) =O(g(n)) –If and only if there exist two positive constants c > 0 and n 0 > 0, such that f(n) = n 0 –iff  c, n 0 > 0 | 0 = n 0 f(n) cg(n) n0n0 f(n) is asymptotically upper bounded by g(n)

8 Big “Omega” Notation f(n) =  (g(n)) –iff  c, n 0 > 0 | 0 = n 0 f(n) cg(n) n0n0 f(n) is asymptotically lower bounded by g(n)

9 Big “Theta” Notation f(n) =  (g(n)) –iff  c 1, c 2, n 0 > 0 | 0 = n 0 f(n) c 1 g(n) n0n0 c 2 g(n) f(n) has the same long-term rate of growth as g(n)

10 Example f(n) = 3n  (1),  (n),  (n 2 )  lower bounds O(n 2 ), O(n 3 ), …  upper bounds  (n 2 )  exact bound

11 Analogous to Real Numbers f(n) = O(g(n)) (a < b) f(n) =  (g(n)) (a > b) f(n) =  (g(n))(a = b) The above analogy is not quite accurate, but its convenient to think of function complexity in these terms.

12 Transitivity f(n) = O(g(n)) (a < b) f(n) =  (g(n)) (a > b) f(n) =  (g(n))(a = b) If f(n) = O(g(n)) and g(n) = O(h(n)) –Then f(n) = O(h(n)) If f(n) =  (g(n)) and g(n) =  (h(n)) –Then f(n) =  (h(n)) If f(n) =  (g(n)) and g(n) =  (h(n)) –Then f(n) =  (h(n)) And many other properties

13 Arithmetic Properties Additive property –If e(n) = O(g(n)) and f(n) = O(h(n)) –Then e(n) + f(n) = O(g(n) + h(n)) –Less formally: O(g(n)+h(n)) = max(O(g(n)), O(h(n)) Multiplicative property –If e(n) = O(g(n)) and f(n) = O(h(n)) –Then e(n) * f(n) = O(g(n) * h(n))

14 Some Rules of Thumb If f(n) is a polynomial of degree k –Then f(N) =  (N k ) log k N = O(N) for any constant k –Logarithms grow very slowly compared to even linear growth

15 Typical Growth Rates

16 Exercise f(N) = N logN and g(N) = N 1.5 –Which one grows faster?? Note that g(N) = N 1.5 = N*N 0.5 –Hence, between f(N) and g(N), we only need to compare growth rate of logN and N 0.5 –Equivalently, we can compare growth rate of log 2 N with N –Now, refer to the result on the previous slide to figure out whether f(N) or g(N) grows faster!

17 Maximum Subsequence Sum Problem Given a sequence of integers A 1, A 2, …, A N –Find the maximum subsequence (A i + A i+1 + … + A k ), where 1 < i < N –For example for: –2, 11, -4, 13, -5, -2 The answer is 20: (11, -4, 13) –Many algorithms of differing complexity can be found –Example run times from some computer illustrated below

18 How Complexity Affects Running Times

19 Complexity Analysis Steps –Find n = size of input –Find atomic activities to count Primitive operations such as +, -, *, /. Assumed to finish within one unit of time –Find f(n) = the number of atomic activities done by an input size of n –Complexity of an algorithm = complexity of f(n)

20 Running Time Calculations - Loops for (j = 0; j < n; ++j) { // 3 atomics } Number of atomic operations –Each iteration has 3 atomic operations, so 3n –Cost of the iteration itself One initialization assignment n increment (of j) n comparisons (between j and n) Complexity =  (3n) =  (n)

21 Loops with Break for (j = 0; j < n; ++j) { // 3 atomics if (condition) break; } Upper bound = O(4n) = O(n) Lower bound = Ω(4) = Ω(1) Complexity = O(n) Why don’t we have a  (…) notation here?

22 Sequential Search Given an unsorted vector a, find the location of element X. for (size_t i = 0; i < a.size(); i++) { if (a[i] == X) return true; } return false; Input size: n = a.size() Complexity = O(n)

23 If-then-else Statement Complexity = ?? = O(1) + max ( O(1), O(N)) = O(1) + O(N) = O(N) if(condition) i = 0; else for ( j = 0; j < n; j++) a[j] = j;

24 Consecutive Statements Add the complexity of consecutive statements Complexity =  (3n + 5n) =  (n) for (j = 0; j < n; ++j) { // 3 atomics } for (j = 0; j < n; ++j) { // 5 atomics }

25 Nested Loop Statements Analyze such statements inside out for (j = 0; j < n; ++j) { // 2 atomics for (k = 0; k < n; ++k) { // 3 atomics } Complexity =  ((2 + 3n)n) =  (n 2 )

26 Recursion long factorial( int n ) { if( n <= 1 ) return 1; else return n * factorial( n - 1 ); } long fib( int n ) { if ( n <= 1) return 1; else return fib( n – 1 ) + fib( n – 2 ); } We need to determine how many times a recursive function is called This is really a simple loop disguised as recursion Complexity = O(n) Fibonacci Series: Terrible way to Implement recursion Complexity = ((3/2) N ) That’s Exponential !!

27 Logarithms in Running Time General rule: –If constant time is required to merely reduce the problem by a constant amount, the algorithm is O(N). –An algorithm is O(logN) if it takes constant O(1) time to cut the problem size by a fraction (usually ½N). Binary search Euclid’s Algorithm Exponentiation

28 Binary Search Given a sorted vector a, find the location of element X int binary_search(const vector & a, int X) { unsigned int low = 0, high = a.size()-1; while (low <= high) { int mid = (low + high) / 2; if (a[mid] < X) low = mid + 1; else if( a[mid] > X ) high = mid - 1; else return mid; } return NOT_FOUND; } Input size: n = a.size() Complexity = O( k iterations x (1 comparison+1 assignment) per loop) = O(log(n))

29 Euclid’s Algorithm Find the greatest common divisor (gcd) between m and n –Given M > N For example –M = 50, N = 15 –Remainders 5, 0 –So gcd(50, 15) = 5 Complexity = O(logN) Exercise: –Why is it O(logN) ? –If M > N, then (M mod N) < M/2

30 Exponentiation Calculate x n Example: –x 11 = x 5 * x 5 * x –x 5 = x 2 * x 2 * x –x 2 = x * x Complexity = O( logN ) Why didn’t we implement the recursion as follows? –pow(x,n/2)*pow(x,n/2)*x