Algorithm Analysis (Big O)

Slides:



Advertisements
Similar presentations
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Advertisements

HST 952 Computing for Biomedical Scientists Lecture 10.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Algorithm Analysis (Big O) CS-341 Dick Steflik. Complexity In examining algorithm efficiency we must understand the idea of complexity –Space complexity.
Analysys & Complexity of Algorithms Big Oh Notation.
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 10 Algorithm Efficiency
Algorithm Analysis (Big O) CS-341 Dick Steflik. Complexity In examining algorithm efficiency we must understand the idea of complexity –Space complexity.
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Cmpt-225 Algorithm Efficiency.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
The Efficiency of Algorithms
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
COMP s1 Computing 2 Complexity
Analysis of Performance
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a lgorithms.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Analysis of Algorithms
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
Algorithm Analysis (Big O)
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Computer science is a field of study that deals with solving a variety of problems by using computers. To solve a given problem by using computers, you.
Searching Topics Sequential Search Binary Search.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Analysis of Algorithms
Analysis of Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
GC 211:Data Structures Algorithm Analysis Tools
Complexity In examining algorithm efficiency we must understand the idea of complexity Space complexity Time Complexity.
Algorithm Analysis (not included in any exams!)
GC 211:Data Structures Algorithm Analysis Tools
Analysys & Complexity of Algorithms
Programming and Data Structure
Analysis of Algorithms
Chapter 2.
Programming and Data Structure
At the end of this session, learner will be able to:
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Estimating Algorithm Performance
Analysis of Algorithms
Presentation transcript:

Algorithm Analysis (Big O)

Complexity In examining algorithm efficiency we must understand the idea of complexity Space complexity Time Complexity

Space Complexity When memory was expensive we focused on making programs as space efficient as possible and developed schemes to make memory appear larger than it really was (virtual memory and memory paging schemes) Space complexity is still important in the field of embedded computing (hand held computer based equipment like cell phones, palm devices, etc)

Time Complexity Is the algorithm “fast enough” for my needs How much longer will the algorithm take if I increase the amount of data it must process Given a set of algorithms that accomplish the same thing, which is the right one to choose

Algorithm Efficiency a measure of the amount of resources consumed in solving a problem of size n time space Benchmarking: implement algorithm, run with some specific input and measure time taken better for comparing performance of processors than for comparing performance of algorithms Big Oh (asymptotic analysis) associates n, the problem size, with t, the processing time required to solve the problem

Cases to examine Best case if the algorithm is executed, the fewest number of instructions are executed Average case executing the algorithm produces path lengths that will on average be the same Worst case executing the algorithm produces path lengths that are always a maximum

Worst case analysis Of the three cases, only useful case (from the standpoint of program design) is that of the worst case. Worst case helps answer the software lifecycle question of: If its good enough today, will it be good enough tomorrow?

Frequency Count examine a piece of code and predict the number of instructions to be executed e.g. for each instruction predict how many times each will be encountered as the code runs Inst # 1 2 3 Code for (int i=0; i< n ; i++) { cout << i; p = p + i; } F.C. n+1 n ____ 3n+1 totaling the counts produces the F.C. (frequency count)

Order of magnitude In the previous example: best_case = avg_case = worst_case Example is based on fixed iteration n By itself, Freq. Count is relatively meaningless Order of magnitude -> estimate of performance vs. amount of data To convert F.C. to order of magnitude: discard constant terms disregard coefficients pick the most significant term Worst case path through algorithm -> order of magnitude will be Big O (i.e. O(n))

Another example F.C. n+1 n(n+1) n*n F.C. n+1 n2+n n2 ____ 3n2+2n+1 Inst # 1 2 3 4 Code for (int i=0; i< n ; i++) for int j=0 ; j < n; j++) { cout << i; p = p + i; } discarding constant terms produces : 3n2+2n clearing coefficients : n2+n picking the most significant term: n2 Big O = O(n2)

What is Big O Big O For example: rate at which algorithm performance degrades as a function of the amount of data it is asked to handle For example: O(n) -> performance degrades at a linear rate O(n2) -> quadratic degradation

Common growth rates

Big Oh - Formal Definition Definition of "big oh": f(n)=O(g(n)), iff there exist constants c and n0 such that: f(n) <= c  g(n) for all n>=n0 Thus, g(n) is an upper bound on f(n) Note: f(n) = O(g(n)) is NOT the same as O(g(n)) = f(n) The '=' is not the usual mathematical operator "=" (it is not reflexive)

Comparing Algorithms and ADT Data Structures Big-O Notation Comparing Algorithms and ADT Data Structures

Algorithm Efficiency a measure of the amount of resources consumed in solving a problem of size n time space benchmarking – code the algorithm, run it with some specific input and measure time taken better for measuring and comparing the performance of processors than for measuring and comparing the performance of algorithms Big Oh (asymptotic analysis) provides a formula that associates n, the problem size, with t, the processing time required to solve the problem

big Oh measures an algorithm’s growth rate how fast does the time required for an algorithm to execute increase as the size of the problem increases? is an intrinsic property of the algorithm independent of particular machine or code based on number of instructions executed for some algorithms is data-dependent meaningful for “large” problem sizes

Computing xn for n >= 0 iterative definition x * x * x .. * x (n times) recursive definition x0 = 1 xn = x * xn-1 (for n > 0) another recursive definition xn = (xn/2)2 (for n > 0 and n is even) xn = x * (xn/2)2 (for n > 0 and n is odd)

Iterative Power function double IterPow (double X, int N) { double Result = 1; while (N > 0) { Result *= X; N--; { return Result; } 1 n+1 n Total instruction count: 3n+3 critical region algorithm's computing time (t) as a function of n is: 3n + 3 t is on the order of f(n) - O[f(n)] O[3n + 3] is n

Recursive Power function Base case Recursive case 1 1 1 1 + T(n-1) total: 2 2 + T(n-1) double RecPow (double X, int N) { if (N == 0) return 1; else return X * RecPow(X, N - 1); } Number of times base case is executed: 1 Number of times recursive case is executed: n Algorithm's computing time (t) as a function of n is: 2n + 2 O[2n + 2] is n

Another Power Function Base case Recursive case 1 1 1 T(n/2) 1(even) 1(odd) total: 2 3 + T(n/2) double Pow3 (double X, int N) { if (N == 0) return 1; else { double halfPower = Pow3(X, N/2); if (N % 2 == 0) return halfPower * halfPower; else return X * halfPower * halfPower; } Number of times base case is executed: 1 Number of times recursive case is executed: log2 n Algorithm's computing time (t) as a function of n is: 3 log2 n + 2 O[3 log2 n + 2] is log2 n

Computational Complexity Computing time, T(n), of an algorithm is a function of the problem size (based on instruction count) T(n) for IterPow is: 3n + 3 T(n) for RecPow is: 2n + 2 T(n) for Pow3 is: 3 log2 n + 2 Computational complexity of an algorithm is the rate at which T(n) grows as the problem size grows is expressed using "big Oh" notation growth rate (big Oh) of 3n+3 and of 2n+2 is: n big Oh of 3 log2 n + 2 is: log2 n

Common big Ohs constant O(1) logarithmic O(log2 N) linear O(N) n log n O(N log2 N) quadratic O(N2) cubic O(N3) exponential O(2N)

Comparing Growth Rates n log2 n n T(n) log2 n Problem Size

An Experiment Execution time (in seconds) 2^25 2^50 2^100 2^25 2^50 2^100 IterPow .71 1.15 2.03 RecPow 3.63 7.42 15.05 Pow3 .99 1.15 1.38 (1,000,000 repetitions)

Uses of big Oh compare algorithms which perform the same function search algorithms sorting algorithms comparing data structures for an ADT each operation is an algorithm and has a big Oh data structure chosen affects big Oh of the ADT's operations

Comparing algorithms Sequential search growth rate is O(n) average number of comparisons done is n/2 Binary search growth rate is O(log2 n) average number of comparisons done is 2((log2 n) -1) n n/2 2((log2 n)-1) 100 50 12 500 250 16 1000 500 18 5000 2500 24