E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Computational Complexity 1. Time Complexity 2. Space Complexity.
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 3 Growth of Functions
Chapter 10 Algorithm Efficiency
Introduction to Analysis of Algorithms
Analysis of Algorithms (Chapter 4)
© 2006 Pearson Addison-Wesley. All rights reserved6-1 More on Recursion.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis (Big O)
COMP s1 Computing 2 Complexity
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Lecture 2 We have given O(n 3 ), O(n 2 ), O(nlogn) algorithms for the max sub-range problem. This time, a linear time algorithm! The idea is as follows:
Algorithmic Complexity: Complexity Analysis of Time Complexity Complexities Nate the Great.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a lgorithms.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
C. – C. Yao Data Structure. C. – C. Yao Chap 1 Basic Concepts.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CS 3343: Analysis of Algorithms
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
CSS342: Algorithm Analysis1 Professor: Munehiro Fukuda.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Sorting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
Complexity Analysis. 2 Complexity The complexity of an algorithm quantifies the resources needed as a function of the amount of input data size. The resource.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Algorithm Analysis 1.
COMP108 Algorithmic Foundations Algorithm efficiency
Introduction to Algorithms
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
Chapter 2.
At the end of this session, learner will be able to:
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Estimating Algorithm Performance
Presentation transcript:

E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a slow program it is likely not to be used  a program that demands too much memory may not be executable on the machines that are available

E.G.M. PetrakisAlgorithm Analysis2 Memory - Time  It is important to be able to predict how an algorithm behaves under a range of possible conditions  usually different inputs  The emphasis is on the time rather than on the space efficiency  memory is cheap!

E.G.M. PetrakisAlgorithm Analysis3 Running Time  The time to solve a problem increases with the size of the input  measure the rate at which the time increases  e.g., linearly, quadratically, exponentially  This measure focuses on intrinsic characteristics of the algorithm rather than incidental factors  e.g., computer speed, coding tricks, optimizations, quality of compiler

E.G.M. PetrakisAlgorithm Analysis4 Additional Consideration  Sometimes, simple but less efficient algorithms are preferred over more efficient ones  the more efficient algorithms are not always easy to implement  the program is to be changed soon  time may not be critical  the program will run only few times (e.g., once a year)  the size of the input is always very small

E.G.M. PetrakisAlgorithm Analysis5 Bubble sort  Count the number of steps executed by the algorithm  step: basic operation  ignore operations that are executed a constant number of times for ( i = 1; i < ( n – 1); i ++) for (j = n; j < (i + 1); j --) if ( A[ j – 1 ] > A[ j ] ) T(n) = c n2 swap(A[ j ], A[ j – 1 ]); //step

E.G.M. PetrakisAlgorithm Analysis6 Asymptotic Algorithm Analysis  Measures the efficiency of a program as the size of the input becomes large  focuses on “growth rate”: the rate at which the running time of an algorithm grows as the size of the input becomes “big”  Complexity => Growth rate  focuses on the upper and lower bounds of the growth rates  ignores all constant factors

E.G.M. PetrakisAlgorithm Analysis7 Growth Rate  Depending on the order of n an algorithm can beof  Linear time Complexity: T(n) = c n  Quadratic Complexity: T(n) = c n 2  Exponential Complexity: T(n) = c n k  The difference between an algorithm whose running time is T(n) = 10n and another with running time is T(n) = 2n 2 is tremendous  for n > 5 the linear algorithm is must faster despite the fact that it has larger constant  it is slower only for small n but, for such n, both algorithms are very fast

E.G.M. PetrakisAlgorithm Analysis8

E.G.M. PetrakisAlgorithm Analysis9 Upper – Lower Bounds  Several terms are used to describe the running time of an algorithm:  The upper bound to its growth rate which is expressed by the Big-Oh notation  f(n) is upper bound of T(n)  T(n) is in O(f(n))  The lower bound to it growth rate which is expressed by the Omega notation  g(n) is lower bound of T(n)  T(n) is in Ω(g(n))  If the upper and lower bounds are the same  T(n) is in Θ(g(n))

E.G.M. PetrakisAlgorithm Analysis10 Big O (upper bound)  Definition:  Example:

E.G.M. PetrakisAlgorithm Analysis11 Properties of O  O is transitive: if  If

E.G.M. PetrakisAlgorithm Analysis12 More on Upper Bounds  T(n)  O(g(n))  g(n) :upper bound for T(n)  There exist many upper bounds  e.g., T(n) = n 2  O(n 2 ), O(n 3 ) … O(n 100 )...  Find the smallest upper bound!!  O notation “hides” the constants  e.g., cn 2, c 2 n 2, c!n 2,…c k n 2  O(n 2 )

E.G.M. PetrakisAlgorithm Analysis13 O in Practice  Among different algorithms solving the same problem, prefer the algorithm with the smaller “rate of growth” O  it will be faster for large n  Ο(1) < Ο(log n) < Ο(n) < Ο(n log n) < Ο(n 2 ) < Ο(n 3 ) < …< Ο(2 n ) < O(n n )  Ο(n), Ο(n log n), Ο(n 2 ) < Ο(n 3 ): polynomial  Ο(log n): logarithmic  O(n k ), O(2 n ), (n!), O(n n ): exponential!!

E.G.M. PetrakisAlgorithm Analysis n Complexity 58 min 35 yrs hrs cent 220 da y hrs 21 min 1.0 min min nlogn n sec/command

E.G.M. PetrakisAlgorithm Analysis15 Omega Ω (lower bound)  Definition:  g(n) is a lower bound of the rate of growth of f(n)  f(n) increases faster than g(n) as n increases  e.g: T(n) = 6n 2 – 2n + 7 =>T(n) in Ω(n 2 )

E.G.M. PetrakisAlgorithm Analysis16 More on Lower Bounds  Ω provides lower bound of the growth rate of T(n)  if there exist many lower bounds  T(n) = n 4  Ω(n),  Ω(n 2 ),  Ω(n 3 ),  Ω(n 4 )  find the larger one => T(n)  Ω(n 4 )

E.G.M. PetrakisAlgorithm Analysis17 Thetα Θ  If the lower and upper bounds are the same:  e.g.: T(n) = n n  Θ(n 3 )

E.G.M. PetrakisAlgorithm Analysis18 Theorem: T(n)=Σ α i n i  Θ(n m )  T(n) = α n n m +α n-1 n m-1 +…+α 1 n+ α 0  O(n m ) Proof: T(n)  |T(n)|  |α n n m |+|α m-1 n m-1 |+…|α 1 n|+|α 0 |  n m {|α m | + 1/n|α m-1 |+…+1/n m-1 |α 1 |+1/n m |α 0 |}   n m {|α m |+|α m-1 |+…+|α 1 |+|α 0 |}  T(n)  O(n m )

E.G.M. PetrakisAlgorithm Analysis19 Theorem (cont.) b)T(n) = α m n m + α m-1 n m-1 +… α 1 n + α 0 >= cn m + cn m-1 + … cn + c >= cn m where c = min{α m, α m-1,… α 1, α 0 } => T(n)  Ω(n m ) (a), (b) => Τ(n)  Θ(n m )

E.G.M. PetrakisAlgorithm Analysis20 Theorem: (a) (b) (either or )

E.G.M. PetrakisAlgorithm Analysis21 Recursive Relations (1)

E.G.M. PetrakisAlgorithm Analysis22 Fibonacci Relation

E.G.M. PetrakisAlgorithm Analysis23 Fibonacci relation (cont.)  From (a), (b):  It can be proven that F(n)  Θ(φ n ) where golden ratio

E.G.M. PetrakisAlgorithm Analysis24 Binary Search int BSearch(int table[a..b], key, n) { if (a > b) return (– 1) ; int middle = (a + b)/2 ; if (T[middle] == key) return (middle); else if ( key < T[middle] ) return BSearch(T[a..middle – 1], key); else return BSearch(T[middle + 1..b], key); }

E.G.M. PetrakisAlgorithm Analysis25 Analysis of Binary Search  Let n = b – a + 1: size of array and n = 2 k – 1 for some k (e.g., 1, 3, 5 etc)  The size of each sub-array is 2 k-1 –1  c: execution time of first command  d: execution time outside recursion  T: execution time

E.G.M. PetrakisAlgorithm Analysis26 Binary Search (cont.)

E.G.M. PetrakisAlgorithm Analysis27 Binary Search for Random n  T(n) <= T(n+1) <=...T(u(n)) where u(n)=2 k – 1 for some k: n < u(n) < 2n  Then, T(n) d log(2n + 1) + c => T(n)  O(log n)

E.G.M. PetrakisAlgorithm Analysis28 Merge sort list mergesort(list L, int n) { if (n == 1) return (L); L 1 = lower half of L; L 2 = upper half of L; return merge (mergesort(L 1,n/2), mergesort(L 2,n/2) ); } n: size of array L (assume L some power of 2) merge: merges the sorted L 1, L 2 in a sorted array

E.G.M. PetrakisAlgorithm Analysis29

E.G.M. PetrakisAlgorithm Analysis30 Analysis of Merge sort T(n/4) T(n/2) T(n/4) T(n) T(n/4) T(n/2) T(n/4) O(n) cost of merge => Complexity O(n log n) Step 1Step 2 log n steps

E.G.M. PetrakisAlgorithm Analysis31 Analysis of Merge sort  Two ways:  recursion analysis  induction

E.G.M. PetrakisAlgorithm Analysis32 Induction  Let T(n) = O(n log n) then T(n) can be written as T(n) 0  Proof: (1) for n = 1, true if b >= c 1 (2) let T(k) = a k log k + b for k=1,2,..n-1 (3) for k=n prove that T(n) <= a n logn + b For k = n/2 : T(n/2) <= a n/2 log n/2 + b therefore, T(n) can be written as T(n) = 2 {a n/2 log n/2 + b} + c 2 n = a n(log n – 1)+ 2b + c 2 n =

E.G.M. PetrakisAlgorithm Analysis33 Induction (cont.)  T(n) = a n log n – a n + 2 b + c 2 n <= a n log n + c 2 nlog n + 2b  Let b = c 1 and a = c 1 + c 2 => T(n) T(n) <= C n log n + B

E.G.M. PetrakisAlgorithm Analysis34 Recursion analysis of mergesort  T(n) = 2T(n/2) + c 2 n <= <= 4T(n/4) + 2 c 2 n <= <= 8T(n/8) + 3 c 2 n <= ……  T(n) <= 2 i T(n/2 i ) + ic 2 n where n = 2 i  T(n) T(n) T(n) T(n) <= C n log n

E.G.M. PetrakisAlgorithm Analysis35 Best, Worst, Average Cases  For some algorithms, different inputs require different amounts of time  e.g., sequential search in an array of size n  worst-case: element not in array => T(n) = n  best-case: element is first in array =>T(n) = 1  average-case: element between 1, n =>T(n) = n/2... n comparisons

E.G.M. PetrakisAlgorithm Analysis36 Comparison  The best-case is not representative  happens only rarely  useful when it has high probability  The worst-case is very useful:  the algorithm performs at least that well  very important for real-time applications  The average-case represents the typical behavior of the algorithm  not always possible  knowledge about data distribution is necessary