Asymptotic Algorithm Analysis The asymptotic analysis of an algorithm determines the running time in big-Oh (big O) notation To perform the asymptotic.

Slides:



Advertisements
Similar presentations
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Advertisements

Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSC401 – Analysis of Algorithms Lecture Notes 1 Introduction
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Asymptotic Growth Rate
© 2004 Goodrich, Tamassia 1 Lecture 01 Algorithm Analysis Topics Basic concepts Theoretical Analysis Concept of big-oh Choose lower order algorithms Relatives.
Introduction to Analysis of Algorithms
Analysis of Algorithms Algorithm Input Output. Analysis of Algorithms2 Outline and Reading Running time (§1.1) Pseudo-code (§1.1) Counting primitive operations.
Analysis of Algorithms1 CS5302 Data Structures and Algorithms Lecturer: Lusheng Wang Office: Y6416 Phone:
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Analysis of Algorithms
Analysis of Algorithms (pt 2) (Chapter 4) COMP53 Oct 3, 2007.
Fall 2006CSC311: Data Structures1 Chapter 4 Analysis Tools Objectives –Experiment analysis of algorithms and limitations –Theoretical Analysis of algorithms.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Analysis of Algorithms1 CS5302 Data Structures and Algorithms Lecturer: Lusheng Wang Office: Y6416 Phone:
PSU CS 311 – Algorithm Design and Analysis Dr. Mohamed Tounsi 1 CS 311 Design and Algorithms Analysis Dr. Mohamed Tounsi
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Performance
CS2210 Data Structures and Algorithms Lecture 2:
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Analysis of Algorithms Lecture 2
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Analysis of Algorithms
Analysis Tools Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms
Analysis of Algorithms1 The Goal of the Course Design “good” data structures and algorithms Data structure is a systematic way of organizing and accessing.
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Data Structures Lecture 8 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
Analysis of Algorithms1 Running Time Pseudo-Code Analysis of Algorithms Asymptotic Notation Asymptotic Analysis Mathematical facts.
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
Analysis of Algorithms
Asymptotic Analysis-Ch. 3
1 Dr. J. Michael Moore Data Structures and Algorithms CSCE 221 Adapted from slides provided with the textbook, Nancy Amato, and Scott Schaefer.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
CSC401 – Analysis of Algorithms Lecture Notes 2 Asymptotic Analysis Objectives: Mathematics foundation for algorithm analysis Amortization analysis techniques.
Computer Science and Software Engineering University of Wisconsin - Platteville 8. Comparison of Algorithms Yan Shi CS/SE 2630 Lecture Notes Part of this.
Analysis of Algorithms Algorithm Input Output © 2010 Goodrich, Tamassia1Analysis of Algorithms.
Algorithm Analysis Part of slides are borrowed from UST.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Big-Oh Notation. Agenda  What is Big-Oh Notation?  Example  Guidelines  Theorems.
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Announcement We will have a 10 minutes Quiz on Feb. 4 at the end of the lecture. The quiz is about Big O notation. The weight of this quiz is 3% (please.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
1 COMP9024: Data Structures and Algorithms Week Two: Analysis of Algorithms Hui Wu Session 2, 2014
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
(Complexity) Analysis of Algorithms Algorithm Input Output 1Analysis of Algorithms.
Analysis of Algorithms
Chapter 2 Algorithm Analysis
COMP9024: Data Structures and Algorithms
COMP9024: Data Structures and Algorithms
Introduction to Algorithms
Analysis of Algorithms
COMP9024: Data Structures and Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Algorithm Analysis Neil Tang 01/22/2008
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Presentation transcript:

Asymptotic Algorithm Analysis The asymptotic analysis of an algorithm determines the running time in big-Oh (big O) notation To perform the asymptotic analysis –We find the worst-case number of primitive operations executed as a function of the input size –We express this function with big-Oh notation Example: –We determine that algorithm arrayMax executes at most 8n  5 primitive operations –We say that algorithm arrayMax “runs in O(n) time” –We say that algorithm arrayMax is “linear in n” Since constant factors and lower-order terms are eventually dropped anyhow, we can disregard them when counting primitive operations Lecture 9: Complexity continued

2 Big O Rules 1. If f(n) is a polynomial of degree d, then f(n) is O(n d ), i.e., 1.Drop lower-order terms 2.Drop constant factors Use the smallest possible class of functions –Say “2n is O(n)” instead of “2n is O(n 2 )” Use the simplest expression of the class –Say “3n + 5 is O(n)” instead of “3n + 5 is O(3n)” ADS Lecture 8

3 1. If T 1 (n) = O(f(n)) and T 2 (n) = O(g(n)) then (a) T 1 (n) +T 2 (n) = O(max(f(n),g(n)) ) (b) T 1 (n)*T 2 (n) = O(f(n)*g(n)) 2. If T(n) = (log n) k then T(n) = O(n) Big O Rules 2. ADS Lecture 8

4 Computing Prefix Averages We further illustrate asymptotic analysis with two algorithms for prefix averages The i -th prefix average of an array X is average of the first (i  1) elements of X: A[i]  X[0]  X[1]  …  X[i])/(i+1) Computing the array A of prefix averages of another array X has applications to financial analysis ADS Lecture

5 Prefix Averages (Quadratic) The following algorithm computes prefix averages in quadratic time by applying the definition Algorithm prefixAverages1(X, n) Input array X of n integers Output array A of prefix averages of X #operations A  new array of n integers n for i  0 to n  1 do n s  X[0] n for j  1 to i do 1  2  …  (n  1) s  s  X[j] 1  2  …  (n  1) A[i]  s  (i  1) n return A 1 ADS Lecture 8 Note, this time ignoring constants A Dumb Algorithm

6 Aside: Arithmetic Progression The running time of prefixAverages1 is O(1  2  …  n) The sum of the first n integers is n(n  1)  2 –There is a simple visual proof of this fact Thus, algorithm prefixAverages1 runs in O(n 2 ) time

7 Prefix Averages (Linear) The following algorithm computes prefix averages in linear time by keeping a running sum Algorithm prefixAverages2(X, n) Input array X of n integers Output array A of prefix averages of X #operations A  new array of n integersn s  0 1 for i  0 to n  1 don s  s  X[i]n A[i]  s  (i  1) n return A 1 Algorithm prefixAverages2 runs in O(n) time – so more efficient ADS Lecture 8 A Smart Algorithm

8 Some more growth rates most common –O(1) or constant –O(log n) or logarithmic (to base 2) –O(n) or linear –O(n log n) (usually just called n log n) –O(n 2 ) or quadratic –O(n 3 ) or cubic –O(2 n ) exponential –O(n!) n factorial ADS Lecture 8

9 n 1 log n n log n n Growth rate

10 General rules for finding running times Rule 1 – Loops –The running time of a loop is at most the running time of the statements inside the loop (including tests) multiplied by the number of iterations Rule 2 – nested Loops –Should be analysed inside out. Total running time of a statement inside a group of nested loops is running time of statement multiplied by the product of the sizes of all the loops... is O(n 3 ) ADS Lecture 8 Algorithm Test1(n): for i=1 to n do for j=1 to n do for k=1 to n do increment x

11 General rules contd. Rule 3 – Consecutive statements –These just add (so take the maximum O, see slide 4) Rule 4 – If-then-else - Running time is never more than the time of the test (condition) plus the larger of the running times of S1 and S2 Similarly for multiple/nested else statements ADS Lecture 8 Algorithm Test2(n): if condition1 then S1 else S2

Examples ADS Lecture 812 Algorithm Squares1(n): Input: Ingeger n Output: The sum of the first n positive integers i  0 sum  0 while (i<n) increment i sum=sum+i*i return sum Is O(n)

Examples ADS Lecture 813 Algorithm Squares1(n): Input: Ingeger n Output: The sum of the first n positive integers i  0 sum  0 while (i<n) increment i sum=sum+i*i return sum With some mathematical knowledge we can rewrite the program so that it is faster: ∑i 2 =n*(n+1)*(2n+1)/6 Is O(n)

Examples ADS Lecture 814 Algorithm Squares1(n): Input: Ingeger n Output: The sum of the first n positive integers i  0 sum  0 while (i<n) increment i sum=sum+i*i return sum With some mathematical knowledge we can rewrite the program so that it is faster: ∑i 2 =n*(n+1)*(2n+1)/6 Algorithm Squares2(n): Input: Ingeger n Output: The sum of the first n positive integers sum  (n*(n+1)*(2*n+1))/6 return sum Is O(n) Is O(1)

ADS Lecture 815 Algorithm IntSqrt1(n): Input: Ingeger n Output: The integer part of the square root of n i  0 while ((i+1)*(i+1)<n) do increment i return i Algorithm IntSqrt2(n): Input: Ingeger n Output: The integer part of the square root of n L  0 H  n D  0 while H  L+1 do D= (L+H)/2 if (D*D<=n) then L=D else H=D return L Is O(log n) Is O( )   n Faster version: “hone in” on   n

16 Complexity of the simple search and sorting algorithms In JP2 you saw that: Search –LinearSearch – n/2 “comparisons” –BinarySearch – log 2 n “comparisions” Sorting –InsertionSort, SelectionSort: about n 2 /2 “comparisons” Actually BubbleSort also n 2 /2 (see later this lecture!) More formal analysis (like we’re doing here) would give: LinearSearch is O(n) Binary search cost is O(log 2 n), InsertionSort is O(n 2 ) etc.

17 (More) Definitions If T(n) is a function of n then –T(n) = O(f(n)) if there are constants c and n 0 such that T(n)  cf(n) when n  n 0 –T(n) =  (g(n)) if there are constants c and n 0 such that T(n)  cg(n) when n  n 0 –T(n)=  (h(n)) if and only if T(n) = O(h(n)) and T(n)=  (h(n)) –T(n)=o(p(n)) if T(n)=O(p(n)) and T(n)   (h(n)) Don’t panic! Will really only use the first of these, and we will translate them into English first… big omega big theta little o ADS Lecture 8

18 In English? T(n) = O(f(n)) –Can we find a constant c such for large enough n, T(n)  cf(n)? T(n)=  (g(n)) –Can we find a constant c such that for large enough n, T(n)  cg(n)? T(n)=  (h(n)) –Is the bound as tight as we can get? T(n)=o(p(n)) –Could we perhaps have done better? ADS Lecture 8

19 A note about logarithms in running times IntSqrt2 was O(log 2 n) because every time we iterated loop, H-L was divided by 2. (Similarly, binary search). Consider following example: while n >=1 do n := n/3 Do we say that this fragment is O(log 3 n)? No! we say it is O(log 2 n) - why? Traditionally, only use logs to the base 2 ADS Lecture 8

20 Two other complexities (both impossibly slow) Exponential – O(2 n ) –E.g. print out all of the combinations/subsets from a set of size n. –(Every element is either in or out of a subset) Factorial – O(n!) –E.g. print out all of the permutations of the n elements of a set. Could use recursion here. ADS Lecture 8

21 Function ODescriptionHow good? AO(1)constantnearly immediate A+Blog 2 nO(log n)logarithmicstupendously fast A+B√nO(√n)square rootvery fast A+BnO(n)linearfast A+Blog 2 n+Cn O(n)linearfast A+Bnlog 2 nO(n log n)n log nfairly fast A+Bn + Cnlog 2 n O(n log n)n log nfairly fast A+Bn+Cn 2 O(n 2 )quadraticslow for large n A+Bn+Cn 2 +Dn 3 O(n 3 )cubicslow for most n A+B2 n O(2 n )exponentialimpossibly slow An! O(n!) factorial impossibly slow A handy lookup table ADS Lecture 8

22 Function ODescriptionHow good? AO(1)constantnearly immediate A+Blog 2 nO(log n)logarithmicstupendously fast A+B√nO(√n)square rootvery fast A+BnO(n)linearfast A+Blog 2 n+Cn O(n)linearfast A+Bnlog 2 nO(n log n)n log nfairly fast A+Bn + Cnlog 2 n O(n log n)n log nfairly fast A+Bn+Cn 2 O(n 2 )quadraticslow for large n A+Bn+Cn 2 +Dn 3 O(n 3 )cubicslow for most n A+B2 n O(2 n )exponentialimpossibly slow An! O(n!) factorial impossibly slow A handy lookup table ADS Lecture 8

23 Function ODescriptionHow good? AO(1)constantnearly immediate A+Blog 2 nO(log n)logarithmicstupendously fast A+B√nO(√n)square rootvery fast A+BnO(n)linearfast A+Blog 2 n+Cn O(n)linearfast A+Bnlog 2 nO(n log n)n log nfairly fast A+Bn + Cnlog 2 n O(n log n)n log nfairly fast A+Bn+Cn 2 O(n 2 )quadraticslow for large n A+Bn+Cn 2 +Dn 3 O(n 3 )cubicslow for most n A+B2 n O(2 n )exponentialimpossibly slow An! O(n!) factorial impossibly slow A handy lookup table ADS Lecture 8 good

24 Function ODescriptionHow good? AO(1)constantnearly immediate A+Blog 2 nO(log n)logarithmicstupendously fast A+B√nO(√n)square rootvery fast A+BnO(n)linearfast A+Blog 2 n+Cn O(n)linearfast A+Bnlog 2 nO(n log n)n log nfairly fast A+Bn + Cnlog 2 n O(n log n)n log nfairly fast A+Bn+Cn 2 O(n 2 )quadraticslow for large n A+Bn+Cn 2 +Dn 3 O(n 3 )cubicslow for most n A+B2 n O(2 n )exponentialimpossibly slow An! O(n!) factorial impossibly slow A handy lookup table ADS Lecture 8 intractable

25 Best, average and worst cases The analysis so far depends only on the size of the problem and not on the disposition of data. In linear search we could be –lucky and find the item immediately - complexity O(1) –unlucky and find the item last - complexity O(n) otherwise we have average case here - complexity O(n) A full complexity analysis should include –best case –worst case –average case Note, in some cases (like most of our examples) these would all be equal ADS Lecture 8

26 Space Complexity We can analyse algorithms for space complexity. Space complexity is as important as time complexity - we can buy memory but … we still have to initialise/read/write it! ADS Lecture 8

A note about logs 27 Need to revise logs? Look in GoTa p Will do some examples in examples class. If x = b c then log b x =c So log = 3 and log = 10 For complexity analysis we generally use log x to mean log 2 x The main rules we use about logs are: log(xy) = log x + log y and log x y = ylogx So log = log 2 (32*32) = log log 2 32 = = 10

Demos, example and exercise for you

Look at our demo page

39 Some examples for you to try In all cases give an analysis of running time (big O will do.. but make it as small as possible) ADS Lecture 8 sum  0 for i=1 to N do for j=1 to i do sum=sum+1 sum  0 for i=1 to N do for j=1 to i*i do for k=1 to j do sum=sum+1 sum  0 for i=1 to N do for j=1 to i*i do if j mod i = 0 then for k=1 to j do sum=sum+1

40 Bubble sort Simple but inefficient. To sort the following list into ascending order First stage Select the first item, 3. Is it bigger than the next, 2? Yes - swap Select the second item, 3. Is it bigger than the next, 1? Yes- swap. Repeat until the biggest is rightmost.

original ,2 compared ,1 compared ,6 compared ,4 compared ,5 compared List of 6 items split into list of one item (the biggest) to the right and a list of 5 items (not necessarily in their original order) to the left. Second Stage Take the leftmost list of 5 items and do the same thing comparisons in total No change in 5 th place, but takes 4 comparisons - You check List of size 5 List of size 1 List of size 4 List of size 2

42 Complexity For N items in list: Number of comparisons at first stage is N-1 Number of comparisons at second stage is N-2... Total=(N-1)+…+2+1=(N-1)N/2≈N 2 /2 for large N But could stop when zero swaps made… Insertion_Sort Selection_Sort Minimum n-1 + n = n-1 = n(n-1)/2 Maximum n-1 n-1 + n = n(n-1)/2 = n(n-1)/2 Average ( n-1)/2 n-1 + n = n(n-1)/4 = n(n-1)/2 This is as bad as insertion sort and selection sort!