David Meredith dave@create.aau.dk Growth of Functions David Meredith dave@create.aau.dk.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
COMP 171 Data Structures and Algorithms Tutorial 2 Analysis of algorithms.
20-Jun-15 Analysis of Algorithms II. 2 Basics Before we attempt to analyze an algorithm, we need to define two things: How we measure the size of the.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Text Chapters 1, 2. Sorting ä Sorting Problem: ä Input: A sequence of n numbers ä Output: A permutation (reordering) of the input sequence such that:
CSE 5311 DESIGN AND ANALYSIS OF ALGORITHMS. Definitions of Algorithm A mathematical relation between an observed quantity and a variable used in a step-by-step.
Algorithm analysis and design Introduction to Algorithms week1
Algorithm Design and Analysis Liao Minghong School of Computer Science and Technology of HIT July, 2003.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
Introduction to Algorithms Jiafen Liu Sept
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
A Lecture /24/2015 COSC3101A: Design and Analysis of Algorithms Tianying Ji Lecture 1.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Time Complexity of Algorithms (Asymptotic Notations)
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 5: Algorithms.
Spring 2015 Lecture 2: Analysis of Algorithms
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
2IL50 Data Structures Spring 2016 Lecture 2: Analysis of Algorithms.
21-Feb-16 Analysis of Algorithms II. 2 Basics Before we attempt to analyze an algorithm, we need to define two things: How we measure the size of the.
Asymptotic Notation Faculty Name: Ruhi Fatima
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Kompleksitas Waktu Asimptotik CSG3F3 Lecture 5. 2 Intro to asymptotic f(n)=an 2 + bn + c RMB/CSG3F3.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
September 18, Algorithms and Data Structures Lecture II Simonas Šaltenis Aalborg University
Asymptotic Complexity
Introduction to Algorithms: Asymptotic Notation
Introduction to Algorithms
Introduction to Algorithms (2nd edition)
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Notations Algorithms Lecture 9.
Foundations II: Data Structures and Algorithms
CS 583 Analysis of Algorithms
Asymptotic Growth Rate
Fundamentals of Algorithms MCS - 2 Lecture # 9
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Analysis of Algorithms II
O-notation (upper bound)
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Advanced Analysis of Algorithms
Presentation transcript:

David Meredith dave@create.aau.dk Growth of Functions David Meredith dave@create.aau.dk

Source Chapter 3 of Cormen, T. H., Leiserson, C. E., Rivest, R. L. and Stein, C. (2001). Introduction to Algorithms (Second Edition). MIT Press, Cambridge, MA.

Introduction Order of growth of the running time of an algorithm indicates algorithm’s efficiency When n, the input size, is large enough, merge sort with it’s Θ(n lg n) worst case running time beats insertion sort whose worst-case running time is Θ(n2) When processing lots of data (e.g., in many multimedia applications) then we only worry about the highest-order terms in the order of growth We ignore constants and we ignore the lower-order terms

Asymptotic behaviour Only generally concerned with how the running time (or space) increases in the limit That is, the asymptotic behaviour of the algorithm

Asymptotic notation Asymptotic running time usually stated in terms of functions whose domains are the set of natural numbers, N = {0, 1, ,2, ...} Worst-case running time, T(n), usually defined on integral input sizes Notation often abused e.g., by extending to domain of real numbers But should be careful not to misuse the notation

Θ-notation Worst-case running time of insertion sort is Θ(n2) If g(n) is a function, then Θ(g(n)) is the set of functions, f(n), such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 for some given values of c1, c2 and n0

Θ-notation Θ(g(n)) is a set so we can write In fact, we actually usually write This may be confusing, but we’ll see later that it has its advantages

Θ-notation In figure on left, f(n) is between c1g(n) and c2g(n) g(n) is an asymptotically tight bound for f(n) Θ(g(n)) must be asymptotically non-negative f(n) must be non-negative when n is large enough

Θ-notation Let’s show that if Need to find c1, c2 and n0 such that for all n ≥ n0

Θ-notation Divide by n2 to get RH inequality holds for n ≥ 1 if c2 ≥ ½ LH inequality holds for n ≥ 7 if if c1 ≤ 1/14 So if c1 = 1/14, c2 = 1/2 and n0 = 7, we can verify that

Θ-notation Let’s now show that Suppose c2 and n0 exist such that for all n ≥ n0 But then n ≤ c2/6 which cannot hold for all n greater than some number n0

Θ-notation Can generally ignore all but the highest-order terms in the growth function If then where c1 is slightly less than a1 and c2 is slightly greater than a1

Θ-notation A constant function, c, is said to be Θ(n0) = Θ(1) If the running time of an algorithm is Θ(1), we say it is constant time

O-notation Θ-notation gives tight asymptotic bounds on a function from above and below When we only have an asymptotic upper bound, we use O-notation O(g(n)) is the set of functions 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 for some given values of c and n0

O-notation

O-notation and loop structure Can often infer upper bound on worst-case just by inspecting loop structure of an algorithm Algorithm at top left is linear: O(n) Algorithm at bottom left is cubic: O(n3)

O-notation If use O-notation to bound the worst-case running time, then we have an upper bound on the time for every input this is because the bound is not necessarily tight Thus, the running time for every input on insertion sort is O(n2) – EVEN THE LINEAR BEST CASE! Note that the running time for every input is NOT given by Θ-notation e.g., best-case running time of insertion sort is Θ(n) which is NOT Θ(n2)

Ω-notation Ω-notation gives an asymptotic lower bound cf. O-notation which gives an asymptotic upper bound

Theorem relating Θ, O and Ω If f(n) is Θ(g(n)), this means f(n) has tight upper and lower bounds of the form cg(n) If f(n) is Ω(g(n)), then f(n) has a lower bound of the form cg(n) If f(n) is O(g(n)), then f(n) has an upper bound of the form cg(n) If both the upper and lower bounds are cg(n), then these bounds must be tight, so f(n) is Θ(g(n))

Ω-notation When Ω is used to bound the best case, then it bounds the running time for all inputs For example, the best-case running time of insertion sort is Ω(n), so the running time of insertion sort is Ω(n) (for all inputs) The worst-case running time of insertion sort is Ω(n2) (and Ω(n)) But the running time of insertion sort is NOT Ω(n2) because the best case is Θ(n)

Asymptotic notation in equations

o-notation The bound provided by O-notation may or may not be tight The bound provided by o-notation is necessarily not tight o(g(n)) is the set of functions, f(n), such that for any positive constant, c > 0, there exists a constant, n0 > 0, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 For example, 2n = o(n2) but 2n2 ≠ o(n2) The difference between o-notation and O-notation is that O only has to hold for at least one constant, c, whereas o has to hold for all constants c greater than 0

ω-notation ω-notation is to Ω-notation what o-notation is to O-notation ω-notation gives a lower bound that is not asymptotically tight ω(g(n)) is the set of functions, f(n), such that for any positive constant, c > 0, there exists a constant, n0>0, such that 0 ≤ cg(n) < f(n) for all n > n0

Transitivity

Reflexivity

Symmetry and transpose symmetry

Analogy with comparison between numbers f(n) is asymptotically smaller than g(n) if f(n)=o(g(n)) f(n) is asymptotically larger than g(n) if f(n)=ω(g(n))

Trichotomy For real numbers, it must be true that a = b or a < b or a > b However some functions are not asymptotically comparable that is, f(n) is neither O(g(n)) nor Ω(g(n)) For example, the function n and n1+sin n are not comparable because the value in the exponent of n1+sin n oscillates between 0 and 2, taking all values in between