13 February 2016 Asymptotic Notation, Review of Functions & Summations.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Analysis Motivation Definitions Common complexity functions
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
3.Growth of Functions Hsu, Lih-Hsing.
1 Discrete Mathematics Summer 2004 By Dan Barrish-Flood originally for Fundamental Algorithms For use by Harper Langston in D.M.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Insertion Sort CSE 331 Section 2 James Daly. Insertion Sort Basic idea: keep a sublist sorted, then add new items into the correct place to keep it sorted.
Introduction to Analysis of Algorithms
Algorithm analysis and design Introduction to Algorithms week1
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
Asymptotics Data Structures & OO Development I 1 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Analysis Order of magnitude analysis.
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Asymptotic Notation (O, Ω, )
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
Time Complexity of Algorithms (Asymptotic Notations)
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
David Meredith Growth of Functions David Meredith
Asymptotic Notation Faculty Name: Ruhi Fatima
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Introduction to Algorithms: Asymptotic Notation
Chapter 3 Growth of Functions Lee, Hsiu-Hui
nalisis de lgoritmos A A
Asymptotic Analysis.
Introduction to Algorithms
Introduction to Algorithms (2nd edition)
Asymptotic Notation Q, O, W, o, w
Asymptotic Growth Rate
Asymptotic Analysis.
Ch 3: Growth of Functions Ming-Te Chi
Performance Evaluation
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Advanced Algorithms Analysis and Design
Algorithm Course Dr. Aref Rashad
Presentation transcript:

13 February 2016 Asymptotic Notation, Review of Functions & Summations

asymp - 1 Asymptotic Complexity  Running time of an algorithm as a function of input size n for large n.  Expressed using only the highest-order term in the expression for the exact running time.  Instead of exact running time, say  (n 2 ).  Describes behavior of function in the limit.  Written using Asymptotic Notation.

asymp - 2 Asymptotic Notation  , O, , o,   Defined for functions over the natural numbers.  Ex: f(n) =  (n 2 ).  Describes how f(n) grows in comparison to n 2.  Define a set of functions; in practice used to compare two function sizes.  The notations describe different rate-of-growth relations between the defining function and the defined set of functions.

asymp - 3  -notation  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, we have 0  c 1 g(n)  f(n)  c 2 g(n) } For function g(n), we define  (g(n)), big-Theta of n, as the set: g(n) is an asymptotically tight bound for f(n). Intuitively: Set of all functions that have the same rate of growth as g(n).

asymp - 4  -notation  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, we have 0  c 1 g(n)  f(n)  c 2 g(n) } For function g(n), we define  (g(n)), big-Theta of n, as the set: Technically, f(n)   (g(n)). Older usage, f(n) =  (g(n)). I’ll accept either… f(n) and g(n) are nonnegative, for large n.

asymp - 5 Example  10n 2 - 3n =  (n 2 )  What constants for n 0, c 1, and c 2 will work?  Make c 1 a little smaller than the leading coefficient, and c 2 a little bigger.  To compare orders of growth, look at the leading term.  Exercise: Prove that n 2 /2-3n=  (n 2 )  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, 0  c 1 g(n)  f(n)  c 2 g(n) }

asymp - 6 Example  Is 3n 3   (n 4 ) ??  How about 2 2n   (2 n )??  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, 0  c 1 g(n)  f(n)  c 2 g(n) }

asymp - 7 O-notation O(g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  f(n)  cg(n) } For function g(n), we define O(g(n)), big-O of n, as the set: g(n) is an asymptotic upper bound for f(n). Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n). f(n) =  (g(n))  f(n) = O(g(n)).  (g(n))  O(g(n)).

asymp - 8 Examples  Any linear function an + b is in O(n 2 ). How?  Show that 3n 3 =O(n 4 ) for appropriate c and n 0. O(g(n)) = {f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  f(n)  cg(n) }

asymp - 9  -notation g(n) is an asymptotic lower bound for f(n). Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n). f(n) =  (g(n))  f(n) =  (g(n)).  (g(n))   (g(n)).  (g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  cg(n)  f(n) } For function g(n), we define  (g(n)), big-Omega of n, as the set:

asymp - 10 Example   n =  (lg n). Choose c and n 0.  (g(n)) = {f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  cg(n)  f(n)}

asymp - 11 Relations Between , O, 

asymp - 12 Relations Between , , O  I.e.,  (g(n)) = O(g(n))   (g(n))  In practice, asymptotically tight bounds are obtained from asymptotic upper and lower bounds. Theorem : For any two functions g(n) and f(n), f(n) =  (g(n)) iff f(n) = O(g(n)) and f(n) =  (g(n)). Theorem : For any two functions g(n) and f(n), f(n) =  (g(n)) iff f(n) = O(g(n)) and f(n) =  (g(n)).

asymp - 13 Running Times  “Running time is O(f(n))”  Worst case is O(f(n))  O(f(n)) bound on the worst-case running time  O(f(n)) bound on the running time of every input.   (f(n)) bound on the worst-case running time   (f(n)) bound on the running time of every input.  “Running time is  (f(n))”  Best case is  (f(n))  Can still say “Worst-case running time is  (f(n))”  Means worst-case running time is given by some unspecified function g(n)   (f(n)).

asymp - 14 Example  Insertion sort takes  (n 2 ) in the worst case, so sorting (as a problem) is O(n 2 ). Why?  Any sort algorithm must look at each item, so sorting is  (n).  In fact, using (e.g.) merge sort, sorting is  (n lg n) in the worst case.  Later, we will prove that we cannot hope that any comparison sort to do better in the worst case.

asymp - 15 Asymptotic Notation in Equations  Can use asymptotic notation in equations to replace expressions containing lower-order terms.  For example, 4n 3 + 3n 2 + 2n + 1 = 4n 3 + 3n 2 +  (n) = 4n 3 +  (n 2 ) =  (n 3 ). How to interpret?  In equations,  (f(n)) always stands for an anonymous function g(n)   (f(n))  In the example above,  (n 2 ) stands for 3n 2 + 2n + 1.

asymp - 16 o-notation f(n) becomes insignificant relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = 0 n  g(n) is an upper bound for f(n) that is not asymptotically tight. Observe the difference in this definition from previous ones. Why? o(g(n)) = {f(n):  c > 0,  n 0 > 0 such that  n  n 0, we have 0  f(n) < cg(n)}. For a given function g(n), the set little-o:

asymp - 17  (g(n)) = {f(n):  c > 0,  n 0 > 0 such that  n  n 0, we have 0  cg(n) < f(n)}.  -notation f(n) becomes arbitrarily large relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = . n  g(n) is a lower bound for f(n) that is not asymptotically tight. For a given function g(n), the set little-omega:

asymp - 18 Comparison of Functions f  g  a  b f (n) = O(g(n))  a  b f (n) =  (g(n))  a  b f (n) =  (g(n))  a = b f (n) = o(g(n))  a < b f (n) =  (g(n))  a > b

asymp - 19 Limits  lim [f(n) / g(n)] = 0  f(n)   (g(n)) n   lim [f(n) / g(n)] <   f(n)   (g(n)) n   0 < lim [f(n) / g(n)] <   f(n)   (g(n)) n   0 < lim [f(n) / g(n)]  f(n)  (g(n)) n   lim [f(n) / g(n)] =   f(n)   (g(n)) n   lim [f(n) / g(n)] undefined  can’t say n 

asymp - 20 Properties  Transitivity f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n)) f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n)) f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n))  Reflexivity f(n) =  (f(n)) f(n) = O(f(n)) f(n) =  (f(n))

asymp - 21 Properties  Symmetry f(n) =  (g(n)) iff g(n) =  (f(n))  Complementarity f(n) = O(g(n)) iff g(n) =  (f(n)) f(n) = o(g(n)) iff g(n) =  ((f(n))

13 February 2016 Common Functions

asymp - 23 Monotonicity  f(n) is  monotonically increasing if m  n  f(m)  f(n).  monotonically decreasing if m  n  f(m)  f(n).  strictly increasing if m < n  f(m) < f(n).  strictly decreasing if m > n  f(m) > f(n).

asymp - 24 Exponentials  Useful Identities:  Exponentials and polynomials

asymp - 25 Logarithms x = log b a is the exponent for a = b x. Natural log: ln a = log e a Binary log: lg a = log 2 a lg 2 a = (lg a) 2 lg lg a = lg (lg a)

asymp - 26 Logarithms and exponentials – Bases  If the base of a logarithm is changed from one constant to another, the value is altered by a constant factor.  Ex: log 10 n * log 2 10 = log 2 n.  Base of logarithm is not an issue in asymptotic notation.  Exponentials with different bases differ by a exponential factor (not a constant factor).  Ex: 2 n = (2/3) n *3 n.

asymp - 27 Polylogarithms  For a  0, b > 0, lim n  ( lg a n / n b ) = 0, so lg a n = o(n b ), and n b =  (lg a n )  Prove using L’Hopital’s rule repeatedly  lg(n!) =  (n lg n)  Prove using Stirling’s approximation (in the text) for lg(n!).

asymp - 28 Exercise Express functions in A in asymptotic notation using functions in B. A B 5n n 3n A   (n 2 ), n 2   (B)  A   (B) log 3 (n 2 ) log 2 (n 3 ) log b a = log c a / log c b; A = 2lgn / lg3, B = 3lgn, A/B =2/(3lg3) n lg4 3 lg n a log b = b log a ; B =3 lg n =n lg 3 ; A/B =n lg(4/3)   as n  lg 2 n n 1/2 lim ( lg a n / n b ) = 0 (here a = 2 and b = 1/2)  A   (B) n  A   (B) A   (B) A   (B)

asymp - 29 THETOPPERSWAY.COM