Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Chapter 3 Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
Lecture 2: Math Review and Asymptotic Analysis. Common Math Functions Floors and Ceilings: x-1 < └ x ┘ < x < ┌ x ┐ < x+1. Modular Arithmetic: a mod n.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
3.Growth of Functions Hsu, Lih-Hsing.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
CSE 5311 DESIGN AND ANALYSIS OF ALGORITHMS. Definitions of Algorithm A mathematical relation between an observed quantity and a variable used in a step-by-step.
Algorithm Design and Analysis Liao Minghong School of Computer Science and Technology of HIT July, 2003.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
BY Lecturer: Aisha Dawood.  stands alone on the right-hand side of an equation (or inequality), example : n = O(n 2 ). means set membership :n ∈ O(n.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Design and Analysis of Algorithms Chapter Asymptotic Notations* Dr. Ying Lu August 30, RAIK.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Spring 2015 Lecture 2: Analysis of Algorithms
David Meredith Growth of Functions David Meredith
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
2IL50 Data Structures Spring 2016 Lecture 2: Analysis of Algorithms.
Asymptotic Notation Faculty Name: Ruhi Fatima
 Discrete numeric functions(or numeric functions): The functions whose domain is the set of natural numbers and whose range is the set of real numbers.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Chapter 3 Growth of Functions Lee, Hsiu-Hui
nalisis de lgoritmos A A
Introduction to Algorithms
Introduction to Algorithms (2nd edition)
Analysis of algorithms
Growth of functions CSC317.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
Asymptotic Notations Algorithms Lecture 9.
Asymptotic Growth Rate
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Chapter 2.
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Ch 3: Growth of Functions Ming-Te Chi
Intro to Data Structures
Advanced Algorithms Analysis and Design
At the end of this session, learner will be able to:
Analysis of algorithms
Presentation transcript:

Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)

Overview  Order of growth of functions provides a simple characterization of efficiency  Allows for comparison of relative performance between alternative algorithms  Concerned with asymptotic efficiency of algorithms  Best asymptotic efficiency usually is best choice except for smaller inputs  Several standard methods to simplify asymptotic analysis of algorithms

Asymptotic Notation  Applies to functions whose domains are the set of natural numbers: N = {0,1,2,…}  If time resource T(n) is being analyzed, the function’s range is usually the set of non- negative real numbers: T(n)  R +  If space resource S(n) is being analyzed, the function’s range is usually also the set of natural numbers: S(n)  N

Asymptotic Notation  Depending on the textbook, asymptotic categories may be expressed in terms of -- a.set membership (our textbook): functions belong to a family of functions that exhibit some property; or b.function property (other textbooks): functions exhibit the property  Caveat: we will formally use (a) and informally use (b)

The Θ-Notation f c1 ⋅ gc1 ⋅ g n0n0 c2 ⋅ gc2 ⋅ g Θ (g(n)) = { f(n) : ∃ c 1, c 2 > 0, n 0 > 0 s.t. ∀ n ≥ n 0 : c 1 · g(n) ≤ f(n) ≤ c 2 ⋅ g(n) }

The O-Notation f c ⋅ gc ⋅ g n0n0 O(g(n)) = { f(n) : ∃ c > 0, n 0 > 0 s.t. ∀ n ≥ n 0 : f(n) ≤ c ⋅ g(n) }

The Ω-Notation Ω(g(n)) = { f(n) : ∃ c > 0, n 0 > 0 s.t. ∀ n ≥ n 0 : f(n) ≥ c ⋅ g(n) } f c ⋅ gc ⋅ g n0n0

The o-Notation o(g(n)) = { f(n) : ∀ c > 0 ∃ n 0 > 0 s.t. ∀ n ≥ n 0 : f(n) ≤ c ⋅ g(n) } f c1 ⋅ gc1 ⋅ g n1n1 c2 ⋅ gc2 ⋅ g c3 ⋅ gc3 ⋅ g n2n2 n3n3

The ω-Notation f c1 ⋅ gc1 ⋅ g n1n1 c2 ⋅ gc2 ⋅ g c3 ⋅ gc3 ⋅ g n2n2 n3n3 ω (g(n)) = { f(n) : ∀ c > 0 ∃ n 0 > 0 s.t. ∀ n ≥ n 0 : f(n) ≥ c ⋅ g(n) }

 f(n) = O(g(n)) and g(n) = O(h(n)) ⇒ f(n) = O(h(n))  f(n) = Ω(g(n)) and g(n) = Ω(h(n)) ⇒ f(n) = Ω(h(n))  f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒ f(n) = Θ(h(n))  f(n) = O(f(n)) f(n) = Ω(f(n)) f(n) = Θ(f(n)) Comparison of Functions Reflexivity Transitivity

 f(n) = Θ(g(n)) ⇐⇒ g(n) = Θ(f(n))  f(n) = O(g(n)) ⇐⇒ g(n) = Ω(f(n))  f(n) = o(g(n)) ⇐⇒ g(n) = ω(f(n))  f(n) = O(g(n)) and f(n) = Ω(g(n)) ⇒ f(n) = Θ(g(n)) Comparison of Functions Transpose Symmetry Symmetry Theorem 3.1

Asymptotic Analysis and Limits

 f 1 (n) = O(g 1 (n)) and f 2 (n) = O(g 2 (n)) ⇒ f 1 (n) + f 2 (n) = O(g 1 (n) + g 2 (n))  f(n) = O(g(n)) ⇒ f(n) + g(n) = O(g(n)) Comparison of Functions

Standard Notation and Common Functions  Monotonicity A function f(n) is monotonically increasing if m  n implies f(m)  f(n). A function f(n) is monotonically decreasing if m  n implies f(m)  f(n). A function f(n) is strictly increasing if m < n implies f(m) < f(n). A function f(n) is strictly decreasing if m f(n).

Standard Notation and Common Functions  Floors and ceilings For any real number x, the greatest integer less than or equal to x is denoted by  x . For any real number x, the least integer greater than or equal to x is denoted by  x . For all real numbers x, x  1 <  x   x   x  < x+1. Both functions are monotonically increasing.

Standard Notation and Common Functions  Exponentials For all n and a  1, the function a n is the exponential function with base a and is monotonically increasing.  Logarithms Textbook adopts the following convention lg n = log 2 n (binary logarithm), ln n = log e n (natural logarithm), lg k n = (lg n) k (exponentiation), lg lg n = lg(lg n) (composition), lg n + k = (lg n)+k (precedence of lg). ai

 Important relationships For all real constants a and b such that a>1, n b = o(a n ) that is, any exponential function with a base strictly greater than unity grows faster than any polynomial function. For all real constants a and b such that a>0, lg b n = o(n a ) that is, any positive polynomial function grows faster than any polylogarithmic function. Standard Notation and Common Functions

 Factorials For all n the function n! or “n factorial” is given by n! = n  (n  1)  (n  2)  (n  3)  …  2  1 It can be established that n! = o(n n ) n! =  (2 n ) lg(n!) =  ( n lgn )

 Functional iteration The notation f (i) (n) represents the function f(n) iteratively applied i times to an initial value of n, or, recursively f (i) (n) = n if n=0 f (i) (n) = f(f (i  1) (n)) if n>0 Example: If f(n) = 2n then f (2) (n) = f(2n) = 2(2n) = 2 2 n then f (3) (n) = f(f (2) (n)) = 2(2 2 n) = 2 3 n then f (i) (n) = 2 i n Standard Notation and Common Functions

 Iterated logarithmic function The notation lg* n which reads “log star of n ” is defined as lg* n = min {i  0 : lg (i) n  1 Example: lg* 2 = 1 lg* 4 = 2 lg* 16 = 3 lg* = 4 lg* = 5 Standard Notation and Common Functions

 We consider algorithm A better than algorithm B if T A (n) = o(T B (n))  Why is it acceptable to ignore the behavior of algorithms for small inputs?  Why is it acceptable to ignore the constants?  What do we gain by using asymptotic notation? Asymptotic Running Time of Algorithms

 Asymptotic analysis studies how the values of functions compare as their arguments grow without bounds.  Ignores constants and the behavior of the function for small arguments.  Acceptable because all algorithms are fast for small inputs and growth of running time is more important than constant factors. Things to Remember

 Ignoring the usually unimportant details, we obtain a representation that succinctly describes the growth of a function as its argument grows and thus allows us to make comparisons between algorithms in terms of their efficiency. Things to Remember