Growth of Functions CIS 606 Spring 2010.

Slides:



Advertisements
Similar presentations
한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님
Advertisements

Analysis of Algorithms
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Chapter 3 Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Lecture 2: Math Review and Asymptotic Analysis. Common Math Functions Floors and Ceilings: x-1 < └ x ┘ < x < ┌ x ┐ < x+1. Modular Arithmetic: a mod n.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
3.Growth of Functions Hsu, Lih-Hsing.
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
CSE 5311 DESIGN AND ANALYSIS OF ALGORITHMS. Definitions of Algorithm A mathematical relation between an observed quantity and a variable used in a step-by-step.
Algorithm Design and Analysis Liao Minghong School of Computer Science and Technology of HIT July, 2003.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
Introduction to Algorithms Jiafen Liu Sept
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
BY Lecturer: Aisha Dawood.  stands alone on the right-hand side of an equation (or inequality), example : n = O(n 2 ). means set membership :n ∈ O(n.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
COSC 3101A - Design and Analysis of Algorithms 2 Asymptotic Notations Continued Proof of Correctness: Loop Invariant Designing Algorithms: Divide and Conquer.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotics and Recurrence Equations Prepared by John Reif, Ph.D. Analysis of Algorithms.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
2IS80 Fundamentals of Informatics Fall 2015 Lecture 5: Algorithms.
Spring 2015 Lecture 2: Analysis of Algorithms
Asymptotic Analysis CSE 331. Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where?
David Meredith Growth of Functions David Meredith
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
2IL50 Data Structures Spring 2016 Lecture 2: Analysis of Algorithms.
Asymptotic Notation Faculty Name: Ruhi Fatima
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Introduction to Algorithms: Asymptotic Notation
Chapter 3 Growth of Functions Lee, Hsiu-Hui
nalisis de lgoritmos A A
Introduction to Algorithms (2nd edition)
CS 3343: Analysis of Algorithms
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Fundamentals of Algorithms MCS - 2 Lecture # 9
Advanced Analysis of Algorithms
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Ch 3: Growth of Functions Ming-Te Chi
Intro to Data Structures
O-notation (upper bound)
Advanced Algorithms Analysis and Design
Advanced Analysis of Algorithms
Exponential functions
Presentation transcript:

Growth of Functions CIS 606 Spring 2010

Asymptotic Notation O-notation O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ c g(n) for all n ≥ n0 }.

Asymptotic Notation g(n) is an asymptotic upper bound for f(n) If f(n) ∈O(g(n)), we write f(n) = O(g(n)) Example 2n2 = O(n3) with c = 1 and n0 = 2. Examples of functions in O(n2)

Ω-Notation Ω(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ c g(n) ≤ f(n) for all n ≥ n0 }.

Ω-Notation g(n) is an asymptotic lower bound for f(n). Example √n = Ω(lg n) with c = 1 and n0 = 16. Examples of functions in Ω(n2)

Θ-Notation Θ-notation: Θ(g(n)) = { f(n): there exist positive constants c1, c2 and n0 such that 0 ≤ c1 g(n) ≤ f(n) ≤ c2 g(n) for all n ≥ n0 }.

Θ-Notation g(n) is an asymptotically tight bound for f(n). Example n2 / 2 - 2n = Θ(n2) with c1 = ¼, c2 = ½ and n0 = 16. Theorem f(n) = Θ(g(n)) if and only if f = O(g(n)) and f = Ω(g(n)). Leading constants and low order terms don’t matter.

Asymptotic Notations in Equations When on right-hand side O(n2) stands for some function in the set O(n2). 2n2 + 3n + 1 = 2n2 + Θ(n) for some f(n) in Θ(n) In particular f(n) = 3n + 1. When on left-hand side No matter how the anonymous functions are chosen on the left-hand side, there is a way to choose the anonymous functions on the right-hand side to make the equation valid.

Asymptotic Notations in Equations Interpret 2n2 + Θ(n) = Θ(n2) as meaning for all functions f(n) ∈ Θ(n2) there exists a function g(n) ∈ Θ(n2) such that 2n2 + f(n) = g(n). Can chain together 2n2 + 3n + 1 = 2n2 + Θ(n) = Θ(n2)

Asymptotic Notations in Equations Interpretation First equation: there exists f(n) ∈ Θ(n) such that 2n2 + 3n + 1 = 2n2 + f(n). Second equation: For all g(n) ∈ Θ(n) (such as the f(n) used to make the first equation hold), there exists h(n) ∈ Θ(n2) such that 2n2 + g(n) = h(n).

o-notation o(g(n)) = {f(n): for all constants c > 0, there exists a constant n0 > 0 such that 0 ≤ f(n) < c g(n) for all n ≥ n0 . Another view, probably easier to use:

ω-notation ω(g(n))= {f(n): for all constants c > 0, there exists a constant n0 > 0 such that 0 ≤ c g(n) < f(n) for all n > n0 }. Another view, probably easier to use:

Comparisons of Functions Relational properties Transitivity f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒ f(n) = Θ(h(n)) Same for O, Ω, o, and ω. Reflexivity f(n) = Θ(f(n)) Same for O, and Ω. Symmetry f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n)) Transpose symmetry f(n) = O(g(n)) if and only of g(n) = Ω (f(n)) f(n) = o(g(n)) if and only of g(n) = ω(f(n))

Comparisons of Functions f(n) is asymptotically larger than g(n) if f(n) = ω(g(n)) f(n) is asymptotically smaller than g(n) if f(n) = o(g(n)) No trichotomy. Although intuitively we can liken O to ≤ and Ω to ≥, unlike real numbers where a < b, a = b, or a > b, we might not be able to compare functions Example: n1+sin n and n since 1 + sin n oscillates between 0 and 2.

Standard Notations and Common Functions Monotonicity f(n) is monotonically increasing if m ≤ n ⇒ f(m) ≤ f(n) f(n) is monotonically decreasing if m ≥ n ⇒ f(m) ≥ f(n) f(n) is strictly increasing if m < n ⇒ f(m) < f(n) f(n) is strictly decreasing if m > n ⇒ f(m) > f(n)

Standard Notations and Common Functions Exponentials Useful identities: a-1 = 1/a (am)n = amn aman = am+n Can relate rates of growth of exponentials and polynomials: for all real constants a and b such that a > 1 Which implies that nb = o(an). For all real x, ex ≥ 1 + x.

Standard Notations and Common Functions Logarithms lg n = log2 n (binary logarithm). ln n = loge n (natural logarithm). lgk n = (lg n)k (exponentiation). lg lg n = lg(lg n) (composition). Useful identities for all real a > 0, b > 0, c > 0 and logarithm bases not 0

Standard Notations and Common Functions Factorials n! = 1 ⋅ 2 ⋅ 3 ⋅ … ⋅ n Special case 0! = 1. Can use Stirling’s approximation To derive that lg(n!) = Θ (n lg n).