Asymptotic Notation Faculty Name: Ruhi Fatima

Slides:



Advertisements
Similar presentations
한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님
Advertisements

BY Lecturer: Aisha Dawood. The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains.
Analysis of Algorithms
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
Lecture 2: Math Review and Asymptotic Analysis. Common Math Functions Floors and Ceilings: x-1 < └ x ┘ < x < ┌ x ┐ < x+1. Modular Arithmetic: a mod n.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Algorithmic Complexity: Complexity Analysis of Time Complexity Complexities Nate the Great.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
BY Lecturer: Aisha Dawood.  stands alone on the right-hand side of an equation (or inequality), example : n = O(n 2 ). means set membership :n ∈ O(n.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Analysis of Algorithms Algorithm Input Output An algorithm is a step-by-step procedure for solving a problem in a finite amount of time.
Time Complexity of Algorithms (Asymptotic Notations)
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Spring 2015 Lecture 2: Analysis of Algorithms
David Meredith Growth of Functions David Meredith
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
CSC317 1 Recap: Oh, Omega, Theta Oh (like ≤) Omega (like ≥) Theta (like =) O(n) is asymptotic upper bound 0 ≤ f(n) ≤ cg(n) Ω(n) is asymptotic lower bound.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Introduction to Algorithms: Asymptotic Notation
nalisis de lgoritmos A A
Asymptotic Analysis.
Introduction to Algorithms
Introduction to Algorithms (2nd edition)
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Notations Algorithms Lecture 9.
Asymptotic Growth Rate
Asymptotic Analysis.
CS 3343: Analysis of Algorithms
Ch 3: Growth of Functions Ming-Te Chi
O-notation (upper bound)
Advanced Algorithms Analysis and Design
Chapter 3 Growth of Functions
Presentation transcript:

Asymptotic Notation Faculty Name: Ruhi Fatima Topics Covered Theta Notation Oh Notation Omega Notation Standard Notations and Common function CCSE-ICS353-prepared by:Shamiel .H -Natalia .A

Asymptotic notations Asymptotic notation are primarily used to describe the running times of algorithms The Running time of Algorithm is defined as : the time needed by an algorithm in order to deliver its output when presented with legal input . In Asymptotic notation, algorithm is treated as a function. Let us consider asymptotic notations that are well suited to characterizing running times no matter what the input. There are 3 standard asymptotic notations O notation: asymptotic “less than”: f(n)=O(g(n)) implies: f(n) “≤” g(n)  notation: asymptotic “greater than”: f(n)=  (g(n)) implies: f(n) “≥” g(n)  notation: asymptotic “equality”: f(n)=  (g(n)) implies: f(n) “=” g(n)

Θ(Theta)-Notation(Tight Bound – Average Case) DEFINITION: For a given function g(n), Θ (g(n)) is defined as follows Θ (g(n)) ={ f(n) : there exist positive constants c1, c2, and n0 such that 0≤ c1g(n)≤ f (n) ≤ c2g(n) for all n ≥ n0}. Consider an intuitive picture of functions f(n) and g(n), where f(n)=Θ (g(n)) For all values of n at and to the right of n0, the value of f(n) lies at or above c1g(n) and at or below c2g(n). So, g(n) is an asymptotically tight bound for f(n). Example: Is 10 n2+ 20n = Θ (n2 ) ? Answer: Given f(n)=10n2+ 20n since for all n≥1 Let n0= 1,c1=10 and c2=30 Such that for all n ≥1, 10n2 ≤f(n) ≤30 n2 So, 10 n2+ 20n = Θ (n2 ) is true.

O(Big Oh)-notation(Upper Bound – Worst Case) DEFINITION : For a given function g(n), we denote by O(g(n)) (pronounced “big-oh of g of n” or sometimes just “oh of g of n”) the set of functions O (g(n)) ={ f(n) : there exist positive constant c, n0 such that 0≤ f (n) ≤ c g(n) for all n ≥ n0}. For all values of n at and to the right of n0, the value of f(n) lies at or below cg(n). So, g(n) is an asymptotically upper bound for f(n). Example: Suppose f(n) = 4n2+5n+3. Is f(n) is O(n2 ) ? Solution: f(n)= 4n2+5n+3 ≤ 4n2+5n+3 , for all n > 0 ≤ 4n2+5n2+3n2 , for all n > 1 ≤12n2 for all n > 1 4n2+5n+3 ≤ 12n2 Hence f(n) = O(n2 )

(Omega)-notation (Lower Bound – Best Case) DEFINITION : For a given function g(n), we denote by Ω (g(n)) (pronounced “big-omega of g of n” or sometimes just “omega of g of n”) the set of functions Ω (g(n)) ={ f(n) : there exist positive constant c, c, and n0 such that 0≤ c g(n)≤ f (n) for all n ≥ n0}. For all values of n at and to the right of n0, the value of f(n) lies at or above cg(n). So, g(n) is an asymptotically lower bound for f(n). Example: Let f(n)=10 n2+ 20 n since for all n≥1 f(n)=10 n2+20 n f(n)≥10n2 10 n2+20 n ≥ 10n2 Therefore f(n) = (n2) as there exists a natural number n0 =1 and a constant c=10>0 such that for all n ≥n0 ,f(n) ≥ cg(n)

Asymptotic notation in equations and inequalities Theorem : For any two functions f (n) and g(n), we have f (n) = Θ(g(n)) if and only if f (n) = O(g(n)) and f (n) =Ω(g(n)). Asymptotic notation in equations and inequalities In general, however, when asymptotic notation appears in a formula, we interpret it as standing for some anonymous function that we do not care to name. For example, the formula 2n2+3n+1 = 2n2 +Θ(n) means that 2n2+3n+1 = 2n2 +f(n), where f(n) is some function in the set Θ(n) . In this case, we let f (n)=3n+ 1, which indeed is in Θ(n) . Using asymptotic notation in this manner can help eliminate inessential detail and clutter in an equation. For example, the worst-case running time of merge sort as the recurrence T (n)= 2T(n/2) + Θ(n). Since the interset is only in the asymptotic behavior of T (n), the lower-order terms are all understood to be included in the anonymous function denoted by the term Θ(n).

Cont… The number of anonymous functions in an expression is understood to be equal to the number of times the asymptotic notation appears. If the Asymptotic notation appears on the left-hand side of an equation, as in 2n2 + Θ(n) = Θ(n2) , then The above equations are interpreted using the following rule: “No matter how the anonymous functions are chosen on the left of the equal sign, there is a way to choose the anonymous functions on the right of the equal sign to make the equation valid”. Thus, our example means that for any function f(n) ϵ Θ(n), there is some function g(n) ϵ Θ(n2), such that 2n2 + f (n)= g (n) for all n. In other words, the right-hand side of an equation provides a coarser (rough) level of detail than the left-hand side. We can chain together a number of such relationships, as in 2n2 + 3n + 1 = 2n2 + Θ (n) = Θ (n2)

o(little oh)-notation We use o-notation to denote an upper bound that is not asymptotically tight. We formally define o(g(n)) (“little-oh of g of n”) as the set o(g(n)) = { f(n) : for any positive constant c>0, there exists a constant n0 >0 such that 0≤ f(n) < c g(n) for all n ≥ n0 }. Example: 2n = o(n2), but 2n2 ≠ o(n2). f(n) becomes insignificant relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = 0 n

w(little omega)-notation For a given function g(n), the set little-omega: w(g(n)) = {f(n):  c > 0,  n0 > 0 such that  n  n0, we have 0  cg(n) < f(n)}. Example, n2/2 = w(n), but n2/2 ≠w(n2) f(n) becomes arbitrarily large relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = . n g(n) is a lower bound for f(n) that is not asymptotically tight.

Relational Properties Transitivity f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n)) f(n) = w(g(n)) & g(n) = w(h(n))  f(n) = w(h(n)) Reflexivity f(n) = (f(n)) f(n) = O(f(n)) f(n) = (f(n)) Symmetry f(n) = (g(n)) iff g(n) = (f(n)) Transpose symmetry (Complementarity) f(n) = O(g(n)) iff g(n) = (f(n)) f(n) = o(g(n)) iff g(n) = w((f(n))

Comparison of Functions f  g  a  b f (n) = O(g(n))  a  b , f (n) = o(g(n))  a < b f (n) = (g(n))  a  b, f (n) = w (g(n))  a > b f (n) = (g(n))  a = b Limits lim [f(n) / g(n)] = 0 Þ f(n) Î o(g(n)) n lim [f(n) / g(n)] <  Þ f(n) Î O(g(n)) 0 < lim [f(n) / g(n)] <  Þ f(n) Î Q(g(n)) 0 < lim [f(n) / g(n)] Þ f(n) Î W(g(n)) lim [f(n) / g(n)] =  Þ f(n) Î w(g(n)) lim [f(n) / g(n)] undefined Þ can’t say

Excercise Let f (n) and g (n) be asymptotically nonnegative functions. Using the basic definition of ‚Θ -notation, prove that max ( f (n), g(n)) = Θ (f (n) + g (n)). Show that for any real constants a and b, where b >0, (n+a)b = Θ(nb).

Standard notations and common functions Monotonicity : A function f (n) is monotonically increasing if m ≤ n implies f (m) ≤ f (n). Similarly, it is monotonically decreasing if m ≤ n implies f(m) ≥ f (n). A function f (n) is strictly increasing if m < n implies f (m) < f (n) and strictly decreasing if m < n implies f (m) > f (n). Floors and ceilings: For any real number x, we denote the greatest integer less than or equal to x by (read “the floor of x”) and the least integer greater than or equal to x by (read “the ceiling of x”). For all real x,

Cont… For any integer n,

Exponentials & Logarithms Exponentials: For all real a > 0, m, and n, we have the following identities: Logarithms: x = logba is the exponent for a = bx. Natural log: ln a = logea Binary log: lg a = log2a Exponential log : lg2a = (lg a)2 Composition log : lg lg a = lg (lg a) CCSE-ICS353-prepared by:Shamiel .H -Natalia .A

Functional iteration We use the notation f (i) (n) to denote the function f(n) iteratively applied i times to an initial value of n. Formally, let f (n) be a function over the reals. For nonnegative integers i , we recursively define: For example, if f (n) = 2n, then f (i) (n) = 2i n. CCSE-ICS353-prepared by:Shamiel .H -Natalia .A