O, , and  Notations Lecture 39 Section 9.2 Wed, Mar 30, 2005.

Slides:



Advertisements
Similar presentations
CSE115/ENGR160 Discrete Mathematics 03/01/12
Advertisements

CMSC 341 Asymptotic Analysis. 2 Mileage Example Problem: John drives his car, how much gas does he use?
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Week 12 - Monday.  What did we talk about last time?  Trees  Graphing functions.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Discrete Mathematics CS 2610 February 26, part 1.
The Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Real-Valued Functions of a Real Variable and Their Graphs
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Analysis of Algorithms CPS212 Gordon College. Measuring the efficiency of algorithms There are 2 algorithms: algo1 and algo2 that produce the same results.
Algorithms Chapter 3 With Question/Answer Animations.
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Real-Valued Functions of a Real Variable and Their Graphs Lecture 43 Section 9.1 Wed, Apr 18, 2007.
4.2 - The Mean Value Theorem
1 Growth of Functions CS 202 Epp, section ??? Aaron Bloomfield.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
CMSC 341 Asymptotic Analysis. 8/3/07 UMBC CMSC 341 AA2-color 2 Complexity How many resources will it take to solve a problem of a given size?  time 
DISCRETE MATHEMATICS I CHAPTER 11 Dr. Adam Anthony Spring 2011 Some material adapted from lecture notes provided by Dr. Chungsim Han and Dr. Sam Lomonaco.
Real-Valued Functions of a Real Variable and Their Graphs Lecture 38 Section 9.1 Mon, Mar 28, 2005.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Discrete Mathematics Math Review. Math Review: Exponents, logarithms, polynomials, limits, floors and ceilings* * This background review is useful for.
Ch 18 – Big-O Notation: Sorting & Searching Efficiencies Our interest in the efficiency of an algorithm is based on solving problems of large size. If.
Asymptotic Analysis-Ch. 3
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Week 11 - Friday.  What did we talk about last time?  Graph isomorphism  Trees.
CompSci 102 Discrete Math for Computer Science
Functions Defined on General Sets Lecture 35 Section 7.1 Fri, Mar 30, 2007.
1 Growth of Functions CS/APMA 202 Rosen section 2.2 Aaron Bloomfield.
O -Notation April 23, 2003 Prepared by Doug Hogan CSE 260.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Asymptotic Analysis (based on slides used at UMBC)
Big Oh Notation Greek letter Omicron (Ο) is used to denote the limit of asymptotic growth of an algorithm If algorithm processing time grows linearly with.
Logarithmic Functions & Their Graphs
The Fundamentals. Algorithms What is an algorithm? An algorithm is “a finite set of precise instructions for performing a computation or for solving.
Solving Polynomials.
8/3/07CMSC 341 Asymptotic Anaylsis1 CMSC 341 Asymptotic Analysis.
Equivalence Relations Lecture 45 Section 10.3 Fri, Apr 8, 2005.
Chapter 3. Chapter Summary Algorithms Example Algorithms Growth of Functions Big-O and other Notation Complexity of Algorithms.
Partial Order Relations Lecture 50 Section 10.5 Wed, Apr 20, 2005.
6.6 Function Operations Honors. Operations on Functions Addition: h(x) = f(x) + g(x) Subtraction: h(x) = f(x) – g(x) Multiplication: h(x) = f(x) g(x)
Chapter 3 Chapter Summary  Algorithms o Example Algorithms searching for an element in a list sorting a list so its elements are in some prescribed.
Chapter 3 With Question/Answer Animations 1. Chapter Summary Algorithm and Growth of Function (This slide) Algorithms - Sec 3.1 – Lecture 13 Example Algorithms.
Discrete Mathematics Chapter 2 The Fundamentals : Algorithms, the Integers, and Matrices. 大葉大學 資訊工程系 黃鈴玲.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Jessie Zhao Course page: 1.
Equivalence Relations
The Growth of Functions
Partial Order Relations
The Growth of Functions
CS 2210 Discrete Structures Algorithms and Complexity
Direct Proof and Counterexample IV
Asymptotic Growth Rate
CS 2210 Discrete Structures Algorithms and Complexity
Lecture 43 Section 10.1 Wed, Apr 6, 2005
Time Complexity Lecture 14 Sec 10.4 Thu, Feb 22, 2007.
One-to-One and Onto, Inverse Functions
One-to-One and Onto, Inverse Functions
Intro to Data Structures
Time Complexity Lecture 15 Mon, Feb 27, 2006.
College Algebra Chapter 3 Polynomial and Rational Functions
Algorithms Chapter 3 With Question/Answer Animations.
Lecture 44 Section 9.2 Mon, Apr 23, 2007
Equivalence Relations
CS 2210 Discrete Structures Algorithms and Complexity
Presentation transcript:

O, , and  Notations Lecture 39 Section 9.2 Wed, Mar 30, 2005

 Notation Let g : R  R be a function. A function f : R  R is “of order at least g,” written “f(x) is  (g(x)),” if there exist real numbers M, x 0 such that  f(x)   M  g(x)  for all x  x 0.

“Big-Oh” Notation A function f : R  R is “of order at most g,” written “f(x) is  O(g(x)),” if there exist real numbers M, x 0 such that  f(x)   M  g(x)  for all x  x 0.

 Notation A function f : R  R is “of order g,” written “f(x) is  (g(x)),” if there exist real numbers M 1, M 2, x 0  R such that M 1  g(x)    f(x)   M 2  g(x)  for all x  x 0.

Growth Rates If f(x) is O(g(x)), then the growth rate of f is no greater than the growth rate of g, and maybe less. If f(x) is  (g(x)), then the growth rate of f is no less than the growth rate of g, and maybe greater.

Growth Rates Theorem: If f(x) is O(g(x)) and g(x) is O(f(x)), then f(x) is  (g(x)). Proof: If f(x) is O(g(x)), then there exist M 1, x 1  R such that  f(x)   M 1  g(x)  for all x  x 1. If g(x) is O(f(x)), then there exist M 2, x 2  R such that  g(x)   M 2  f(x)  for all x  x 2.

Growth Rates Let x 3 = max(x 1, x 2 ). Then (1/M 2 )  g(x)    f(x)   M 1  g(x)  for all x  x 3. Therefore, f(x) is  (g(x)). We call  (f(x)) the growth rate of f.

Growth Rates Theorem: f(x) is  (g(x)) if and only if f(x) is  (g(x)) and f(x) is O(g(x)). Proof: Show that if f(x) is  (g(x)), then g(x) is O(f(x)). Then apply the previous theorem.

Common Growth Rates Growth RateExample  (1) Access an array element  (log 2 x) Binary search (x)(x) Sequential search  (x log 2 x) Quick sort (x2)(x2) Bubble sort  (2 x ) Factor an integer  (x!) Traveling salesman problem

Transitivity of O(f) Theorem: Let f, g, and h be functions from R to R. If f(x) is O(g(x)) and g(x) is O(h(x)), then f(x) is O(h(x)). Proof: If f(x) is O(g(x)), then  f(x)   M 1  g(x)  for all x  x 1, for some M 1 and x 1. If g(x) is O(h(x)), then  g(x)   M 2  h(x)  for all x  x 2, for some M 2 and x 2.

Transitivity of O(f) Let x 3 = max(x 1, x 2 ). Then  f(x)   M 1 M 2  h(x) , for all x  x 3. Therefore, f(x) is O(h(x)).

Power Functions Theorem: If 0  a  b, then x a is O(x b ), but x b is not O(x a ). Proof: Since a  b, for all x > 1, x a – b < 1. Therefore, x a 1. On the other hand, suppose x b  Mx a, for some M and for all x > x 0, for some x 0.

Power Functions Then, x b – a  M, for all x > x 0. But b – a > 0, so x b – a increases without bound. This is a contradiction, so x b is not O(x a ).

Application Any polynomial of degree d is  (x m ), if m < d.  (x d ). O(x n ), if n > d. For example, 16x 3 – 10x 2 + 3x – 12 is  (x 2 ),  (x 3 ), and O(x 4 ).

Logarithmic Functions Lemma: Proof: It true by the definition of the log a function. Also, it follows from the fact that the functions f(x) = a x and g(x) = log a x are inverses of each other.

Logarithmic Functions Theorem: For all a, b  1, log a x is  (log b x). Proof:

Application Since all logarithmic functions are of the same order, it doesn’t matter which base we use.

Logarithmic Functions vs. Power Functions Theorem: For all a > 1, b > 0, log a x is O(x b ), but x b is not O(log a x). Proof (using Calc II):

Application Every polynomial function has a faster growth rate than all logarithmic functions. If a function mixes polynomial terms and logarithmic terms, then the polynomial tems “dominate.” 3x log 10 (5x + 2) is  (x 2 ).

Exponential Functions Theorem: If 1 < a < b, then a x is O(b x ), but b x is not O(a x ). Proof: Clearly, a x < b x for all x  1. Therefore, a x is O(b x ). We must show that b x is not O(a x ). Suppose that it is. Then b x  Ma x for all x > x 0 for some M, x 0.

Exponential Functions Then But this contradicts that assumption that the inequality holds for all x > x 0.

Application Any exponential function with base a is  (b x ), if b < a.  (a x ). O(b x ), if b > a. For example, 5  8 x is  (2 x ),  (8 x ), and O(10 x ).

Power Functions vs. Exponential Functions Theorem: For all a > 0, b > 1, x a is O(b x ), but b x is not O(x a ).

Application If a function is a mix of polynomial terms, logarithmic terms, and exponential terms, the exponential term with the highest base “dominates.” 5x log 2 x  3 x + 28  5 x is  (5 x ).

Benefits of O, , and  Clearly, the  -notation is useful because it singles out the term that best represents the growth rate of the function (for large x). The O-notation is useful when we can’t pin down the exact growth rate, but we can put an upper bound on it. Similarly, the  -notation expresses a lower bound.

Multiplicativity of O(f) Theorem: Let f, g, h, and k be functions from R to R. If f(x) is O(h(x)) and g(x) is O(k(x)), then f(x)g(x) is O(h(x)k(x)). Proof: Suppose  f(x)   M 1  h(x)  for all x  x 1 and  g(x)   M 2  k(x)  for all x  x 2, for some M 1, M 2, x 1, x 2. Then  f(x)g(x)   M 1 M 2  h(x)k(x)  for all x  max(x 1, x 2 ).

Application We know that log x is O(x) and that x is O(x), so x log x is O(x 2 ). More generally, x d is O(x d log x). x d log x is O(x d + 1 ).