1 Section 5.6 Comparing Rates of Growth We often need to compare functions ƒ and g to see whether ƒ(n) and g(n) are about the same or whether one grows.

Slides:



Advertisements
Similar presentations
Data Structues and Algorithms Algorithms growth evaluation.
Advertisements

Discrete Structures CISC 2315
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Nattee Niparnan. Recall What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
1 Section 14.1 Computability Some problems cannot be solved by any machine/algorithm. To prove such statements we need to effectively describe all possible.
The Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
CS 310 – Fall 2006 Pacific University CS310 Complexity Section 7.1 November 27, 2006.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Instructor Neelima Gupta
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
Asymptotics Data Structures & OO Development I 1 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Analysis Order of magnitude analysis.
Mathematics Review and Asymptotic Notation
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Asymptotic Analysis-Ch. 3
MCA 202: Discrete Structures Instructor Neelima Gupta
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Time Complexity of Algorithms
Time Complexity of Algorithms (Asymptotic Notations)
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Lecture 2 Analysis of Algorithms How to estimate time complexity? Analysis of algorithms Techniques based on Recursions ACKNOWLEDGEMENTS: Some contents.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Algorithm Analysis: Running Time Big O and omega ( 
Asymptotic Notation Faculty Name: Ruhi Fatima
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
CSC317 1 Recap: Oh, Omega, Theta Oh (like ≤) Omega (like ≥) Theta (like =) O(n) is asymptotic upper bound 0 ≤ f(n) ≤ cg(n) Ω(n) is asymptotic lower bound.
Chapter 2 Algorithm Analysis
Introduction to the Design and Analysis of Algorithms
nalisis de lgoritmos A A
Asymptotic Analysis.
Introduction to Algorithms
What is an Algorithm? Algorithm Specification.
Analysis of Algorithms
Growth of functions CSC317.
The Growth of Functions
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Asymptotics & Stirling’s Approximation
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Asymptotic Analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Advanced Analysis of Algorithms
Introduction to Algorithms
Intro to Data Structures
Asymptotics & Stirling’s Approximation
Asymptotics & Stirling’s Approximation
At the end of this session, learner will be able to:
David Kauchak cs161 Summer 2009
CSE 373, Copyright S. Tanimoto, 2001 Asymptotic Analysis -
Advanced Analysis of Algorithms
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Algorithm Course Dr. Aref Rashad
Presentation transcript:

1 Section 5.6 Comparing Rates of Growth We often need to compare functions ƒ and g to see whether ƒ(n) and g(n) are about the same or whether one grows faster as n increases. The functions might represent running times of algorithms that we need to compare. They might also be approximations with closed forms are easier to evaluate. To compare functions we need to give precise meanings to the phrases “about the same” and “grows faster.” Big Oh. The growth rate of ƒ is bounded above by the growth rate of g if there are constants d > 0 and m such that |ƒ(n)| ≤ d|g(n)| for n ≥ m If this is the case we write ƒ(n) = O(g(n)) and say, “ƒ(n) is big oh of g(n).” Examples/Quizzes. ƒ(n) = O(ƒ(n)), 200n = O((1/100)n), n = O(n 2 ) Big Omega. The growth rate of ƒ is bounded below by the growth rate of g if there are constants c > 0 and m such that c|g(n)| ≤ |ƒ(n)| for n ≥ m If this is the case we write ƒ(n) =  (g(n)) and say, “ƒ(n) is big omega of g(n).” Examples/Quizzes. ƒ(n) =  (ƒ(n)), (1/100)n =  (200n), n 2 =  (n)

2 Big Theta. ƒ has the same growth rate as g if there are constants c > 0, d > 0 and m such that c|g(n)| ≤ |ƒ(n)| ≤ d|g(n)| for n ≥ m (or, if g(n) ≠ 0 then c ≤|ƒ(n)/g(n)| ≤ d for n ≥ m) If this is the case we write ƒ(n) =  (g(n)) and say, “ƒ(n) is big theta of g(n).” Examples. 1.  n  =  (  n  )). Proof: (1/2)  n  ≤  n  ≤ 1·  n  for n ≥ 1. QED. 2.n(n + 1)/2 =  (n 2 ). Proof: (1/2)·n 2 ≤ n(n + 1)/2 ≤ 2·n 2 for n ≥ 0. QED. Theorem. If and c ≠ 0 and c ≠ , then ƒ(n) =  (g(n)). Example. So n(n + 1)/2 =  (n 2 ). Example. The converse of the theorem is false. For example, let ƒ(n) = (1 + n mod 2)n 2 and let g(n) = n 2. Then ƒ(n) =  (g(n)) by letting c = 1, d = 2, and m = 0. But the following limit does not exist:

3 If this is the case we write ƒ(n) = o(g(n)) and say, “ƒ(n) is little oh of g(n).” Example. Quiz. Show that log(n + 1) =  (log n). Solution (using the definition). Since n < n + 1 < n 2 for n ≥ 2 it follows that log n < log (n + 1) < log n 2 = 2log n for n ≥ 2. So let c = 1, d = 2, and m = 2. Therefore, log(n + 1) =  (log n). QED. Solution (using the theorem). Little oh. ƒ has lower growth rate than g if (use base e and L’Hospital’s rule) Therefore, log(log n) = o(log n). (use base e and L’Hospital’s rule) Therefore, log(n + 1) =  (log n). QED.

4 Example/Quiz. Show that if ƒ(n) = o(g(n)), then ƒ(n) = O(g(n)). Proof: Since ƒ(n) = o(g(n)), it follows that So for any  > 0 there is an m such that |ƒ(n)/g(n)| <  for n ≥ m. In other words, |ƒ(n)| ≤  |g(n)| for n ≥ m. So ƒ(n) = O(g(n)). QED. Example. Let ƒ and g be defined by ƒ(n) = if n is odd then 1 else n g(n) = n. Show that ƒ(n) = O(g(n)), but ƒ(n) ≠ o(g(n)) and ƒ(n) ≠  (g(n)). Proof: We have ƒ(n) ≤ g(n) for n ≥ 1. So ƒ(n) = O(g(n)). The limit as n approaches infinity of ƒ(n)/g(n) does not exist. So ƒ(n) ≠ o(g(n)). Assume, BWOC, that ƒ(n) =  (g(n)). Then there are constants c > 0, d > 0 and m such that c|g(n)| ≤ |ƒ(n)| ≤ d|g(n)| for n ≥ m So c|n| ≤ |1| if n is odd and n ≥ m, which is a contradiction. So ƒ(n) ≠  (g(n)). QED.

5 Example/Quiz. Show that if ƒ(n) = o(g(n)), then g(n) =  (ƒ(n)). Proof: Since ƒ(n) = o(g(n)), it follows that So for any  > 0 there is an m such that |ƒ(n)/g(n)| <  for n ≥ m. In other words, |ƒ(n)| ≤  |g(n)| for n ≥ m. So (1/  )|ƒ(n)| ≤ |g(n)| for n ≥ m. So g(n) =  ( ƒ(n)). QED. Example/Quiz. ƒ(n) =  (g(n)) iff g(n) = O(ƒ(n)). Proof: c|g(n)| ≤ |ƒ(n)| for n ≥ m iff |g(n)| ≤ (1/c)|ƒ(n)| for n ≥ m. QED. Example. We saw earlier that a divide and conquer algorithm that splits a problem of size n = b k into b smaller problems of size n/b, and does this recursively, uses a n = tn + tnlog b n operations, where tn is the number of operations needed to split up the problem of size n and put a solution back together. We can now observe that a n =  (nlog b n),