The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)

Slides:



Advertisements
Similar presentations
CSE115/ENGR160 Discrete Mathematics 03/01/12
Advertisements

Analysis of Algorithms
Discrete Structures CISC 2315
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
February 19, 2015Applied Discrete Mathematics Week 4: Number Theory 1 The Growth of Functions Question: If f(x) is O(x 2 ), is it also O(x 3 )? Yes. x.
Discrete Mathematics CS 2610 February 26, part 1.
Chapter 3 Growth of Functions
The Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Recursive Definitions Rosen, 3.4. Recursive (or inductive) Definitions Sometimes easier to define an object in terms of itself. This process is called.
Runtime Analysis CSC 172 SPRING 2002 LECTURE 9 RUNNING TIME A program or algorithm has running time T(n), where n is the measure of the size of the input.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Real-Valued Functions of a Real Variable and Their Graphs
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Data Structure Algorithm Analysis TA: Abbas Sarraf
Algorithms Chapter 3 With Question/Answer Animations.
February 17, 2015Applied Discrete Mathematics Week 3: Algorithms 1 Double Summations Table 2 in 4 th Edition: Section th Edition: Section th.
Chapter 5 Algorithm Analysis 1CSCI 3333 Data Structures.
Program Efficiency and Complexity
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 Growth of Functions CS 202 Epp, section ??? Aaron Bloomfield.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CS 3343: Analysis of Algorithms
DISCRETE MATHEMATICS I CHAPTER 11 Dr. Adam Anthony Spring 2011 Some material adapted from lecture notes provided by Dr. Chungsim Han and Dr. Sam Lomonaco.
Recurrence Relation. Outline  What is a recurrence relation ?  Solving linear recurrence relations  Divide-and-conquer algorithms and recurrence relations.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Fall 2002CMSC Discrete Structures1 Enough Mathematical Appetizers! Let us look at something more interesting: Algorithms.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
MCA-2012Data Structure1 Algorithms Rizwan Rehman CCS, DU.
MCA 202: Discrete Structures Instructor Neelima Gupta
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CompSci 102 Discrete Math for Computer Science
Big Oh Notation Greek letter Omicron (Ο) is used to denote the limit of asymptotic growth of an algorithm If algorithm processing time grows linearly with.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Time Complexity of Algorithms (Asymptotic Notations)
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
The Fundamentals. Algorithms What is an algorithm? An algorithm is “a finite set of precise instructions for performing a computation or for solving.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Review 2 Basic Definitions Set - Collection of objects, usually denoted by capital letter Member, element - Object in a set, usually denoted by lower.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Lecture aid 1 Devon M. Simmonds University of North Carolina, Wilmington CSC231 Data Structures.
Lecture 7. Asymptotic definitions for the analysis of algorithms 1.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
Chapter 3 Chapter Summary  Algorithms o Example Algorithms searching for an element in a list sorting a list so its elements are in some prescribed.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Discrete Mathematics Chapter 2 The Fundamentals : Algorithms, the Integers, and Matrices. 大葉大學 資訊工程系 黃鈴玲.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Jessie Zhao Course page: 1.
Big-O. Speed as function Function relating input size to execution time – f(n) = steps where n = length of array f(n) = 4(n-1) + 3 = 4n – 1.
COMP108 Algorithmic Foundations Algorithm efficiency
Analysis of Algorithms & Orders of Growth
The Growth of Functions
The Growth of Functions
Enough Mathematical Appetizers!
Computation.
Review 2.
Asymptotic Growth Rate
Orders of Growth Rosen 5th ed., §2.2.
Applied Discrete Mathematics Week 6: Computation
Enough Mathematical Appetizers!
Enough Mathematical Appetizers!
Algorithms Chapter 3 With Question/Answer Animations.
Presentation transcript:

The Growth of Functions Rosen 2.2

Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x) + log z (y) = log z (x) - log z (y) = ylog z (x) then log z (x) = log z (y) then log z (x) < log z (y) If log z (x) = a, then x = z a

Growth If f is a function from Z or R to R, how can we quantify the rate of growth and compare rates of growth of different functions? Possible problem: Whether f(n) or g(n) is larger at any point may depend on value of n. For example: n 2 > 100n if n > 100

How to quantify growth as n gets bigger?

Big-O Notation Let f and g be functions from the set of integers or the set of real numbers to the set of real numbers. We say that f(x) is O(g(x)) if there are constants C  N and k  R such that |f(x)|  C|g(x)| whenever x > k. We say “f(x) is big-oh of g(x)”. The intuitive meaning is that as x gets large, the values of f(x) are no larger than a constant time the values of g(x), or f(x) is growing no faster than g(x). The supposition is that x gets large, it will approach a simplified limit.

Show that 3x 3 +2x 2 +7x+9 is O(x 3 ) Proof: We must show that  constants C  N and k  R such that |3x 3 +2x 2 +7x+9|  C|x 3 | whenever x > k. Choose k = 1 then 3x 3 +2x 2 +7x+9  3x 3 +2x 3 +7x 3 +9x 3 = 21x 3 So let C = 21. Then 3x 3 +2x 2 +7x+9  21 x 3 when x  1.

Show that n! is O(n n ) Proof: We must show that  constants C  N and k  R such that |n!|  C|n n | whenever n > k. n! = n(n-1)(n-2)(n-3)…(3)(2)(1)  n(n)(n)(n)…(n)(n)(n) n times =n n So choose k = 0 and C = 1

General Rules Multiplication by a constant does not change the rate of growth. If f(n) = kg(n) where k is a constant, then f is O(g) and g is O(f). The above means that there are an infinite number of pairs C,k that satisfy the Big-O definition. Addition of smaller terms does not change the rate of growth. If f(n) = g(n) + smaller order terms, then f is O(g) and g is O(f). Ex.: f(n) = 4n 6 + 3n n is O(n 6 ).

General Rules (cont.) If f 1 (x) is O(g 1 (x)) and f 2 (x) is O(g 2 (x)), then f 1 (x)f 2 (x) is O(g 1 (x)g 2 (x)). Examples: 10xlog 2 x is O(xlog 2 x) n!6n 3 is O(n!n 3 ) =O(n n+3 )

Examples f(x) = 10 is O(1) f(x) = x 2 + x + 1 is O(x 2 ) f(x) = 2x x 3 + xlogx is O(x 5 ) f(x) = 2 n + n 10 is O(2 n ) How would you prove this?

Prove that n 10 is O(2 n ) Proof: We must show that  constants C  N and k  R such that |n 10 |  C|2 n | whenever n > k. Take log 2 of both expressions. log 2 2 n = nlog 2 2 =n,log 2 n 10 = 10log 2 n When is 10log 2 n 10? 2/1 = 2,4/2 = 2,8/3  2.67,16/4 = 4 32/5 = 6.2,64/6  For n = 64, 2 64 > So, if we choose then k = 64, C = 1, then |n 10 |  1*|2 n | whenever n > 64.

Example: Big-Oh Not Symmetric Order matters in big-oh. Sometimes f is O(g) and g is O(f), but in general big-oh is not symmetric. Consider f(n) = 4n and g(n) = n 2. f is O(g). Can we prove that g is O(f)? Formally,  constants C  N and k  R such that |n 2 |  C|4n| whenever n > k? No. To show this, we must prove that negation is true for all C and k.  C  N,  k  R,  n>k such that n 2 > C|4n|.

 C  N,  k  R,  n>k such that n 2 > 4nC. To prove that negation is true, start with arbitrary C and k. Must show/construct an n>k such that n 2 > 4nC Easy to satisfy n > k, then To satisfy n 2 >4nC, divide both sides by n to get n>4C. Pick n = max(4C+1,k+1), which proves the negation.

Is 2 n O(n!)? We must show that  constants C  N and k  Z such that |2 n |  C|n!| whenever n > k. 2 n = 2(2)(2)…(2)(2)(2) n times  n(n-1)(n-2)…(3)(2)(1) =n! if n = 4 So let C = 1 and k = 3.

Is 2 n O(n!)? Note that we could also choose k = 1 and C = 2 Since |2 0 |  2*|0!| = 2 |2 1 |  2*|1!| = 2 |2 2 |  2*|2!| = 4 |2 3 |  2*|3!|= 12

Is f(x)=(x 2 +1)/(x+1) O(x)? We must show that  constants C  N and k  R such that |f(x)|  C|x| whenever x > k. When x > 1 (Why?) Therefore let k=1, C = 1 |(x 2 +1)/(x+1)|  |x| when x > 1

Hierarchy of functions 1 log 2 n 3n 3n nn n nlog 2 n nnnn n2n2 n3n3 2n2n n! n

Hierarchy of functions 1 log 2 n 3n 3n nn n nlog 2 n nnnn n2n2 n3n3 2n2n n! n

Hierarchy of functions 1, log 2 n 3n 3n nn n nlog 2 n nnnn n2n2 n3n3 2n2n n! n

Hierarchy of functions 1, log 2 n, 3  n nn n nlog 2 n nnnn n2n2 n3n3 2n2n n! n

Hierarchy of functions 1, log 2 n, 3  n,  n n nlog 2 n nnnn n2n2 n3n3 2n2n n! n

Hierarchy of functions 1, log 2 n, 3  n,  n, n nlog 2 n nnnn n2n2 n3n3 2n2n n! n

Hierarchy of functions 1, log 2 n, 3  n,  n, n, nlog 2 n nnnn n2n2 n3n3 2n2n n! n

Hierarchy of functions 1, log 2 n, 3  n,  n, n, nlog 2 n, n  n n2n2 n3n3 2n2n n! n

Hierarchy of functions 1, log 2 n, 3  n,  n, n, nlog 2 n, n  n, n 2 n3n3 2n2n n! n

Hierarchy of functions 1, log 2 n, 3  n,  n, n, nlog 2 n, n  n, n 2, n 3 2n2n n! n

Hierarchy of functions 1, log 2 n, 3  n,  n, n, nlog 2 n, n  n, n 2, n 3, 2 n n! n

Hierarchy of functions 1, log 2 n, 3  n,  n, n, nlog 2 n, n  n, n 2, n 3, 2 n, n!, n n Each one is Big-Oh of any function to its right

Prove that log 10 x is O(log 2 x) First we will prove the following lemma: Lemma: log 10 x = clog 2 x where c is a constant. Proof: Let y = log 2 x. Then 2 y = x and log 10 2 y = log 10 x. log 10 2 y = ylog 10 2 = log 10 x. But since y = log 2 x, this means that log 2 xlog 10 2 = log 10 x. Therefore c = log 10 2

To Prove that log 10 x is O(log 2 x) We must show that  constants CN CN and kR kR such that |log 10 x|  C|log 2 x| whenever x > k. From the lemma log 10 2log 2 x = log 10 x; so choose C = log 10 2, k=0

Prove log(n!) is O(nlogn) We must show that  constants C  N and k  R such that |logn!|  C|nlogn| whenever x > k. We know that n!  n n so log(n!)  log(n n )= nlogn So choose k = 1, C = 1

Time Complexity We can use Big-O to find the time complexity of algorithms (i.e., how long they take in terms of the number of operations performed). There are two types of complexity normally considered. Worst-case complexity. The largest number of operations needed for a problem of a given size in the worst case. The number of operations that will guarantee a solution. Average-case complexity. The average number of operations used to solve a problem over all inputs of a given size.

Complexity of the Linear Search Algorithm Worst-case complexity The algorithm loops over all the values in a list of n values. –At each step, two comparisons are made. One to see whether the end of the loop is reached, and one to compare the search element x with the element in the list. –If x is equal to list element a i, then 2i comparisons are made. –If x is not in the list, then 2n comparisons are made. –The worst-case complexity is thus 2n and is O(n).

Complexity of the Linear Search Algorithm Average-case complexity The algorithm loops over all the values in a list of n values. –At each step, two comparisons are again made. –On average, the number of comparisons is ….+ (2n) n What’s the numerator? Average case is O(n) 2 ( )

Complexity of Pair-wise Correlation Assume that there are n elements to correlate