Real-Valued Functions of a Real Variable and Their Graphs

Slides:



Advertisements
Similar presentations
Growth-rate Functions
Advertisements

CMSC 341 Asymptotic Analysis. 2 Mileage Example Problem: John drives his car, how much gas does he use?
Discrete Structures CISC 2315
February 19, 2015Applied Discrete Mathematics Week 4: Number Theory 1 The Growth of Functions Question: If f(x) is O(x 2 ), is it also O(x 3 )? Yes. x.
Discrete Mathematics CS 2610 February 26, part 1.
MAT 4 – Kompleks Funktionsteori MATEMATIK 4 INDUKTION OG REKURSION MM 1.5 MM 1.5: Kompleksitet Topics: Computational complexity Big O notation Complexity.
The Growth of Functions
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
1 Section 2.3 Complexity of Algorithms. 2 Computational Complexity Measure of algorithm efficiency in terms of: –Time: how long it takes computer to solve.
Analysis of Algorithms CPS212 Gordon College. Measuring the efficiency of algorithms There are 2 algorithms: algo1 and algo2 that produce the same results.
Algorithms Chapter 3 With Question/Answer Animations.
February 17, 2015Applied Discrete Mathematics Week 3: Algorithms 1 Double Summations Table 2 in 4 th Edition: Section th Edition: Section th.
1 i206: Lecture 6: Math Review, Begin Analysis of Algorithms Marti Hearst Spring 2012.
Real-Valued Functions of a Real Variable and Their Graphs Lecture 43 Section 9.1 Wed, Apr 18, 2007.
Chapter 5 Algorithm Analysis 1CSCI 3333 Data Structures.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
Limit Laws Suppose that c is a constant and the limits lim f(x) and lim g(x) exist. Then x -> a Calculating Limits Using the Limit Laws.
1 Growth of Functions CS 202 Epp, section ??? Aaron Bloomfield.
Lecture 2 Computational Complexity
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
CMSC 341 Asymptotic Analysis. 8/3/07 UMBC CMSC 341 AA2-color 2 Complexity How many resources will it take to solve a problem of a given size?  time 
DISCRETE MATHEMATICS I CHAPTER 11 Dr. Adam Anthony Spring 2011 Some material adapted from lecture notes provided by Dr. Chungsim Han and Dr. Sam Lomonaco.
Real-Valued Functions of a Real Variable and Their Graphs Lecture 38 Section 9.1 Mon, Mar 28, 2005.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Discrete Mathematics Math Review. Math Review: Exponents, logarithms, polynomials, limits, floors and ceilings* * This background review is useful for.
Fall 2002CMSC Discrete Structures1 Enough Mathematical Appetizers! Let us look at something more interesting: Algorithms.
MCA-2012Data Structure1 Algorithms Rizwan Rehman CCS, DU.
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
O, , and  Notations Lecture 39 Section 9.2 Wed, Mar 30, 2005.
CompSci 102 Discrete Math for Computer Science
Analysis of Algorithms [ Section 4.1 ] Examples of functions important in CS: the constant function:f(n) =
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
O -Notation April 23, 2003 Prepared by Doug Hogan CSE 260.
Chapter 9 Efficiency of Algorithms. 9.1 Real Valued Functions.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Asymptotic Analysis (based on slides used at UMBC)
Copyright © 2014 Curt Hill Growth of Functions Analysis of Algorithms and its Notation.
Time Complexity of Algorithms
Counting Discrete Mathematics. Basic Counting Principles Counting problems are of the following kind: “How many different 8-letter passwords are there?”
Logarithmic Functions & Their Graphs
Section 3.2 Comparing Exponential and Linear Functions.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
The Fundamentals. Algorithms What is an algorithm? An algorithm is “a finite set of precise instructions for performing a computation or for solving.
8/3/07CMSC 341 Asymptotic Anaylsis1 CMSC 341 Asymptotic Analysis.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
Discrete Mathematics Chapter 2 The Fundamentals : Algorithms, the Integers, and Matrices. 大葉大學 資訊工程系 黃鈴玲.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Lesson Objectives Aims Understand the following: The big O notation.
Analysis of Algorithms & Orders of Growth
The Growth of Functions
The Growth of Functions
Enough Mathematical Appetizers!
Computation.
CS 2210 Discrete Structures Algorithms and Complexity
Orders of Growth Rosen 5th ed., §2.2.
CS 2210 Discrete Structures Algorithms and Complexity
Applied Discrete Mathematics Week 6: Computation
Time Complexity Lecture 14 Sec 10.4 Thu, Feb 22, 2007.
Application: Efficiency of Algorithms I
Enough Mathematical Appetizers!
Intro to Data Structures
Time Complexity Lecture 15 Mon, Feb 27, 2006.
Enough Mathematical Appetizers!
Lecture 44 Section 9.2 Mon, Apr 23, 2007
CS 2210 Discrete Structures Algorithms and Complexity
Presentation transcript:

Real-Valued Functions of a Real Variable and Their Graphs Section 9.1 Real-Valued Functions of a Real Variable and Their Graphs

Let f be a real valued function of a real variable Let f be a real valued function of a real variable. The graph of f is the set of all points (x,y) in the Cartesian coordinate plane with the property that x is in the domain of f and y = f(x) For application in computer science, we are almost invariably concerned with situations where x and a are nonnegative.

The Linear Function f(x) = x

The Quadratic Function f(x) = x2

The Cubic Function f(x) = x3

Power Functions xa, a  1 The higher the power of x, the faster the function grows. xa grows faster than xb if a > b.

The Square-Root Function

The Cube-Root Function

The Fourth-Root Function

Power Functions xa, 0 < a < 1 The lower the power of x, the more slowly the function grows. xa grows more slowly than xb if a < b. This is the same rule as before, stated in the contrapositive.

Power Functions x3 x2 x x

Multiples of Functions x2 3x 2x x

Multiples of Functions Notice that x2 eventually exceeds any constant multiple of x. Generally, if f(x) grows faster than g(x), then f(x) also grows faster than cg(x), for any real number c. In other words, we think of g(x) and cg(x) as growing at the “same rate.” If the growth rate of one function is a constant time the growth rate of another then they’re said to’ve the same growth rate

The Logarithmic Function f(x) = log2 x

Growth of the Logarithmic Function The logarithmic functions grow more and more slowly as x gets larger and larger.

f(x) = log2 x vs. g(x) = x1/n x1/2 log2 x x1/3

Logarithmic Functions vs. Power Functions The logarithmic functions grow more slowly than any power function xa, 0 < a < 1.

f(x) = x vs. g(x) = x log2 x x log2 x x

f(x) vs. f(x) log2 x The growth rate of log x is between the growth rates of 1 and x. Therefore, the growth rate of f(x) log x is between the growth rates of f(x) and x f(x).

f(x) vs. f(x) log2 x x2 log2 x x2 x log2 x x

The Exponential Function f(x) = 2x

Growth of the Exponential Function The exponential functions grow faster and faster as x gets larger and larger. Every exponential function grows faster than every power function.

f(x) = 2x vs. Power Functions (Small Values of x)

f(x) = 2x vs. Power Functions (Large Values of x)

Section 9.2 O-Notation

“Big-Oh” Notation: O(f) Let g : R  R be a function. A function f : R  R is “of order g,” written “f(x) is O(g(x)),” if there exist constants M, x0  R such that f(x)  Mg(x) for all x  x0.

Growth Rates If f(x) is O(g(x)), then the growth rate of f is no greater than the growth rate of g. If f(x) is O(g(x)) and g(x) is O(f(x)), then f and g have the same growth rate.

Common Growth Rates Growth Rate Example O(1) Access an array element O(log2 x) Binary search O(x) Sequential search O(x log2 x) Quick sort O(x2) Bubble sort O(2x) Factor an integer O(x!) Traveling salesman problem

Estimating Run Times Suppose a program has growth rate O(f), for a specific function f(x). Then the run time is of the form t = cf(x) for some real number c. Suppose the program runs in t0 seconds when the input size is x0. Then c = t0/f(x0).

Estimating Run Times Thus, the run time is given by t = (t0/f(x0))  f(x) = t0  (f(x)/f(x0)).

Comparison of Growth Rates Consider programs with growth rates of O(1), O(log x), O(x), O(x log x), and O(x2). Assume that each program runs in 1 second when the input size is 100. Calculate the run times for input sizes 102, 103, 104, 105, 106, and 107.

Comparison of Growth Rates x O(1) O(log x) O(x) O(x log x) O(x2) 102 1 sec 103 1.5 sec 10 sec 15 sec 17 min 104 2 sec 1.7 min 3.3 min 2.8 hrs 105 2.5 sec 42 min 11 days 106 3 sec 8.3 hrs 3.2 yrs 107 3.5 sec 1.2 days 4.1 days 320 yrs

Comparison with O(2n) Now consider a program with growth rate O(2x). Assume that the program runs in 1 second when the input size is 100. Calculate the run times for input sizes 100, 110, 120, 130, 140, and 150.

Comparison with O(2n) x O(x2) O(2x) 100 1 sec 110 1.2 sec 17 min 120 12 days 130 1.7 sec 34 yrs 140 2.0 sec 35,000 yrs 150 2.3 sec 36,000,000 yrs

Comparison with O(n!) Now consider a program with growth rate O(x!). Assume that the program runs in 1 second when the input size is 100. Calculate the run times for input sizes 100, 101, 102, 103, 104, and 105.

Comparison with (n!) x O(2x) O(x!) 100 1 sec 101 2 sec 1.7 min 102 2.9 hrs 103 8 sec 12 days 104 16 sec 3.5 yrs 105 32 sec 367 yrs

Comparing Orders Let f and g be two functions. Define O(f)  O(g) to mean that f is O(g). Define O(f) = O(g) to mean that f is O(g) and g is O(f). Define O(f) < O(g) to mean that f is O(g), but g is not O(f).

Comparing Orders Theorem: O(xa) < O(xb) if 0 < a < b. O(loga x) = O(logb x) for all a, b > 1. O(1) < O(log x) < O(xa) for all a > 0. O(ax) < O(bx) if 1 < a < b. O(xa) < O(bx) for all a > 0, b > 1. O(ax) < O(x!) for all a > 1.

Comparing Orders Theorem: For all functions f(x), g(x), and h(x), if O(f) < O(g), then O(f  h) < O(g  h).

Feasible vs. Infeasible Algorithms with polynomial growth rates or less are considered feasible. Algorithms with growth rates greater than polynomial are considered infeasible. Of course, all algorithms are feasible for small input sizes. It is feasible to factor small integers (< 50 digits). It is infeasible to factor large integers (> 200 digits).