Chapter 3: Growth of Functions

Slides:



Advertisements
Similar presentations
BY Lecturer: Aisha Dawood. The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains.
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Nattee Niparnan. Recall What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Asymptotic Analysis Motivation Definitions Common complexity functions
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Algorithm analysis and design Introduction to Algorithms week1
Instructor Neelima Gupta
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Time Complexity of Algorithms (Asymptotic Notations)
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
Lecture aid 1 Devon M. Simmonds University of North Carolina, Wilmington CSC231 Data Structures.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
1 Asymptotes: Why? How to describe an algorithm’s running time? (or space, …) How does the running time depend on the input? T(x) = running time for instance.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Mathematical Foundations (Growth Functions) Neelima Gupta Department of Computer Science University of Delhi people.du.ac.in/~ngupta.
Asymptotic Complexity
COMP108 Algorithmic Foundations Algorithm efficiency
Chapter 3: Growth of Functions
Asymptotic Analysis.
Introduction to Algorithms
Copyright © The McGraw-Hill Companies, Inc
CS 3343: Analysis of Algorithms
Analysis of Algorithms
O-notation (upper bound)
Asymptotic Notations Algorithms Lecture 9.
Chapter 3 Image Slides Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Asymptotes: Why? How to describe an algorithm’s running time?
CSC 413/513: Intro to Algorithms
Analysis of Algorithms
Introduction to Algorithms Analysis
Asymptotic Growth Rate
Asymptotic Analysis.
Fundamentals of Algorithms MCS - 2 Lecture # 9
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Advanced Analysis of Algorithms
Introduction to Algorithms
Intro to Data Structures
O-notation (upper bound)
Copyright © The McGraw-Hill Companies, Inc
Math/CSE 1019N: Discrete Mathematics for Computer Science Winter 2007
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Chapter 3 Growth of Functions
Algorithm Analysis T(n) O() Growth Rates 5/21/2019 CS 303 – Big ‘Oh’
Advanced Analysis of Algorithms
Analysis of Algorithms Big-Omega and Big-Theta
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Richard Anderson Autumn 2015 Lecture 4
Intro to Theory of Computation
Algorithms and Data Structures Lecture II
Presentation transcript:

Chapter 3: Growth of Functions

About this lecture Introduce Asymptotic Notation Q( ), O( ), W( ), o( ), w( )

Dominating Term Recall that for input size n, Insertion Sort ’s running time is: An2 + Bn + C, (A,B,C are constants) Merge Sort ’s running time is: Dn log n + En + F, (D,E,F are constants) To compare their running times for large n, we can just focus on the dominating term (the term that grows fastest when n increases) An2 vs Dn log n

Dominating Term If we look more closely, the leading constants in the dominating term does not affect much in this comparison We may as well compare n2 vs n log n (instead of An2 vs Dn log n ) As a result, we conclude that Merge Sort is better than Insertion Sort when n is sufficiently large

Asymptotic Efficiency The previous comparison studies the asymptotic efficiency of two algorithms If algorithm P is asymptotically faster than algorithm Q, P is often a better choice To aid (and simplify) our study in the asymptotic efficiency, we now introduce some useful asymptotic notation

Big-O notation Definition: Given a function g(n), we denote O(g(n)) to be the set of functions { f(n) | there exists positive constants c and n0 such that 0 ≤ f(n) ≤ c g(n) for all n ≥ n0 } Rough Meaning: O(g(n)) includes all functions that are upper bounded by g(n)

Big-O notation (example) 4n  O(5n) [ proof: c = 1, n ≥ 1] 4n  O(n) [ proof: c = 4, n ≥ 1] 4n + 3  O(n) [ proof: c = 5, n ≥ 3] n  O(0.001n2) [ proof: c = 10, n ≥ 100 ] loge n  O(lg n) [ proof: c = 1, n ≥ 1 ] lg n  O(loge n) [ proof: c = lg e, n ≥ 1 ] Remark: Usually, we will slightly abuse the notation, and write f(n) = O(g(n)) to mean f(n)  O(g(n))

Copyright © The McGraw-Hill Companies, Inc Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.

Big-Omega notation Definition: Given a function g(n), we denote W(g(n)) to be the set of functions { f(n) | there exists positive constants c and n0 such that 0 ≤ c g(n) ≤ f(n) for all n ≥ n0 } Rough Meaning: W(g(n)) includes all functions that are lower bounded by g(n)

Big-W notation (example) 5n  W(4n) [ proof: c = 1, n ≥ 1] n  W(4n) [ proof: c = 1/4, n ≥ 1] 4n + 3  W(n) [ proof: c = 1, n ≥ 1] 0.001n2  W(n) [ proof: c = 10, n ≥ 100 ] loge n  W(lg n) [ proof: c = 1/lg e, n ≥ 1 ] lg n  W(loge n) [ proof: c = 1, n ≥ 1 ]

f(n) = W(g(n))  g(n) = O(f(n)) Big-O and Big-Omega Similar to Big-O, we will slightly abuse the notation, and write f(n) = W(g(n)) to mean f(n)  W(g(n)) Relationship between Big-O and Big-W : f(n) = W(g(n))  g(n) = O(f(n))

Q notation (Big-O \ Big-W) Definition: Given a function g(n), we denote Q(g(n)) to be the set of functions { f(n) | there exists positive constants c1, c2, and n0 such that 0 ≤ c1 g(n) ≤ f(n) ≤ c2 g(n) for all n ≥ n0 } Meaning: Those functions which can be both upper bounded and lower bounded by of g(n)

f(n) = W(g(n)) and f(n) = O(g(n)) Big-O, Big-W, and Q Similarly, we write f(n) = Q(g(n)) to mean f(n)  Q(g(n)) Relationship between Big-O, Big-W, and Q: f(n) = Q(g(n))  f(n) = W(g(n)) and f(n) = O(g(n))

Q notation (example) 4n = Q(n) [ c1 = 1, c2 = 4, n ≥ 1] loge n = Q(lg n) [ c1 = 1/lg e, c2 = 1, n ≥ 1] Running Time of Insertion Sort = Q(n2) If not specified, running time refers to the worst-case running time Running Time of Merge Sort = Q(n lg n)

Little-o notation Definition: Given a function g(n), we denote o(g(n)) to be the set of functions { f(n) | for any positive c, there exists positive constant n0 such that 0 ≤ f(n)  c g(n) for all n ≥ n0 } Note the similarities and differences with Big-O

Little-o (equivalent definition) Definition: Given a function g(n), o(g(n)) is the set of functions { f(n) | limn ∞ (f(n)/g(n)) = 0 } Examples: 4n = o(n2) n lg n = o(n1.000001) n lg n = o(n lg2 n)

Little-omega notation Definition: Given a function g(n), we denote w(g(n)) to be the set of functions { f(n) | for any positive c, there exists positive constant n0 such that 0 ≤ c g(n)  f(n) for all n ≥ n0 } Note the similarities and differences with the Big-Omega definition

Little-omega (equivalent definition) Definition: Given a function g(n), w(g(n)) is the set of functions { f(n) | limn ∞ (g(n)/f(n)) = 0 } Relationship between Little-o and Little-w : f(n) = w(g(n))  g(n) = o(f(n))

To remember the notation: O is like ≤ : f(n) = O(g(n)) means f(n) ≤ cg(n) W is like ≥ : f(n) = W(g(n)) means f(n) ≥ cg(n) Q is like = : f(n) = Q(g(n))  g(n) = Q(f(n)) o is like  : f(n) = o(g(n)) means f(n)  cg(n) w is like  : f(n) = w(g(n)) means f(n)  cg(n) Note: Not any two functions can be compared asymptotically (E.g., sin x vs. cos x )

What’s wrong with it? Your friend, after this lecture, has tried to prove 1+2+…+ n = O(n) His proof is by induction: First, 1 = O(n) {n=1} Assume 1+2+…+k = O(n) {n=k} Then, 1+2+…+k+(k+1) = O(n) + (k+1) {n=k+1} = O(n) + O(n) = O(2n) = O(n) So, 1+2+…+n = O(n) [where is the bug??]

Homework Problem 3.1 Exercises: 3.1-1