Download presentation
Presentation is loading. Please wait.
1
Growth of functions CSC317
2
• We ignore constants and low order terms; why?
Growth of functions We’ve already been talking about “grows as” for the sort examples, but what does this really mean? We already know that: • We ignore constants and low order terms; why? • Asymptotic analysis: we focus on large input size; growth of function for large input; why? Complexity petting zoo (see notes of Burt Rosenberg): CSC317
3
Why are we going top the zoo? Complexity classes
(we are just scratching the surface) Constant time: T(1) Example: Access first number in an array (also second number...) T(log n) Example: Binary search: Sorted array A; find value v between range low and high A = [ ], find v=4 Solution: Search in middle of array: value found, or recursion left side, or recursion right half (sound familiar?) CSC317
4
T(n) Example: Largest number in sequence, sum of fixed sequence Whenever you step through entire sequence or array Even if you have to do this 20 times T(n log n) Example (should know one by now)? T(n2) Example (should know one by now)? T(n3) Example: Naïve matrix multiplication (for an n by n matrix) CSC317
5
Complexity classes Polynomial time (class P): T(n), T(n logn), T(n2), T(n3) T(nk), k ≥ 0 More than polynomial: exponential T(2n) CSC317
6
What about this: subset sum problem? How long to find a solution?
Input: set of integers size n, A = [1,4,-3,2,9,7] Output: is there a subset that sums to 0? Might take exponential time if we have to go through every possible subset (brute force) But what if I handed you a subset, say [1, -3, 2]? How long would it take to verify this sum is 0? Polynomial, linear, time. CSC317
7
Complexity classes Algorithms that are verifiable in polynomial time (good) are called NP class May take exponential number to go through every possible input (possibly bad) Example: Subset sum problem A=[1, 4, -3, 2, 9, 7] Is there a subset that sums to 0? [1, -3, 2] is verifiable to sum to 0 quickly CSC317
8
Example: Subset sum problem A=[1, 4, -3, 2, 9, 7]
Is there a subset that sums to 0? [1, -3, 2] is verifiable to sum to 0 quickly An algorithm that solves this problem is: form one by one each and every of the 2n subsets of A, and see if the subset sums to zero. How long do we need to run through those? T(nk). The problem is P (polynomial). However, we guess-and-check by randomly creating the subsets, hence NP non-deterministic polynomial CSC317
9
NP class: Nondeterministic = random = if I was magically handed solution. Originally from nondeterministic Turing machine P = NP ? Can problem that is quickly verifiable (i.e., polynomial time) be quickly solved (i.e., polynomial time)? Unknown; Millenium prize problem CSC317
10
CSC317
11
CSC317
12
Low asymptotic run time = faster
b: T(n) = 0.2n2 a: T(n) = n log(n) Low asymptotic run time = faster CSC317
13
Big Oh notation Definition: Big O notation describes the limiting behavior of a function when the arguments tend towards a particular value or infinity usually in terms of simpler functions. For us Big Oh allows us to classify algorithms in terms of their response (e.g., in their processing time or working space requirements) to changes in input size. Asymptotic upper bound; bounded from above by g(n) for large enough n More formal definition: O(g(n)) = f(n). There exists a positive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 CSC317
14
Asymptotic upper bound; bounded from above by g(n) for large enough n
More formal definition: O(g(n)) = f(n). There exists a positive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 To show that our relation holds, we just need to find one such pair c and n0 … CSC317
15
Asymptotic upper bound; bounded from above by g(n) for large enough n
More formal definition: O(g(n)) = f(n). There exists a positive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 f(n) = n2+10n is O(n2) CSC317
16
Definition: O(g(n)) = f(n)
Definition: O(g(n)) = f(n). There exists a psitive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 Examples of functions in O(n2): f(n) = n2 f(n) = n2 + n f(n) = n n All bound asymptotically by n2 Intuitively, constants and lower order don’t matter ... CSC317
17
Question: What about f(n) = n ?
Definition: O(g(n)) = f(n). There exists a psitive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 Question: What about f(n) = n ? g(n) = n2 is not a tight upper bound but it’s an upper bound since n ≤ 1n2 for all n ≥ 1(i.e. c = 1, n0 = 1) f(n) = n is O(n2) CSC317
18
Definition: O(g(n)) = f(n)
Definition: O(g(n)) = f(n). There exists a psitive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 What about this? f(n) = aknk a1n + a0. Is f(n) = O(nk) ? Intuition: Yes (we can ignore lower order terms and constants), but we need proof. Proof : we want to find n0; c such that f(n) ≤ cnk CSC317
19
f(n) = aknk a1n + a0. ak > 0 Proof : we want to find n0; c such that f(n) ≤ cnk , f(n) ≥ 0 f(n) = aknk a1n + a0 ≤ |ak|nk |a1|n + |a0| ≤ |ak|nk |a1|nk + |a0|nk = (|ak| |a1| + |a0|)nk what are n0; c ? CSC317
20
Recap: Why are we talking about Big Oh? Most commonly used!
Asymptotic upper bound; bounded from above by g(n) for large enough n However, there are other bounds too! CSC317
21
Big Omega Asymptotic lower bound; bounded from below by g(n) for large enough n Definition: Ω(g(n)) = f(n). There exists a psitive constant c such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 Why is this less often used? CSC317
22
Big Theta Asymptotic tight bound; bounded from below and above by g(n) for large enough n Definition: Θ(g(n)) = f(n): There exist positive constants c1, and c2 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 Stronger statement (Note that literature is sometimes sloppy and says Oh when actually Theta) CSC317
23
Examples of Oh, Omega, Theta
Functions f(n) in O(n2) Functions f(n) in Ω(n2) Functions f(n) in Θ(n2) f(n) = n2; f(n) = n2 + n; f(n) = n f(n) = n2; f(n) = n2 + n; f(n) = n5 f(n) = n2; f(n) = n2 – n CSC317
24
Recap: Oh, Omega, Theta Oh (like ≤) Omega (like ≥) Theta (like =) O(n) is asymptotic upper bound 0 ≤ f(n) ≤ cg(n) Ω(n) is asymptotic lower bound 0 ≤ cg(n) ≤ f(n) Θ(n) is asymptotic tight bound 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) CSC317
25
More on Oh, Omega, Theta Theorem: f(n) = Θ(n) if and only if (iff) f(n) = O(n) and f(n) = Ω(n) Question: Is f(n) = n2 + 5 Ω(n3) ? Answer: NO (why?)! CSC317
26
Question: Is f(n) = n2 + 5 Ω(n3) ?
Answer: NO! Definition: Θ(g(n)) = f(n): There exist positive constants c1, and c2 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 Therefore: If f(n) = Ω(n3), then there exists n0; c such that for all n ≥ n0 n2 + 5 ≥ cn3 Remember from before: n2 + 5n2 ≥ n2 + 5 n2 + 5n2 ≥ cn3 6 ≥ cn Can be true for all n ≥ n0 CSC317
27
Some properties of Oh, Omega, Theta
Transitivity: f(n) = Θ(g(n)) and g(n) = Θ(h(n)) then f(n) = Θ(h(n)) (same for O and Ω) Reflexivity: f(n) = Θ(f(n)) (same for O and Ω) Symmetry: f(n) = Θ(g(n)) iff g(n) = Θ(g(n)) CSC317
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.