Download presentation
1
CS420 lecture two Fundamentals
wim bohm, cs, CSU
2
Asymptotics Asymptotics show the relative growth of functions by comparing them to other functions. There are different notations: f(x) ~ g(x): f(x) = o(g(x)): where f and g are positive functions
3
Big O, big omega, big theta
f(x) = O(g(x)) iff there are positive integers c and n0 such that f(x) < c.g(x) for all x > n0 f(x) = Ω(g(x)) iff f(x) > c.g(x) for all x > n0 f(x) = Θ(g(x)) iff f(x) = O(g(x)) and f(x) = Ω(g(x))
4
Big O etc. Big O used in upper bounds, ie the worst or average case complexity of an algorithm. Big theta is a tight bound, eg stronger than upper bound. Big omega used in lower bounds, ie the complexity of a problem. (So we were sloppy in the last lecture.)
5
Closed / open problems A closed problem has a lower bound Ω(f(x)) and an algorithm with upper bound O(f(x)) eg searching, sorting what about matrix multiply? An open problem has lower bound < upper bound
6
lower bounds An easy lower bound on a problem is the size of the output it needs to produce, or the number of inputs it has to access Generate all permutations of size n, lower bound? Towers of Hanoi, lower bound? Sum n input integers, lower bound? but... sum integers 1 to n?
7
growth rates f(n) = O(1) constant
Scalar operations (+,-,*,/) when input size not measured in #bits Straight line code of simple assignments (x= simple expresssion) and conditionals with simple sub-expressions Function calls, discuss
8
f(n) = log(n) definition: bx = a x = logba, eg 23=8, log28=3
log(x*y) = log x + log y because bx . by = bx+y log(x/y) = log x – log y log xa= a log x log x is a 1to1 monotonically growing function log x = log y x=y
9
more log stuff logax = logbx / logba because
10
and more log stuff
11
log n and algorithms In algorithm analysis we often use log n when we should use floor(log(n)). Is that OK? When in each step of an algorithm we halve the size of the problem (or divide it by k) then it takes log n steps to get to the base case Notice that logb1n = O(logb2n) for any b1 and b2, so the base does not matter in O analysis Does that work for exponents too? Is 2n = O(3n) ? Is 3n = O(2n)?
12
log n and algorithms In algorithm analysis we often use log n when we should use floor(log(n)). That's OK floor(log(n)) = O(log(n)) When in each step of an algorithm we halve the size of the problem (or divide it by k) then it takes log n steps to get to the base case Notice that logb1n = O(logb2n) for any b1 and b2, so the base does not matter in O analysis Does that work for exponents too? Is 2n = O(3n) ? Is 3n = O(2n)?
13
log n and algorithms DivCo(P){
Algorithms with O(log n) complexity are often of a simple divide and conquer variety (base, solve-direct, and split are O(1)): DivCo(P){ if base(P) solve-direct(P) else split(P) P1 ... Pn DivCo(Pi) } eg Binary Search
14
log n and algorithms DivCo(P){
Algorithms with O(log n) complexity are often of a simple divide and conquer variety (base, solve-direct, and split are O(1)): DivCo(P){ if base(P) solve-direct(P) else split(P) P1 ... Pn DivCo(Pi) } Solve f(n) = f(n/2) + 1 f(1)=1 by repeated substitution
15
General Divide and Conquer
DivCo(P) { if base(P) solve-direct(P) else split(P) P1 ... Pn combine(DivCo(P1) ... DivCo(Pn)) } Depending on the costs of base, solve direct, split and combine we get different complexities (later).
16
f(n)=O(n) Linear complexity
eg Linear Search in an unsorted list also: Polynomial evaluation A(x) = anxn+ an-1xn a1x+a0 (an!=0) Evaluate A(x0) How not to do it: an * exp(x,n)+ an-1*exp(x,n-1)+...+ a1*x+a0 why not?
17
How to do it: Horner's rule
y=a[n] for (i=n-1;i>=0;i--) y = y *x + a[i]
18
Horner complexity Lower bound: Ω(n) because we need to access each a[i] at least once Upper bound: O(n) Closed problem But what if A(x) = xn Horner not optimal for xn
19
A(x)=xn Recurrence: x2n=xn.xn x2n+1=x.x2n y=1; Complexity?
while(n!=0){ if(odd(n)) y=y*x; x=x*x; n = n/2; } Complexity?
20
O(n log(n)) Often resulting from divide and conquer algorithms where split & combine are O(n) and we divide in nearly equal halves. mergesort(A){ if size(A) <= 1 return A else return merge(mergesort(left half(A)), mergesort(right half(A)))
21
Merge Sort - Split {7,3,2,9,1,6,4,5} {7,3,2,9} {1,6,4,5} {7,3} {2,9}
{1,6} {4,5} Give out cards, order suits: hearts, diamonds, spades, clubs Ask them to sort this way. Point out physical ease. {7} {3} {2} {9} {1} {6} {4} {5}
22
Merge Sort - Merge {1,2,3,4,5,6,7,9} {2,3,7,9} {1,4,5,6} {3,7} {2,9}
{1,6} {4,5} {7} {3} {2} {9} {1} {6} {4} {5}
23
Merge sort complexity Time Space Total cost of all splits?
Cost of each merge level in the tree? How many merge levels in the tree? Space Is extra space needed? (outside the array A) If so, how much?
24
Series
25
Geometric Series
26
Harmonic series why?
27
Products why?
28
Using integrals to bound sums
If f is a monotonically growing function, then:
29
f(x1)+f(x2)+f(x3)+f(x4)<=f(x2)+f(x3)+f(x4)+f(x5)
x x x x x5
30
some formula manipulation....
31
Example of use
32
using the integral bounds
33
concluding...
34
Sorting lower bound We will use in proving a lower bound on sorting
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.