Presentation is loading. Please wait.

Presentation is loading. Please wait.

Analysis of Algorithms: Methods and Examples

Similar presentations


Presentation on theme: "Analysis of Algorithms: Methods and Examples"— Presentation transcript:

1 Analysis of Algorithms: Methods and Examples
CSE 2320 – Algorithms and Data Structures Alexandra Stefan Based on presentations by Vassilis Athitsos and Bob Weems University of Texas at Arlington Updated 9/11/2016

2 Reading CLRS: Chapter 3 (all)
Notation: f(n) = Θ(n2) (if the dominant term is n2). Given a function, e.g. f(n) = 15n3 +7n2+3n+20, find Theta: find the dominant term: 15n3 Ignore the constant: n3 f(n) = Θ(n3)

3 Counting instructions – part 1
Count in detail the total number of instructions executed by each of the following pieces of code: // Example A. Notice the ; at the end of the for loop. for (i = 0; i <n; i++) ; // Example B (source: Dr. Bob Weems) for (i=0; i<n; i++) for (j=0; j<s; j++) { c[i][j] = 0; for (k=0; k<t; k++) c[i][j] += a[i][k]*b[k][j]; }

4 Counting instructions – part 1
n* ( body_instr_count) for (init; cond; update) // assume the condition is TRUE n times body // Example A. Notice the ; at the end of the for loop. for (i = 0; i <n; i++) ; // Example B (source: Dr. Bob Weems) for (i=0; i<n; i++) for (j=0; j<s; j++) { c[i][j] = 0; for (k=0; k<t; k++) c[i][j] += a[i][k]*b[k][j]; } false true n* ( ) = 2 + 2n n * ( ___ ) = 2 + n * ( *s + 3*s*t) = 2 + 2*n + 5*n*s + 3*n*s*t s* ( ____) = 2 + s * ( *t) = *s + 3*s*t t* ( ) = * t

5 Counting instructions – part 2
(n2 time complexity and it’s justification review) // Example G. T(n) = …. for (i = 0; i<n; i++) for(j=0; j<n; j = j+1) printf("A"); // Example H. (similar to selection sort processing) T(n) = …. for (i = 0; i<(n-1); i++) for(j=i+1; j<n; j = j+1) // Example I. (similar to insertion sort processing) T(n) = …. for (i = 1; i<n; i++) for(j=i-1; j>=0; j = j-1) n Θ(n2 ) n (n-1) + (n-2) + …2 + 1 = [n(n-1)]/2 = Θ(n2 ) … + (n-1) + (n-2) = [n(n-1)]/2 = Θ(n2 )

6 Time complexity – part 1 // Example C (source: Dr. Bob Weems) T(N) = … for (i = 0; i<N; i++) for(j=1; j<N; j = j+j) printf("A"); // Example D. T(N) = …. for(j=1; j<N; j = j*2) // Example E. T(N) = …. for(j=N; j>=1; j = j/2) // Example F. T(N) = …. for(j=1; j<N; j = j+2)

7 Time complexity – part 1 // Example C (source: Dr. Bob Weems) T(N) = … for (i = 0; i<N; i++) for(j=1; j<N; j = j+j) printf("A"); // Example D. T(N) = …. for(j=1; j<N; j = j*2) // Example E. T(N) = …. for(j=N; j>=1; j = j/2) // Example F. T(N) = …. for(j=1; j<N; j = j+2) N Θ(NlgN) j: 1, 2, 4, …, N/2 N Θ(NlgN) j: 1, 2, 4, …, N/2 lgN values N Θ(NlgN) j: N, N/2, …, 2,1 (1+lgN) values N Θ(N2 ) j:1,3,5,7,…, N-4,N-2 N/2 values

8 Time complexity – part 2 k i j (loop 7) Line 8
(nested 3&7) 1 2 n-2 n-1 total Source: code section from the LU decomposition code from notes02.pdf by Dr. Weems. 1. for (k=1; k<=n-1; k++) { 2. … 3. for (j=k+1; j<=n; j++) { 4. temp=ab[ell][j]; 5. ab[ell][j]=ab[k][j]; 6. ab[k][j]=temp; 7. for (i=k+1; i<=n; i++) ab[i][j] -= ab[i][k]*ab[k][j]; }

9 Time complexity– part 2 k i j (loop 7) Line 8
(nested 3&7) 1 2 ↓(n-1) n 2 -> n (n-1)2 3 ↓(n-2) 3 -> n (n-2)2 n-2 n-1 ↓(2) n-1 -> n (2)2 ↓(1) n -> n (1)2 total Source: section from the LU decomposition code: “Notes 2: Growth of functions” by Dr. Weems. 1. for (k=1; k<=n-1; k++) { 2. … 3. for (j=k+1; j<=n; j++) { 4. temp=ab[ell][j]; 5. ab[ell][j]=ab[k][j]; 6. ab[k][j]=temp; 7. for (i=k+1; i<=n; i++) ab[i][j] -= ab[i][k]*ab[k][j]; } See the approximation of summations by integrals example for solution to:

10 Estimate runtime Problem: Summary: Answer: Total instructions: 1012
The total number of instructions in a program (or a piece of a program) is 1012 and it runs on a computer that executes 109 instructions per second. How long will it take to run this program? Give the answer in seconds. If it is very large, transform it in larger units (hours, days, years). Summary: Total instructions: 1012 Speed: 109 instructions/second Answer: Time = (total instructions)/speed = (1012 instructions) / (109 instr/sec) = 103 seconds ~ 15 minutes Note that this computation is similar to computing the time it takes to travel a certain distance ( e.g. 120miles) given the speed (e.g. 60 miles/hour).

11 Estimate runtime A slightly different way to formulate the same problem: total number of instructions in a program (or a piece of a program) is 1012 and it runs on a computer that executes one instruction in one nanosecond (10-9 seconds) How long will it take to run this program? Give the answer in seconds. If it is very large, transform it in larger units (hours, days, years) Summary: 1012 total instructions 10-9 seconds per instruction Answer: Time = (total instructions) * (seconds per instruction) = (1012 instructions)* (10-9 sec/instr) = 103 seconds ~ 15 minutes

12 Motivation for Big-Oh Notation
Scenario: a client requests an application for sorting his records. Which one of the 3 sorting algorithms that we discussed will you implement? Selection sort Insertion sort Merge sort

13 Comparing algorithms Comparing linear, N lg N, and quadratic complexity. Quadratic time algorithms become impractical (too slow) much faster than linear and N lg N algorithms. Of course, what we consider "impractical" depends on the application. Some applications are more tolerant of longer running times. N N lg N N2 106 (1 million) ≈ 20 million 1012 (one trillion) 109 (1 billion) ≈ 30 billion 1018 (one quintillion) 1012 (1 trillion) ≈ 40 trillion 1024 (one septillion) N lgN 1000 ≈ 10 106 =10002 ≈ 2*10 109 =10003 ≈ 3*10 1012 =10004 ≈ 4*10

14 Motivation for Big-Oh Notation
Given an algorithm, we want to find a function that describes the time performance of the algorithm. Computing the number of instructions in detail is NOT desired: It is complicated and the details are not relevant The number of machine instructions and runtime depend on factors other than the algorithm: Programming language Compiler optimizations Performance of the computer it runs on (CPU, memory) There are some details that we would actually NOT want this function to include, because they can make a function unnecessarily complicated. When comparing two algorithms we want to see which one is better for very large data – asymptotic behavior It is not very important what happens for small size data. Asymptotic behavior = rate of growth = order of growth The Big-Oh notation describes the asymptotic behavior and greatly simplifies algorithmic analysis.

15 Asymptotic Notation Goal: we want to be able to say things like:
Selection sort will take time strictly proportional to n Θ(n2 ) Insertion sort will take time at most proportional n O(n2 ) Note that we can still say that insertion sort is Θ(n2 ). Use big-Oh for upper bounding complex functions of n. Any sorting algorithm will take time at least proportional to n Ω(n) Math functions that are: Θ(n2 ) : O(n2 ) : Ω(n2 ) : Informal definition: f(n) grows ‘proportional’ to g(n) if: lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = c ≠ 0 (c is a non-zero constant) = …. Θ tight bound ≤ .… O upper bound (big-Oh – bigger) ≥ …. Ω lower bound Abuse notation: instead of:

16 Abuse of notation Instead of : We may use:

17 Big-Oh A function f(n) is said to be O(g(n)) if there exist constants c0 and n0 such that: f(n) ≤ c0g(n) for all n ≥ n0. Theorem: if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = c is a constant, then Typically, f(n) is the running time of an algorithm. This can be a rather complicated function. We try to find a g(n) that is simple (e.g. n2), and such that f(n) = O(g(n)).

18 Asymptotic Bounds and Notation (CLRS chapter 3)
f(n) is O(g(n)) if there exist positive constants c0 and n0 such that: f(n) ≤ c0 g(n) for all n ≥ n0. Theorem: if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = c is a constant, then g(n) is an asymptotic upper bound for f(n). f(N) is Ω(g(n)) if there exist positive constants c0 and n0 such that: c0 g(n) ≤ f(n) for all n ≥ n0. Theorem: if lim 𝑛→∞ 𝑔(𝑛) 𝑓(𝑛) = c is a constant, then g(n) is an asymptotic lower bound for f(n). f(n) is Θ(g(n)) if there exist positive constants c0 , c1 and n0 such that: c0 g(n) ≤ f(n) ≤ c1 g(n) for all n ≥ n0. Theorem: if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = c ≠ 0 is a constant, g(n) is an asymptotic tight bound for f(n).

19 Asymptotic Bounds and Notation (CLRS chapter 3)
“little-oh”: o Theorem: if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = 0 is a constant, then f(n) is o(g(n)) if for any constant c0, there exists n0 s.t.: f(n) ≤ c0 g(n) for all n ≥ n0. g(N) is an asymptotic upper bound for f(N) (but NOT tight). E.g.: n = ω(n2), n = ω(nlgn), n2 = ω(n4),… “little-omega”: ω Theorem: if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = , then f(N) is ω(g(n)) if for any constant c0, there exists n0 s. t.: c0 g(n) ≤ f(n) for all n ≥ n0. g(n) is an asymptotic lower bound for f(n) (but NOT tight). E.g.: n2 = ω(n), nlgn = ω(n), n3 = ω(n2),…

20 L’Hospital’s Rule If lim 𝑛→∞ 𝑓(𝑛) and lim 𝑛→∞ 𝑔(𝑛) are both 0 or
and if lim 𝑛→∞ 𝑓′(𝑛) 𝑔′(𝑛) is a constant or , Then lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = lim 𝑛→∞ 𝑓′(𝑛) 𝑔′(𝑛)

21

22 Theta vs Big-Oh The Theta notation is more strict than the Big-Oh notation: TRUE: n2 = O(n100). FALSE: n2 = Θ(n100).

23 Simplifying Big-Oh Notation
Suppose that we are given this running time: f(n) = 35n2 + 41n + lg(n) How can we express f(n) in Big-Oh notation?

24 Asymptotic Notation Let f(n) = 35n2 + 41n + lg(n) + 1532.
We say that f(n) = O(n2). Also correct, but too detailed (do not use them): f(n) = O(n2+n) f(n) = O(35n2 + 41n + lg(n) ). In the recurrence formulas and proofs, you may see these notations (see CLRS, page 49): f(n) = 2n2 + Θ(n) There is a function h(n) in Θ(n) s.t. f(n) = 2n2 + h(n) 2n2 + Θ(n) = Θ(n2). For any function h(n) in Θ(n), there is a function g(n) in Θ(n2) s.t. f(n) = 2n2 + h(n) = g(n). For any function h(n) in Θ(n), 2n2 + h(n) is in Θ(n2).

25 Proofs using the definition: O
Let f(n) = 35n2 + 41n + lg(n) Show (using the definition) that f(n) = O(n2). Proof: Want to find n0 and c0 s.t., for all n ≥ n0: f(n) ≤ c0n2. Version 1: Upper bound each term by n2 for large n (e.g. n ≤ 1532) f(n) = 35n2 + 41n + lg(n) ≤ 35n2 + n2 + n2 + n2 = 38n2 Use: c0 = 38, n0 = 1536 f(n) = 35n2 + 41n + lg(n) ≤ 38n2, for all n ≥ 1536 Version 2: You can also pick c0 large enough to cover the coefficients of all the terms: c0 = 1609 = ( ), n0 = 1

26 Proofs using the definition: Ω, Θ
Let f(n) = 35n2 + 41n + lg(n) Show (using the definition) that f(n) = Ω(n2) and f(n) = Θ(n2). Proof of Ω: Want to find n1 and c1 s.t., for all n ≥ n1: f(n) ≥ c1n2. Use: c1 = 1, n1 = 1 f(n) = 35n2 + 41n + lg(n) ≥ n2, for all n ≥ 1 Proof of Θ: Version 1: We have proved f(n) = O(n2) and f(n) = Ω(n2) and so f(n) = Θ(n2) (property 4, page 26). Version 2: We found c0 = 38, n0 = and c1 = 1, n1 = 1 s.t.: f(n) = 35n2 + 41n + lg(n) ≤ 38n2, for all n ≥ 1536 => n2 ≤ f(n) ≤ 38n2, for all n ≥ => f(n) = Θ(n2)

27 Polynomial functions If f(n) is a polynomial function, then it is Θ of the dominant term. E.g. f(n) = 15n3 +7n2+3n+20, find g(n) s.t. f(n)=Θ(g(n)): find the dominant term: n3 Ignore the constant, left with: n3 => g(n) = n3 => f(n) = Θ(n3)

28 Properties of O, Ω and Θ 1. f(n) = O(g(n)) => g(n) = Ω(f(n))
2. f(n) = Ω(g(n)) => g(n) = O(f(n)) 3. f(n) = Θ(g(n)) => g(n) = Θ(f(n)) 4. If f(n) = O(g(n)) and f(n) = Ω(g(n)) => f(n) = Θ(g(n)) If f(n) = Θ(g(n)) => f(n) = O(g(n)) and f(n) = Ω(g(n)) Transitivity (proved in future slides): 6. If 𝑓 𝑛 =𝑂 𝑔 𝑛 and 𝑔 𝑛 =𝑂(ℎ 𝑛 ), then 𝑓 𝑛 =𝑂(ℎ 𝑛 ). 7. If 𝑓 𝑛 =Ω 𝑔 𝑛 and 𝑔 𝑛 =Ω(ℎ 𝑛 ), then 𝑓 𝑛 =Ω(ℎ 𝑛 ).

29 Using Limits if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = c is a non-zero constant, then g(n) = ____(f(n)). In this definition, both zero and infinity are excluded. In this case we can also say that f(n) = Θ(g(n)). This can easily be proved using the limit or the reflexivity property of Θ. if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = c is a constant, then g(n) = ____(f(n)). "constant" includes zero, but not infinity. if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = ∞ then g(n) = ____(f(n)). f(n) grows much faster than g(n) if lim 𝑛→∞ 𝑔(𝑛) 𝑓(𝑛) is a constant, then g(n) = ____(f(n)). "Constant" includes zero, but does NOT include infinity.

30 Using Limits if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = c is a non-zero constant, then f(n) = Θ(g(n)). In this definition, both zero and infinity are excluded. In this case we can also say that g(n) = Θ(f(n)). This can easily be proved using the limit or the reflexivity property of Θ. if lim 𝑛→∞ 𝑔(𝑛) 𝑓(𝑛) = c is a constant, then f(n) = Ω(g(n)). "constant" includes zero, but not infinity. if lim 𝑛→∞ 𝑔(𝑛) 𝑓(𝑛) = ∞ then f(n) = O(g(n)). g(n) grows much faster than f(n) if lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) is a constant, then f(n) = O(g(n)). "Constant" includes zero, but does NOT include infinity.

31 Using Limits: Example 1 Suppose that we are given this running time: f(n) = 35n2 + 41n + lg(n) Use the limits theorem to show that f(n) = O(n2).

32 Using Limits: Example 2 Show that 𝑛5+3𝑛4+2𝑛3+𝑛2+𝑛+12 5𝑛3+𝑛+3 =Θ(???).

33 Using Limits: An Example
Show that 𝑛5+3𝑛4+𝑛3+2𝑛2+𝑛+12 5𝑛3+𝑛+3 =Θ(𝑛2). Proof: Here: 𝑓 𝑛 = 𝑛5+3𝑛4+𝑛3+2𝑛2+𝑛+12 5𝑛3+𝑛+3 Let 𝑔(𝑛) = 𝑛2. We want to show that lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = c ≠0 and so, 𝑓 𝑛 = Θ 𝑔 𝑛 . lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = lim 𝑛→∞ 𝑛5+3𝑛4+𝑛3+2𝑛2+𝑛+12 5𝑛3+𝑛 𝑛2 = lim 𝑛→∞ 𝑛5+3𝑛4+𝑛3+2𝑛2+𝑛+12 5𝑛5+𝑛3+3𝑛2 Solution 1: lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = lim 𝑛→∞ 𝑛5+3𝑛4+𝑛3+2𝑛2+𝑛+12 5𝑛5+𝑛3+3𝑛2 = 1 5 Solution 2 (L’Hospital) : lim 𝑛→∞ 𝑓(𝑛) 𝑔(𝑛) = lim 𝑛→∞ 𝑓′(𝑛) 𝑔′(𝑛) = lim 𝑛→∞ 5𝑛4+3∗4𝑛3+3𝑛2+2𝑛+1 5∗5𝑛4+3∗𝑛2+3∗2𝑛 = … = lim 𝑛→∞ 5∗4∗3∗2∗𝑛 5∗5∗4∗3∗2∗𝑛 = 1 5

34 Big-Oh Hierarchy 1 = 𝑂(lg⁡(𝑛)) 𝑙𝑔⁡(𝑛) = 𝑂(𝑛) 𝑛 = 𝑂 𝑛2
If 0≤𝑐≤d, then 𝑛𝑐 = 𝑂(𝑛𝑑). Higher-order polynomials always get larger than lower-order polynomials, eventually. For any 𝑑, if 𝑐 > 1, 𝑛𝑑 = 𝑂 𝑐𝑛 . Exponential functions always get larger than polynomial functions, eventually. You can use these facts in your assignments. You can apply transitivity to derive other facts, e.g., that 𝑙𝑔⁡(𝑛) = 𝑂(𝑛2). O(1), 𝑂(𝑙𝑔⁡(𝑛)) , 𝑂 , 𝑂 𝑛 , 𝑂 𝑛𝑙𝑔𝑛 , 𝑂 𝑛2 , 𝑂 𝑛𝑐 , 𝑂 𝑐𝑛

35 Big-Oh Transitivity If 𝑓 𝑛 =𝑂 𝑔 𝑛 and 𝑔 𝑛 =𝑂(ℎ 𝑛 ), then 𝑓 𝑛 =𝑂(ℎ 𝑛 ).
Proof:

36 Big-Oh Transitivity If 𝑓 𝑛 =𝑂 𝑔 𝑛 and 𝑔 𝑛 =𝑂(ℎ 𝑛 ), then 𝑓 𝑛 =𝑂(ℎ 𝑛 ).
Proof: We want to find 𝑐3 and 𝑛3 s.t. 𝑓 𝑛 ≤𝑐3ℎ 𝑛 , for all 𝑛≥𝑛3. We know: 𝑓 𝑛 =𝑂 𝑔 𝑛 => there exist c1, n1, s.t. 𝑓 𝑛 ≤𝑐1𝑔 𝑛 , for all 𝑛≥𝑛1 𝑔 𝑛 =𝑂 ℎ 𝑛 => there exist c2, n2, s.t. g 𝑛 ≤𝑐2ℎ 𝑛 , for all 𝑛≥𝑛2 𝑓 𝑛 ≤𝑐1𝑔 𝑛 ≤𝑐1𝑐2ℎ 𝑛 , for all 𝑛≥max⁡(𝑛1,𝑛2) Use: c = c1 * c2 , and 𝑛≥𝑚𝑎𝑥⁡(𝑛1,𝑛2) 𝒏≥𝒏𝟏 𝒏≥𝒏𝟐

37 Using Substitutions If lim 𝑥→∞ ℎ(𝑥) = ∞, then:
𝑓 𝒙 = 𝑂 𝑔 𝒙 ⇒𝑓 ℎ(𝑥) =𝑂(𝑔 ℎ(𝑥) ). (This can be proved ) How do we use that? For example, prove that lg 𝑛 =𝑂( 𝑛 ).

38 Using Substitutions If lim 𝑥→∞ ℎ(𝑥) = ∞, then:
𝑓 𝑥 = 𝑂 𝑔 𝑥 ⇒𝑓 ℎ(𝑥) =𝑂(𝑔 ℎ(𝑥) ). How do we use that? For example, prove that lg 𝑛 =𝑂( 𝑛 ). Use h n = 𝑛 . We get: lg n =O n ⇒lg 𝑛 =𝑂 𝑛

39 Example Problem 1 Is 𝑛=𝑂( sin 𝑛 𝑛 2 )? Answer:

40 Example Problem 2 Show that max(f(n), g(n)) is 𝛩(f(𝑛)+ g(𝑛)) Show O:

41 Asymptotic notation for two parameters (CLRS)
f(n,m) is O(g(n,m)) if there exist constants c0, n0 and m0 such that: f(n,m) ≤ c0g(n,m) for all pairs (n,m) s.t. either n ≥ n0 or m ≥ m0 f(n,m) n0 n m0 m

42 Useful logarithm properties
𝑐 𝑙𝑔⁡(𝑛) = 𝑛 𝑙𝑔⁡(𝑐) Proof: apply lg on both sides and you get two equal terms: lg( 𝑐 𝑙𝑔⁡(𝑛) ) = lg ⁡(𝑛 𝑙𝑔⁡(𝑐) ) => lg(n) * lg(c) = lg(n) * lg(c) Can we also say that 𝑐 𝑛 = 𝑛 𝑐 ? NO!

43 Summary Definitions Properties: transitivity, reflexivity, …
Using limits Big-Oh hierarchy Substitution Example problems Asymptotic notation for two parameters 𝑐 𝑙𝑔⁡(𝑛) = 𝑛 𝑙𝑔⁡(𝑐) ( 𝑐 𝑛 ≠ 𝑛 𝑐 ) (note lg in the exponent)


Download ppt "Analysis of Algorithms: Methods and Examples"

Similar presentations


Ads by Google