Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.

Similar presentations


Presentation on theme: "1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group."— Presentation transcript:

1 1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group - TU Darmstadt Prof. Neeraj Suri Constantin Sarbu Brahim Ayari Dan Dobre Abdelmajid Khelil

2 2 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Given: Array A of integers and a constant c. Question: Is c in A? boolean contains (int [] A, int c) { int n = A.length; boolean found = false; for(int i = 0, i < n; i++) if (A [i] == c) found = true; return (found); } Input A c nAssignment Comparisons Array access Increments [1,4,2,7] 6 4 1+1+1+04+4 44 [2,7,6,1] 2 4 1+1+1+14+4 4 4 [2,1,8,4,19,7,16,3] 5 8 1+1+1+08+8 88 [4,4,4,4,4,4] 4 6 1+1+1+66+666 Memory Complexity (in Java) int: 4 bytes, boolean: 1 byte Memory used: size(A) + size(c) + size(n) + size(i) + size(found) = n*4+13 Time Complexity counting operations Remember: Sequential Search

3 3 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Time complexity ‣ gives a simple characterization of an algorithm’s efficiency ‣ allows to compare it to alternative algorithms In the last lecture we determined exact running time, but extra precision usually doesn’t worth the effort of computing it Large input sizes: constants and lower order terms are ruled out This means we are studying asymptotic complexity of algorithms  we are interested in how the running time increases with the size of the input in the limit Usually, an algorithm which is asymptotically more efficient is not the best choice for very small inputs ;) Why Asymptotic Complexity?

4 4 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Upper Bounds: O (big O) - Notation ‣ Properties, proof f  O(g), sum and product rules ‣ Loops, conditional statements, conditions, procedures ‣ Examples: Sequential search, selection sort Lower Bounds:  (Omega) – Notation Bands:  (Theta) - Notation Today: Efficiency Metrics - Complexity

5 5 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity n c g(n) f(n) n0n0 c > 0  n 0   n > n 0 cg(n) >= f(n) T(n) Asymptotic Time Complexity: Upper Bound

6 6 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity N Given f : N  R + g : N  R + Definition: O( g ) = { f |  n 0  N, c  R, c > 0:  n  n 0 f ( n )  cg ( n ) } Intuitively: O( g ) = the set of all functions f, that grow, at most, as fast as g One says: „If f  O( g ), then g is an asymptotical upper bound for f” O-Notation (pronounce: “big-Oh”)

7 7 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity O( n 4 ) = {…, n, n 2, nlogn, n 3, n 4, 3n 4, cn 3, … } ‣ n 3  O( n 4 ) ‣ nlogn  O( n 4 ) ‣ n 4  O( n 4 ) Generally: „slower growth  O (faster growths)“ Example

8 8 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Often shortened as f = O( g ) instead of f  O( g ) ‣ But: f = O( g ) is no equality in the common meaning, only interpretable from left to right! Normally, for analysis of algorithms: ‣ f : N  N and g : N  N, ‣ since the input is the size of the input data and the value is the amount of elementary operations For average case analysis the set R + is also used: ‣ f : N  R + and g : N  R + O-Notation

9 9 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Example O-Notation T 1 ( n ) = n + 3  O( n )because n + 3  2 n  n  3 T 2 ( n ) = 3 n + 7  O( n ) T 3 ( n ) = 1000 n  O( n ) T 4 ( n ) = 695 n 2 + 397 n + 6148  O( n 2 ) Functions are mostly monotonically increasing and  0. Criteria for finding f  O( g ): If f ( n ) / g ( n )  c for some n  n 0 then f = O( g ) Example: lim f ( n ) / g ( n )  c n  n2n2 lim 695 n 2 + 397 n + 6148 T4(n)T4(n) n2n2 n  = = 695 n0n0 c

10 10 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity The proof has two parts: ‣ Finding the closed form ‣ Solving the inequality f(n)  c. g(n) from the definition Illustration using an example: ‣ A is an algorithm, which sorts a set of numbers in increasing order ‣ Assumption: A performs according to f(n) = 3 + 6 + 9 +...+ 3n ‣ Proposition: A has the complexity O(n 2 ) ‣ Closed form for f(n) = 3 + 6 + 9 +...+ 3n: f(n) = 3(1+2+3+...+n) = 3n(n+1)/2 Proving that f  O(g)

11 11 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Task: Find a value c, for which 3n(n+1)/2  cn 2 (for n > one n 0 ) Try c=3: 3n(n+1)/2  3n 2 n 2 + n  2n 2 n  n 2 1  nfor all n  1Q.E.D. Proving that f  O(g)

12 12 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity O-Notation is a simplification: ‣ It eliminates constants: O(n) = O(n/2) = O(17n) ‣ It forms an upper bound, i.e.: from f(n)  O(n log 2 n) follows that f(n)  O(n 2 ) ‣ For O-Notation the basis for logarithms is irrelevant, as: Consequences of the O-Notation

13 13 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Inclusion relations of the O-Notation: O(1)  O(log n)  O(n)  O(n log n)  O(n 2 )  O(n 3 )  O(2 n )  O(10 n )  We try to set the bounds as tight as possible Rule: Properties of O-Notation

14 14 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity O(1)constant O(log n)logarithmic O(n)linear O(n log n)n log n O(n 2 )square O(n 3 )cubic O(n k ), k  2 polynomial O(2 n )exponential Pronunciation

15 15 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity The time complexity of a program comes from the complexity of its parts The complexity of the elementary operations is O(1) (elementary operation: e.g. assignment, comparison, arithmetic operations, array access, …) A defined sequence of elementary operations (independent of the input size n) also has the complexity O(1) Calculating the Time Complexity

16 16 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Given the time complexities of two algorithms T 1 and T 2 : Summation rule: For the execution of T 1 followed by T 2 : Product rule: For the nested execution of T 1 and T 2 : Sum and Product Rules

17 17 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Loops in series: (n and m are the problem sizes) for (int i = 0; i < n; i++){ operation ; } for (int j = 0; j < m; j++){ operation ; } Complexity O(n+m) = O(max(n,m)) (sum rule) Loops in Series

18 18 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Nested loops: (n is the problem size) ‣ When inner loop execution is not dependent on the problem size, e.g.: for (int i = 0; i < n; i++) for (int j = 0; j < 17; j++) operation ; Complexity O(17n) = O(n) (Product rule) ‣ Otherwise: for (int i = 0; i < n; i++) for (int j = 0; j < n; j++) operation; Complexity: (Product rule) Ex: read the data from a n x n matrix -> very expensive (O(n 2 ))! Nested Loops

19 19 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Conditional Statement: if B then T 1 else T 2 ‣ Cost of „if“ is constant, therefore negligible ‣ T(n)=T 1 (n) or T(n)=T 2 (n) ‣ Good (if decidable): Longer sequences are chosen, i.e., the dominant operation should be used ‣ Upper boundary assessment also possible: T(n) < T 1 (n) + T 2 (n)  O(g 1 (n)+g 2 (n)) Conditional Statement

20 20 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Loop with condition: (n is the problem size) for (int i = 0; i < n; i++) { if (i = = 0) block1 ; else block2 ; } block1 is executed only once => not relevant ‣ (when not: T(block2) >> n.T(block1) ) block2 is dominant Complexity O(n. T(block2)) Condition Example

21 21 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Procedures are analyzed separately, and their execution times inserted for each call For recursive procedure calls: a recurrence relation for T(n) must be found Once again: Find a closed form for the recursive relation (example follows shortly) Procedure Calls

22 22 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Iterative Algorithms (today) ‣ Composed of smaller parts  sum rule ‣ Consider loops  multiplication rule Recursive Algorithms (next lecture) ‣ Time factors: Breaking a problem in several smaller ones Solving the sub-problems Recursive call of the method for solving the problem Combining the solutions for the sub-problems Analysis of simple Algorithms

23 23 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Cost consists of part a, and a part b which is repeated n times T(n) = a+bn T(n)  O(n) boolean contains (int [] A, int c) { int n = A.length; boolean found = false; for(int i = 0, i < n; i++) if (A [i] == c) found = true; return (found); } b: inside loop a: outside loop Example 1: Sequential Search

24 24 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Inner loop is executed i times, i upper boundary: c. n Outer loop is executed n times, constant costs: b Costs: n. (b+cn) = bn + cn 2 => O(n 2 ) void SelectionSort (int [] A) { int MinPosition, temp, i, j; for (i=n-1; i>0; i--) { MinPosition = i; for (j=0; j<i; j++) if ( A[j] < A[MinPosition] ) MinPosition = j; temp = A[i]; A[i] = A[MinPosition]; A[MinPosition] = temp; } b c Example 2: Selection Sort

25 25 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Analog to O(f) we have:  (g) = { h |  c>0:  n‘>0:  n>n‘: h(n)  c g(n) } Intuitively:  (g) is the set of all functions that grow at least as strong as g One says: „ if f   (g), then g sets a lower bound for f.“ Note: f  O(g)  g   (f)  (Omega) - Notation

26 26 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity n c 2, n 0 > 0 such that f(n) =  (g(n)) T(n) “g(n) sets an lower bound for f(n)” f(n) n0n0 c 2 g(n) Example:  -Notation

27 27 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity With the sets O(g) and  (g) we can define:  (g) = O(g)   (g) Intuitively:  (g) is the set of functions that grow exactly as strong as g Meaning: if f  O(g) and f   (g) then f   (g) In this case one talks about an exact bound  (Theta) - Notation

28 28 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity n c 1, c 2, n 0 > 0 such that f(n) =  (g(n)) T(n) “g(n) sets an exact bound for f(n)” c 1 g(n) f(n) n0n0 c 2 g(n) Example:  -Notation

29 29 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Algorithms with a higher asymptotic complexity can be more efficient for smaller problem sizes Asymptotic execution time only holds for certain values of n The constants do make a difference for smaller input sets AlgorithmT(n)Good for n =... A1186182 log 2 nn > 2048 A21000 n 1024  n  2048 A3100 n log 2 n 59  n  1024 A410n 2 10  n  58 A5n3n3 n = 10 A62n2n 2  n  9 Non-Asymptotic Execution Time

30 30 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Up till now, we’ve seen only iterative algorithms What about recursive algorithms? Following week: Refreshing recursion Then: Complexity Analysis with recurrence relation Complexity and Recursion


Download ppt "1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group."

Similar presentations


Ads by Google