Download presentation
Presentation is loading. Please wait.
Published byLogan Patterson Modified over 9 years ago
1
1 Chapter 2 Program Performance – Part 2
2
2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts for the time spent in all parts of the program/function Program step: loosely defined to be a syntactically or semantically meaningful segment of a program for which the execution time is independent of the instance characteristics Return a+b*c/(a-b)*4 X =y
3
3 Use a global variable to count program steps Count = 2n + 3
4
4 Counting steps in a recursive function t Rsum = 2, n=0 t Rsum = 2+t Rsum (n-1), n>0 t Rsum = 2+2+t Rsum (n-2), n>0 t Rsum = 2(n+1), n>=0
5
5 Matrix Addition
6
6 Count steps in Matrix Addition count = 2rows*cols+2rows+1
7
7 Using a Step Table Sum
8
8 Rsum
9
9 Matrix Addition
10
10 Matrix Transpose Template void transpose(T** a, int rows) { for (int i = 0; i < rows ; i++) for (int j = i+1; j < rows ; j++) swap(a[i][j], a[j][i]) }
11
11 Matrix Transpose
12
12 Inefficient way to compute the prefix sums for j = 0, 1, …, n-1 Note: number of S/E for sum() varies depending on parameters
13
13 Steps Per Execution Sum(a, n) requires 2n + 3 steps Sum(a, j + 1) requires 2(j+1) + 3 = 2j +5 steps Assignment statement: b[j]=sum(….) ==>2j + 6 steps n-1 Total: ∑ (2j +6) = n(n+5) j=0
14
14 Prefix sums
15
15 Sequential Search - Best case
16
16 Sequential Search - Worst case
17
17 Average for successful searches X has equal probability of being any one element of a. Step count if X is a[j]
18
18 Average for successful searches
19
19 Insertion in a Sorted Array – Best Case
20
20 Insertion – Worst Case
21
21 Insertion - Average the step count for inserting into position j is 2n-2j+3 Average count is:
22
22 Asymptotic Notation Objectives of performance Evaluation: –Compare time complexities of two programs that do the same function –Predict the growth in run time as instance characteristics change Operation count and step count methods not accurate for either objectives –Op count: counts some ops and ignores others –Step count: definition of a step is inexact
23
23 Asymptotic Notation If two programs: –Program A with complexity C 1 n 2 +C 2 n –Program B with complexity C 3 n Program B is faster than program A for sufficiently large values of n For Small values of n, either could be faster and it may not matter any way. There is a break-even point for n beyond which B is always faster than A.
24
24 Asymptotic Notation Describes behavior of space and time complexities of programs for LARGE instance characteristics –To establish a relative order among functions. –To compare their relative rate of growth Allows us to make meaningful, though inexact statements about the complexity of programs
25
25 Mathematical background T(n) denotes the time or space complexity of a program Big- Oh: Growth rate of T(n) is <= f(n) T(n)= ( f(n) ) iff constants c and n 0 exist such that T(n) =n 0 f is an upper bound function for T Example: Algoritm A is (n 2 ) means, for data sets big enough (n>n 0 ), algorithm A executes less than c*n 2 (c a positive constant).
26
26 The Idea Example: –1000n larger than n 2 for small values of n n 2 grows at a faster rate and thus n 2 will eventually be the larger function. –Here we have T(n) = 1000n, f(n) = n 2, n 0 = 1000, and c=1 T(n) n 0 –Thus we say that 1000n = (n 2 ) –Note that we can get a tighter upper bound
27
27 Example Suppose T(n) = 10n 2 + 4n + 2 for n>= 2, T(n) <= 10n 2 + 5n for n>=5, T(n) <= 11n 2 T(n) = O(n 2 )
28
28 Big Oh Ratio Theorem T(n) = O(f(n)) iff (T(n)/f(n)) < c for some finite constant c. f(n) dominates T(n).
29
29 Examples Suppose T(n) = 10n 2 + 4n + 2 T(n)/n 2 = 10 + 4/n + 2/n 2 (T(n)/ n 2 ) = 10 T(n) = O (n 2 )
30
30 Common Orders of Magnitude FunctionsName 1Constant log nLogarithmic log 2 nLog-squared n log n n 2 Quadratic n 3 Cubic 2 n Exponential n!Factorial
31
31 Loose Bounds Suppose T(n) = 10n2 + 4n + 2 10n2 + 4n + 2 <= 11n3 T(n) = O(n3) Need to get the smallest upper bound.
32
32 Polynomials If T(n) = a m n m + ….+a 1 n 1 +a 0 n 0 then T(n) = O(n m )
33
33 Omga Notation--Lower Bound Omega: T(n)= ( g(n) ) iff constants c and n 0 exist such that T(n)>=c g(n) for all n >=n 0 Establishes a lower bound eg: T(n) = C 1 n 2 +C 2 n C 1 n 2 +C 2 n C 1 n 2 for all n 1 T(n) C 1 n 2 for all n 1 T(n) is ( n 2 ) Note: T(n) is also ( n) and ( 1). Need to get largest lower-bound
34
34 Omega Ratio Theorem T(n) = (f(n)) iff (f(n)/T(n)) <= c for some finite constant c.
35
35 Lower Bound of Polynomials If T(n) = a m n m + ….+a 1 n 1 +a 0 n 0 then T(n) = (n m ) T(n) = n 4 + 3500n 3 + 400n 2 +1 T(n) is (n 4 )
36
36 Theta Notation Theta: When O and meet we indicate that with notation Definition: T(n)= ( h(n) ) iff constants c 1, c 2 and n 0 exist such that c 1 h(n) n 0 T(n)= ( h(n) ) iff T(n)=O(h(n)) and T(n)= (h(n)) e.g. T(n) = 3n + 8 3n = 1 T(n) = (n) T(n) = 20log 2 (n) +8 = log 2 (n) log 2 (n) =32
37
37 Theta Notation cntd T(n) = 1000n T(n) = O(n2) but T(n) != (n 2 ) because T(n) != n 2 )
38
38 Theta of Polynomials If T(n) = a m n m + ….+a 1 n 1 +a 0 n 0 then T(n) = (n m )
39
39 Little o Notation Little- Oh: Growth rate of T(n) is < p(n) T(n)= ( p(n) ) if T(n)= ( p(n) ) and T(n)!= ( p(n) ) T(n) = 1000n T(n) o(n 2 )
40
40 Simplifying Rules If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)). If f(n) is O(kg(n)) for any k>0, then f(n) is O(g(n)). f 1 (n) = O(g 1 (n)) and f 2 (n) = O(g 2 (n)), then (a) (f 1 + f 2 )(n) = max (O(g 1 (n)), O(g 2 (n))), (b) f 1 (n) * f 2 (n) = O(g 1 (n) * g 2 (n))
41
41 Some Points DO NOT include constants or low-order terms inside a Big-Oh. For example: –T(n) = O(2n 2 ) or –T(n) = O(n 2 + n) are the same as: –T(n) = O(n 2 )
42
42 Examples Example1: a = b; This assignment takes constant time, so it is Example 2: sum =0; for( I= 0; I<= n; I++) sum += n; time complexity is (n)
43
43 Examples CNTD a = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) a++; time complexity is (n 2 )
44
44 Examples CNTD a = 0; for (i=1; i<=n; i++) for (j=1; j<= i ; j++) a++; a++ statement will execute n(n+1)/2 times time complexity is (n 2 )
45
45 Examples CNTD a = 0; (1) for (i=1; i<=n; i++) for (j=1; j<= i ; j++) a++; (n 2 ) for (k=1; k<=n; k++) (n) A[k] = k-1; time complexity is (n 2 )
46
46 Examples CNTD Not all doubly nested loops execute n 2 times a = 0; for (i=1; i<=n; i++) for (j=1; j<= n ; j *= 2) a++; Inner loop executes log 2 (n) Outer loop execute n times time complexity is (n log 2 (n))
47
47 Useful asymptotic identities
48
48 Inference rules
49
49 First determine the asymptotic complexity of each statement and then add up
50
50 Asymptotic complexity of Rsum
51
51 Asymptotic complexity of Matrix Addition
52
52 Asymptotic complexity of Transpose
53
53 Asymptotic complexity of Inef
54
54 Asymptotic complexity of Sequential Search
55
55 Binary Search Worst-case complexity is Θ(log n)
56
56 Performance Measurement Chapter 2 Section 6
57
57 Run time on a pseudo machine
58
58 Conclusions The utility of a program with exponential complexity is limited to small n (typically <= 40) Programs that have a complexity of high degree polynomial are also of limited utility Linear complexity is desirable in practice of programming
59
59 Performance Measurement Obtain the actual space and time requirements of a program Choosing Instance Size Developing the test data - exhibits the best-, worst-, and average-case time complexity (using randomly generated data) Setting up the experiment - write a program that will measure the desired run times
60
60 Measuring the performance of Insertion Sort Program
61
61 Measuring the performance of Insertion Sort Program (continue)
62
62 Experimental results - Insertion Sort
63
63 Measuring with repeated runs
64
64 Do without overhead
65
65 Do without overhead (continue)
66
66 Overhead
67
67 End of Chapter 2
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.