Download presentation
Presentation is loading. Please wait.
Published byScott Grant Modified over 8 years ago
1
(Complexity) Analysis of Algorithms Algorithm Input Output 1Analysis of Algorithms
2
How long does this take to open 1) know 2) don’t know. Analysis of Algorithms2 If know combination O(n) where n is number of rings. If the alphabet is size m, O(nm) If don’t know combination O(2 n ), O(2 nm ),
3
With a combination lock – the idea is If you know the combination – you can open it fast. If you do not know the combination – you can open it – but it will take you a very long time. Analysis of Algorithms3
4
Knapsack and Travelling Salesman problems Analysis of Algorithms4 Maximize the cost, Without exceeding the weight Minimize the route length, While visiting every city
5
Knapsack and Travelling Salesman problems Analysis of Algorithms5 Maximize the cost, Without exceeding the weight Minimize the route length, While visiting every city O(2 n ) O(n!) Means n! = n.(n-1)…2.1
6
With these types of problem, so called combinatorial problem. We do not know a fast way to solve these problems We just have to go thru all possible combinations. Analysis of Algorithms6
7
Measuring Complexity We can measure the runtime of an algorithms with a clock (unreliable) Different machines and software… Different inputs (sorted, random, reverse) We can theoretically analyse the runtime of an algorithms How many primitive instructions are executed. This is a reflection of actual runtime. Analysis of Algorithms7
8
Which lines of code are costly 1. Obviously, some lines in a algorithm are executed more than others. 2. Often there is one line (or body) of code that is executed repeated. 3. This lies at the heart of the algorithm and is responsible for most of the (time) cost. 4. We could probe code, and put a counter at each line to see how many times executed. Analysis of Algorithms8
9
For loop analysis for (int i = 0; i < n; i++) { // do some interest stuff in Java } Analysis of Algorithms9
10
Single for loop with counters int n = 10; for (int i = 0; ((i = 0)); i++, counter2++) { counter3++; // do some interest stuff in Java } System.out.println("counter1 = " + counter1); System.out.println("counter2 = " + counter2); System.out.println("counter3 = " + counter3); Analysis of Algorithms10
11
what is the complexity O(?) counter1 = 10 counter2 = 10 counter3 = 10 Analysis of Algorithms11
12
Nested for loop // nested for loop for (int i = 0; ((i = 0)); i++, counter2++) { counter3++; for (int j = 0; ((j = 0)); j++, counter5++) { // do some even more interest stuff in Java counter6++; } // do some interest stuff in Java } Analysis of Algorithms12
13
what is the complexity O(?) counter1 = 10 counter2 = 10 counter3 = 10 counter1 = 100 counter2 = 100 counter3 = 100 Analysis of Algorithms13
14
Grouping Complexities 1. In order to compare runtimes, it is convenient to have some grouping schema. 2. Think of runtimes as a function of the LENGTH of inputs. 3. How do mathematicians group functions? Analysis of Algorithms14
15
Functions How might you “ describe ” these functions? What “ group ” do they belong to? f(x) = 7 f(x) = log(x + 3) f(x) = 3x + 5 f(x) = 4x 2 + 15x + 90 f(x) = 10x 3 – 30 f(x) = 2 (3x + 3) Analysis of Algorithms15
16
Functions How might you “ describe ” these functions? What “ group ” do they belong to? f(x) = 7 f(x) = log(x + 3) f(x) = 3x + 5 f(x) = 4x 2 + 15x + 90 f(x) = 10x 3 – 30 f(x) = 2 (3x + 3) Linear Quadratic Cubic Exponential Logarithmic Constant Analysis of Algorithms16
17
Big-O Notation 1. Instead of using terms like linear, quadratic, and cubic, computer scientists use big-O notation to discuss runtimes. 2. A function with linear runtime is said to be of order n. 3. The shorthand looks like this: O(n) Analysis of Algorithms17
18
Functions If the following are runtimes expressed as functions of the number of inputs (x), we would label them as follows: 1. f(x) = 7 2. f(x) = log(x + 3) 3. f(x) = 3x + 5 4. f(x) = 4x 2 + 15x + 90 5. f(x) = 10x 3 – 30 6. f(x) = 2 (3x + 3) Analysis of Algorithms18
19
Functions If the following are runtimes expressed as functions of the number of inputs (x), we would label them as follows: 1. f(x) = 7 2. f(x) = log(x + 3) 3. f(x) = 3x + 5 4. f(x) = 4x 2 + 15x + 90 5. f(x) = 10x 3 – 30 6. f(x) = 2 (3x + 3) O(n) O(n 2 ) O(n 3 ) O(2 n ) O(log n) O(1) Analysis of Algorithms19
20
The common Big-O values 1. O(1) : constant 2. O(log n): Logarithmic 3. O(n) : Linear 4. O(n log n) : n log n 5. O(n 2 ): Quadratic 6. O(n 3 ): Cubic 7. O(2 n ): exponential Analysis of Algorithms20
21
Merge sort vs. Insertion sort. Analysis of Algorithms21
22
Comparison of Two Algorithms 22Analysis of Algorithms Slide by Matt Stallmann included with permission. Insertion Sort is n 2 / 4 Merge sort is 2n log n Sorting a million items: Insertion Sort takes roughly 70 hours Merge Sort takes roughly 40 seconds This is a slow machine, but if 100 x as fast then it’s 40 minutes versus less than 0.5 seconds
23
Which number is ‘bigger?’ 12 or 13 It is easy to compare numbers What about functions??? Analysis of Algorithms23
24
Analysis of Algorithms24 Which function is ‘bigger?’ Source: Wikipedia
25
Analysis of Algorithms25 Constant Factors don’t Matter! The growth rate is not affected by constant factors or lower-order terms Examples 10 2 n 10 5 is a linear function 10 5 n 2 10 8 n is a quadratic function
26
Analysis of Algorithms26 Big-Oh Notation Given functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive constants c and n 0 such that f(n) cg(n) for n n 0 Example: 2n 10 is O(n) 2n 10 cn (c 2) n 10 n 10 (c 2) Pick c 3 and n 0 10
27
Analysis of Algorithms27 Big-Oh Example Example: the function n 2 is not O(n) n 2 cn n c The above inequality cannot be satisfied since c must be a constant
28
Analysis of Algorithms28 Pseudocode Algorithm arrayMax(A, n) Input array A of n integers Output maximum element of A currentMax A[0] for i 1 to n 1 do if A[i] currentMax then currentMax A[i] return currentMax Example: find max element of an array
29
Analysis of Algorithms29 Primitive Operations 1. Basic computations performed by an algorithm 2. Identifiable in pseudocode 3. Largely independent from the programming language 4. Exact definition not important 5. Assumed to take a constant amount of time in the RAM model Examples: Evaluating an expression Assigning a value to a variable Indexing into an array Calling a method Returning from a method
30
Analysis of Algorithms30 Counting Primitive Operations By inspecting the pseudocode, we can determine the maximum number of primitive operations executed by an algorithm, as a function of the input size Algorithm arrayMax(A, n) # operations currentMax A[0] 2 for i 1 to n 1 do 2n if A[i] currentMax then2(n 1) currentMax A[i]2(n 1) { increment counter i }2(n 1) return currentMax 1 Total 8n 2
31
Analysis of Algorithms31 Computing Prefix Averages We further illustrate asymptotic analysis with two algorithms for prefix averages The i -th prefix average of an array X is average of the first (i 1) elements of X: A[i] X[0] X[1] … X[i])/(i+1) Computing the array A of prefix averages of another array X has applications to financial analysis
32
Analysis of Algorithms32 Prefix Averages (Quadratic) The following algorithm computes prefix averages in quadratic time by applying the definition Algorithm prefixAverages1(X, n) Input array X of n integers Output array A of prefix averages of X #operations A new array of n integers n for i 0 to n 1 do n s X[0] n for j 1 to i do 1 2 … (n 1) s s X[j] 1 2 … (n 1) A[i] s (i 1) n return A 1
33
Analysis of Algorithms33 Arithmetic Progression The running time of prefixAverages1 is O(1 2 … n) The sum of the first n integers is n(n 1) 2 There is a simple visual proof of this fact Thus, algorithm prefixAverages1 runs in O(n 2 ) time
34
Analysis of Algorithms34 Prefix Averages (Linear) The following algorithm computes prefix averages in linear time by keeping a running sum Algorithm prefixAverages2(X, n) Input array X of n integers Output array A of prefix averages of X #operations A new array of n integersn s 0 1 for i 0 to n 1 don s s X[i]n A[i] s (i 1) n return A 1 Algorithm prefixAverages2 runs in O(n) time
35
General Guidelines The worst-case instructions determine worst-case behaviour, overall. (eg. A single n 2 statement means the whole algorithm is n 2.) Instant recognition: Assignments/arithmetic is O(1), Loops are O(n), Two nested loops are O(n 2 ). How about three nested loops? 35Analysis of Algorithms
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.