Download presentation
Presentation is loading. Please wait.
Published byNancy Davidson Modified over 9 years ago
1
Problem of the Day I am thinking of a question and propose 3 possible answers. Exactly one of the following is the solution. Which is it? A. Answer 1 B. Answers 1 or 2 C. Answer 2 D. Answers 2 or 3
2
Problem of the Day I am thinking of a question and propose 3 possible answers. Exactly one of the following is the solution. Which is it? A. Answer 1 B. Answers 1 or 2 C. Answer 2 D. Answers 2 or 3 If answers 1 or 2 were correct, we would not be able to select exactly one solution. So, answer 3 (and selection D) must be right.
3
CSC 212 – Data Structures
4
Analysis Techniques Running time is critical, … …but comparing times impossible in many cases Single problem may have lots of ways to be solved Many implementations possible for each solution
5
Pseudo-Code Only for human eyes Unimportant & implementation details ignored Serves very real purpose, even if it is not real Useful for tasks like outlining, designing, & analyzing Language-like manner to system, though not formal
6
Pseudo-Code Only needs to include details needed for tracing Loops, assignments, calls to methods, etc. Anything that would be helpful analyzing algorithm Understanding algorithm is only goal Feel free to ignore punctuation & other formalisms Understanding & analysis is only goal of using this
7
Pseudo-code Example Algorithm factorial(int n) returnVariable = 1 while (n > 0) returnVariable = returnVariable * n n = n – 1 endwhile return returnVariable
8
“Anything that can go wrong…” Expresses an algorithm’s complexity Worst-case Worst-case analysis of algorithm performance Usually closely correlated with execution time Not always right to consider only worst-case May be situation where worst-case is very rare Closely related approaches for other cases come later
9
“Should I Even Bother?” Compare algorithms using big-Oh notation Could use to compare implementations, also Saves time implementing all the algorithms Biases like CPU, typing speed, cosmic rays ignored
10
Algorithmic Analysis
11
Algorithm Analysis Execution time with n inputs on 4GHz machine: n = 10n = 50n = 100n = 1000n = 10 6 O(n log n)9 ns50 ns175 ns2500 ns5 ms O(n 2 )25 ns625 ns2250 ns250000 ns4 min O(n 5 )25000 ns72.5 ms2.7 s2.9 days1x10 13 yrs O(2 n )2500 ns3.25 days1 x 10 14 yrs1 x 10 285 yrs Too long! O(n!)1 ms1.4 x 10 58 yrs7 x 10 141 yrs Too long!
12
Want results for large data sets Nobody cares Nobody cares about 2 minute-long program Limit considerations to only major details Ignore multipliers So, O ( 5n ) = O ( 2n ) = O ( n ) Multipliers usually implementation-specific Who cares about 20ms after waiting 4 minutes? Ignore lesser terms So, O ( n 5 + n 2 ) = O ( n 5 ) Tolerate extra 17 minutes if waiting 3x10 13 years? Big-Oh Notation
13
What is n ? Big-Oh analysis always relative to input size But determining input size is not always clear Quick rules of thumb: Need to consider what algorithm is processing Analyze values below x : n = x Analyze data in an array: n = size of array Analyze linked list: n = size of linked list Analyze 2 arrays: n = sum of array sizes
14
primitive operations Big-Oh computes primitive operations executed Assignments Calling a method Performing arithmetic operation Comparing two values Getting entry from an array Following a reference Returning a value from a method Accessing a field Analyzing an Algorithm
15
Primitive Statements O(1) Basis of programming, take constant time: O(1) Fastest possible big-Oh notation Time to run sequence of primitive statements, too But only if the input does not affect sequence Ignore constant multiplier 11 O(5) = O(5 * 1 ) = O( 1 )
16
Simple Loops for (int i = 0; i < n.length; i++){} -or- while (i < n) { i++; } Each loop executed n times Primitive statements only within body of loop O(1) Big –oh complexity of single loop iteration: O(1) O(n) Either loop runs O(n) iterations O(n)O(1)O(n) So loop has O(n) * O(1) = O(n) complexity total
17
for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n 1) = O(n + 1) Loops In a Row
18
for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n 1) = O(n + 1) Loops In a Row
19
for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n) = O(n) Loops In a Row
20
More Complicated Loops for (int i = 0; i < n; i += 2) { } i 0, 2, 4, 6,..., n In above example, loop executes n / 2 iterations O(1) Iterations takes O(1) time, so total complexity: O( n )O(1) = O( n / 2 ) * O(1) O(n ) = O(n * ½ * 1) O(n) = O(n)
21
Really Complicated Loops for (int i = 1; i < n; i *= 2) { } i 1, 2, 4, 8,..., n In above code, loop executes log 2 n iterations O(1) Iterations takes O(1) time, so total complexity: O(log 2 n)O(1) = O(log 2 n) * O(1) O(log 2 n) = O(log 2 n * 1) O(log 2 n) = O(log 2 n)
22
Really Complicated Loops for (int i = 1; i < n; i *= 3) { } i 1, 3, 9, 27,..., n In above code, loop executes log 3 n iterations O(1) Iterations takes O(1) time, so total complexity: O(log 3 n)O(1) = O(log 3 n) * O(1) O(log 3 n) = O(log 3 n * 1) O(log 3 n) = O(log 3 n)
23
Math Moment All logarithms are related, no matter the base Change base for an answer using constant multiple But ignore constant multiple using big-Oh notation O(log n) So can consider all O(log n) solutions identical
24
Nested Loops for (int i = 0; i < n; i++){ for (int j = 0; j < n; j++) { } } Program would execute outer loop n times Inner loop run n times each iteration of outer loop O(n)O(n) O(n) iterations doing O(n) work each iteration O(n)O(n)O(n 2 ) So loop has O(n) * O(n) = O(n 2 ) complexity total Loops complexity multiplies when nested
25
+ Only care about approximates on huge data sets Ignore constant multiples n!2 n n 5 n 2 nlog n1 Drop lesser terms (& n! > 2 n > n 5 > n 2 > n > log n > 1 ) O(1) O(1) time for primitive statements to execute O(n) Change by constant amount in loop: O(n) time O(log n) O(log n) time if multiply by constant in loop Ignore constants: does not matter what constant is add When code is sequential, add their complexities multiplied Complexities are multiplied when code is nested
26
Your Turn Get into your groups and complete activity
27
For Next Lecture Read GT4.3 for class on Wednesday How do we prove big-Oh complexities? Week #7 weekly assignment available now Angel also has programming assignment #1 Pulls everything together and shows off your stuff
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.