Presentation is loading. Please wait.

Presentation is loading. Please wait.

Big O Notation.

Similar presentations


Presentation on theme: "Big O Notation."— Presentation transcript:

1 Big O Notation

2 Big O notation – Calculating Algorithm Efficiency
Big O notation is the language we use for talking about how long an algorithm takes to run. It's how we compare the efficiency of different approaches to a problem. Imagine you have a list of 10 objects, and you want to sort them in order.  There’s a whole bunch of algorithms you can use to make that happen, but not all algorithms are built equal. It is simply a way of comparing algorithms and how long they will take to run.

3 Big O notation A simplified analysis of algorithm efficiency. It considers: Complexity in terms of input size, N Machine independence(Doesn’t consider the machines hardware) Basic computer steps Time & space(working storage) Originally used by German theorist Paul Bachman O means order of (German: "Ordnung von"). n is the input size in units of bits needed to represent the input

4 General Rules Ignore constants (Considers long term growth only)
5n becomes O(n) Terms are dominated by others O(1) < O(logn) < O(n) < O(nlogn) < O(n2) < O(2n) < O(n!) Ignore lower order terms

5 Constant Time Algorithms – O(1)
Time taken per input size Constant time implies that the number of operations the algorithm needs to perform to complete a given task is independent of the input size. In Big O notation we use the following O(1). No matter how much data you add the amount of time to perform a task will not change. The quickest type of algorithm Graph shows increase in time to run algorithms.

6 Constant time O(1) “Big oh of one”
X = 10 + (5 * 2); Input size(X) is independent doesn’t matter doesn’t effect processing of code/algorithm So it’s O(1)

7 Constant time O(1) “Big oh of one”
X = 10 + (5 * 2); Y = 20 – 2; System.out.print(”x + y”); Total time is = O(1) + O(1) + O(1) = 3 * O(1) Because we drop constants we would call this code O(1) “Big oh of one”

8 Constant Time Algorithms – O(1) Example
Consider these two bits of code: 1 2 int n = 5000; System.out.println("Hey - your input is: " + n); 1 2 3 4 int n = 5000; System.out.println(“Your input is: " + n); System.out.println(“Good stuff with: " + n); System.out.println("And more: " + n); It doesn’t matter what n is, above. This piece of code takes a constant amount of time to run. It’s not dependent on the size of n.

9 Logarithmic Time Algorithms – O(log n)
Logarithmic time implies that an algorithms run time is proportional to the logarithm of the input size. In Big O notation this is usually represented as O(log n). Next fastest after constant time Any code that involves a divide and conquer strategy Binary search algorithm is O(log n) Good video on logarithms:

10 Logarithmic Time Algorithms – O(log n) - Example
Logarithm – Basic principle How many 2s do we multiply to get 8? 2x2x2=8 So the logarithm is 3 We would write this as log2(8) = 3 What is important here is that the running time grows in proportion to the logarithm of the input (in this case, log to the base 2): 1 2 3 int n = 8; for (int i = 1; i < n; i = i * 2){     System.out.println("Hey - I'm busy looking at: " + i); } Any algorithm that ignores a proportion of the input data can be considered O(log(n)) As n is 8, the output will be the following: 1 2 3 Hey - I'm busy looking at: 1 Hey - I'm busy looking at: 2 Hey - I'm busy looking at: 4 "log base two of eight is four” Our simple algorithm ran log2 (8) = 3 times. Because we drop constants we would call this algorithm O(log n) N = The input value e.g. number of elements in the array

11 Linear Time Algorithms – O(n)
Next quickest algorithm Linear time is a concept where by time is seen sequentially, as a series of events that are leading toward something: beginning, and an end. Grow directly proportional to the size of its inputs.

12 Linear time O(n) example
This loop prints numbers 0 to n Y = 5 + (20 * 20) //O(1) For x in range (O, n); //O(N) Print x; Total time is = O(1) + O(N) = O(N) Because Linear time(O(N)) takes so much longer than constant time(O(1)) it dominates it and effectively becomes irrelevant.

13 Linear Time Algorithms – O(n) - Example
1 2 3 for (int i = 0; i < n; i++) {     System.out.println("Hey - I'm busy looking at: " + i); } This code runs N times. We don’t know exactly how long it will take for this to run – and we don’t worry about that. 1 2 3 4 5 for (int i = 0; i < n; i++) {     System.out.println("Hey - I'm busy looking at: " + i);     System.out.println("Hmm.. Let's have another look at: " + i);     System.out.println("And another: " + i); } In this second code the runtime would still be linear in the size of its input, n. We denote linear algorithms as follows: O(n).

14 Polynomial Time Algorithms – O(np)
Also known as quadratic time algorithms Normally denoted by nested for loops.   If your array has 10 items and your algorithm is running in Polynomial time your going to be doing 10 * 10 = 100 steps.   quadratic time algorithm:

15 Polynomial Time Algorithms – O(np) - Example
For x in range (O, n); //O(N) Print x; For x in range(0, n); //O(N2) For y in range (0, n); Print x * y; // O(1) Three nested for loops? O(N3) Four? O(N4) and so on

16 Polynomial Time Algorithms – O(np) - Example
Polynomial is a general term which contains quadratic (n2), cubic (n3), quartic (n4), etc. functions. 1 2 3 4 5 for (int i = 1; i <= n; i++) {     for(int j = 1; j <= n; j++) {         System.out.println("Hey - I'm busy looking at: " + i + " and " + j);     } } Assuming N = 8 this algorithm will run 82 = 64 times This is a O(n2) algorithm because it has two loops If we were to nest another for loop, this would become an O(n3) algorithm. O(n2) is faster than O(n3) which is faster than O(n4), etc. quadratic time algorithm:

17 Bubble Sort - Polynomial Time Algorithm
Polynomial Complexity the more data you have the more checks it has to make and the more time it will take to run. Assuming the data in the array going through the algorithm is not already sorted this algorithm would be considered polynomial o(n2). It has two loops. If the data is already sorted then the complexity will be linear o(n) because the loop only has to run once. Enter length of array. (int) 100 Array contains ’99’ items. I performed ‘4950’ checks Execution time ‘0’ minutes ‘0’ seconds ‘0’ Milliseconds Note the big difference in execution time based on array length. Bubble sort is not very efficient for larger data arrays. Efficiency decreases dramatically. Enter length of array. (int) Array contains ‘59999’ items. I performed ‘ ’ checks Execution time ‘0’ minutes ’14’ seconds ‘344’ Milliseconds

18 Cost of running time, acknowledging hardware performance as a factor.
Big O Notation does not consider performance of hardware. Processor speed, Memory (Usually RAM), bandwidth, latency, storage space, power all have an effect on algorithm speed. Efficiency of the algorithm of course still plays a part but we can’t ignore raw hardware performance.

19 Big O shows how the algorithm will perform based on big and small inputs.


Download ppt "Big O Notation."

Similar presentations


Ads by Google