Presentation is loading. Please wait.

Presentation is loading. Please wait.

Efficiency of Algorithms. Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter()

Similar presentations


Presentation on theme: "Efficiency of Algorithms. Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter()"— Presentation transcript:

1 Efficiency of Algorithms

2 Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter() 0..* 1 UML "Has-A" Relationship LinkedList - head : Node + createLinkedList() + add(Object) + add(Object, int) + clear() + get(int) : Node + isEmpty() : boolean + size() : int + find(Object) : int + remove(Object) + remove(int) - getReference(int) : Node

3 Is-A Relationship LinkedList - head : Node + createLinkedList() + add(Object) + add(Object, int) + clear() + get(int) : Node + isEmpty() : boolean + size() : int + find(Object) : int + remove(Object) + remove(int) - getReference(int) : Node Stack + createStack() + push(Object) + pop() : Object + peek() : Object + isEmpty() : boolean

4 - make : String - model : String - year : int - vin : String - color : int - tag : String - cost : double. Car - make : String - model : String - year : int - vin : String - tag : String - cost : double - netWeight : int - grossWeight : int. Truck

5 Vehicle - color : int. Car - netWeight : int - grossWeight : int. Truck - make : String - model : String - year : int - vin : String - tag : String - cost : double

6 Efficiency of Algorithms There are always multiple ways to implement a data structure or algorithm. We can judge the efficiency of each and choose the one that best meets our needs. We can gauge the efficiency of each in two ways: ◦The amount of memory used (space complexity) ◦The amount to time it takes (time complexity)

7 What to Compare? You don’t compare programs, you compare algorithms. ◦You don’t have to worry about the quality of the coding. ◦You don’t have to worry about the type of computer or language to use. ◦You don’t have to worry that the data adequately represents the problem. Analyzing the algorithms avoids all of this.

8 How to Compare? Counting an algorithm’s operations is a way to determine its efficiency. An algorithm’s time requirements can be measured as a function of the problem size. An algorithm’s growth rate enables comparison of one algorithm to another. Typically, we only worry about efficiency when dealing with large problems.

9 Algorithm Growth Rates

10 Three Comparisons Best Case – When searching a list for a value, it is in the first node we look at. When sorting, the list is already in sorted order. Average Case – Difficult to determine. Worst Case – Normally used for all comparisons. The number of operations can easily be determined.

11 Big O Notation This “Order of Magnitude” analysis is expressed in Big O notation. ◦Uses the upper-case O to specify an algorithm’s order. ◦Example: O(f(n)) or O(n) Algorithm A is said to be of order f(n) if constants k and n 0 exist such that A requires no more than k * f(n) time units to solve a problem of size n ≥ n 0

12 Traversing a Linked List Node current = head; while( current != null ) { total += current.getData(); current = current.getLink(); } On each pass of the loop, 2 statements are executed. The loop executes once for each node in the list. O (f(3 * n))or O( 3 * n )orO(n)

13 Bubble Sort for( int i = 0; i < array.length - 1; i++ ) { if( array[i] > array[i + 1] ) { swap( i, i + 1 ); for( j = i; j >= 0; j-- ) { if( j – 1 > j ) swap( j – 1, j ); } The main loop executes n – 1 times. The inside loop executes n – 2 times. Drop the low order constants, and we get O(n * n) or O( n 2 ).

14 Some Example Growth Rates OrderComments 1 Constant time, so no matter how many data there are, the algorithm operates in some k time, example: retrieving the value of an array element. log n Logarithmic time means that the complexity grows slowly as the input size increases, example: binary search (any algorithm where the data size is cut in half in each iteration is an example) n Linear time means that the complexity is proportional to the size of the input by some constant k, example: finding a particular node in a linked list n log n This peculiar time complexity arises in a number of sorting algorithms n2n2 Quadratic time, this one grows much more quickly than n, example: some less efficient sorting algorithms, algorithms requiring 2 nested for loops n3n3 Cubic time, found in some graph algorithms, or 3 nested loops 2n2n Exponential time, the time doubles with each additional input, this is by far the worst complexity, examples include Tower of Hanoi and many AI searching/backtracking algorithms

15 Big O Notation

16 Keeping Things in Perspective Throughout the course of an analysis, keep in mind that you are interested only in significant differences in efficiency When choosing an ADT’s implementation, consider how frequently particular ADT operations occur in a given application

17 Keeping Things in Perspective If the problem size is always small, you can probably ignore an algorithm’s efficiency Weigh the trade-offs between an algorithm’s time requirements and its memory requirements Order-of-magnitude analysis focuses on large problems


Download ppt "Efficiency of Algorithms. Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter()"

Similar presentations


Ads by Google