The Efficiency of Algorithms Chapter 9 Carrano, Data Structures and Abstractions with Java, Second Edition, (c) 2007 Pearson Education, Inc. All rights reserved X
Chapter Contents Motivation Measuring an Algorithm's Efficiency – Big Oh Notation Formalities Picturing Efficiency The Efficiency of Implementations of the ADT List – An Array-Based Implementation – A Linked Implementation – Comparing Implementations
Motivation Even a simple program can be noticeably inefficient long firstOperand = 7562; long secondOperand = 423; long product = 0; while (secondOperand > 0) { product = product + firstOperand; secondOperand--; } System.out.println(product); When the 423 is changed to 100,000,000 there is a significant delay in seeing the result
Measuring Algorithm Efficiency Types of complexity – Space complexity – Time complexity Analysis of algorithms – The measuring of the complexity of an algorithm Cannot compute actual time for an algorithm – We usually measure worst-case time
Measuring Algorithm Efficiency Fig. 9-1 Three algorithms for computing … n for an integer n > 0
Measuring Algorithm Efficiency Fig. 9-2 The number of operations required by the algorithms for Fig 9-1
Measuring Algorithm Efficiency Fig. 9-3 The number of operations required by the algorithms in Fig. 9-1 as a function of n
Big Oh Notation To say "Algorithm A has a worst-case time requirement proportional to n" – We say A is O(n) – Read "Big Oh of n" For the other two algorithms – Algorithm B is O(n 2 ) – Algorithm C is O(1)
Big Oh Notation Fig. 9-4 Typical growth-rate functions evaluated at increasing values of n
Big Oh Notation Fig. 9-5 The number of digits in an integer n compared with the integer portion of log 10 n
Big Oh Notation Fig. 9-6 The values of two logarithmic growth-rate functions for various ranges of n.
Formalities Formal definition of Big Oh An algorithm's time requirement f(n) is of order at most g(n) – f(n) = O(g(n)) – For a positive real number c and positive integer N exist such that f(n) ≤ c g(n) for all n ≥ N
Formalities Fig. 9-7 An illustration of the definition of Big Oh
Formalities The following identities hold for Big Oh notation: – O(k f(n)) = O(f(n)) – O(f(n)) + O(g(n)) = O(f(n) + g(n)) – O(f(n)) O(g(n)) = O(f(n) g(n))
Picturing Efficiency Fig. 9-8 an O(n) algorithm.
Picturing Efficiency Fig. 9-9 An O(n 2 ) algorithm.
Picturing Efficiency Fig Another O(n 2 ) algorithm.
Picturing Efficiency Fig The effect of doubling the problem size on an algorithm's time requirement.
Picturing Efficiency Fig The time to process one million items by algorithms of various orders at the rate of one million operations per second.
Comments on Efficiency A programmer can use O(n 2 ), O(n 3 ) or O(2 n ) as long as the problem size is small At one million operations per second it would take 1 second … – For a problem size of 1000 with O(n 2 ) – For a problem size of 1000 with O(n 3 ) – For a problem size of 20 with O(2 n )
Efficiency of Implementations of ADT List Reference AList from Chapter 5 Reference AList For array-based implementation – Add to end of listO(1) – Add to list at given position O(n) Reference LList from Chapter 6 Reference LList For linked implementation – Add to end of list O(n) – Add to list at given position O(n) – Retrieving an entryO(n)
Comparing Implementations Fig The time efficiencies of the ADT list operations for two implementations, expressed in Big Oh notation