Download presentation
Presentation is loading. Please wait.
Published byMuriel Rogers Modified over 8 years ago
1
The Efficiency of Algorithms Chapter 9 Carrano, Data Structures and Abstractions with Java, Second Edition, (c) 2007 Pearson Education, Inc. All rights reserved. 0-13-237045-X
2
Chapter Contents Motivation Measuring an Algorithm's Efficiency – Big Oh Notation Formalities Picturing Efficiency The Efficiency of Implementations of the ADT List – An Array-Based Implementation – A Linked Implementation – Comparing Implementations
3
Motivation Even a simple program can be noticeably inefficient long firstOperand = 7562; long secondOperand = 423; long product = 0; while (secondOperand > 0) { product = product + firstOperand; secondOperand--; } System.out.println(product); When the 423 is changed to 100,000,000 there is a significant delay in seeing the result
4
Measuring Algorithm Efficiency Types of complexity – Space complexity – Time complexity Analysis of algorithms – The measuring of the complexity of an algorithm Cannot compute actual time for an algorithm – We usually measure worst-case time
5
Measuring Algorithm Efficiency Fig. 9-1 Three algorithms for computing 1 + 2 + … n for an integer n > 0
6
Measuring Algorithm Efficiency Fig. 9-2 The number of operations required by the algorithms for Fig 9-1
7
Measuring Algorithm Efficiency Fig. 9-3 The number of operations required by the algorithms in Fig. 9-1 as a function of n
8
Big Oh Notation To say "Algorithm A has a worst-case time requirement proportional to n" – We say A is O(n) – Read "Big Oh of n" For the other two algorithms – Algorithm B is O(n 2 ) – Algorithm C is O(1)
9
Big Oh Notation Fig. 9-4 Typical growth-rate functions evaluated at increasing values of n
10
Big Oh Notation Fig. 9-5 The number of digits in an integer n compared with the integer portion of log 10 n
11
Big Oh Notation Fig. 9-6 The values of two logarithmic growth-rate functions for various ranges of n.
12
Formalities Formal definition of Big Oh An algorithm's time requirement f(n) is of order at most g(n) – f(n) = O(g(n)) – For a positive real number c and positive integer N exist such that f(n) ≤ c g(n) for all n ≥ N
13
Formalities Fig. 9-7 An illustration of the definition of Big Oh
14
Formalities The following identities hold for Big Oh notation: – O(k f(n)) = O(f(n)) – O(f(n)) + O(g(n)) = O(f(n) + g(n)) – O(f(n)) O(g(n)) = O(f(n) g(n))
15
Picturing Efficiency Fig. 9-8 an O(n) algorithm.
16
Picturing Efficiency Fig. 9-9 An O(n 2 ) algorithm.
17
Picturing Efficiency Fig. 9-10 Another O(n 2 ) algorithm.
18
Picturing Efficiency Fig. 9-11 The effect of doubling the problem size on an algorithm's time requirement.
19
Picturing Efficiency Fig. 9-12 The time to process one million items by algorithms of various orders at the rate of one million operations per second.
20
Comments on Efficiency A programmer can use O(n 2 ), O(n 3 ) or O(2 n ) as long as the problem size is small At one million operations per second it would take 1 second … – For a problem size of 1000 with O(n 2 ) – For a problem size of 1000 with O(n 3 ) – For a problem size of 20 with O(2 n )
21
Efficiency of Implementations of ADT List Reference AList from Chapter 5 Reference AList For array-based implementation – Add to end of listO(1) – Add to list at given position O(n) Reference LList from Chapter 6 Reference LList For linked implementation – Add to end of list O(n) – Add to list at given position O(n) – Retrieving an entryO(n)
22
Comparing Implementations Fig. 9-13 The time efficiencies of the ADT list operations for two implementations, expressed in Big Oh notation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.