Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Efficiency of Algorithms

Similar presentations


Presentation on theme: "The Efficiency of Algorithms"— Presentation transcript:

1 The Efficiency of Algorithms
Chapter 9

2 Chapter Contents Motivation Measuring an Algorithm's Efficiency
Big Oh Notation Formalities Picturing Efficiency The Efficiency of Implementations of the ADT List The Array-Based Implementation The Linked Implementation Comparing Implementations

3 Motivation Even a simple program can be noticeably inefficient
When the 423 is changed to 100,000,000 there is a significant delay in seeing the result long firstOperand = 7562; long secondOperand = 423; long product = 0; for (; secondOperand > 0; secondOperand--) product = product + firstOperand; System.out.println(product);

4 Measuring Algorithm Efficiency
Types of complexity Space complexity Time complexity Analysis of algorithms The measuring of the complexity of an algorithm Cannot compute actual time for an algorithm We usually measure worst-case time

5 Measuring Algorithm Efficiency
Fig. 9-1 Three algorithms for computing … n for an integer n > 0

6 Measuring Algorithm Efficiency
Fig. 9-2 The number of operations required by the algorithms for Fig 9-1

7 Measuring Algorithm Efficiency
Fig. 9-3 The number of operations required by the algorithms in Fig. 9-1 as a function of n

8 Big Oh Notation To say "Algorithm A has a worst-case time requirement proportional to n" We say A is O(n) Read "Big Oh of n" For the other two algorithms Algorithm B is O(n2) Algorithm C is O(1)

9 Big Oh Notation Fig. 9-4 Typical growth-rate functions evaluated at increasing values of n

10 Big Oh Notation Fig. 9-5 The number of digits in an integer n compared with the integer portion of log10n

11 Big Oh Notation Fig. 9-6 The values of two logarithmic growth-rate functions for various ranges of n.

12 f(n) ≤ c•g(n) for all n ≥ N
Formalities Formal definition of Big Oh An algorithm's time requirement f(n) is of order at most g(n) f(n) = O(g(n)) For a positive real number c and positive integer N exist such that f(n) ≤ c•g(n) for all n ≥ N

13 Fig. 9-7 An illustration of the definition of Big Oh
Formalities Fig. 9-7 An illustration of the definition of Big Oh

14 Formalities The following identities hold for Big Oh notation:
O(k f(n)) = O(f(n)) O(f(n)) + O(g(n)) = O(f(n) + g(n)) O(f(n)) O(g(n)) = O(f(n) g(n))

15 Picturing Efficiency Fig. 9-8 an O(n) algorithm.

16 Picturing Efficiency Fig. 9-9 An O(n2) algorithm.

17 Fig. 9-10 Another O(n2) algorithm.
Picturing Efficiency Fig Another O(n2) algorithm.

18 Picturing Efficiency Fig The effect of doubling the problem size on an algorithm's time requirement.

19 Picturing Efficiency Fig The time to process one million items by algorithms of various orders at the rate of one million operations per second.

20 Comments on Efficiency
A programmer can use O(n2), O(n3) or O(2n) as long as the problem size is small At one million operations per second it would take 1 second … For a problem size of 1000 with O(n2) For a problem size of 1000 with O(n3) For a problem size of 20 with O(2n)

21 Efficiency of Implementations of ADT List
For array-based implementation Add to end of list O(1) Add to list at given position O(n) For linked implementation Add to end of list O(n) Retrieving an entry O(n)

22 Comparing Implementations
Fig The time efficiencies of the ADT list operations for two implementations, expressed in Big Oh notation


Download ppt "The Efficiency of Algorithms"

Similar presentations


Ads by Google