Download presentation
Presentation is loading. Please wait.
1
Analysis of Algorithms
It is often important to understand how efficient a particular algorithm is Chapter 12 focuses on what it means for an algorithm to be efficient the concept of algorithm analysis comparing algorithmic growth functions asymptotic complexity Big-Oh notation
2
Outline Algorithm efficiency Growth Functions and Big-Oh Notation
Comparing Growth Functions
3
12.1 – Algorithm Efficiency
An algorithm’s efficiency is a major factor in determining how fast a program executes Algorithm analysis can be performed relative to the amount of memory a program uses, but the amount of CPU time is usually a more interesting issue Every day example: washing dishes by hand
4
12.1 – Dishes Example If washing a dish takes 30 seconds and drying a dish takes an additional 30 seconds, then it would take n seconds to wash and dry n dishes This computation can be expressed as
5
12.1 – Dishes Example A second (sloppy) version:
While washing the dishes, we slash around a lot of water. Each time we dry a dish, we need to dry all previously dried dishes as well 30 seconds to wash 30 seconds to dry last dish (once), 60 seconds to dry next-to-last dish (twice), 90 seconds to dry the third-to-last dish (thrice) This computation can be expressed as
6
12.1 – Dishes Example Both algorithms accomplish the same result
One is much less efficient than the other All things being equal, we'll take the more efficient version The efficiency can be expressed in a formula That formula is a function of the size of the problem – in this case the number of dishes to be washed
7
Outline Algorithm efficiency Growth Functions and Big-Oh Notation
Comparing Growth Functions
8
12.2 – Growth Functions and Big-Oh Notation
For every algorithm we want to analyze, we need to define the size of the problem When downloading a file, the size of the problem is the size of the file When searching for a target value, the size of the problem is the size of the search pool For a sorting algorithm, the size of the program is the number of elements to be sorted
9
12.2 – Growth Functions and Big O() Notation
We need to think about the key processing steps that define our use of CPU time We want to find a processing step we can minimize The overall amount of time spent at the task is directly related to how many times we have to perform the key processing step Algorithm efficiency can be defined in terms of problem size and the key processing step A growth function shows the relationship between the size of the problem (n) and the value we hope to optimize
10
12.2 – Growth Functions and Big O() Notation
A growth function represents the time complexity (or space complexity) of an algorithm The growth function of the second dishwashing algorithm is t(n) = 15n2 + 45n But it is not typically necessary to know the exact growth function for an algorithm We are mainly interested in the asymptotic complexity of an algorithm – the general nature of the algorithm as n gets bigger
11
12.2 – Growth Functions and Big-Oh Notation
Asymptotic complexity is based on the dominant term of the growth function – the term that increases most quickly as n increases Asymptotic complexity is called the order of the algorithm Our dishwashing example is said to have order n2 time complexity – written as O(n2) This is referred to as Big O() or Big-Oh notation
12
12.2 – Growth Functions and Big O() Notation
Order Complexity t(n)=17 O(1) constant t(n)=20n – 4 O(n) linear t(n)=12n log n + 100n O(n log n) logarithmic t(n)=3n2 + 5n - 2 O(n2) polynomial (quadratic) t(n)=2n + 18n2 + 3n O(2n) exponential
13
Outline Algorithm efficiency Growth Functions and Big-Oh Notation
Comparing Growth Functions
14
12.3 – Comparing Growth Functions
Optimizing an algorithm – tweaking it so that it is more efficient – is good But finding a better algorithm, one that takes you into a different Big-Oh category, has a far bigger impact Similarly, having a faster processor is nice, but it will never eliminate the need to find better algorithms Faster processors and most optimizations add constant factors to the timing function, which we throw away anyway when determining Big-Oh efficiency
15
12.3 – Comparing Growth Functions
16
12.3 – Analyzing Loop Execution
How do we determine an algorithm's efficiency? Let's start with the basics: loops To analyze a loop, determine the order of the body of the loop, and then multiply that by the number of times the loop will execute for (int count = 0; count < n; count++) { //some sequence of O(1) steps } The loop executes n times, and the body of the loop is O(1). Thus the overall order is O(n) * O(1) = O(n)
17
12.3 – Analyzing Loop Execution
When loops are nested, we must multiply the complexity of the outer loop by the complexity of the inner loop for (int count = 0; count < n; count++) for (int count2 = 0; count 2 < n; count2++) { // some sequence of O(1) steps } Both the inner and outer loops have complexity of O(n) When multiplied together, the order becomes O(n2)
18
12.3 – Analyzing Recursive Algorithms
Determine the order of the recursion (# of times recursive definition is followed) and multiply it by the order of the body of the recursive method Consider the recursive method to compute the sum of integers from 1 to some positive value public int sum (int num) { int result; if (num == 1) result = 1; else result = num + sum (num-1); return result; } Size of the problem is the number of values to be summed Operation of interest is the addition operation The body of the method performs one addition operation and therefore is O(1) Each time the recursive method is invoked, the value of num is decreased by 1 The recursive method is called num times The order of the entire algorithm is O(n)
19
Chapter 12 – Summary Chapter 12 focused on
the nature of algorithmic efficiency growth functions Big-Oh notation for asymptotic complexity determining an algorithm's efficiency
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.