Download presentation
Presentation is loading. Please wait.
Published byBenedict Carr Modified over 9 years ago
1
1 Algorithms Algorithms are simply a list of steps required to solve some particular problem They are designed as abstractions of processes carried out by computer programs Examples include Sorting Determining if a student qualifies for financial aid Determining the steps to set up a dating service
2
2 Algorithms In some cases we have only one algorithm for a problem or the problem is so straightforward that there is no need to consider anything other than the obvious Some other problems have many known algorithms We obviously want to choose the "best" algorithm Other problems have no known algorithm !
3
3 What is the "best" Algorithm? Traditionally we focused on two questions 1. How fast does it run? Early days, measured by timing the implementation of the algorithm It was common to hear about a "new" SuperDuper Sort that could sort a list of 1 million integers in 17 seconds whereas Junk Sort requires 43 seconds 2. How much memory does it require?
4
4 Analysis of Algorithms Programs depend on the operating systems, machine, compiler/interpreter used, etc. Analysis of algorithms compare algorithms and not programs It is based on the premise that the longer the algorithm takes the longer its implementation will run. Sorting 1 million items ought to take longer than sorting 1000 But if we comparing algorithms (not yet implemented) how can we express it's performance? How can we "measure" the performance of an algorithm?
5
5 Analysis of Algorithms We want an expression that can be applied to any computer This is only possible by stating the efficiency in terms of some critical operations These operations depend on the problem We could for instance say that in sorting algorithms it is the number of time two elements are compared
6
6 Analysis of Algorithm In general we do analysis of algorithms using the RAM model (Random Access Machine) Instructions are executed one after the other There is no concurrency Basic operations take the same time (constant time) We normally say that each line (step) in the algorithm takes time 1 (one)
7
7 Analysis of Algorithms But you could be asking: If each line takes a constant time then the whole algorithm (any algorithm) will take constant time, right? Wrong! Although some algorithms may take constant time, the majority of algorithms varies its number of steps based on the size of instances we're trying to solve.
8
8 Number of steps It is easy to see that most of algorithms vary their number of steps for i = 1.. N a = a + 2 i = i + 1 for i = 1.. N a = a + 2 i = i + 1 for i = 1.. N a = a + 2 i = i + 2 for i = 1.. N a = a + 2 i = i + 2 So must also consider the number of steps it will take to process the number of items(N).
9
9 Analysis of Algorithms Therefore the efficiency of an algorithm is always normally stated as a function of the problem size We generally use the variable n to represent the problem size On the implementation, we could find out that out SuperDuper Sort takes 0.6n 2 + 0.3n + 0.45 seconds on Pentium. Plug a value for n and you have how long it takes
10
10 Analysis of Algorithms But we're not yet independent of the machine. Remember that we said that we said that the formula for the SuperDuper Sort is valid for a Pentium We need to identify the most important aspect of the function that represents the running time of an algorithm Which one is the "best" f(n) = 10000000n g(n) = n 2 + n
11
11 Asymptotic Analysis Asymptotic analysis of an algorithm describes the relative efficiency of an algorithm as n gets very large. In the example it is easy to see that for very large n, g(n) grows faster than f(n) Take for instance the value n=20000000 Remember that the goal here is to compare algorithms. In practice, if you're writing small programs, asymptotic analysis may not be that important When you're dealing with small input size, most algorithms will do When the input size is very large, things change
12
12 An simple comparison Let's assume that you have 3 algorithms to sort a list f(n) = n log 2 n g(n) = n 2 h(n) = n 3 Let's also assume that each step takes 1 microsecond (10 -6 ) 1s
13
13 Higher order Term Most of the algorithms discussed here will be given in terms of common functions: polynomials, logarithms, exponentials and product of these functions Analyzing the table given earlier we can see that in an efficiency function we are interested in the term with higher order If we have a function f(n) = n 3 + n 2, for the case when n = 100000 the running time of the algorithm is 31.7 years + 2.8 hours Its clear that a couple of hours does not make much difference if the program is to run for 31.7 years!
14
14 Higher order Term In the case above we say that f(n) is O(n 3 ) meaning that f(n) is of the order n 3. This is called big-O notation. It disregards any constant multiplying the term of highest order and any term of smaller order f(n) = 10000000000000n 3 is O(n 3 )
15
15 Common Functions Constant (1): Very fast. Some hash table algorithms can look up one item from the table of n items in an average time which is constant (independent of the table size) Logarithmic(lg2 of N): Also very fast. Typical of many algorithms that use (binary) trees. Linear Time( n): Typical of fast algorithms on a single-processor computer. If all the input of size n has to be read. Poly-logarithmic (n log n): Typical of the best sorting algorithms. Considered a good solution Polynomial(n^2): When a problem of size n can be solved in time n k where k is a constant. Small n's (n <= 3) is OK. Exponential (2 ^ N): These problems can not be done in a reasonable time - see next slide
16
16 Common Functions Exponential: Are those that use time k n where k is a constant. Algorithms that grow on this rate are suitable only for small problems. Unfortunately the best algorithms known to many problems use exponential time Much of the work on developing algorithms today is focused on these problems because they take an huge amount of time to execute (even for reasonably small input size) There is a large variation in the size of various exponential functions (2 0.0001n and 2 n ). But for large n the functions become huge
17
17 Comparison of Algorithms Big-O Notation A notation that expresses computing time (complexity) as the term in a function that increases most rapidly relative to the size of a problem If f(N) = N 4 + 100N 2 + 10N + 50 then f(N) is 0(N 4 ). N represents the size of the problem.
18
18 Worst Case and Best Case If we return to our original question of "how fast does a program run?" we can see that this question is not enough Inputs vary in the way they are organized and this can influence the number of critical operations performed Suppose that we are searching of an element in an ordered list If the target key is the first in the list our function takes constant time If the target key is not in the list our function takes O(n), where n is the size of the list
19
19 Worst Case and Best Case The examples above are referred to as best case analysis and worst case analysis. Which is the really relevant case? Worst case is more important because it gives us a bound on how long the function might have to run
20
20 Average Case In some situations neither the best nor the worst case analysis express well the performance of an algorithm Average case analysis can be used if necessary Still average case is uncommon because It may be cumbersome to do an average analysis of non-trivial algorithms In most cases the "order" of the average analysis is the same as the worst
21
21 Comparison of Rates of Growth N log2N N log2N N² N³ 2^N
22
22 Comparison of Linear and Binary Searches
23
23 Big-O Comparison of List Operations Operation Unsorted List Sorted List O(LgN)
24
24 Review Questions: 1.What problems arise when we "measure" the performance of an algorithm? What problems arise if we time it. 2. What is a “critical operation”? 3. How then do we measure the efficiency of an algorithm 1. The efficiency of an algorithm is stated as a function of the problem size. We generally use the variable N to represent the problem size 2. We must also consider the number of steps it will take to process the number of items(N). 4.What is big-O notation? What are Common Functions: Give an example Constant (1): Logarithmic(lg2 of N): Linear Time( n): Poly-logarithmic (n log n): Polynomial(n^2): Exponential (2^n) 5.What is Worst Case Analysis?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.