Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Foundations of Software Design Fall 2002 Marti Hearst Lecture 11: Analysis of Algorithms, cont.

Similar presentations


Presentation on theme: "1 Foundations of Software Design Fall 2002 Marti Hearst Lecture 11: Analysis of Algorithms, cont."— Presentation transcript:

1 1 Foundations of Software Design Fall 2002 Marti Hearst Lecture 11: Analysis of Algorithms, cont.

2 2 Function Pecking Order In increasing order Adapted from Goodrich & Tamassia Where does n log n fit in?

3 3 Plot them! Both x and y linear scales Convert y axis to log scale (that jump for large n happens because the last number is out of range) Notice how much bigger 2^n is than n^k This is why exponential growth is BAD BAD BAD!!

4 4 More Plots

5 5 Let’s Count Some Beer A well-known “song” –“100 bottles of beer on the wall, 100 bottles of beer; you take one down, pass it around, 99 bottles of beer on the wall.” –“99 bottles of beer on the wall, 99 bottles of beer; you take one down, pass it around, 98 bottles of beer on the wall.” –… –“1 bottle of beer on the wall, 1 bottle of beer, you take it down, pass it around, no bottles of beer on the wall.” –HALT. Let’s change the song to “N bottles of beer on the wall”. The number of bottles of beer passed around is Order what?

6 6 Let’s Count Some Ants Another song: –The ants go marching 1 by 1 –The ants go marching 2 by 2 –The ants go marching 3 by 3 How ants are in the lead in each wave of ants? 1 + 2 + 3 + … + n Does this remind you of anything?

7 7 Graph it! Let’s plot beer(n) versus ants(n) Ants

8 8 Definition of Big-Oh A running time is O(g(n)) if there exist constants n 0 > 0 and c > 0 such that for all problem sizes n > n 0, the running time for a problem of size n is at most c(g(n)). In other words, c(g(n)) is an upper bound on the running time for sufficiently large n. http://www.cs.dartmouth.edu/~farid/teaching/cs15/cs5/lectures/0519/0519.html c g(n)

9 9 The Crossover Point Adapted from http://www.cs.sunysb.edu/~algorith/lectures-good/node2.html One function starts out faster for small values of n. But for n > n0, the other function is always faster.

10 10 More formally Let f(n) and g(n) be functions mapping nonnegative integers to real numbers. f(n) is  (g(n)) if there exist positive constants n0 and c such that for all n>=n0, f(n) <= c*g(n) Other ways to say this: f(n) is order g(n) f(n) is big-Oh of g(n) f(n) is Oh of g(n) f(n)  O(g(n)) (set notation)

11 11 Comparing Running Times Adapted from Goodrich & Tamassia

12 12 Analysis Example: Phonebook Given: –A physical phone book Organized in alphabetical order –A name you want to look up –An algorithm in which you search through the book sequentially, from first page to last –What is the order of: The best case running time? The worst case running time? The average case running time? –What is: A better algorithm? The worst case running time for this algorithm?

13 13 Analysis Example (Phonebook) This better algorithm is called Binary Search What is its running time? –First you look in the middle of n elements –Then you look in the middle of n/2 = ½*n elements –Then you look in the middle of ½ * ½*n elements –…–… –Continue until there is only 1 element left –Say you did this m times: ½ * ½ * ½* …*n –Then the number of repetitions is the smallest integer m such that

14 14 Analyzing Binary Search –In the worst case, the number of repetitions is the smallest integer m such that –We can rewrite this as follows: Multiply both sides by Take the log of both sides Since m is the worst case time, the algorithm is O(logn)

15 15 Analysis Example “prefix averages” You want this mapping from array of numbers to an array of averages of the preceding numbers (who knows why – not my example): 5 10 15 20 25 30 5/1 15/2 30/3 50/4 75/5 105/6 There are two straightforward algorithms: One is easy but wasteful. The other is more efficient, but requires insight into the problem. Adapted from Goodrich & Tamassia

16 16 Analysis Example Adapted from Goodrich & Tamassia

17 17 Analysis Example For each position i in A, you look at the values for all the elements that came before –What is the number of positions in the largest part? –When i=n, you look at n positions –When i=n-1, you look at n-1 positions –When i=n-2, you look at n-2 positions –… –When i=2, you look at 2 positions –When i=1, you look at 1 position This should look familiar …

18 18 Analysis Example A useful tool: store partial information in a variable! Uses space to save time. The key – don’t divide s. Eliminates one for loop – always a good thing to do. Adapted from Goodrich & Tamassia

19 19 Summary: Analysis of Algorithms A method for determining, in an abstract way, the asymptotic running time of an algorithm –Here asymptotic means as n gets very large Useful for comparing algorithms Useful also for determing tractability –Meaning, a way to determine if the problem is intractable (impossible) or not –Exponential time algorithms are usually intractable. We’ll revisit these ideas throughout the rest of the course.

20 20 Next Time Stacks and Queues


Download ppt "1 Foundations of Software Design Fall 2002 Marti Hearst Lecture 11: Analysis of Algorithms, cont."

Similar presentations


Ads by Google