Download presentation
Presentation is loading. Please wait.
1
15-211 Fundamental Data Structures and Algorithms
Sorting Fundamental Data Structures and Algorithms Peter Lee February 20, 2003
2
Announcements Homework #4 is out Reading: Quiz #2 available on Tuesday
Get started today! Reading: Chapter 8 Quiz #2 available on Tuesday
3
Objects in calendar are closer than they appear.
4
Introduction to Sorting
5
Comparison-based sorting
We assume Items are stored in an array. Can be moved around in the array. Can compare any two array elements. Comparison has 3 possible outcomes: < = >
6
Flips and inversions An unsorted array. 24 47 13 99 105 222 inversion
7
Naïve sorting algorithms
Bubble sort. 13 24 24 47 13 99 105 222 13 47 Keep scanning for flips, until all are fixed What is the running time?
8
Insertion sort Sorted sublist 105 47 13 99 30 222 47 105 13 99 30 222
9
Insertion sort for i = 2 to n do insert a[i] in the proper place
in a[1:i-1]
10
How fast is insertion sort?
Takes O(#inversions) steps, which is very fast if array is nearly sorted to begin with. …
11
How many inversions? Consider the worst case: In this case, there are
n n-1 … In this case, there are n + (n-1) + (n-2) + … + 1 or
12
How many inversions? What about the average case? Consider:
p = x1 x2 x3 … xn-1 xn For any such p, let rev(p) be its reversal. Then (xi,xj) is an inversion either in p or in rev(p). There are n(n-1)/2 pairs (xi,xj), hence the average number of inversions in a permutation is n(n-1)/4, or O(n2).
13
How long does it take to sort?
Can we do better than O(n2)? In the worst case? In the average case
14
Heapsort Remember heaps: Heapsort:
buildHeap has O(n) worst-case running time. deleteMin has O(log n) worst-case running time. Heapsort: Build heap O(n) DeleteMin until empty. O(n log n) Total worst case: O(n log n)
15
N2 vs Nlog N
16
Sorting in O(n log n) Heapsort establishes the fact that sorting can be accomplished in O(n log n) worst-case running time.
17
Heapsort in practice The average-case analysis for heapsort is somewhat complex. In practice, heapsort consistently tends to use nearly n log n comparisons. So, while the worst case is better than n2, other algorithms sometimes work better.
18
Shellsort Shellsort, like insertion sort, is based on swapping inverted pairs. It achieves O(n4/3) running time. [See your book for details.]
19
Shellsort ... Example with sequence 3, 1. 105 47 13 99 30 222 99 47 13
Several inverted pairs fixed in one exchange. 99 30 13 105 47 222 99 30 13 105 47 222 30 99 13 105 47 222 30 13 99 105 47 222 ...
20
Recursive Sorting
21
Recursive sorting Intuitively, divide the problem into pieces and then recombine the results. If array is length 1, then done. If array is length N>1, then split in half and sort each half. Then combine the results. An example of a divide-and-conquer algorithm.
22
Divide-and-conquer
23
Divide-and-conquer
24
Why divide-and-conquer works
Suppose the amount of work required to divide and recombine is linear, that is, O(n). Suppose also that the amount of work to complete each step is greater than O(n). Then each dividing step reduces the amount of work by greater than a linear amount, while requiring only linear work to do so.
25
Divide-and-conquer is big
We will see several examples of divide-and-conquer in this course.
26
Recursive sorting If array is length 1, then done.
If array is length N>1, then split in half and sort each half. Then combine the results.
27
Analysis of recursive sorting
Suppose it takes time T(N) to sort N elements. Suppose also it takes time N to combine the two sorted arrays. Then: T(1) = 1 T(N) = 2T(N/2) + N, for N>1 Solving for T gives the running time for the recursive sorting algorithm.
28
Remember recurrence relations?
Systems of equations such as T(1) = 1 T(N) = 2T(N/2) + N, for N>1 are called recurrence relations (or sometimes recurrence equations).
29
A solution A solution for is given by How to solve such equations?
T(N) = 2T(N/2) + N is given by T(N) = Nlog N + N which is O(Nlog N). How to solve such equations?
30
Recurrence relations There are several methods for solving recurrence relations. It is also useful sometimes to check that a solution is valid. This is done by induction.
31
Checking a solution Base case: Inductive case: T(1) = 1log 1 + 1 = 1
Assume T(M) = Mlog M + M, all M<N. T(N) = 2T(N/2) + N
32
Checking a solution Base case: Inductive case: T(1) = 1log 1 + 1 = 1
Assume T(M) = Mlog M + M, all M<N. T(N) = 2T(N/2) + N
33
Checking a solution Base case: Inductive case: T(1) = 1log 1 + 1 = 1
Assume T(M) = Mlog M + M, all M<N. T(N) = 2T(N/2) + N = 2((N/2)(log(N/2))+N/2)+N = N(log N - log 2)+2N = Nlog N - N + 2N = Nlog N + N
34
Logarithms Some useful equalities. xA = B iff logxB = A log 1 = 0
log(AB) = log A + log B, if A, B > 0 log(A/B) = log A - log B, if A, B > 0 log(AB) = Blog A
35
Upper bounds for rec. relations
Divide-and-conquer algorithms are very useful in practice. Furthermore, they all tend to generate similar recurrence relations. As a result, approximate upper-bound solutions are well-known for recurrence relations derived from divide-and-conquer algorithms.
36
Divide-and-Conquer Theorem
Theorem: Let a, b, c 0. The recurrence relation T(1) = b T(N) = aT(N/c) + bN for any N which is a power of c has upper-bound solutions T(N) = O(N) if a<c T(N) = O(Nlog N) if a=c T(N) = O(Nlogca) if a>c a=2, b=1, c=2 for recursive sorting
37
Upper-bounds Corollary:
Dividing a problem into p pieces, each of size N/p, using only a linear amount of work, results in an O(Nlog N) algorithm.
38
Upper-bounds Proof of this theorem later in the semester.
39
Exact solutions Recall from earlier in the semester that it is sometimes possible to derive closed-form solutions to recurrence relations. Several methods exist for doing this. As an example, consider again our current equations: T(1) = 1 T(N) = 2T(N/2) + N, for N>1
40
Repeated substitution method
One technique is to use repeated substitution. T(N) = 2T(N/2) + N 2T(N/2) = 2(2T(N/4) + N/2) = 4T(N/4) + N T(N) = 4T(N/4) + 2N 4T(N/4) = 4(2T(N/8) + N/4) = 8T(N/8) + N T(N) = 8T(N/8) + 3N T(N) = 2kT(N/2k) + kN
41
Repeated substitution, cont’d
We end up with T(N) = 2kT(N/2k) + kN, for all k>1 Let’s use k=log N. Note that 2log N = N. So: T(N) = NT(1) + Nlog N = Nlog N + N
42
Other methods There are also other methods for solving recurrence relations. For example, the “telescoping method”…
43
Telescoping method We start with Divide both sides by N to get
T(N) = 2T(N/2) + N Divide both sides by N to get T(N)/N = (2T(N/2) + N)/N = T(N/2)/(N/2) + 1 This is valid for any N that is a power of 2, so we can write the following:
44
Telescoping method, cont’d
Additional equations: T(N)/N = T(N/2)/(N/2) + 1 T(N/2)/(N/2) = T(N/4)/(N/4) + 1 T(N/4)/(N/4) = T(N/8)/(N/8) + 1 … T(2)/2 = T(1)/1 + 1 What happens when we sum all the left-hand and right-hand sides?
45
Telescoping method, cont’d
Additional equations: T(N)/N = T(N/2)/(N/2) + 1 T(N/2)/(N/2) = T(N/4)/(N/4) + 1 T(N/4)/(N/4) = T(N/8)/(N/8) + 1 … T(2)/2 = T(1)/1 + 1 We are left with: T(N)/N = T(1)/1 + log(N)
46
Telescoping method, cont’d
We are left with T(N)/N = T(1)/1 + log(N) Multiplying both sides by N gives T(N) = N log(N) + N
47
Mergesort
48
Mergesort Mergesort is the most basic recursive sorting algorithm.
Divide array in halves A and B. Recursively mergesort each half. Combine A and B by successively looking at the first elements of A and B and moving the smaller one to the result array. Note: Should be a careful to avoid creating of lots of result arrays.
49
Mergesort
50
Mergesort But don’t actually want to create all of these arrays!
51
Mergesort Use simple indexes to perform the split.
Use a single extra array to hold each intermediate result.
52
Analysis of mergesort Mergesort generates almost exactly the same recurrence relations shown before. T(1) = 1 T(N) = 2T(N/2) + N - 1, for N>1 Thus, mergesort is O(Nlog N).
53
Quicksort
54
Quicksort Quicksort was invented in 1960 by Tony Hoare.
Although it has O(N2) worst-case performance, on average it is O(Nlog N). More importantly, it is the fastest known comparison-based sorting algorithm in practice.
55
Quicksort idea Choose a pivot.
56
Quicksort idea Choose a pivot.
Rearrange so that pivot is in the “right” spot.
57
Quicksort idea Choose a pivot.
Rearrange so that pivot is in the “right” spot. Recurse on each half and conquer!
58
Quicksort algorithm If array A has 1 (or 0) elements, then done.
Choose a pivot element x from A. Divide A-{x} into two arrays: B = {yA | yx} C = {yA | yx} Quicksort arrays B and C. Result is B+{x}+C.
59
Quicksort algorithm 105 47 13 17 30 222 5 19 19 5 17 13 47 30 222 105 13 47 5 17 30 222 105 105 222
60
Quicksort algorithm 105 47 13 17 30 222 5 19 19 5 17 13 47 30 222 105 13 47 5 17 30 222 105 105 222 In practice, insertion sort is used once the arrays get “small enough”.
61
Doing quicksort in place
L R L R L R
62
Doing quicksort in place
L R L R R L
63
Quicksort is fast but hard to do
Quicksort, in the early 1960’s, was famous for being incorrectly implemented many times. More about invariants next time. Quicksort is very fast in practice. Faster than mergesort because Quicksort can be done “in place”.
64
Informal analysis If there are duplicate elements, then algorithm does not specify which subarray B or C should get them. Ideally, split down the middle. Also, not specified how to choose the pivot. Ideally, the median value of the array, but this would be expensive to compute. As a result, it is possible that Quicksort will show O(N2) behavior.
65
Worst-case behavior 105 47 13 17 30 222 5 19 5 47 13 17 30 222 19 105 13 47 105 17 30 222 19 17 47 105 19 30 222 19
66
Analysis of quicksort Assume random pivot. T(0) = 1 T(1) = 1
T(N) = T(i) + T(N-i-1) + cN, for N>1 where I is the size of the left subarray.
67
Worst-case analysis If the pivot is always the smallest element, then:
T(N) = T(0) + T(N-1) + cN, for N>1 T(N-1) + cN = O(N2) See the book for details on this solution.
68
Best-case analysis In the best case, the pivot is always the median element. In that case, the splits are always “down the middle”. Hence, same behavior as mergesort. That is, O(Nlog N).
69
Average-case analysis
Consider the quicksort tree: 105 47 13 17 30 222 5 19
70
Average-case analysis
The time spent at each level of the tree is O(N). So, on average, how many levels? That is, what is the expected height of the tree? If on average there are O(log N) levels, then quicksort is O(Nlog N) on average.
71
Average-case analysis
We’ll answer this question next time…
72
Summary of quicksort A fast sorting algorithm in practice.
Can be implemented in-place. But is O(N2) in the worst case. Average-case performance?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.