Download presentation
Presentation is loading. Please wait.
Published byLucinda Wright Modified over 9 years ago
1
Quicksort CS 3358 Data Structures
2
Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom happens. Another divide-and-conquer recursive algorithm, like mergesort
3
Sorting II/ Slide 3 Quicksort * Divide step: n Pick any element (pivot) v in S n Partition S – {v} into two disjoint groups S1 = {x S – {v} | x <= v} S2 = {x S – {v} | x v} * Conquer step: recursively sort S1 and S2 * Combine step: the sorted S1 (by the time returned from recursion), followed by v, followed by the sorted S2 (i.e., nothing extra needs to be done) v v S1S2 S
4
Sorting II/ Slide 4 Example: Quicksort
5
Sorting II/ Slide 5 Example: Quicksort...
6
Sorting II/ Slide 6 Pseudocode Input: an array A[p, r] Quicksort (A, p, r) { if (p < r) { q = Partition (A, p, r) //q is the position of the pivot element Quicksort (A, p, q-1) Quicksort (A, q+1, r) }
7
Sorting II/ Slide 7 Partitioning * Partitioning n Key step of quicksort algorithm n Goal: given the picked pivot, partition the remaining elements into two smaller sets n Many ways to implement n Even the slightest deviations may cause surprisingly bad results. * We will learn an easy and efficient partitioning strategy here. * How to pick a pivot will be discussed later
8
Sorting II/ Slide 8 Partitioning Strategy * Want to partition an array A[left.. right] * First, get the pivot element out of the way by swapping it with the last element. (Swap pivot and A[right]) * Let i start at the first element and j start at the next-to- last element (i = left, j = right – 1) pivot i j 564 6 31219564 6 312 19 swap
9
Sorting II/ Slide 9 Partitioning Strategy * Want to have n A[p] <= pivot, for p < i n A[p] >= pivot, for p > j * When i < j n Move i right, skipping over elements smaller than the pivot n Move j left, skipping over elements greater than the pivot n When both i and j have stopped A[i] >= pivot A[j] <= pivot i j 564 6 31219 i j 564 6 31219 i j <= pivot >= pivot
10
Sorting II/ Slide 10 Partitioning Strategy * When i and j have stopped and i is to the left of j n Swap A[i] and A[j] The large element is pushed to the right and the small element is pushed to the left n After swapping A[i] <= pivot A[j] >= pivot n Repeat the process until i and j cross swap i j 564 6 312 19 i j 534 6 612 19
11
Sorting II/ Slide 11 Partitioning Strategy * When i and j have crossed n Swap A[i] and pivot * Result: n A[p] <= pivot, for p < i n A[p] >= pivot, for p > i i j 534 6 612 19 i j 534 6 612 19 i j 534 6 612 19
12
Sorting II/ Slide 12 Small arrays * For very small arrays, quicksort does not perform as well as insertion sort n how small depends on many factors, such as the time spent making a recursive call, the compiler, etc * Do not use quicksort recursively for small arrays n Instead, use a sorting algorithm that is efficient for small arrays, such as insertion sort
13
Sorting II/ Slide 13 Picking the Pivot * Use the first element as pivot n if the input is random, ok n if the input is presorted (or in reverse order) all the elements go into S2 (or S1) this happens consistently throughout the recursive calls Results in O(n 2 ) behavior (Analyze this case later) * Choose the pivot randomly n generally safe n random number generation can be expensive
14
Sorting II/ Slide 14 Picking the Pivot * Use the median of the array n Partitioning always cuts the array into roughly half n An optimal quicksort (O(N log N)) n However, hard to find the exact median e.g., sort an array to pick the value in the middle
15
Sorting II/ Slide 15 Pivot: median of three * We will use median of three n Compare just three elements: the leftmost, rightmost and center n Swap these elements if necessary so that A[left] = Smallest A[right] = Largest A[center] = Median of three n Pick A[center] as the pivot n Swap A[center] and A[right – 1] so that pivot is at second last position (why?) median3
16
Sorting II/ Slide 16 Pivot: median of three pivot 5 64 6 31219 21365 6431219 2613 A[left] = 2, A[center] = 13, A[right] = 6 Swap A[center] and A[right] 5 6431219 213 pivot 6 5 64312 19213 Choose A[center] as pivot Swap pivot and A[right – 1] Note we only need to partition A[left + 1, …, right – 2]. Why?
17
Sorting II/ Slide 17 Main Quicksort Routine For small arrays Recursion Choose pivot Partitioning
18
Sorting II/ Slide 18 Partitioning Part * Works only if pivot is picked as median-of-three. n A[left] = pivot n Thus, only need to partition A[left + 1, …, right – 2] * j will not run past the beginning n because a[left] <= pivot * i will not run past the end n because a[right-1] = pivot
19
Sorting II/ Slide 19 Quicksort Faster than Mergesort * Both quicksort and mergesort take O(N log N) in the average case. * Why is quicksort faster than mergesort? n The inner loop consists of an increment/decrement (by 1, which is fast), a test and a jump. n There is no extra juggling as in mergesort. inner loop
20
Sorting II/ Slide 20 Analysis * Assumptions: n A random pivot (no median-of-three partitioning) n No cutoff for small arrays * Running time n pivot selection: constant time, i.e. O(1) n partitioning: linear time, i.e. O(N) n running time of the two recursive calls * T(N)=T(i)+T(N-i-1)+cN where c is a constant n i: number of elements in S1
21
Sorting II/ Slide 21 Worst-Case Analysis * What will be the worst case? n The pivot is the smallest element, all the time n Partition is always unbalanced
22
Sorting II/ Slide 22 Best-case Analysis * What will be the best case? n Partition is perfectly balanced. n Pivot is always in the middle (median of the array)
23
Sorting II/ Slide 23 Average-Case Analysis * Assume n Each of the sizes for S1 is equally likely * This assumption is valid for our pivoting (median-of-three) strategy * On average, the running time is O(N log N)
24
Sorting II/ Slide 24 Quicksort * Quicksort’s reputation * Quicksort’s implementation * Quicksort for distinct elements * Quicksort for duplicate elements * Quicksort for small arrays * Comparison of Quicksort and Mergesort
25
Sorting II/ Slide 25 Quicksort’s reputation * For many years, [Quicksort] had the reputation of being an algorithm that could in theory be highly optimized but in practice was impossible to code correctly
26
Sorting II/ Slide 26 Quicksort for Small Arrays * For very small arrays (N<= 20), quicksort does not perform as well as insertion sort * A good cutoff range is N=10 * Switching to insertion sort for small arrays can save about 15% in the running time
27
Sorting II/ Slide 27 Comparisons of Mergesort and Quicksort * Both run in O(nlogn) * Compared with Quicksort, Mergesort has less number of comparisons but larger number of moving elements * In Java, an element comparison is expensive but moving elements is cheap. Therefore, Mergesort is used in the standard Java library for generic sorting
28
Sorting II/ Slide 28 Comparisons of Mergesort and Quicksort In C++ , copying objects can be expensive while comparing objects often is relatively cheap. Therefore, quicksort is the sorting routine commonly used in C++ libraries
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.