25 May 20151 Quick Sort (11.2) CSE 2011 Winter 2011.

Slides:



Advertisements
Similar presentations
Chapter 9 continued: Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CSE 3101: Introduction to the Design and Analysis of Algorithms
ADA: 5. Quicksort1 Objective o describe the quicksort algorithm, it's partition function, and analyse its running time under different data conditions.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
QuickSort 4 February QuickSort(S) Fast divide and conquer algorithm first discovered by C. A. R. Hoare in If the number of elements in.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
CSE 373: Data Structures and Algorithms
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
The Substitution method T(n) = 2T(n/2) + cn Guess:T(n) = O(n log n) Proof by Mathematical Induction: Prove that T(n)  d n log n for d>0 T(n)  2(d  n/2.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
Chapter 7: Sorting Algorithms
1 Today’s Material Divide & Conquer (Recursive) Sorting Algorithms –QuickSort External Sorting.
CS 201 Data Structures and Algorithms Text: Read Weiss, § 7.7
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
CSC 2300 Data Structures & Algorithms March 23, 2007 Chapter 7. Sorting.
Lecture 7COMPSCI.220.FS.T Algorithm MergeSort John von Neumann ( 1945 ! ): a recursive divide- and-conquer approach Three basic steps: –If the.
Insertion Sort Merge Sort QuickSort. Sorting Problem Definition an algorithm that puts elements of a list in a certain order Numerical order and Lexicographical.
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
ALG0183 Algorithms & Data Structures Lecture 17 Quicksort 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks comparison sort worse-case.
Fundamentals of Algorithms MCS - 2 Lecture # 16. Quick Sort.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Copyright © 2006 Pearson Addison-Wesley. All rights reserved. Sorting III 1 An Introduction to Sorting.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
Quicksort.
Sorting. Introduction Assumptions –Sorting an array of integers –Entire sort can be done in main memory Straightforward algorithms are O(N 2 ) More complex.
Quicksort.
Unit 061 Quick Sort csc326 Information Structures Spring 2009.
Quicksort
Sorting Rearrange n elements into ascending order. 7, 3, 6, 2, 1  1, 2, 3, 6, 7.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts Divide & Conquer.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
CS 202, Spring 2003 Fundamental Structures of Computer Science II Bilkent University1 Sorting - 3 CS 202 – Fundamental Structures of Computer Science II.
CSE 373 Data Structures Lecture 19
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Divide-And-Conquer Sorting Small instance.  n
CSC – 332 Data Structures Sorting
Sorting and Lower bounds Fundamental Data Structures and Algorithms Ananda Guna January 27, 2005.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Sorting: Implementation Fundamental Data Structures and Algorithms Klaus Sutner February 24, 2004.
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Deterministic and Randomized Quicksort Andreas Klappenecker.
Divide And Conquer A large instance is solved as follows:  Divide the large instance into smaller instances.  Solve the smaller instances somehow. 
1 Heapsort, Mergesort, and Quicksort Sections 7.5 to 7.7.
Sorting Fundamental Data Structures and Algorithms Aleks Nanevski February 17, 2004.
Divide and Conquer Sorting Algorithms COMP s1 Sedgewick Chapters 7 and 8.
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 122 – Data Structures Sorting.
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
Nothing is particularly hard if you divide it into small jobs. Henry Ford Nothing is particularly hard if you divide it into small jobs. Henry Ford.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Intro. to Data Structures Chapter 7 Sorting Veera Muangsin, Dept. of Computer Engineering, Chulalongkorn University 1 Chapter 7 Sorting Sort is.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
Quicksort
Chapter 7 Sorting Spring 14
Quick Sort (11.2) CSE 2011 Winter November 2018.
EE 312 Software Design and Implementation I
Quicksort.
CSE 373 Data Structures and Algorithms
CSC 380: Design and Analysis of Algorithms
Data Structures & Algorithms
Quicksort.
CSE 332: Sorting II Spring 2016.
Quicksort.
Presentation transcript:

25 May Quick Sort (11.2) CSE 2011 Winter 2011

2 Quick Sort Fastest known sorting algorithm in practice Average case: O(N log N) Worst case: O(N 2 )  But the worst case can be made exponentially unlikely. Another divide-and-conquer recursive algorithm, like merge sort.

3 Quick Sort: Main Idea 1.If the number of elements in S is 0 or 1, then return (base case). 2.Pick any element v in S (called the pivot). 3.Partition the elements in S except v into two disjoint groups: 1.S 1 = {x  S – {v} | x  v} 2.S 2 = {x  S – {v} | x  v} 4.Return {QuickSort(S 1 ) + v + QuickSort(S 2 )}

4 Quick Sort: Example

5 Example of Quick Sort...

6 Issues To Consider How to pick the pivot?  Many methods (discussed later) How to partition?  Several methods exist.  The one we consider is known to give good results and to be easy and efficient.  We discuss the partition strategy first.

7 Partitioning Strategy For now, assume that pivot = A[(left+right)/2]. We want to partition array A[left.. right]. First, get the pivot element out of the way by swapping it with the last element (swap pivot and A[right]). Let i start at the first element and j start at the next-to- last element (i = left, j = right – 1) pivot i j swap

8 Partitioning Strategy Want to have  A[k]  pivot, for k < i  A[k]  pivot, for k > j When i < j  Move i right, skipping over elements smaller than the pivot  Move j left, skipping over elements greater than the pivot  When both i and j have stopped A[i]  pivot A[j]  pivot  A[i] and A[j] should now be swapped i j i j i j  pivot  pivot

9 Partitioning Strategy (2) When i and j have stopped and i is to the left of j (thus legal)  Swap A[i] and A[j] The large element is pushed to the right and the small element is pushed to the left  After swapping A[i]  pivot A[j]  pivot  Repeat the process until i and j cross swap i j i j

10 Partitioning Strategy (3) When i and j have crossed  swap A[i] and pivot Result:  A[k]  pivot, for k < i  A[k]  pivot, for k > i i j i j i j swap A[i] and pivot Break!

Picking the Pivot There are several ways to pick a pivot. Objective: Choose a pivot so that we will get 2 partitions of (almost) equal size.

12 Picking the Pivot (2) Use the first element as pivot  if the input is random, ok.  if the input is presorted (or in reverse order) all the elements go into S 2 (or S 1 ). this happens consistently throughout the recursive calls. results in O(N 2 ) behavior (we analyze this case later). Choose the pivot randomly  generally safe,  but random number generation can be expensive and does not reduce the running time of the algorithm.

13 Picking the Pivot (3) Use the median of the array (ideal pivot)  The  N/2  th largest element  Partitioning always cuts the array into roughly half  An optimal quick sort (O(N log N))  However, hard to find the exact median Median-of-three partitioning  eliminates the bad case for sorted input.  reduces the number of comparisons by 14%.

14 Median of Three Method Compare just three elements: the leftmost, rightmost and center  Swap these elements if necessary so that A[left] = Smallest A[right] = Largest A[center] = Median of three  Pick A[center] as the pivot.  Swap A[center] and A[right – 1] so that the pivot is at the second last position (why?)

15 Median of Three: Example pivot A[left] = 2, A[center] = 13, A[right] = 6 Swap A[center] and A[right] pivot Choose A[center] as pivot Swap pivot and A[right – 1] We only need to partition A[ left + 1, …, right – 2 ]. Why?

Quick Sort Summary Recursive case: QuickSort( a, left, right ) pivot = median3( a, left, right ); Partition a[left … right] into a[left … i-1], i, a[i+1 … right]; QuickSort( a, left, i-1 ); QuickSort( a, i+1, right ); Base case: when do we stop the recursion?  In theory, when left >= right.  In practice, …

17 Small Arrays For very small arrays, quick sort does not perform as well as insertion sort Do not use quick sort recursively for small arrays  Use a sorting algorithm that is efficient for small arrays, such as insertion sort. When using quick sort recursively, switch to insertion sort when the sub-arrays have between 5 to 20 elements (10 is usually good).  saves about 15% in the running time.  avoids taking the median of three when the sub-array has only 1 or 2 elements.

18 Quick Sort: Pseudo-code For small arrays Recursion Choose pivot Partitioning

19 Partitioning Part The partitioning code we just saw works only if pivot is picked as median-of-three.  A[left]  pivot and A[right]  pivot  Need to partition only A[left + 1, …, right – 2] j will not run past the beginning  because A[left]  pivot i will not run past the end  because A[right-1] = pivot

Homework Assume the pivot is chosen as the middle element of an array: pivot = a[(left+right)/2]. Rewrite the partitioning code and the whole quick sort algorithm.

21 Quick Sort Faster Than Merge Sort Both quick sort and merge sort take O(N log N) in the average case. But quick sort is faster in the average case:  The inner loop consists of an increment/decrement (by 1, which is fast), a test and a jump.  There is no extra juggling as in merge sort. inner loop

22 Analysis Assumptions: A random pivot (no median-of-three partitioning) No cutoff for small arrays ( to make it simple) 1.If the number of elements in S is 0 or 1, then return (base case). 2.Pick an element v in S (called the pivot). 3.Partition the elements in S except v into two disjoint groups: 1.S 1 = {x  S – {v} | x  v} 2.S 2 = {x  S – {v} | x  v} 4.Return {QuickSort(S 1 ) + v + QuickSort(S 2 )}

23 Analysis (2) Running time  pivot selection: constant time, i.e. O(1)  partitioning: linear time, i.e. O(N)  running time of the two recursive calls T(N)= T(i) + T(N – i – 1) + cN  i: number of elements in S1  c is a constant

24 Worst-Case Scenario What will be the worst case?  The pivot is the smallest element, all the time  Partition is always unbalanced

25 Best-Case Scenario What will be the best case?  Partition is perfectly balanced.  Pivot is always in the middle (median of the array). T(N) = T(N/2) + T(N/2) + cN = 2T(N/2) + cN This recurrence is similar to the merge sort recurrence. The result is O(NlogN).

26 Average-Case Analysis Assume that each of the sizes for S 1 is equally likely  has probability 1/N. This assumption is valid for the pivoting and partitioning strategy just discussed (but may not be for some others), On average, the running time is O(N log N). Proof: pp 272–273, Data Structures and Algorithm Analysis by M. A. Weiss, 2 nd edition

27 Next time … Arrays (review) Linked Lists (3.2, 3.3) Comparing sorting algorithms Stacks, queues (Chapter 5)