© 2004 Goodrich, Tamassia Quick-Sort1 7 4 9 6 2  2 4 6 7 9 4 2  2 47 9  7 9 2  29  9.

Slides:



Advertisements
Similar presentations
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
Advertisements

© 2004 Goodrich, Tamassia Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures Lecture 9 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Merge Sort 4/15/2017 4:30 PM Merge Sort 7 2   7  2  2 7
Using Divide and Conquer for Sorting
Quicksort Quicksort     29  9.
Merge Sort1 Part-G1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Insertion Sort. Selection Sort. Bubble Sort. Heap Sort. Merge-sort. Quick-sort. 2 CPSC 3200 University of Tennessee at Chattanooga – Summer 2013 © 2010.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
CSC2100B Quick Sort and Merge Sort Xin 1. Quick Sort Efficient sorting algorithm Example of Divide and Conquer algorithm Two phases ◦ Partition phase.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
© 2004 Goodrich, Tamassia QuickSort1 Quick-Sort     29  9.
Quick-Sort     29  9.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
Comparison Sorts Chapter 9.4, 12.1, Sorting  We have seen the advantage of sorted data representations for a number of applications  Sparse vectors.
© 2004 Goodrich, Tamassia Sorting Lower Bound1. © 2004 Goodrich, Tamassia Sorting Lower Bound2 Comparison-Based Sorting (§ 10.3) Many sorting algorithms.
© 2004 Goodrich, Tamassia Selection1. © 2004 Goodrich, Tamassia Selection2 The Selection Problem Given an integer k and n elements x 1, x 2, …, x n, taken.
TTIT33 Algorithms and Optimization – Dalg Lecture 2 HT TTIT33 Algorithms and optimization Lecture 2 Algorithms Sorting [GT] 3.1.2, 11 [LD] ,
© 2004 Goodrich, Tamassia Merge Sort1 Quick-Sort     29  9.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4 Modified by: Daniel Gomez-Prado, University of Massachusetts Amherst.
Selection1. 2 The Selection Problem Given an integer k and n elements x 1, x 2, …, x n, taken from a total order, find the k-th smallest element in this.
© 2004 Goodrich, Tamassia Merge Sort1 Chapter 8 Sorting Topics Basics Merge sort ( 合併排序 ) Quick sort Bucket sort ( 分籃排序 ) Radix sort ( 基數排序 ) Summary.
MergeSort Source: Gibbs & Tamassia. 2 MergeSort MergeSort is a divide and conquer method of sorting.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
Sorting.
CSC 213 Lecture 12: Quick Sort & Radix/Bucket Sort.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Sorting Fun1 Chapter 4: Sorting     29  9.
Lecture10: Sorting II Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej.
1 Heapsort, Mergesort, and Quicksort Sections 7.5 to 7.7.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
© 2004 Goodrich, Tamassia Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Sorting Fundamental Data Structures and Algorithms Aleks Nanevski February 17, 2004.
1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Towers of Hanoi Move n (4) disks from pole A to pole B such that a larger disk is never put on a smaller disk A BC ABC.
QuickSort by Dr. Bun Yue Professor of Computer Science CSCI 3333 Data.
Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 8b. Sorting(2): (n log n) Algorithms.
Merge Sort 1/12/2018 5:48 AM Merge Sort 7 2   7  2  2 7
Advanced Sorting 7 2  9 4   2   4   7
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Merge Sort 1/12/2018 9:44 AM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Quick-Sort 2/18/2018 3:56 AM Selection Selection.
Quick-Sort 9/12/2018 3:26 PM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Quick-Sort 9/13/2018 1:15 AM Quick-Sort     2
Quicksort "There's nothing in your head the sorting hat can't see. So try me on and I will tell you where you ought to be." -The Sorting Hat, Harry Potter.
Selection Selection 1 Quick-Sort Quick-Sort 10/30/16 13:52
MergeSort Source: Gibbs & Tamassia.
Radish-Sort 11/11/ :01 AM Quick-Sort     2 9  9
Objectives Introduce different known sorting algorithms
Quick-Sort 11/14/2018 2:17 PM Chapter 4: Sorting    7 9
Quick-Sort 11/19/ :46 AM Chapter 4: Sorting    7 9
Quick-Sort 2/23/2019 1:48 AM Chapter 4: Sorting    7 9
Merge Sort 2/23/ :15 PM Merge Sort 7 2   7  2   4  4 9
Quick-Sort 2/25/2019 2:22 AM Quick-Sort     2
Copyright © Aiman Hanna All rights reserved
Quick-Sort 4/8/ :20 AM Quick-Sort     2 9  9
Merge Sort 4/10/ :25 AM Merge Sort 7 2   7  2   4  4 9
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
Divide & Conquer Sorting
Quick-Sort 5/7/2019 6:43 PM Selection Selection.
Quick-Sort 5/25/2019 6:16 PM Selection Selection.
Merge Sort 5/30/2019 7:52 AM Merge Sort 7 2   7  2  2 7
CS203 Lecture 15.
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

© 2004 Goodrich, Tamassia Quick-Sort     29  9

Quick-Sort Inventor Invented by Hoare in 1962 Quick-Sort2 查爾斯 · 安東尼 · 理察 · 霍爾爵士 (Charles Antony Richard Hoare ,縮寫為 C. A. R. Hoare , 1934 年 1 月 11 日- ) ,生於斯里蘭卡可倫坡,英國計算 機科學家,圖靈獎得主。他設計了可快速進行排 序程序的快速排序 (quick sort) 演算法,提出可 驗證程式正確性的霍爾邏輯 (Hoare logic) 、以 及提出可訂定並時程序 (concurrent process) 的 交互作用 ( 如哲學家用餐問題 (dining philosophers problem) 的交談循序程續 (CSP, Communicating Sequential Processes) 架構 。 ( 圖及說明摘自維基百科 )

About Quick-Sort Fastest known sorting algorithm in practice Caveat: not stable In-place Complexity Average-case complexity O(n log n) Worse-case complexity O(n 2 )  Rarely happens if implemented correctly Quick-Sort3

Quick-Sort Example Select pivot partition Recursive call Merge

5 Quick-Sort Quick-sort is a sorting algorithm based on divide- and-conquer: Divide: pick an element x (called pivot) and partition S into  L elements less than x  E elements equal x  G elements greater than x Conquer: sort L and G Combine: join L, E and G x x L G E x Key to the success of quicksort: Select a good pivot In-place partition

Quick-Sort6 Partition We partition an input sequence as follows: We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot x Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time Thus, the partition step of quick-sort takes O(n) time Algorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G  empty sequences x  S.erase(p) while  S.empty() y  S.eraseFront() if y < x L.insertBack(y) else if y = x E.insertBack(y) else { y > x } G.insertBack(y) return L, E, G

Quick-Sort7 Quick-Sort Tree An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores  Unsorted sequence before the execution and its pivot  Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or     29  9

Quick-Sort8 Execution Example Pivot selection      38    94  4

Quick-Sort9 Execution Example (cont.) Partition, recursive call, pivot selection    94     38  8 2  2

Quick-Sort10 Execution Example (cont.) Partition, recursive call, base case   11    94      38  8

Quick-Sort11 Execution Example (cont.) Recursive call, …, base case, join   38     11  14 3   94  44  4

Quick-Sort12 Execution Example (cont.) Recursive call, pivot selection      11  14 3   94  44  4

Quick-Sort13 Execution Example (cont.) Partition, …, recursive call, base case      11  14 3   94  44  4 9  99  9

Quick-Sort14 Execution Example (cont.) Join, join      11  14 3   94  44  4 9  99  9

Quick-Sort15 Worst-case Running Time The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element One of L and G has size n  1 and the other has size 0 The running time is proportional to the sum n  (n  1)  …  2  Thus, the worst-case running time of quick-sort is O(n 2 ) depthtime 0n 1 n  1 …… 1 …

Quick-Sort16 Expected Running Time Consider a recursive call of quick-sort on a sequence of size s Good call: the sizes of L and G are each less than 3s  4 Bad call: one of L and G has size greater than 3s  4 A call is good with probability 1  2 1/2 of the possible pivots cause good calls:  Good callBad call Good pivotsBad pivots

Quick-Sort17 Expected Running Time, Part 2 Probabilistic Fact: The expected number of coin tosses required in order to get k heads is 2k For a node of depth i, we expect i  2 ancestors are good calls The size of the input sequence for the current call is at most ( 3  4 ) i  2 n Therefore, we have For a node of depth 2log 4  3 n, the expected input size is one The expected height of the quick-sort tree is O(log n) The amount or work done at the nodes of the same depth is O(n) Thus, the expected running time of quick-sort is O(n log n)

In-Place Quick-Sort 18

Partitioning Algorithm Illustrated lrpivot lr pivot lrpivot Move swap lrpivot move lrpivot swap rlpivot move lr pivot Swap S[i] with pivot l and r have crossed Animation

How to Pick the Pivot (1/2) Strategy 1: Pick the first element Works only if the input is random O(n 2 ) if input S is sorted or almost sorted Strategy 2: Pick a random element Usually works well Extra computation for random number generation Strategy 3: Random permutation of input S first Usually works well Quick-Sort20

How to Pick the Pivot (2/2) Strategy 3: Median of three Ideally, the pivot should be the median of input S, which divides the input into two sequences of almost the same length However, computing median takes O(n) So we find the approximate median via  Pivot = median of the left-most, right-most, and the center element of the array S Quick-Sort21

Example of Median of Three Let input S = {6, 1, 4, 9, 0, 3, 5, 2, 7, 8} left=0 and S[left] = 6 right=9 and S[right] = 8 center = (left+right)/2 = 4 and S[center] = 0 Pivot = median of {6, 8, 0} = 6 Quick-Sort22

Dealing with Small Arrays For small arrays (say, N ≤ 20), Insertion sort is faster than quicksort Quicksort is recursive So it can spend a lot of time sorting small arrays Hybrid algorithm: Switch to using insertion sort when problem size is small (say for N < 20) Quick-Sort23

Quick-Sort24 Summary of Sorting Algorithms AlgorithmTimeNotes selection-sort O(n2)O(n2)  in-place, stable  slow (good for small inputs) insertion-sort O(n2)O(n2)  in-place, stable  slow (good for small inputs) quick-sort O(n log n) expected  in-place, unstable, randomized  fastest (good for large inputs) heap-sort O(n log n)  in-place, unstable  fast (good for large inputs) merge-sort O(n log n)  not in-place, stable  fast (good for huge inputs)