Chapter 7 Quicksort.

Slides:



Advertisements
Similar presentations
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Advertisements

Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
Theory of Computing Lecture 3 MAS 714 Hartmut Klauck.
Quicksort Lecture 3 Prof. Dr. Aydın Öztürk. Quicksort.
Analysis of Algorithms CS 477/677 Sorting – Part B Instructor: George Bebis (Chapter 7)
Spring 2015 Lecture 5: QuickSort & Selection
Analysis of Algorithms CS 477/677 Randomizing Quicksort Instructor: George Bebis (Appendix C.2, Appendix C.3) (Chapter 5, Chapter 7)
Introduction to Algorithms Chapter 7: Quick Sort.
1 Introduction to Randomized Algorithms Md. Aashikur Rahman Azim.
Quicksort Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
Ch. 7 - QuickSort Quick but not Guaranteed. Ch.7 - QuickSort Another Divide-and-Conquer sorting algorithm… As it turns out, MERGESORT and HEAPSORT, although.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
Quicksort!. A practical sorting algorithm Quicksort  Very efficient Tight code Good cache performance Sort in place  Easy to implement  Used in older.
1 QuickSort Worst time:  (n 2 ) Expected time:  (nlgn) – Constants in the expected time are small Sorts in place.
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
10 Algorithms in 20th Century Science, Vol. 287, No. 5454, p. 799, February 2000 Computing in Science & Engineering, January/February : The Metropolis.
Chapter 7 Quicksort Ack: This presentation is based on the lecture slides from Hsu, Lih- Hsing, as well as various materials from the web.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
David Luebke 1 6/3/2016 CS 332: Algorithms Analyzing Quicksort: Average Case.
Deterministic and Randomized Quicksort Andreas Klappenecker.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
COSC 3101A - Design and Analysis of Algorithms 4 Quicksort Medians and Order Statistics Many of these slides are taken from Monica Nicolescu, Univ. of.
Mudasser Naseer 1 3/4/2016 CSC 201: Design and Analysis of Algorithms Lecture # 6 Bubblesort Quicksort.
1 Chapter 7 Quicksort. 2 About this lecture Introduce Quicksort Running time of Quicksort – Worst-Case – Average-Case.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
Analysis of Algorithms CS 477/677
Order Statistics.
Quick Sort Divide: Partition the array into two sub-arrays
Order Statistics Comp 122, Spring 2004.
Randomized Algorithms
CSC 413/513: Intro to Algorithms
Sorting We have actually seen already two efficient ways to sort:
Quick Sort (11.2) CSE 2011 Winter November 2018.
CO 303 Algorithm Analysis And Design Quicksort
Ch 7: Quicksort Ming-Te Chi
Randomized Algorithms
Lecture 3 / 4 Algorithm Analysis
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Order Statistics Comp 550, Spring 2015.
CS 583 Analysis of Algorithms
Design and Analysis of Algorithms
EE 312 Software Design and Implementation I
CS 3343: Analysis of Algorithms
Order Statistics Def: Let A be an ordered set containing n elements. The i-th order statistic is the i-th smallest element. Minimum: 1st order statistic.
CS 1114: Sorting and selection (part two)
CS 332: Algorithms Quicksort David Luebke /9/2019.
Tomado del libro Cormen
Ch. 2: Getting Started.
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Ack: Several slides from Prof. Jim Anderson’s COMP 202 notes.
Chapter 9: Medians and Order Statistics
Algorithms CSCI 235, Spring 2019 Lecture 20 Order Statistics II
Data Structures & Algorithms
Order Statistics Comp 122, Spring 2004.
The Selection Problem.
Quicksort and Randomized Algs
Design and Analysis of Algorithms
Quicksort Quick sort Correctness of partition - loop invariant
Algorithms CSCI 235, Spring 2019 Lecture 17 Quick Sort II
CS200: Algorithm Analysis
Sorting We have actually seen already two efficient ways to sort:
Medians and Order Statistics
Presentation transcript:

Chapter 7 Quicksort

About this lecture Introduce Quicksort Running time of Quicksort Worst-Case Average-Case

Cinderella’s New Problem You have to sort the bolts and sort the nuts I see…

Fairy Godmother’s Proposal Pick one of the nut Compare this nut with all other bolts  Find those which are larger, and find those which are smaller

Fairy Godmother’s Proposal picked nut Done ! Bolts larger than nut Bolts smaller than nut Bolt equal to nut

Fairy Godmother’s Proposal Pick the bolt that is equal to the selected nut Compare this bolt with all other nuts  Find those which are larger, and find those which are smaller picked nut Bolts larger than nut Bolts smaller than nut Bolt equal to nut

Fairy Godmother’s Proposal Done ! Nuts smaller than bolt Nuts larger than bolt

Fairy Godmother’s Proposal Sort left part (recursively) Sort right part (recursively) ^^ This is all of my proposal ^^ Nuts smaller than bolt Nuts larger than bolt

Fairy Godmother’s Proposal Can you see why Fairy Godmother’s proposal is a correct algorithm? What is the running time ? Worst-case: Q(n2) comparisons No better than the brute force approach !! Though worst-case runs badly, the average case is good: Q(n log n) comparisons

Quicksort uses Partition The previous algorithm is exactly Quicksort, which makes use of a Partition function: Partition(A,p,r) /* to partition array A[p..r] */ Pick an element, say A[t] (called pivot) Let q = #elements less than pivot Put elements less than pivot to A[p..p+q-1] Put pivot to A[p+q] Put remaining elements to A[p+q+1..r] Return q

More on Partition After Partition(A,p,r), we obtain the value q, and know that Pivot was A[p+q] Before A[p+q] : smaller than pivot After A[p+q] : larger than pivot There are many ways to perform Partition. One way is shown in the next slides It will be an in-place algorithm (using O(1) extra space in addition to the input array)

Ideas for In-Place Partition Idea 1: Use A[r] (the last element) as pivot Idea 2: Process A[p..r] from left to right The prefix (the beginning part) of A stores all elements less than pivot seen so far Use two counters: One for the length of the prefix One for the element we are looking

In-Place Partition in Action before running Length of prefix = 0 1 3 7 8 2 6 4 5 pivot next element Because next element is less than pivot, we shall extend the prefix by 1

In-Place Partition in Action after 1 step Length of prefix = 1 1 3 7 8 2 6 4 5 pivot next element Because next element is smaller than pivot, and is adjacent to the prefix, we extend the prefix

In-Place Partition in Action after 2 steps Length of prefix = 2 1 3 7 8 2 6 4 5 pivot next element Because next element is larger than pivot, no change to prefix

In-Place Partition in Action after 3 steps Length of prefix = 2 1 3 7 8 2 6 4 5  5 ≥ 5 unknown pivot next element Again, next element is larger than pivot, no change to prefix

In-Place Partition in Action after 4 steps Length of prefix = 2 1 3 7 8 2 6 4 5 pivot next element Because next element is less than pivot, we shall extend the prefix by swapping

In-Place Partition in Action after 5 steps Length of prefix = 3 1 3 2 8 7 6 4 5 pivot next element Because next element is larger than pivot, no change to prefix

In-Place Partition in Action after 6 steps Length of prefix = 3 1 3 2 8 7 6 4 5 pivot next element Because next element is less than pivot, we shall extend the prefix by swapping

In-Place Partition in Action after 7 steps Length of prefix = 4 1 3 2 4 7 6 8 5 pivot next element When next element is the pivot, we put it after the end of the prefix by swapping

In-Place Partition in Action after 8 steps Length of prefix = 4 1 3 2 4 5 6 8 7 pivot Partition is done, and return length of prefix

To sort A[1..n], we just call Quicksort(A,1,n) The Quicksort algorithm works as follows: Quicksort(A,p,r) /* to sort array A[p..r] */ if ( p  r ) return; q = Partition(A,p,r); Quicksort(A, p, p+q-1); Quicksort(A, p+q+1, r); To sort A[1..n], we just call Quicksort(A,1,n)

Randomized Versions of Partition Randomized-Partition(A, p, r) i = Random(p, r) Exchange A[r] with A[i] Return Partition(A, p, r) Randomized-Quicksort(A, p, r) If p < r q = Randomized-Partition(A, p, r) Randomized-Quicksort(A, p, q-1) Randomized-Quicksort(A, q+1, r)

Worst-Case Running Time The worst-case running time of Quicksort can be expressed by: T(n) = maxq=0 to n-1 (T(q) + T(n-q-1)) + Q(n) We prove T(n)= O(n2) by substitution method: Guess T(n) ≤ cn2 for some constant c Next, verify our guess by induction

Worst-Case Running Time Inductive Case: T(n) = maxq=0 to n-1 (T(q) + T(n-q-1)) + Q(n) ≤ maxq=0 to n-1 (cq2 + c(n-q-1)2) + Q(n) ≤ c(n-1)2 + Q(n) = cn2 - 2cn + c + Q(n) ≤ cn2 when c is large enough Maximized when q = 0 or when q = n-1 q2 + (n-q-1)2 = (n-1)2 + 2q(q-n+1) (1) Because (n-1)2 > 0 and q <= n, maximum (1) is equal to minimize q(q-n+1) Therefore, q = 0 or q= n-1 Inductive Case is OK now. How about Base Case?

Worst-Case Running Time Conclusion: T(n) = O(n2) However, we can also show T(n) = W(n2) by finding a worst-case input Let T(n) = T(n-1) + Q(n)  T(n) = Q(n2)

Balanced partitioning Quicksort’s average running time is much closer to the best case than to the worst case Imagine that PARTITION always produces a 9-to-1 split. Get the recurrence T (n) = T(9n/10) + T(n/10) + c n Any split of constant proportionality will yield a recursion tree of depth Θ(lg n) It’s like the one for T(n) = T(n/3) + T(2n/3) + O(n) in Section 4.4.

Copyright © The McGraw-Hill Companies, Inc Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Log10/9 n = log10/9 2 log2 n

Average-Case Running Time So, Quicksort runs badly for some input… But suppose that when we store a set of n numbers into the input array, each of the n! permutations are equally likely Running time varies on input What will be the “average” running time ?

Average Running Time Let X = # comparisons in all Partition Later, we will show that Running time = O( n + X ) varies on input Finding average of X (i.e. #comparisons) gives average running time Our first target: Compute average of X

Average # of Comparisons We define some notation to help the analysis: Let a1, a2, …, an denote the set of n numbers initially placed in the array Further, we assume a1  a2  …  an (So, a1 may not be the element in A[1] originally) Let Xij = # comparisons between ai and aj in all Partition calls

Average # of Comparisons Then, X = # comparisons in all Partition calls = X12 + X13 + … + Xn-1,n Average # comparisons = E[X] = E[X12 + X13 + … + Xn-1,n] = E[X12] + E[X13] + … + E[Xn-1,n]

Average # of Comparisons The next slides will prove: E[Xij] = 2/(j-i+1) Using this result, E[X] = Si=1 to n-1 Sj=i+1 to n 2/(j-i+1) = Si=1 to n-1 Sk=1 to n-i 2/(k+1)  Si=1 to n-1 Sk=1 to n 2/k = Si=1 to n-1 O(log n) = O(n log n)

Comparison between ai and aj Question: # times ai be compared with aj ? Answer: At most once, which happens only if ai or aj are chosen as pivot 1 3 2 4 5 6 8 7 pivot After that, the pivot is fixed and is never compared with the others

Comparison between ai and aj Question: Will ai always be compared with aj ? Answer: No. E.g., after Partition in Page 14: 1 3 2 4 5 6 8 7 pivot we will separately Quicksort the first 4 elements, and then the last 3 elements  3 is never compared with 8

Comparison between ai and aj Observation: Consider the elements ai, ai+1, …, aj-1, aj If ai or aj is first chosen as a pivot, then ai is compared with aj Else, if any element of ai+1, …, aj-1 is first chosen as a pivot, then ai is never compared with aj

Comparison between ai and aj When the n! permutations are equally likely to be the input, Pr(ai compared with aj once) = 2/(j-i+1) Pr(ai not compared with aj) = (j-i-1)/(j-i+1)  E[Xij] = 1 * 2/(j-i+1) + 0 * (j-i-1)/(j-i+1) = 2/(j-i+1)

Proof: Running time = O(n+X) Observe that in the Quicksort algorithm: Each Partition fixes the position of pivot  at most n Partition calls After each Partition, we have 2 Quicksort Also, all Quicksort (except 1st one: Quicksort(A,1,n)) are invoked after a Partition  total Q(n) Quicksort calls

Proof: Running time = O(n+X) So, if we ignore the comparison time in all Partition calls, the time used = O(n) Thus, we include back the comparison time in all Partition calls, Running time = O( n + X )

Homework Problems 7-2 Practice in home:7.2-5, 7.3-2, 7.4-3 Problem 7-3 (bonus 1 point)