The Substitution method T(n) = 2T(n/2) + cn Guess:T(n) = O(n log n) Proof by Mathematical Induction: Prove that T(n)  d n log n for d>0 T(n)  2(d  n/2.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

David Luebke 1 4/22/2015 CS 332: Algorithms Quicksort.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
CS 3343: Analysis of Algorithms Lecture 14: Order Statistics.
Introduction to Algorithms Jiafen Liu Sept
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
Spring 2015 Lecture 5: QuickSort & Selection
Sorting. “Sorting” When we just say “sorting,” we mean in ascending order (smallest to largest) The algorithms are trivial to modify if we want to sort.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
Median/Order Statistics Algorithms
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Chapter 7: Sorting Algorithms
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
Sorting Algorithms and Average Case Time Complexity
CS Data Structures I Chapter 10 Algorithm Efficiency & Sorting III.
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
Fundamentals of Algorithms MCS - 2 Lecture # 16. Quick Sort.
1 Issues with Matrix and Vector Issues with Matrix and Vector Quicksort Quicksort Determining Algorithm Efficiency Determining Algorithm Efficiency Substitution.
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
1 Algorithm Efficiency and Sorting (Walls & Mirrors - Remainder of Chapter 9)
Quicksort.
Cmpt-225 Sorting. Fundamental problem in computing science  putting a collection of items in order Often used as part of another algorithm  e.g. sort.
Quicksort
Data Structures Review Session 1
Cmpt-225 Sorting – Part two. Idea of Quick Sort 1) Select: pick an element 2) Divide: partition elements so that x goes to its final position E 3) Conquer:
CSCD 326 Data Structures I Sorting
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Order Statistics ● The ith order statistic in a set of n elements is the ith smallest element ● The minimum is thus the 1st order statistic ● The maximum.
Divide and Conquer Applications Sanghyun Park Fall 2002 CSE, POSTECH.
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
CENG 213 Data Structures Sorting Algorithms. CENG 213 Data Structures Sorting Sorting is a process that organizes a collection of data into either ascending.
Divide And Conquer A large instance is solved as follows:  Divide the large instance into smaller instances.  Solve the smaller instances somehow. 
Sorting Fundamental Data Structures and Algorithms Aleks Nanevski February 17, 2004.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
CSC317 1 Quicksort on average run time We’ll prove that average run time with random pivots for any input array is O(n log n) Randomness is in choosing.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Nothing is particularly hard if you divide it into small jobs. Henry Ford Nothing is particularly hard if you divide it into small jobs. Henry Ford.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
Computer Sciences Department1. Sorting algorithm 4 Computer Sciences Department3.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
1 Chapter 7 Quicksort. 2 About this lecture Introduce Quicksort Running time of Quicksort – Worst-Case – Average-Case.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Figure 9.1 Time requirements as a function of the problem size n.
Sorting Algorithms CENG 213 Data Structures 1.
Quicksort
Quicksort 1.
CS 3343: Analysis of Algorithms
Quick Sort (11.2) CSE 2011 Winter November 2018.
Data Structures Review Session
Topic: Divide and Conquer
Quicksort.
CSE 373 Data Structures and Algorithms
Topic: Divide and Conquer
Quicksort.
The Selection Problem.
Design and Analysis of Algorithms
Quicksort.
Presentation transcript:

The Substitution method T(n) = 2T(n/2) + cn Guess:T(n) = O(n log n) Proof by Mathematical Induction: Prove that T(n)  d n log n for d>0 T(n)  2(d  n/2  log n/2) + cn (where T(n/2)  d  n/2 (log n/2) by induction hypothesis)  dn log n/2 + cn = dn log n – dn + cn = dn log n + (c-d)n  dn log nif d  c Therefore, T(n) = O(n log n)

Quick Sort – Partitioning – algorithm public int partition(Comparable[] arr, int low, int high) { Comparable pivot = arr[high]; // choose pivot int l = low; int r = high-1; while (l<=r) { // find bigger item on the left while (l<=r && arr[l].compareTo(pivot) <= 0) l++; // find smaller item on the right while (l = 0) r--; if (l<r) { swap(arr[l], arr[r]); l++; r--; } // put pivot to the correct location swap(arr[l], arr[high]); return r; }

Quick Sort – Partitioning – algorithm – Proof of correctness Loop invariant: at the beginning/end of each loop: 1. arr[low]..arr[l-1] contains elements <= pivot 2. arr[r+1]..arr[high-1] contains elements >= pivot When the loop is finished we have l=r+1, i.e., 1. arr[low]..arr[r] are <= pivot 2. arr[r+1]..arr[high-1] are >= pivot 3. arr[high]=pivot By swapping arr[high] with arr[l] (or arr[r]) we get a proper partitioning.

Quick Sort – Partitioning – another algorithm (textbook) Pivot is chosen to be the first element of the array (does not really matter) The array is divided to 4 parts (see bellow), initially “<p” and “¸p” parts are empty Invariant for the partition algorithm: The items in region S 1 are all less than the pivot, and those in S 2 are all greater than or equal to the pivot In each step the first element in “?” part is added either to “<p” or “¸p” part.

Initial state of the array S 1 : arr[first+1]..arr[lastS1] ) empty S 2 : arr[lastS1+1]..arr[firstUnknown-1] ) empty ?: arr[firstUnknown]..arr[last] ) all elements but pivot Quick Sort – Partitioning – another algorithm (textbook)

Processing arr[firstUnknown]: case “< pivot” Move arr[firstUnknown] into S 1 by swapping it with theArray[lastS1+1] and by incrementing both lastS1 and firstUnknown. Quick Sort – Partitioning – another algorithm (textbook)

Processing arr[firstUnknown]: case “¸ pivot” Moving theArray[firstUnknown] into S 2 by incrementing firstUnknown. Quick Sort – Partitioning – another algorithm (textbook)

public int partition(Comparable[] arr, int first, int last) { Comparable pivot = arr[first]; // choose pivot // initially everything but pivot is unknown int lastS1 = first; for (int firstUnknown = first+1; firstUnknown <= last; firstUnknown++) { if (arr[firstUnknown].compareTo(pivot) < 0) { // item should be moved to S1 lastS1++; swap(arr[lastS1],arr[firstUnknown]); } // else item should be moved to S2, // which will be increamenting firstUnknown in the loop } // put pivot to the correct location swap(arr[first], arr[lastS1]); return lastS1; }

Quick Sort – Selection of pivot In the above algorithm we selected the pivot to be the last or the first element of subarray which we want to partition It turns out that the selection of pivot is crucial for performance of Quick Sort – see best and worst cases Other strategies used: select 3 (or more elements) and pick the median randomly select (especially used when the arrays might be originally sorted) select an element “close to the median” in the subarray (there is a recursive linear time algorithm for that, see for details).

Analysis of Quick Sort: Best Case How much time do we need to partition an array of size n? O(n) using any of two algorithms Best case: Suppose each partition operation divides the array almost exactly in half

Best case Partitioning at various levels

Analysis of Quick Sort: Best Case How much time do we need to partition an array of size n? O(n) using any of two algorithms Best case: Suppose each partition operation divides the array almost exactly in half When could the best case happen? For example, array was sorted and the pivot is selected to be the middle element of the subarray.

Analysis of Quick Sort: Best Case Best case: Suppose each partition operation divides the array almost exactly in half The running time (time cost) can be expressed with the following recurrence: T(n) = 2.T(n/2) + T(partitioning array of size n) = 2.T(n/2) + O(n) The same recurrence as for merge sort, i.e., T(n) is of order O(n.log n).

In the worst case, partitioning always divides the size n array into these three parts: A length one part, containing the pivot itself A length zero part, and A length n-1 part, containing everything else Analysis of Quick Sort: Worst Case

Worst case partitioning

In the worst case, partitioning always divides the size n array into these three parts: A length one part, containing the pivot itself A length zero part, and A length n-1 part, containing everything else When could this happen? Example: the array is sorted and the pivot is selected to be the first or the last element. Analysis of Quick Sort: Worst Case

The recurrent formula for the time cost of Quick Sort in the worst case: T(n) = T(0) + T(n-1) + O(n) = T(n-1) + O(n) By repeated substitution (or Master’s theorem) we get the running time of Quick Sort in the worst case is O(n 2 ) Similar, situation as for Insertion Sort. Does it mean that the performance of Quick Sort is bad on average? Analysis of Quick Sort: Worst Case

If the array is sorted to begin with, Quick sort running time is terrible: O(n 2 ) (Remark: could be improved by random selection of pivot.) It is possible to construct other bad cases However, Quick sort runs usually (on average) in time O(n.log 2 n) -> CMPT307 for detailed analysis The constant in front of n.log 2 n is so good that Quick sort is generally the fastest algorithm known. Most real-world sorting is done by Quick sort. Quick Sort: Average Case

Exercise Problem on Quick Sort. What is the running time of QUICKSORT when a) All elements of array A have the same value ? b) The array A contains distinct elements and in sorted decreasing order ?

Answer – 1 st algorithm Pivot is chosen to be the last element in the subarray. a) Whatever pivot you choose in each subarray it would result in WORST CASE PARTITIONING (l=high) and hence the running time is O(n 2 ). b) Same is the case. Since you always pick the minimum element in the subarray as the pivot each partition you do would be a worst case partition and hence the running time is O(n 2 ) again !

Answer – 2 nd algorithm Pivot is chosen to be the first element in the subarray a) Whatever pivot you choose in each subarray it would result in WORST CASE PARTITIONING (everything will be put to S 2 part) and hence the running time is O(n 2 ). b) Same is the case. Since you always pick the maximum element in the sub array as the pivot each partition you do would be a worst case partition and hence the running time is O(n 2 ) again !

A Comparison of Sorting Algorithms Approximate growth rates of time required for eight sorting algorithms

Finding the k-th Smallest Element in an Array (Selection Problem) One possible strategy: sort an array and just take the k-th element in the array This would require O(n.log n) time if use some efficient sorting algorithm Question: could we use partitioning idea (from Quicksort)?

If S1 contains k or more items -> S1 contains kth smallest item If S1 contains k-1 items -> k-th smalles item is pivot p If S1 contains fewer then k-1 items -> S2 contains kth smallest item Finding the k-th Smallest Element in an Array Assume we have partition the subarray as before.

Finding the k-th Smallest Element in an Array public Comparable select(int k, Comparable[] arr, int low, int high) // pre: low <= high and // k <= high-low+1 (number of elements in the subarray) // return the k-th smallest element // of the subarray arr[low..high] { int pivotIndex = partition(arr, low, high); // Note: pivotIndex - low is the local index // of pivot in the subarray if (k == pivotIndex - low + 1) { // the pivot is the k-th element of the subarray return arr[pivotIndex]; } else if (k < pivotIndex - low + 1) { // the k-th element must be in S1 partition return select(k, arr, low, pivotIndex-1); } else { // k > pivotIndex - low +1 // the k-th element must be in S2 partition // Note: there are pivotIndex-first elements in S1 // and one pivot, i.e., all smaller than // elements in S2, so we have to recalculate // index k return select(k - (pivotIndex-first+1), arr, pivotIndex+1, high); } } // end kSmall

Finding the k-th Smallest Item in an Array The running time in the best case: T(n) = T(n/2) + O(n) It can be shown with repeated substitution that T(n) is of order O(n) The running time in the worst case: T(n) = T(n-1) + O(n) This gives the time O(n 2 ) average case: O(n) By selecting the pivot close to median (using a recursive linear time algorithm), we can achieve O(n) time in the worst case as well.