Advanced Sorting.

Slides:



Advertisements
Similar presentations
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Advertisements

CSCE 3110 Data Structures & Algorithm Analysis
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 5.
Analysis of Algorithms CS 477/677 Sorting – Part B Instructor: George Bebis (Chapter 7)
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
Fundamentals of Algorithms MCS - 2 Lecture # 16. Quick Sort.
Quick Sort. 2 Divide: Pick any element p as the pivot, e.g, the first element Partition the remaining elements into FirstPart, which contains all elements.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
Sorting Algorithms Bubble Sort Merge Sort Quick Sort Randomized Quick Sort.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
Fall 2013 Instructor: Reza Entezari-Maleki Sharif University of Technology 1 Fundamentals of Programming Session 17 These.
1 Lecture 16: Lists and vectors Binary search, Sorting.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
COMP 171 Data Structures and Algorithms Tutorial 3 Merge Sort & Quick Sort.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Lecture 7. Solution by Substitution Method T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n)
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Chapter 9: Sorting1 Sorting & Searching Ch. # 9. Chapter 9: Sorting2 Chapter Outline  What is sorting and complexity of sorting  Different types of.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
1 Overview Divide and Conquer Merge Sort Quick Sort.
Sorting. 2 The Sorting Problem Input: A sequence of n numbers a 1, a 2,..., a n Output: A permutation (reordering) a 1 ’, a 2 ’,..., a n ’ of the input.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Algorithms Sorting – Part 3.
Advanced Sorting 7 2  9 4   2   4   7
CMPT 438 Algorithms.
Analysis of Algorithms CS 477/677
Quick-Sort 9/13/2018 1:15 AM Quick-Sort     2
Chapter 7 Sorting Spring 14
Advance Analysis of Algorithms
CSE 143 Lecture 23: quick sort.
Quicksort 1.
Merge Sort Merge sort is a recursive algorithm for sorting that decomposes the large problem.
Chapter 4: Divide and Conquer
Quick Sort (11.2) CSE 2011 Winter November 2018.
Unit-2 Divide and Conquer
Sorting Algorithms Ellysa N. Kosinaya.
Data Structures Review Session
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Sub-Quadratic Sorting Algorithms
slides adapted from Marty Stepp
Chapter 4.
EE 312 Software Design and Implementation I
Quicksort.
CS 1114: Sorting and selection (part two)
CSE 373 Data Structures and Algorithms
Algorithms: Design and Analysis
CSC 380: Design and Analysis of Algorithms
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
Data Structures & Algorithms
Quicksort.
Analysis of Algorithms
CS200: Algorithm Analysis
Algorithms Sorting.
The Selection Problem.
Design and Analysis of Algorithms
CSE 332: Sorting II Spring 2016.
Divide and Conquer Merge sort and quick sort Binary search
Quicksort.
Sorting Popular algorithms:
Presentation transcript:

Advanced Sorting

Types of Sorting Algorithms There are many, many different types of sorting algorithms, but the primary ones are: Bubble Sort Selection Sort Insertion Sort Merge Sort Quick Sort Heap Sort Shell Sort Radix Sort Swap Sort

Overview For now, we’ll consider the following: Divide and Conquer Merge Sort Quick Sort

Divide and Conquer Base Case, solve the problem directly if it is small enough Divide the problem into two or more similar and smaller subproblems Recursively solve the subproblems Combine solutions to the subproblems

Divide and Conquer - Sort Base case at most one element (left ≥ right), return Divide A into two subarrays: FirstPart, SecondPart Two Subproblems: sort the FirstPart sort the SecondPart Recursively sort FirstPart sort SecondPart Combine sorted FirstPart and sorted SecondPart

Overview Divide and Conquer Merge Sort Quick Sort

Merge sort An example of recursion Idea in merge sort is to divide an array in half, sort each half, and then merge the two halves into a single sorted array. - sorting each half by recursion Much more efficient sorting technique than bubble, insertion, and selection sorts which take O(N2) time Merge sort is O(N*logN). Merge sort is also fairly easy to implement. It’s conceptually easier than quicksort, Shell sort, etc. The downside of the merge sort is that it requires an additional array in memory, equal in size to the one being sorted.

Merge Sort: Idea A A is sorted! Divide into two halves FirstPart SecondPart Recursively sort SecondPart FirstPart Merge Divide: if the initial array A has at least two elements (nothing needs to be done if A has zero or one elements), divide A into two subarrays each containing about half of the elements of A. Conquer: sort the two subarrays using Merge Sort. Combine: merging the two sorted subarray into one sorted array A is sorted!

Merge Sort: Algorithm Merge-Sort (A, left, right) if left ≥ right return else middle ← b(left+right)/2 Merge-Sort(A, left, middle) Merge-Sort(A, middle+1, right) Merge(A, left, middle, right) Recursive Call

Merge-Sort: Merge A: merge A: Sorted Sorted FirstPart Sorted SecondPart A: A[left] A[middle] A[right]

Merge-Sort: Merge Example 2 3 7 8 5 15 28 30 6 10 14 1 4 5 6 6 10 14 22 3 5 15 28 L: R: Temporary Arrays

Merge-Sort: Merge Example 3 1 5 15 28 30 6 10 14 k=0 L: R: 3 2 15 3 7 28 8 30 1 6 4 10 5 14 6 22 i=0 j=0

Merge-Sort: Merge Example 1 2 5 15 28 30 6 10 14 k=1 L: R: 2 3 5 3 7 15 28 8 1 6 10 4 5 14 6 22 i=0 j=1

Merge-Sort: Merge Example 1 2 3 15 28 30 6 10 14 k=2 L: R: 2 3 7 8 1 6 4 10 5 14 22 6 i=1 j=1

Merge-Sort: Merge Example 1 2 3 4 6 10 14 k=3 L: R: 2 3 7 8 6 1 10 4 14 5 6 22 i=2 j=1

Merge-Sort: Merge Example 1 2 3 4 5 6 10 14 k=4 L: R: 2 3 7 8 6 1 4 10 14 5 6 22 i=2 j=2

Merge-Sort: Merge Example 1 2 3 4 5 6 6 10 14 k=5 L: R: 2 3 7 8 6 1 10 4 5 14 22 6 i=2 j=3

Merge-Sort: Merge Example 1 2 3 4 5 6 7 14 k=6 L: R: 2 3 7 8 6 1 4 10 5 14 22 6 i=2 j=4

Merge-Sort: Merge Example 1 2 3 4 5 6 7 8 14 k=7 L: R: 2 3 3 5 7 15 8 28 1 6 10 4 14 5 6 22 i=3 j=4

Merge-Sort: Merge Example 1 2 3 4 5 6 7 8 k=8 L: R: 2 3 3 5 7 15 8 28 1 6 10 4 5 14 22 6 j=4 i=4

Merge(A, left, middle, right) n1 ← middle – left + 1 n2 ← right – middle create array L[n1], R[n2] for i ← 0 to n1-1 do L[i] ← A[left +i] for j ← 0 to n2-1 do R[j] ← A[n1+j] k ← i ← j ← 0 while i < n1 & j < n2 if L[i] < R[j] A[k++] ← L[i++] else A[k++] ← R[j++] while i < n1 while j < n2 n = n1+n2 Space: n Time : cn for some constant c

Merge-Sort(A, 0, 7) Divide A: 6 2 8 4 3 7 5 1 6 2 8 4 3 7 5 1

Merge-Sort(A, 0, 7) Merge-Sort(A, 0, 3) , divide A: 3 7 5 1 6 2 8 4 3 7 5 1 6 2 8 4 6 2 8 4

Merge-Sort(A, 0, 7) Merge-Sort(A, 0, 1) , divide A: 3 7 5 1 8 4 6 2 6 3 7 5 1 8 4 6 2 6 2

Merge-Sort(A, 0, 7) Merge-Sort(A, 0, 0) , base case A: 3 7 5 1 8 4 2 6

Merge-Sort(A, 0, 7) Merge-Sort(A, 0, 0), return A: 3 7 5 1 8 4 6 2

Merge-Sort(A, 0, 7) Merge-Sort(A, 1, 1) , base case A: 3 7 5 1 8 4 6 2

Merge-Sort(A, 0, 7) Merge-Sort(A, 1, 1), return A: 3 7 5 1 8 4 6 2

Merge-Sort(A, 0, 7) Merge(A, 0, 0, 1) A: 3 7 5 1 8 4 2 6

Merge-Sort(A, 0, 7) Merge-Sort(A, 0, 1), return A: 3 7 5 1 2 6 8 4

Merge-Sort(A, 0, 7) Merge-Sort(A, 2, 3) , divide A: 3 7 5 1 2 6 8 4 8 3 7 5 1 2 6 8 4 8 4

Merge-Sort(A, 0, 7) Merge-Sort(A, 2, 2), base case A: 3 7 5 1 2 6 4 8

Merge-Sort(A, 0, 7) Merge-Sort(A, 2, 2), return A: 3 7 5 1 2 6 8 4

Merge-Sort(A, 0, 7) Merge-Sort(A, 3, 3), base case A: 2 6 8 4

Merge-Sort(A, 0, 7) Merge-Sort(A, 3, 3), return A: 3 7 5 1 2 6 8 4

Merge-Sort(A, 0, 7) Merge(A, 2, 2, 3) A: 3 7 5 1 2 6 4 8

Merge-Sort(A, 0, 7) Merge-Sort(A, 2, 3), return A: 3 7 5 1 2 6 4 8

Merge-Sort(A, 0, 7) Merge(A, 0, 1, 3) A: 3 7 5 1 2 4 6 8

Merge-Sort(A, 0, 7) Merge-Sort(A, 0, 3), return A: 2 4 6 8 3 7 5 1

Merge-Sort(A, 0, 7) Merge-Sort(A, 4, 7) A: 2 4 6 8 3 7 5 1

Merge-Sort(A, 0, 7) Merge (A, 4, 5, 7) A: 2 4 6 8 1 3 5 7

Merge-Sort(A, 0, 7) Merge-Sort(A, 4, 7), return A: 2 4 6 8 1 3 5 7

Merge-Sort(A, 0, 7) Merge-Sort(A, 0, 7), done! Merge(A, 0, 3, 7) A: 1 2 3 4 5 6 7 8

Merge-Sort Analysis Total running time: (nlogn) Total Space:  (n) cn 2 × cn/2 = cn n/2 n/2 log n levels 4 × cn/4 = cn n/4 n/4 n/4 n/4 n/2 × 2c = cn 2 2 2 Total: cn log n Total running time: (nlogn) Total Space:  (n)

Merge-Sort Summary Approach: divide and conquer Time Space: Most of the work is in the merging Total time: (n log n) Space: (n), more space than other sorts. This is typically the method that you would naturally use when sorting a pile of books, CDs cards, etc. Best Case: Ω(n ㏒ n) Worst Case: Ο(n ㏒ n) Running Time: Θ(n ㏒ n) Advantage Stable running time Fast running time Disadvantage Need extra memory space for merge step

Overview Divide and Conquer Merge Sort Quick Sort

Quick Sort Quicksort is undoubtedly the most popular sorting algorithm, and for good reason: In the majority of situations, it’s the fastest, operating in O(N*logN) time. Quicksort was discovered by C.A.R. Hoare in 1962. Quicksort algorithm operates by partitioning an array into two subarrays and then calling itself recursively to quicksort each of these subarrays. There are three basic steps: Partition the array or subarray into left (smaller keys) and right (larger keys) groups. Call itself recursively to sort the left group. 3. Call itself recursively again to sort the right group.

Partitioning Partitioning is the underlying mechanism of quicksort, but it’s also a useful operation on its own. To partition data is to divide it into two groups, so that all the items with a key value higher than a specified amount are in one group, and all the items with a lower key value are in another. You can easily imagine situations in which you would want to partition data. You may want to divide your personnel records into two groups: employees who live within 15 miles of the office and those who live farther away. Or a school administrator might want to divide students into those with grade point averages higher and lower than 3.5, so as to know who deserves to be on the Dean’s list.

The Partition Algorithm

Quick Sort Divide: Recursively sort the FirstPart and SecondPart Pick any element p as the pivot, e.g, the first element Partition the remaining elements into FirstPart, which contains all elements < p SecondPart, which contains all elements ≥ p Recursively sort the FirstPart and SecondPart Combine: no work is necessary since sorting is done in place since sorting is done in place

Quick Sort p pivot p p ≤ x x < p x < p p p ≤ x A: Partition FirstPart SecondPart p p ≤ x x < p Recursive call x < p p p ≤ x Sorted FirstPart SecondPart Sorted

Quick Sort Quick-Sort(A, left, right) if left ≥ right return else middle ← Partition(A, left, right) Quick-Sort(A, left, middle–1 ) Quick-Sort(A, middle+1, right) end if Divide: partition array into 2 subarrays such that elements in the lower part <= elements in the higher part 1. If the number of elements to be sorted is 0 or 1, then return 2. Pick any element, v (this is called the pivot) 3. Partition the other elements into two disjoint sets, S1 of elements  v, and S2 of elements > v Conquer: recursively sort the 2 subarrays Combine: trivial since sorting is done in place Return QuickSort (S1) followed by v followed by QuickSort (S2)

Partition Example A: 4 8 6 3 5 1 7 2

Partition Example i=0 A: 4 8 6 3 5 1 7 2 j=1

Partition Example i=0 A: 4 8 8 6 3 5 1 7 2 j=1

Partition Example i=0 A: 4 8 6 6 3 5 1 7 2 j=2

Partition Example i=1 i=0 A: 4 8 3 8 6 3 3 5 1 7 2 j=3

Partition Example i=1 A: 4 3 6 8 5 5 1 7 2 j=4

Partition Example i=1 A: 4 3 6 8 5 1 1 7 2 j=5

Partition Example i=2 A: 4 3 6 1 6 8 5 1 7 2 j=5

Partition Example i=2 A: 4 3 1 8 5 6 7 7 2 j=6

Partition Example i=2 i=3 A: 4 3 1 8 2 5 6 7 8 2 2 j=7

Partition Example i=3 A: 4 3 1 2 5 6 7 8 j=8

Partition Example i=3 A: 4 2 3 1 2 4 5 6 7 8

Partition Example pivot in correct position A: 2 3 1 4 5 6 7 8

Partition(A, left, right) x ← A[left] i ← left for j ← left+1 to right if A[j] < x then i ← i + 1 swap(A[i], A[j]) end if end for j swap(A[i], A[left]) return i n = right – left +1 Time: cn for some constant c Space: constant

Quick-Sort(A, 0, 7) Partition A: 4 8 6 3 5 1 7 2 2 3 1 5 6 7 8 4

Quick-Sort(A, 0, 7) Quick-Sort(A, 0, 2) , partition A: 4 5 6 7 8 2 3 1 5 6 7 8 2 3 1 2 1 3

Quick-Sort(A, 0, 7) Quick-Sort(A, 0, 0) , return , base case 4 5 6 7 8 5 6 7 8 1 2 3 1

Quick-Sort(A, 0, 7) Quick-Sort(A, 1, 1) , base case 4 5 6 7 8 1 2 3

Quick-Sort(A, 0, 7) Quick-Sort(A, 0, 2), return 1 3 4 5 6 7 8 2 1 3

Quick-Sort(A, 0, 7) Quick-Sort(A, 4, 7) Quick-Sort(A, 2, 2), return , partition 2 1 3 4 5 6 7 8 6 7 8 5

Quick-Sort(A, 0, 7) Quick-Sort(A, 5, 7) , partition 2 1 3 4 5 6 7 8 6 6 7 8 6 7 8 6

Quick-Sort(A, 0, 7) Quick-Sort(A, 6, 7) , partition 2 1 3 4 5 6 7 8 7 7 8 7 8

Quick-Sort(A, 0, 7) Quick-Sort(A, 7, 7) , base case , return 2 1 3 4 5 6 7 8 8

Quick-Sort(A, 0, 7) Quick-Sort(A, 6, 7) , return 2 1 3 4 5 6 7 8

Quick-Sort(A, 0, 7) Quick-Sort(A, 5, 7) , return 2 1 3 4 5 6 8 7

Quick-Sort(A, 0, 7) Quick-Sort(A, 4, 7) , return 2 1 3 4 5 6 8 7

Quick-Sort(A, 0, 7) Quick-Sort(A, 0, 7) , done! 2 1 3 4 5 6 8 7

Quick-Sort: Best Case Even Partition Total time: (nlogn) cn n n/2 2 × cn/2 = cn 4 × c/4 = cn n/3 × 3c = cn log n levels n n/2 n/4 3 Total time: (nlogn)

Quick-Sort: Worst Case Unbalanced Partition cn n c(n-1) n-1 c(n-2) n-2 3c 3 Happens only if input is sortd input is reversely sorted 2 2c Total time: (n2)

Quick-Sort Summary Time Space Most of the work done in partitioning. Best/Average case takes (n log(n)) time. Worst case takes (n2) time Space Sorts in-place, i.e., does not require additional space no specific input triggers worst-case behavior the worst-case is only determined by the output of the random-number generator Disadvantage: Unstable in running time Finding “pivot” element is a big issue!

Summary Divide and Conquer Merge-Sort Quick-Sort Most of the work done in Merging (n log(n)) time (n) space Quick-Sort Most of the work done in partitioning Best/Average case takes (n log(n)) time Worst case takes (n2) time (1) space no specific input triggers worst-case behavior the worst-case is only determined by the output of the random-number generator Disadvantage: Unstable in running time Finding “pivot” element is a big issue!

Homework What is the running time of Merge-Sort if the array is already sorted? What is the best case running time of Merge-Sort? Demonstrate the working of Partition on sequence (13, 5, 14, 11, 16, 12, 1, 15). What is the value of i returned at the completion of Partition?