1 SORTING Dan Barrish-Flood. 2 heapsort made file “3-Sorting-Intro-Heapsort.ppt”

Slides:



Advertisements
Similar presentations
Sorting in Linear Time Introduction to Algorithms Sorting in Linear Time CSE 680 Prof. Roger Crawfis.
Advertisements

Introduction to Algorithms Quicksort
David Luebke 1 4/22/2015 CS 332: Algorithms Quicksort.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
Analysis of Algorithms CS 477/677 Linear Sorting Instructor: George Bebis ( Chapter 8 )
Non-Comparison Based Sorting
Sorting Comparison-based algorithm review –You should know most of the algorithms –We will concentrate on their analyses –Special emphasis: Heapsort Lower.
CSE332: Data Abstractions Lecture 14: Beyond Comparison Sorting Dan Grossman Spring 2010.
Order Statistics(Selection Problem) A more interesting problem is selection:  finding the i th smallest element of a set We will show: –A practical randomized.
CSCE 3110 Data Structures & Algorithm Analysis
Using Divide and Conquer for Sorting
Spring 2015 Lecture 5: QuickSort & Selection
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
Lecture 5: Linear Time Sorting Shang-Hua Teng. Sorting Input: Array A[1...n], of elements in arbitrary order; array size n Output: Array A[1...n] of the.
CS 253: Algorithms Chapter 8 Sorting in Linear Time Credit: Dr. George Bebis.
Comp 122, Spring 2004 Lower Bounds & Sorting in Linear Time.
CS421 - Course Information Website Syllabus Schedule The Book:
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Analysis of Algorithms CS 477/677 Midterm Exam Review Instructor: George Bebis.
Lecture 5: Master Theorem and Linear Time Sorting
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
Analysis of Algorithms CS 477/677
David Luebke 1 7/2/2015 Linear-Time Sorting Algorithms.
1 QuickSort Worst time:  (n 2 ) Expected time:  (nlgn) – Constants in the expected time are small Sorts in place.
Lower Bounds for Comparison-Based Sorting Algorithms (Ch. 8)
CS 146: Data Structures and Algorithms July 14 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
David Luebke 1 8/17/2015 CS 332: Algorithms Linear-Time Sorting Continued Medians and Order Statistics.
Computer Algorithms Lecture 11 Sorting in Linear Time Ch. 8
Sorting in Linear Time Lower bound for comparison-based sorting
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Efficient Algorithms Quicksort. Quicksort A common algorithm for sorting non-sorted arrays. Worst-case running time is O(n 2 ), which is actually the.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
David Luebke 1 10/13/2015 CS 332: Algorithms Linear-Time Sorting Algorithms.
CSC 41/513: Intro to Algorithms Linear-Time Sorting Algorithms.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Introduction to Algorithms Jiafen Liu Sept
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Analysis of Algorithms CS 477/677
September 26, 2005Copyright © Erik D. Demaine and Charles E. Leiserson L5.1 Introduction to Algorithms 6.046J/18.401J Prof. Erik Demaine LECTURE5.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
David Luebke 1 6/3/2016 CS 332: Algorithms Analyzing Quicksort: Average Case.
Mudasser Naseer 1 11/5/2015 CSC 201: Design and Analysis of Algorithms Lecture # 8 Some Examples of Recursion Linear-Time Sorting Algorithms.
Order Statistics(Selection Problem)
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
CS 146: Data Structures and Algorithms July 14 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
COSC 3101A - Design and Analysis of Algorithms 6 Lower Bounds for Sorting Counting / Radix / Bucket Sort Many of these slides are taken from Monica Nicolescu,
1 Algorithms CSCI 235, Fall 2015 Lecture 17 Linear Sorting.
David Luebke 1 2/19/2016 Priority Queues Quicksort.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
Lecture 5 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting So Far Insertion sort: –Easy to code –Fast on small inputs (less than ~50 elements) –Fast on nearly-sorted.
David Luebke 1 6/26/2016 CS 332: Algorithms Linear-Time Sorting Continued Medians and Order Statistics.
David Luebke 1 7/2/2016 CS 332: Algorithms Linear-Time Sorting: Review + Bucket Sort Medians and Order Statistics.
Lower Bounds & Sorting in Linear Time
Quick Sort Divide: Partition the array into two sub-arrays
Linear-Time Sorting Continued Medians and Order Statistics
Introduction to Algorithms
Lecture 5 Algorithm Analysis
Lecture 5 Algorithm Analysis
Lower Bounds & Sorting in Linear Time
Linear-Time Sorting Algorithms
Lecture 5 Algorithm Analysis
Lecture 5 Algorithm Analysis
Presentation transcript:

1 SORTING Dan Barrish-Flood

2 heapsort made file “3-Sorting-Intro-Heapsort.ppt”

3 Quicksort Worst-case running time is Θ(n 2 ) on an input array of n numbers. Expected running time is Θ(nlgn). Constants hidden by Θ are quite small. Sorts in place. Probably best sorting algorithm for large input arrays. Maybe.

4 How does Quicksort work? based on “divide and conquer” paradigm (so is merge sort). Divide: Partition (re-arrange) the array A[p..r] into two (possibly empty) sub-arrays A[p.. q-1] and A[q+1.. r] such that each element of A[p.. q-1] is ≤ each element of A[q], which is, in turn, ≤ each element of A[q+1.. r]. Compute the index q as part of this partitioning procedure. Conquer: Sort the two sub-arrays A[p.. q-1] and A[q+1.. r] by recursive calls to quicksort. Combine: No combining needed; the entire array A[p.. r] is now sorted!

5 Quicksort

6 Partition in action

7 Quicksort Running Time, worse-case worst-case occurs when partition yields one subproblem of size n-1 and one of size 0. Assume this “bad split” occurs at each recursive call. partition costs Θ(n). Recursive call to QS on array of size 0 just returns, so T(0) = 1, so we get: T(n) = T(n-1) + T(0) + Θ(n), same as... T(n) = T(n-1) + n just an arithmetic series! So... T(n) = Θ(n 2 ) (worst-case) Under what circumstances do you suppose we get this worst-case behavior?

8 Quicksort, best-case In the most even possible split, PARTITION yields two subproblems each of size no more than n/2, since one is of size floor(n/2), and one is [ceiling(n/2)]-1. We get this recurrence, with some OK sloppiness: T(n) = 2T(n/2) + n (look familiar?) T(n) = O(nlgn) This is asymptotically superior to worst- case, but this ideal scenario is not likely...

9 Quicksort, Average-case suppose the great and awful splits alternate levels in the tree. the running time for QS, when levels alternate between great and awful splits, is just the same as when all levels yield great splits! (with a slightly larger constant hidden by the big-oh notation). So, average case... T(n) = O(nlgn)

10 A lower bound for sorting (Sorting, part 2) We will show that any sorting algorithm based only on comparison of the input values must run in Ω(nlgn) time.

11 Decision Trees Tree of comparisons made by a sorting algorithm. Each comparison reduces the number of possible orderings. Eventually, only one must remain. A decision tree is a “full” (not “complete”) binary tree; each node is a leaf or has degree 2.

12 Q. How many leaves does a decision tree have? A. There is one leaf for each permutation of n elements. There are n! permuatations. Q. What is the height of the tree? A. # of leaves = n! ≤ 2 h Note the height is the worst-case number of comparisons that might be needed.

13 Show we can’t beat nlgn recall n! ≤ 2 h... now take logs lg(n!) ≤ lg(2 h ) lg(n!) ≤ h lg2 lg(n!) ≤ h... just flip it over h ≥ lg(n!) ( lg(n!) = Θ(nlgn) )...Stirling, CLRS p. 55 h = Ω(nlgn) QED In the worst case, Ω(nlgn) comparisons are needed to sort n items.

14 Sorting in Linear Time !!! The Ω(nlgn) bound does not apply if we use info other than comparisons. Like what other info? 1.Use the item as an array index. 2.Examine the digits (or bits) of the item.

15 Counting Sort Good for sorting integers in a narrow range Assume the input numbers (keys) are in the range 0..k Use an auxilliary array C[0..k] to hold the number of items less than i for 0 ≤ i ≤ k if k = O(n), then the running time is Θ(n). Counting sort is stable; it keeps records in their original order.

16

17 Counting Sort in action

18 Radix Sort How IBM made its money, using punch card readers for census tabulation in early 1900’s. Card sorters worked on one column at a time. Sort each digit (or field) separately. Start with the least-significant digit. Must use a stable sort. RADIX-SORT(A, d) 1 for i ← 1 to d 2 do use a stable sort to sort array A on digit i

19 Radix Sort in Action

20 Correctness of Radix Sort induction on number of passes base case: low-order digit is sorted correctly inductive hypothesis: show that a stable sort on digit i leaves digits 1...i sorted –if 2 digits in position i are different, ordering by position i is correct, and positions 1.. i-1 are irrelevant –if 2 digits in position i are equal, numbers are already in the right order (by inductive hypotheis). The stable sort on digit i leaves them in the right order. Radix sort must invoke a stable sort.

21 Running Time of Radix Sort use counting sort as the invoked stable sort, if the range of digits is not large if digit range is 1..k, then each pass takes Θ(n+k) time there are d passes, for a total of Θ(d(n+k)) if k = O(n), time is Θ(dn) when d is const, we have Θ(n), linear!