CompSci 100 18.1 Sorting: From Theory to Practice  Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Garfield AP Computer Science
CompSci Searching & Sorting. CompSci Searching & Sorting The Plan  Searching  Sorting  Java Context.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
CSCE 3110 Data Structures & Algorithm Analysis
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
CSE 373: Data Structures and Algorithms
Data Structures and Algorithms PLSD210 Sorting. Card players all know how to sort … First card is already sorted With all the rest, ¶Scan back from the.
Sorting Algorithms Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
CSE 373: Data Structures and Algorithms
CSE 373: Data Structures and Algorithms
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Sorting Algorithms Fawzi Emad Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
E.G.M. Petrakissorting1 Sorting  Put data in order based on primary key  Many methods  Internal sorting:  data in arrays in main memory  External.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
CSE 373 Data Structures Lecture 15
CPS 100, Fall Sorting: From Theory to Practice l Why do we study sorting?  Elegant, practical, powerful, simple, complex  Everyone is doing.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
HKOI 2006 Intermediate Training Searching and Sorting 1/4/2006.
Computer Science Searching & Sorting.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
CPS 100, Fall Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm.
CompSci 100 Prog Design and Analysis II December 2, 2010 Prof. Rodger CompSci 100, Fall
CompSci 100e 11.1 Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm.
Chapter 10 B Algorithm Efficiency and Sorting. © 2004 Pearson Addison-Wesley. All rights reserved 9 A-2 Sorting Algorithms and Their Efficiency Sorting.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
CSE 373: Data Structures and Algorithms Lecture 6: Sorting 1.
1 Joe Meehean.  Problem arrange comparable items in list into sorted order  Most sorting algorithms involve comparing item values  We assume items.
Heapsort. Heapsort is a comparison-based sorting algorithm, and is part of the selection sort family. Although somewhat slower in practice on most machines.
CompSci 100e Program Design and Analysis II April 26, 2011 Prof. Rodger CompSci 100e, Spring20111.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
1 CSE 373 Sorting 3: Merge Sort, Quick Sort reading: Weiss Ch. 7 slides created by Marty Stepp
Searching and Sorting Recursion, Merge-sort, Divide & Conquer, Bucket sort, Radix sort Lecture 5.
CPS Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm analysis.
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
CompSci 100e 11.1 Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm.
CS 146: Data Structures and Algorithms July 9 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
CompSci 100E 30.1 Other N log N Sorts  Binary Tree Sort  Basic Recipe o Insert into binary search tree (BST) o Do Inorder Traversal  Complexity o Create:
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
Searching and Sorting Searching: Sequential, Binary Sorting: Selection, Insertion, Shell.
The Sorting Methods Lecture Notes 10. Sorts Many programs will execute more efficiently if the data they process is sorted before processing begins. –
Sorting divide and conquer. Divide-and-conquer  a recursive design technique  solve small problem directly  divide large problem into two subproblems,
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 8b. Sorting(2): (n log n) Algorithms.
CS 367 Introduction to Data Structures Lecture 11.
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 13 CS2110 – Fall 2009.
The Linear and Binary Search and more Lecture Notes 9.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
Advanced Sorting.
Sorting.
May 17th – Comparison Sorts
Data Structures and Algorithms
Unit IV : Searching and sorting Techniques
8/04/2009 Many thanks to David Sun for some of the included slides!
Sub-Quadratic Sorting Algorithms
CSE373: Data Structure & Algorithms Lecture 21: Comparison Sorting
CSE 326: Data Structures Sorting
CSE 332: Data Abstractions Sorting I
CSE 373: Data Structures and Algorithms
CSE 373 Data Structures and Algorithms
CSE 332: Sorting II Spring 2016.
Sorting: From Theory to Practice
Presentation transcript:

CompSci Sorting: From Theory to Practice  Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm analysis in a simple, useful setting  There are n sorting algorithms, how many should we study?  O(n), O(log n), …  Why do we study more than one algorithm? o Some are good, some are bad, some are very, very sad o Paradigms of trade-offs and algorithmic design  Which sorting algorithm is best?  Which sort should you call from code you write?

CompSci Sorting out sorts  Simple, O(n 2 ) sorts --- for sorting n elements  Selection sort --- n 2 comparisons, n swaps, easy to code  Insertion sort --- n 2 comparisons, n 2 moves, stable Very fast fast on nearly sorted vectors: O(n)  Bubble sort --- n 2 everything, slower  Divide and conquer faster sorts: O(n log n) for n elements  Quick sort: fast in practice, O(n 2 ) worst case  Merge sort: good worst case, great for linked lists, stable, uses extra storage for vectors/arrays  Other sorts:  Heap sort, basically priority queue sorting  Radix sort: doesn’t compare keys, uses digits/characters  Shell sort: quasi-insertion, fast in practice, non-recursive

CompSci Selection sort: summary  Simple to code n 2 sort: n 2 comparisons, n swaps void selectSort(String[] a) { for(int k=0; k < a.length; k++){ int minIndex = findMin(a,k); swap(a,k,minIndex); }  # comparisons:  Swaps?  Invariant:  k=1 n k = … + n = n(n+1)/2 = O(n 2 ) Sorted, won’t move final position ?????

CompSci Insertion Sort: summary  Stable sort, O(n 2 ), good on nearly sorted vectors  Stable sorts maintain order of equal keys  Good for sorting on two criteria: name, then age void insertSort(String[] a) { int k, loc; string elt; for(k = 1; k < a.length; k++) { elt = a[k]; loc = k; // shift until spot for elt is found while (0 < loc && elt.compareTo(a[loc-1]) < 0) { a[loc] = a[loc-1]; // shift right loc--; } a[loc] = elt; } Sorted relative to each other ?????

CompSci Bubble sort: summary of a dog  For completeness you should know about this sort  Few, if any, redeeming features. Really slow, really, really  Can code to recognize already sorted vector (see insertion) o Not worth it for bubble sort, much slower than insertion void bubbleSort(String[] a) { for(int j = a.length-1; j >= 0; j--) { for(int k = 0; k < j; k++) { if (a[k] > a[k+1]) swap(a,k,k+1); }  “bubble” elements down the vector/array Sorted, in final position ?????

CompSci Summary of simple sorts  Selection sort has n swaps, good for “heavy” data  moving objects with lots of state, e.g., … o In C or C++ this is an issue o In Java everything is a pointer/reference, so swapping is fast since it's pointer assignment  Insertion sort is good on nearly sorted data, it’s stable, it’s fast  Also foundation for Shell sort, very fast non-recursive  More complicated to code, but relatively simple, and fast  Bubble sort is a travesty? But it's fast to code if you know it!  Can be parallelized, but on one machine don’t go near it

CompSci Quicksort: fast in practice  Invented in 1962 by C.A.R. Hoare, didn’t understand recursion  Worst case is O(n 2 ), but avoidable in nearly all cases  In 1997 Introsort published (Musser, introspective sort) o Like quicksort in practice, but recognizes when it will be bad and changes to heapsort void quick(String[], int left, int right) { if (left < right) { int pivot = partition(a, left, right); quick(a, left, pivot-1); quick(a, pivot+1, right); }  Recurrence? <= X > XX pivot index

CompSci Partition code for quicksort left  Easy to develop partition int partition(String[] a, int left, int right) { string pivot = a[left]; int k, pIndex = left; for(k=left+1, k <= right; k++) { if (a[k].compareTo(pivot) <= 0) { pIndex++; swap(a,k,pIndex); } swap(a,left,pIndex); }  Loop invariant:  statement true each time loop test is evaluated, used to verify correctness of loop  Can swap into a[left] before loop  Nearly sorted data still ok ?????????????? <=>??? <= pivot> pivot pIndex left right leftright what we want what we have invariant pIndex k

CompSci Analysis of Quicksort  Average case and worst case analysis  Recurrence for worst case: T(n) =  What about average?  Reason informally:  Two calls vector size n/2  Four calls vector size n/4  … How many calls? Work done on each call?  Partition: typically find middle of left, middle, right, swap, go  Avoid bad performance on nearly sorted data  In practice: remove some (all?) recursion, avoid lots of “clones” T(n-1) + T(1) + O(n) T(n) = 2T(n/2) + O(n)

CompSci Tail recursion elimination  If the last statement is a recursive call, recursion can be replaced with iteration  Call cannot be part of an expression  Some compilers do this automatically void foo(int n) void foo2(int n) { { if (0 < n) { while (0 < n) { System.out.println(n); System.out.println(n); foo(n-1); n = n-1; } }  What if print and recursive call switched?  What about recursive factorial? return n*factorial(n-1);

CompSci Merge sort: worst case O(n log n)  Divide and conquer --- recursive sort  Divide list/vector into two halves o Sort each half o Merge sorted halves together  What is complexity of merging two sorted lists?  What is recurrence relation for merge sort as described? T(n) =  What is advantage of array over linked-list for merge sort?  What about merging, advantage of linked list?  Array requires auxiliary storage (or very fancy coding) T(n) = 2T(n/2) + O(n)

CompSci Merge sort: lists or vectors  Mergesort for vectors void mergesort(String[] a, int left, int right) { if (left < right) { int mid = (right+left)/2; mergesort(a, left, mid); mergesort(a, mid+1, right); merge(a,left,mid,right); }  What’s different when linked lists used?  Do differences affect complexity? Why?  How does merge work?

CompSci Mergesort continued  Array code for merge isn’t pretty, but it’s not hard  Mergesort itself is elegant void merge(String[] a, int left, int middle, int right) // pre: left <= middle <= right, // a[left] <= … <= a[middle], // a[middle+1] <= … <= a[right] // post: a[left] <= … <= a[right]  Why is this prototype potentially simpler for linked lists?  What will prototype be? What is complexity?

CompSci Mergesort continued void merge(String[] a, int left, int middle, int right) { String[] b = new String[right - left + 1]; int k = 0, kl = left, kr = middle + 1; for (; kl <= middle && kr <= right; k++){ if (a[kl].compareTo(a[kr]) <= 0) b[k] = a[kl++]; else b[k] = a[kr++]; } for (; kl <= middle; kl++) b[k++] = a[kl]; for (; kr <= right; kr++) b[k++] = a[kr]; for (k = 0; k < b.length; k++) a[left+k] = b[k]; }

CompSci Summary of O(n log n) sorts  Quicksort is relatively straight-forward to code, very fast  Worst case is very unlikely, but possible, therefore …  But, if lots of elements are equal, performance will be bad o One million integers from range 0 to 10,000 o How can we change partition to handle this?  Merge sort is stable, it’s fast, good for linked lists, harder to code?  Worst case performance is O(n log n), compare quicksort  Extra storage for array/vector  Heapsort, more complex to code, good worst case, not stable  Basically heap-based priority queue in a vector

CompSci Sorting in practice  Rarely will you need to roll your own sort, but when you do …  What are key issues?  If you use a library sort, you need to understand the interface  In C++ we have STL o STL has sort, and stable_sort  In C the generic sort is complex to use because arrays are ugly  In Java guarantees and worst-case are important o Why won’t quicksort be used?  Comparators permit sorting criteria to change simply

CompSci Other N log N Sorts  Binary Tree Sort  Basic Recipe o Insert into binary search tree (BST) o Do Inorder Traversal  Complexity o Create: O(N log N) o Traversal O(N)  Not usually used for sorting unless you need BST for other reasons

CompSci Other N log N Sorts  Heap Sort  Basic recipe: o Create Heap (priority queue) o Items one at a time (Sorted order!)  Complexity o Create heap: N * O(1) = O(N) o Remove N items: N * O(log N) = O(N log N)  To make into sort: o Use Max-Heap on array o Put removed items into space vacated as heap shrinks o Thus sort “in place”: no extra array needed  Not widely used sort; not stable

CompSci Shellsort  Uses Insertion Sorts with gaps (or skips)  “Diminishing Gap Sort” (Donald Shell, 1959)  [ 17 | 13 | 29 | 21 | 20 | 24 | 11 | 15 | 14 | 18 | 23 | 27 | 12 ]  Gap = 5 (5 insertion sorts with every 5 th element)  [ 17 | 11 | 12 | 14 | 18 | 23 | 13 | 15 | 21 | 20 | 24 | 27 | 29 ]  Gap = 3 (3 insertion sorts with every 3 rd element)  [ 13 | 11 | 12 | 14 | 15 | 21 | 17 | 18 | 23 | 20 | 24 | 27 | 29 ]  Gap = 1 (standard insertions sort)  [ 11 | 12 | 13 | 14 | 15 | 17 | 18 | 20 | 21 | 23 | 24 | 27 | 29 ]  Complexity  Very hard to analyze : depends on gaps used  O(N 3/2 ) fairly easy to achieve; can do better  Easy to program

CompSci Non-comparison-based sorts  Lower bound:  (n log n) for comparison based sorts (like searching lower bound)  Bucket sort/radix sort are not- comparison based, faster asymptotically and in practice  Sort a vector of ints, all ints in the range , how?  (use extra storage)  Radix: examine each digit of numbers being sorted  One-pass per digit  Sort based on digit  What order should passes be in?

CompSci External Sorting  Large memories on modern machines means techniques discussed so far usually apply  Sometimes data does not fit into memory  This used to be a common data processing problem  Usual Recipe:  Chop data into chunks that will fit into memory  Sort chunks in memory using best programs o Use Quicksort for speed, or Merge Sort for stable sort o Write sorted chunks back to disk files  Merge the disk files o Read front of 2 or more files o Merge o Write to final disk file as you merge o Only part needs to be in memory at any time  Historically all done with tapes (disks too small)