CompSci 100e 11.1 Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Garfield AP Computer Science
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Sorting Part 4 CS221 – 3/25/09. Sort Matrix NameWorst Time Complexity Average Time Complexity Best Time Complexity Worst Space (Auxiliary) Selection SortO(n^2)
CSC 213 – Large Scale Programming or. Today’s Goals  Begin by discussing basic approach of quick sort  Divide-and-conquer used, but how does this help?
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
CSCE 3110 Data Structures & Algorithm Analysis
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
CSE 373: Data Structures and Algorithms
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Data Structures and Algorithms PLSD210 Sorting. Card players all know how to sort … First card is already sorted With all the rest, ¶Scan back from the.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
CS 171: Introduction to Computer Science II Quicksort.
CSE 373: Data Structures and Algorithms
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
CPS 100, Fall Sorting: From Theory to Practice l Why do we study sorting?  Elegant, practical, powerful, simple, complex  Everyone is doing.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
CS 1704 Introduction to Data Structures and Software Engineering.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Sorting HKOI Training Team (Advanced)
1 Programming with Recursion. 2 Recursive Function Call A recursive call is a function call in which the called function is the same as the one making.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
CPS 100, Fall Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm.
CompSci 100 Prog Design and Analysis II December 2, 2010 Prof. Rodger CompSci 100, Fall
CompSci 100e 11.1 Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
CSE 373: Data Structures and Algorithms Lecture 6: Sorting 1.
1 Joe Meehean.  Problem arrange comparable items in list into sorted order  Most sorting algorithms involve comparing item values  We assume items.
Heapsort. Heapsort is a comparison-based sorting algorithm, and is part of the selection sort family. Although somewhat slower in practice on most machines.
CompSci 100e Program Design and Analysis II April 26, 2011 Prof. Rodger CompSci 100e, Spring20111.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
Software Design 8.1 A Rose by any other name…C or Java? l Why do we use Java in our courses (royal we?)  Object oriented  Large collection of libraries.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
1 CSE 373 Sorting 3: Merge Sort, Quick Sort reading: Weiss Ch. 7 slides created by Marty Stepp
Searching and Sorting Recursion, Merge-sort, Divide & Conquer, Bucket sort, Radix sort Lecture 5.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
CPS Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm analysis.
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
CompSci Problem: finding subsets l See CodeBloat APT, requires finding sums of all subsets  Given {72, 33, 41, 57, 25} what is sum closest (not.
CS 146: Data Structures and Algorithms July 9 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
CompSci 100E 30.1 Other N log N Sorts  Binary Tree Sort  Basic Recipe o Insert into binary search tree (BST) o Do Inorder Traversal  Complexity o Create:
CompSci Sorting: From Theory to Practice  Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm.
A Computer Science Tapestry 10.1 From practice to theory and back again In theory there is no difference between theory and practice, but not in practice.
QUICKSORT 2015-T2 Lecture 16 School of Engineering and Computer Science, Victoria University of Wellington COMP 103 Marcus Frean.
Searching and Sorting Searching: Sequential, Binary Sorting: Selection, Insertion, Shell.
Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
Chapter 9 Recursion © 2006 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.
CompSci 100e 12.1 Sorting: In 2 Slides l Why do people study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm analysis.
Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 8b. Sorting(2): (n log n) Algorithms.
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 13 CS2110 – Fall 2009.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Advanced Sorting.
Advanced Sorting 7 2  9 4   2   4   7
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Sorting.
Bottom up meets Top down
May 17th – Comparison Sorts
Data Structures and Algorithms
Sub-Quadratic Sorting Algorithms
CSE 326: Data Structures Sorting
CSE 373: Data Structures and Algorithms
CSE 373 Data Structures and Algorithms
CSE 332: Sorting II Spring 2016.
Sorting: From Theory to Practice
From bit to byte to char to int to long
Presentation transcript:

CompSci 100e 11.1 Sorting: From Theory to Practice l Why do we study sorting?  Because we have to  Because sorting is beautiful  Example of algorithm analysis in a simple, useful setting l There are n sorting algorithms, how many should we study?  O(n), O(log n), …  Why do we study more than one algorithm? Some are good, some are bad, some are very, very sad Paradigms of trade-offs and algorithmic design  Which sorting algorithm is best?  Which sort should you call from code you write?

CompSci 100e 11.2 Sorting out sorts l Simple, O(n 2 ) sorts --- for sorting n elements  Selection sort --- n 2 comparisons, n swaps, easy to code  Insertion sort --- n 2 comparisons, n 2 moves, stable, fast  Bubble sort --- n 2 everything, slow, slower, and ugly l Divide and conquer faster sorts: O(n log n) for n elements  Quick sort: fast in practice, O(n 2 ) worst case  Merge sort: good worst case, great for linked lists, uses extra storage for vectors/arrays l Other sorts:  Heap sort, basically priority queue sorting  Radix sort: doesn’t compare keys, uses digits/characters  Shell sort: quasi-insertion, fast in practice, non-recursive

CompSci 100e 11.3 Selection sort: summary l Simple to code n 2 sort: n 2 comparisons, n swaps void selectSort(String[] a) { int len = a.length; for(int k=0; k < len; k++){ int mindex = getMinIndex(a,k,len); swap(a,k,mindex); } l # comparisons:  Swaps?  Invariant:  k=1 n k = … + n = n(n+1)/2 = O(n 2 ) Sorted, won’t move final position ?????

CompSci 100e 11.4 Insertion Sort: summary l Stable sort, O(n 2 ), good on nearly sorted vectors  Stable sorts maintain order of equal keys  Good for sorting on two criteria: name, then age void insertSort(String[] a){ int k, loc; String elt; for(k=1; k < a.length; ++k) { elt = a[k]; loc = k; // shift until spot for elt is found while (0 < loc && elt.compareTo(a[loc-1]) < 0) { a[loc] = a[loc-1]; // shift right loc=loc-1; } a[loc] = elt; } Sorted relative to each other ?????

CompSci 100e 11.5 Bubble sort: summary of a dog l For completeness you should know about this sort  Really, really slow (to run), really really fast (to code)  Can code to recognize already sorted vector (see insertion) Not worth it for bubble sort, much slower than insertion void bubbleSort(String[] a) { for(int j=a.length-1; j >= 0; j--) { for(int k=0; k < j; k++) { if (a[k].compareTo(a[k+1]) > 0) swap(a,k,k+1); } l “bubble” elements down the vector/array Sorted, in final position ?????

CompSci 100e 11.6 Summary of simple sorts l Selection sort has n swaps, good for “heavy” data  moving objects with lots of state, e.g., … In C or C++ this is an issue In Java everything is a pointer/reference, so swapping is fast since it's pointer assignment l Insertion sort is good on nearly sorted data, it’s stable, it’s fast  Also foundation for Shell sort, very fast non-recursive  More complicated to code, but relatively simple, and fast l Bubble sort is a travesty? But it's fast to code if you know it!  Can be parallelized, but on one machine don’t go near it (see quotes at end of slides)

CompSci 100e 11.7 Quicksort: fast in practice l Invented in 1962 by C.A.R. Hoare, didn’t understand recursion  Worst case is O(n 2 ), but avoidable in nearly all cases  In 1997 Introsort published (Musser, introspective sort) Like quicksort in practice, but recognizes when it will be bad and changes to heapsort void quick(String[], int left, int right) { if (left < right) { int pivot = partition(a,left,right); quick(a,left,pivot-1); quick(a,pivot+1, right); } l Recurrence? <= X > XX pivot index

CompSci 100e 11.8 Partition code for quicksort left l Easy to develop partition int partition(String[] a, int left, int right) { String pivot = a[left]; int k, pIndex = left; for(k=left+1; k <= right; k++) { if (a[k].compareTo(pivot) <= 0){ pIndex++; swap(a,k,pIndex); } swap(a,left,pIndex); } l loop invariant:  statement true each time loop test is evaluated, used to verify correctness of loop l Can swap into a[left] before loop  Nearly sorted data still ok ?????????????? <=>??? <= pivot> pivot pIndex left right leftright what we want what we have invariant pIndex k

CompSci 100e 11.9 Tail recursion elimination l If the last statement is a recursive call, recursion can be replaced with iteration  Call cannot be part of an expression  Some compilers do this automatically void foo(int n) void foo2(int n) { if (0 < n) { while (0 < n) { System.out.println(n); System.out.println(n); foo(n-1); n = n-1; } } } l What if print and recursive call switched? What about recursive factorial? return n*factorial(n-1);

CompSci 100e Merge sort: worst case O(n log n) l Divide and conquer --- recursive sort  Divide list/vector into two halves Sort each half Merge sorted halves together  What is complexity of merging two sorted lists?  What is recurrence relation for merge sort as described? T(n) = l What is advantage of array over linked-list for merge sort?  What about merging, advantage of linked list?  Array requires auxiliary storage (or very fancy coding) T(n) = 2T(n/2) + O(n)

CompSci 100e Merge sort: lists or arrays or … l Mergesort for arrays void mergesort(String[] a, int left, int right) { if (left < right) { int mid = (right+left)/2; mergesort(a, left, mid); mergesort(a, mid+1, right); merge(a,left,mid,right); } l What’s different when linked lists used?  Do differences affect complexity? Why? l How does merge work?

CompSci 100e Merge for LinkedList public static LinkedList merge(LinkedList a, LinkedList b) { LinkedList result = new LinkedList (); while (a.size() != 0 && b.size() != 0){ String as = a.getFirst(); String bs = b.getFirst(); if (as.compareTo(bs) <= 0){ result.add(a.remove()); } else { result.add(b.remove()); } // what's missing here??

CompSci 100e Merge for arrays l Array code for merge isn’t pretty, but it’s not hard  Mergesort itself is elegant void merge(String[] a, int left, int middle, int right) // pre: left <= middle <= right, // a[left] <= … <= a[middle], // a[middle+1] <= … <= a[right] // post: a[left] <= … <= a[right] l Need extra storage, can't easily merge in place  Can alternate between arrays: one merged into, then swap

CompSci 100e Summary of O(n log n) sorts l Quicksort is relatively straight-forward to code, very fast  Worst case is very unlikely, but possible, therefore …  But, if lots of elements are equal, performance will be bad One million integers from range 0 to 10,000 How can we change partition to handle this? l Merge sort is stable, it’s fast, good for linked lists, harder to code?  Worst case performance is O(n log n), compare quicksort  Extra storage for array/vector l Heapsort, more complex to code, good worst case, not stable  Basically heap-based priority queue in a vector

CompSci 100e Sorting in practice l Rarely will you need to roll your own sort, but when you do …  What are key issues? l If you use a library sort, you need to understand the interface  In C++ we have STL STL has sort, and stable_sort  In C generic sort is complex to use because arrays are ugly  In Java guarantees and worst-case are important Why won’t quicksort be used? l Comparators permit sorting criteria to change simply

CompSci 100e Non-comparison-based sorts lower bound:  (n log n) for comparison based sorts (like searching lower bound) l bucket sort/radix sort are not- comparison based, faster asymptotically and in practice l sort a vector of ints, all ints in the range , how?  (use extra storage) l radix: examine each digit of numbers being sorted  One-pass per digit  Sort based on digit

CompSci 100e Scoreboard Algorithmstable/ in-place worstavgbestremarks selection insertion shell heap quick merge radix

CompSci 100e A Rose by any other name…C or Java? l Why do we use Java in our courses (royal we?)  Object oriented  Large collection of libraries  Safe for advanced programming and beginners  Harder to shoot ourselves in the foot l Why don't we use C++ (or C)?  Standard libraries weak or non-existant (comparatively)  Easy to make mistakes when beginning  No GUIs, complicated compilation model

CompSci 100e Why do we learn other languages? l Perl, Python, PHP, mySQL, C, C++, Java, Scheme, ML, …  Can we do something different in one language? Depends on what different means. In theory: no; in practice: yes  What languages do you know? All of them.  In what languages are you fluent? None of them l In later courses why do we use C or C++?  Closer to the machine, we want to understand the machine at many levels, from the abstract to the ridiculous Or at all levels of hardware and software  Some problems are better suited to one language What about writing an operating system? Linux?

CompSci 100e C++ on two slides l Classes are similar to Java, compilation model is different  Classes have public and private sections/areas  Typically declaration in.h file and implementation in.cpp Separate interface from actual implementation Good in theory, hard to get right in practice  One.cpp file compiles to one.o file To create an executable, we link.o files with libraries Hopefully someone else takes care of the details l We #include rather than import, this is a preprocessing step  Literally sucks in an entire header file, can take a while for standard libraries like iostream, string, etc.  No abbreviation similar to java.util.*;

CompSci 100e C++ on a second slide l We don't have to call new to create objects, they can be created "on the stack"  Using new creates memory "on the heap"  In C++ we need to do our own garbage collection, or avoid and run out of memory (is this an issue?) l Vectors are similar to ArrayLists, pointers are similar to arrays  Unfortunately, C/C++ equate array with memory allocation  To access via a pointer, we don't use. we use -> l Streams are used for IO, iterators are used to access begin/end of collection  ifstream, cout correspond to Readers and System.out

CompSci 100e How do we read a file in C++ and Java? Scanner s = new Scanner(new File(“data.txt”)); TreeSet set = new TreeSet (); while (s.hasNext()){ String str = s.next(); set.add(str); } myWordsAsList = new ArrayList (set); string word; set unique; ifstream input(“data.txt”); while (input >> word){ unique.insert(word); } myWordsAsVector = vector (unique.begin(), unique.end()); l What are similarities? Differences?

CompSci 100e How do we read a file in C? FILE * file = fopen("/u/ola/data/poe.txt","r"); char buf[1024]; char ** words = (char **) malloc(5000*sizeof(char **)); int count = 0; int k; while (fscanf(file,"%s",buf) != EOF){ int found = 0; // look for word just read for(k=0; k < count; k++){ if (strcmp(buf,words[k]) == 0){ found = 1; break; } if (!found){ // not found, add to list words[count] = (char *) malloc(strlen(buf)+1); strcpy(words[count],buf); count++; } l What if more than 5000 words? What if string length > 1024? What if?  What is complexity of this code?

CompSci 100e The exam l Wednesday, April 29, 7pm-10pm in White Lecture Hall l Open book/open note l ~40% multiple choice/short answer l Cumulative l By Friday, April 24:  All grades up (except huff)  Test solutions out  Grade problems: Submit assignment issues l Final grades up Saturday, May 2 l Available by appointment throughout reading period and exam week l Help session Wednesday in class