CS100A Lecture 15, 17 22, 29 October 1998 Sorting

Slides:



Advertisements
Similar presentations
Recursion Chapter 14. Overview Base case and general case of recursion. A recursion is a method that calls itself. That simplifies the problem. The simpler.
Advertisements

Introduction to Algorithms Quicksort
SEARCHING AND SORTING HINT AT ASYMPTOTIC COMPLEXITY Lecture 9 CS2110 – Spring 2015 We may not cover all this material.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Quicksort.
1 CS211, Lecture 20 Priority queues and Heaps Readings: Weiss, sec. 6.9, secs When they've got two queues going, there's never any queue!
Algorithm Efficiency and Sorting
1 Algorithmic analysis Introduction. This handout tells you what you are responsible for concerning the analysis of algorithms You are responsible for:
Sorting and Asymptotic Complexity
Recitation 11 Analysis of Algorithms and inductive proofs 1.
SortingBigOh Sorting and "Big Oh" Adapted for ASFA from a presentation by: Barb Ericson Georgia Tech Aug 2007 ASFA AP Computer Science.
1 CS100J 31 October 2006 Arrays: searching & sorting. Reading: 8.5 Merry Christmas!! On Halloween? Searching and sorting algorithms are on the course website.
CS 100Lecture 131 Announcements Exam stats P3 due on Thursday.
Recitation on analysis of algorithms. Formal definition of O(n) We give a formal definition and show how it is used: f(n) is O(g(n)) iff There is a positive.
1 CS November 2010 insertion sort, selection sort, quick sort Do exercises on pp to get familiar with concepts and develop skill. Practice.
Computer Science 1620 Sorting. cases exist where we would like our data to be in ascending (descending order) binary searching printing purposes selection.
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
CS100A, Fall Review of loops 1 CS100A, Fall 1998, Review of Loops and Loop Invariants Some students are still having trouble understanding loop invariants.
1 CS April 2010 insertion sort, selection sort, quick sort Do exercises on pp to get familiar with concepts and develop skill. Practice.
Searching and Sorting Searching algorithms with simple arrays
Analysis of Algorithms and Inductive Proofs
Sorting Mr. Jacobs.
Introduction to Analysis of Algorithms
Correctness issues and Loop invariants
May 17th – Comparison Sorts
Analysis of Algorithms
CS100J 30 October 2007 Algorithms on arrays Reading: 8.3–8.5
Simple Sorting Algorithms
Sorting Chapter 13 presents several common algorithms for sorting an array of integers. Two slow but simple algorithms are Selectionsort and Insertionsort.
Data Structures in Java with JUnit ©Rick Mercer
Quicksort
Algorithm Analysis CSE 2011 Winter September 2018.
Quicksort 1.
Big-Oh and Execution Time: A Review
Developing Loops from Invariants
Analysis of Algorithms
Algorithm design and Analysis
Advanced Sorting Methods: Shellsort
Announcements P2 is due tomorrow Prelim on Monday
Ch 7: Quicksort Ming-Te Chi
"Organizing is what you do before you do something, so that when you do it, it is not all mixed up." ~ A. A. Milne Sorting Lecture 11 CS2110 – Fall 2018.
Sorting … and Insertion Sort.
CS100A Lecture 15, 17 22, 29 October 1998 Sorting
Sub-Quadratic Sorting Algorithms
Sorting Chapter 13 presents several common algorithms for sorting an array of integers. Two slow but simple algorithms are Selectionsort and Insertionsort.
Asymptotic complexity Searching/sorting
Quicksort.
"Organizing is what you do before you do something, so that when you do it, it is not all mixed up." ~ A. A. Milne Sorting Lecture 11 CS2110 – Spring 2019.
Simple Sorting Algorithms
CSC 143 Java Sorting.
CS100J Lecture 16 Previous Lecture This Lecture Programming concepts
CS100J Lecture 14 Previous Lecture
CS November 2010 Developing array algorithms. Reading:
Sorting and Asymptotic Complexity
Quicksort.
Searching and Sorting Hint at Asymptotic Complexity
Simple Sorting Algorithms
Searching and Sorting Hint at Asymptotic Complexity
Developing loops from invariants
CS100J Lecture 16 Previous Lecture This Lecture Programming concepts
Linear and Binary Search
The Selection Problem.
Simple Sorting Algorithms
Quicksort and Randomized Algs
CS100A Sections Dec Loop Invariant Review C Review and Example
Developing Loops from Invariants
Searching and Sorting Hint at Asymptotic Complexity
Searching and Sorting Hint at Asymptotic Complexity
Quicksort.
Lecture 20 – Practice Exercises 4
Presentation transcript:

CS100A Lecture 15, 17 22, 29 October 1998 Sorting Selection sort works by, at each step, placing the next smallest value into position: 4 8 3 7 2 1 Start 1 8 3 7 2 4 After step 1 1 2 3 7 8 4 After step 2 1 2 3 7 8 4 After step 3 1 2 3 4 8 7 After step 4 1 2 3 4 7 8 After step 5 At each step, the boldfaced value is the one to put in its place next; it gets swapped with the value that’s in the array elements where it belongs. CS100A, Lecture 15, 17; 22, 29 October 1998

Sorting algorithm: Selection sort // Sort b (into ascending order) static public void selection Sort(int [ ] b) { int k= 0; //inv: b[0..k-1] is sorted, b[0..k-1] <= b[k..]: while (k < b.length-1) { // Set j so that b[j] is the minimum of b[k..b.length-1] int j= k; int h= k+1; // inv.:b[j] is minimum of b[k..h-1] while (h<b.length) { if (b[h] < b[j]) j= h; h= h+1; } // Swap b[k] with b[j] int t= b[k]; b[k]= b[j]; b[j]= t; k= k+1; 0 k b.length sorted, <= >= k h b.length b[j] is minimum of these CS100A, Lecture 15, 17; 22, 29 October 1998

Iteration No. times array comparison performed How fast is selectionSort? Want a general idea of its speed that is independent of the computer on which it executes. Don’t give time in seconds or milliseconds. The operation that is performed the most is the array comparison b[h] < b[j]. We take the number of such comparisons performed as a measure of speed. Abbreviate b.length as n: Iteration No. times array comparison performed of outer loop during this iteration of outer loop 0 n-1 1 n-2 2 n-3 . . . last 1 So number of comparisons is 1 + 2 + 3 + … +(n-2) + (n-1) = n * (n-1) / 2 = n2/2 - n/2 As n gets large, the term n2 dominates. We say the number if comparisons is proportional to n2 and that this is a quadratic algorithm. CS100A, Lecture 15, 17; 22, 29 October 1998

Insertion sort --another algorithm to sort b[0..] k= 0; // Invariant (given above) while (k < b.length-1) { // Place b[k] in its sorted position in b[0..k] int temp= b[k]; //Save b[k] in temp h= k; // Inv: b[0..k] is sorted except for position b[h], // temp <= b[h+1..k] while (h != 0) && b[h-1]>=temp) { b[h]= b[h-1]; h= h-1; } // Note: {b[0..h-1] <= temp <= b[h+1..k]} b[h]= temp; k= k+1; number of array comparisons is proportional to n2 values originally unchanged in b[0..i-1], sorted 0 k b.length b CS100A, Lecture 15, 17; 22, 29 October 1998

and permutes its elements so that it looks like this: (1) Algorithm partition We are going to develop a sorting algorithm called quicksort. An important piece of it is an algorithm called partition, which starts with an array segment that looks like this (x is the initial value in b[h]): (0) and permutes its elements so that it looks like this: (1) In rearranging the array in this fashion, the order of the final values of b[h+1..j-1] and b[j+1..k] doesn’t matter. We will find it advantages to use the following loop invariant (2) When the loop terminates, the array segment looks as shown below, and the last step is to swap b[h] and b[j] (3) x ? h h+1 k b <= x x >x h j k b x <= x ? >x h h+1 i j k b h h+1 j k b x <= x >x CS100A, Lecture 15, 17; 22, 29 October 1998

// b[h..k] contains at least 2 elements. Using the name x Algorithm partition // b[h..k] contains at least 2 elements. Using the name x // for the originally value in b[h], permute b[h..k] so that // it looks like the following and return j public int partition (int [ ] b, int h, int k ) { int i= h; int j= k; // Inv: while (i <= j) { if (b[i] <= x) i= i+1; else if (b[j] > x) j= j-1; else {//Swap b[i] and b[j] int t= b[i]; b[i]= b[j]; b[j]= t; } // b[h..k] looks like (2) on previous slide // Swap b[h] and b[j] int s= b[i]; b[i]= b[j]; b[j]= s; return j; <= x x >x h j k b x <= x ? >x h h+1 i j k b CS100A, Lecture 15, 17; 22, 29 October 1998

Algorithm quicksort --recursive version Suppose we want to sort b[h..k]. Let’s use algorithm partition to make it look like this, where x is the value initially in b[h]: What remains to be done in order to sort b[h..k]? Two things: Sort b[h..j-1] Sort b[j+1..k] We present on the next slide a recursive version of quicksort --a version that calls itself. We don’t expect you to fully comprehend it, because recursion the idea of a method calling itself is new to you. Subsequently we present a non-recursive version. Quicksort sorts very small sections --size 2 or less-- directly <= x x >x h j k b CS100A, Lecture 15, 17; 22, 29 October 1998

public static void quicksort(int[ ] b, int h, int k) { Recursive Quicksort // Sort b[h..k] public static void quicksort(int[ ] b, int h, int k) { if (h+1-k <= 1) return; if (h+1-k = 2) { //{b[h..k] has exactly 2 elements} if (b[h] <= b[k]) return; // Swap b[h] and b[k] int t= b[h]; b[h]= b[k]; b[k]= t; return; } int j= partition(b,h,k); // b[h..k] looks like // Sort b[h..j-1] quicksort(b,h,j-1); // recursive call // Sort b[j+1..k] quicksort(b,j+1,k); // recursive call <= x x >x h j k b CS100A, Lecture 15, 17; 22, 29 October 1998

Execution of quicksort on array: quicksort ( b, 0, 12); 3 6 8 2 7 1 9 4 6 8 7 5 4 0 12 start: b frame b h 0 k 12 j 2 1 3 8 7 6 9 4 6 8 7 5 4 h j k After partition b frame b h 0 k 12 j 2 At this point, the call sort(b,h.j-1); has to be executed. If we follow the rules for executing a method call, every-thing works out fine! 1. Draw a frame for the call (place it where?) 2. Write in the parameters and local variables 3. Assign arguments to parameters 4. Execute method body 5. Erase the frame for the call. CS100A, Lecture 15, 17; 22, 29 October 1998

Execution of quicksort on an array (continued) Here’s the state of affairs just after the second frame has been constructed and the arguments have been assigned to the parameters. We call the first one frame0 and the new one frame1. Executing the method body, using frame1, results in 2 1 3 8 7 6 9 4 6 8 7 5 4 h j k b frame0 b h 0 k 12 j 2 frame1 b h 0 k 1 j 1 2 3 8 7 6 9 4 6 8 7 5 4 h j k b frame0 b h 0 k 12 j 2 frame1 b h 0 k 1 j CS100A, Lecture 15, 17; 22, 29 October 1998

Execution of quicksort on an array (continued) frame1 is now deleted, resulting in The call quicksort(b,h,j-1); is complete. Now, the call quicksort(b,j+1,k); is executed. Thus, another frame will be constructed, to be erased when the call is completed; when the call completes, the situation is: The call is completed, so the frame disappears: 1 2 3 8 7 6 9 4 6 8 7 5 4 h j k b frame0 b h 0 k 12 j 2 1 2 3 4 4 5 6 6 7 7 8 8 9 h j k b frame0 b h 0 k 12 j 2 b 1 2 3 4 4 5 6 6 7 7 8 8 9 CS100A, Lecture 15, 17; 22, 29 October 1998

You can see that the recursive calls of quicksort execute correctly by executing the algorithm yourself, using carefully the rules for executing method calls. This may explain why we want you to know exactly how to execute method calls. When trying to understand a method with a recursive call or write a recursive call yourself, don’t go through the exercise of executing it. Instead, do what we did in quicksort. In the situation we see that the array can be sorted simply by sorting the two array segments b[h..j-1] and b[j+1..k]. We can sort these two segments by calling any sorting method we wish, including the one that we are currently writing. If we call the method we are currently writing, we are using a recursive call. In worst case, quicksort make on the order of n2 array comparisons. In the average case, n * log(n). <= x x >x h j k b CS100A, Lecture 15, 17; 22, 29 October 1998

Iterative version of quicksort --no recursion After partitioning the array, it looks like: There are now two sections to sort, b[h..j-1] and b[j+1..k], and while one is being sorted, it must be remembered to sort the other. Sorting b[h..j-1] will result in partitioning and the creation of two other sections to sort; these must also be “remembered”. // An instance represents the bound f and l of an // array section b[f..l] (for some array) public class Bounds { public int f; public int l; // Constructor: instance with f=fp and l=lp public Bounds(int fp, int lp) {f= fp; l= lp;} } <= x x >x h j k b CS100A, Lecture 15, 17; 22, 29 October 1998

// Sort array section b[h..k] public static void Quicksort(int [ ] b, int h, int k) { Bounds c [ ] = new Bounds [ k+1-h]; c[0]= new Bounds(k,h); int i= 1; // inv: b[h..k] is sorted iff all its subsegments // defined by elements of c[0..i-1] are sorted while (i > 0) { i= i-1; int f= c[i].f; int l= c[i-1].l; // Process segment b[f..l] if (l-f=1) {// b[f..l] has two elements if (b[f] > b[l]) { // Swap b[f] and b[l] int t= b[f]; b[f]= b[l]; b[l]= t; } else if (l-f>1) { //b[f..l] has > 2 elements // Add bounds of b[f..j-1] and b[j+1..k] to c int j= partition (b,f,l); c[i]= new Bounds (f,j-1); i= i+1; c[i]= new Bounds(j+1,l); CS100A, Lecture 15, 17; 22, 29 October 1998

c[0] represents a segment of 0 elements How big can array c get? Let b[h..k] have n values, and let b be already sorted. At each step, b[f..j-1] would be empty and b[j+1..k]would have all but one of the elements. After 3 loop iterations, we would have c[0] represents a segment of 0 elements c[1] represents a segment of 0 elements c[2] represents a segment of 0 elements c[3] represents a segment of n-3 elements In worst case array c needs almost n array elements! Put largest of the two segments b[f..j-1], b[j+1..k] on c first, then the smaller. Then, we can show that that if c[0] represents a segment of m elements, c looks like c[0] represents m elements c[1] represents < m/2 elements c[2] represents < m/4 elements c[3] represents < m/8 elements … c[i-1] represents m/ 2 i-1 elements c has at most 1+ log m elements So c has at most 1 + log n elements. Much better! CS100A, Lecture 15, 17; 22, 29 October 1998

1. Change allocation of c to Bounds c [ ] = new Bounds [ 50]; Changes to ensure that array c never gets bigger than log (l-f). If the array has 250 elements, array c need have no more than 50 elements. 1. Change allocation of c to Bounds c [ ] = new Bounds [ 50]; 2. Change implementation of “Add bounds …” to the following: // Add bounds of b[f..j-1] and b[j+1..k] to c // --put larger segment on first if (j-f > l-j) { c[i]= new Bounds (f,j-1); i= i+1; c[i+1]= new Bounds(j+1..k); i= i+1; } else { c[i]= new Bounds (j+1..k); i= i+1; c[i]= new Bounds(f,j-1); i= i+1; CS100A, Lecture 15, 17; 22, 29 October 1998

// Given b >= 0, return a b. Exponentiation // Given b >= 0, return a b. public static long exp (long a, long b) { int z= 1; x= a; y= b; //inv: z*xy = ab and y>=0 while (y > 0) { if (y is even) {x= x*x; y= y/2;} else {z= z*x; y= y-1;} } return z; Think of the binary representation of y. E.g. if y = 7 its binary representation is 111. One iteration changes right-most bit from 1 to 0. The next one deletes the bit. Algorithm looks at each bit at most twice. y= 210 in binary, y is a 1 followed by 10 zeros 10000000000. Takes at most 2*10+1 iterations. Logarithmic algorithm. CS100A, Lecture 15, 17; 22, 29 October 1998

decimal binary octal hexadecimal Why did 0 0 0 0 I get a 1 1 1 1 Christmas 2 10 2 2 card on 3 11 3 3 Halloween? 4 100 4 4 5 101 5 5 6 110 6 6 7 111 7 7 8 1000 10 8 9 1001 11 9 10 1010 12 A 11 1011 13 B 12 1100 14 C 13 1101 15 D 14 1110 16 E 15 1111 17 F 16 10000 20 10 17 10001 21 11 20 10010 22 12 21 10011 23 13 22 10100 24 14 23 10101 25 15 24 10110 26 16 25 10111 27 17 CS100A, Lecture 15, 17; 22, 29 October 1998