Searching and Sorting Hint at Asymptotic Complexity

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

SORTING Lecture 12B CS2110 – Spring InsertionSort 2 pre: b 0 b.length ? post: b 0 b.length sorted inv: or: b[0..i-1] is sorted b 0 i b.length sorted.
MATH 224 – Discrete Mathematics
SEARCHING AND SORTING HINT AT ASYMPTOTIC COMPLEXITY Lecture 9 CS2110 – Spring 2015 We may not cover all this material.
Spring 2015 Lecture 5: QuickSort & Selection
Recitation on analysis of algorithms. runtimeof MergeSort /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return;
1 Algorithmic analysis Introduction. This handout tells you what you are responsible for concerning the analysis of algorithms You are responsible for:
Sorting and Asymptotic Complexity
COMP s1 Computing 2 Complexity
(c) , University of Washington
Recitation 11 Analysis of Algorithms and inductive proofs 1.
Analysis of Algorithms
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
ASYMPTOTIC COMPLEXITY CS2111 CS2110 – Fall
CS 100Lecture 131 Announcements Exam stats P3 due on Thursday.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Recitation on analysis of algorithms. Formal definition of O(n) We give a formal definition and show how it is used: f(n) is O(g(n)) iff There is a positive.
1 CS November 2010 insertion sort, selection sort, quick sort Do exercises on pp to get familiar with concepts and develop skill. Practice.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 10 CS2110 – Fall
1 CS April 2010 binary search, linear search insertion sort, selection sort, quick sort Do exercises on pp to get familiar with concepts.
1 CS April 2009 Sorting: insertion sort, selection sort, quick sort Do exercises on pp to get familiar with concepts and develop skill.
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
1 CS November 2008 Sorting: insertion sort, selection sort, quick sort Do exercises on pp to get familiar with concepts and develop skill.
CORRECTNESS ISSUES AND LOOP INVARIANTS Lecture 8 CS2110 – Fall 2014.
1 CS April 2010 insertion sort, selection sort, quick sort Do exercises on pp to get familiar with concepts and develop skill. Practice.
CORRECTNESS ISSUES AND LOOP INVARIANTS Lecture 8 CS2110 – Fall 2015.
Analysis of Algorithms and Inductive Proofs
16 Searching and Sorting.
Correctness issues and Loop invariants
Recitation 13 Searching and Sorting.
Introduction to complexity
Quicksort
Quicksort "There's nothing in your head the sorting hat can't see. So try me on and I will tell you where you ought to be." -The Sorting Hat, Harry Potter.
Teach A level Computing: Algorithms and Data Structures
Big-Oh and Execution Time: A Review
Building Java Programs
Analysis of Algorithms
Algorithm design and Analysis
Announcements P2 is due tomorrow Prelim on Monday
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
Data Structures Review Session
8/04/2009 Many thanks to David Sun for some of the included slides!
Searching, Sorting, and Asymptotic Complexity
"Organizing is what you do before you do something, so that when you do it, it is not all mixed up." ~ A. A. Milne Sorting Lecture 11 CS2110 – Fall 2018.
CS100A Lecture 15, 17 22, 29 October 1998 Sorting
Sub-Quadratic Sorting Algorithms
Searching, Sorting, and Asymptotic Complexity
Searching, Sorting, and Asymptotic Complexity
Asymptotic complexity Searching/sorting
Quicksort.
"Organizing is what you do before you do something, so that when you do it, it is not all mixed up." ~ A. A. Milne Sorting Lecture 11 CS2110 – Spring 2019.
Binary Search and Loop invariants
Asymptotic complexity
CSC 143 Java Sorting.
CS November 2010 Developing array algorithms. Reading:
Sorting and Asymptotic Complexity
Quicksort.
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
CS100A Lecture 15, 17 22, 29 October 1998 Sorting
Searching and Sorting Hint at Asymptotic Complexity
Quicksort and Randomized Algs
Developing Loops from Invariants
Searching and Sorting Hint at Asymptotic Complexity
Searching and Sorting Hint at Asymptotic Complexity
Quicksort.
Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.
Presentation transcript:

Searching and Sorting Hint at Asymptotic Complexity Lecture 10 CS2110 – Fall 2016

Miscellaneous A3 due Monday night. Group early! Only 325 views of the piazza A3 FAQ yesterday morning. Everyone should look at it. Pinned Piazza note on Supplemental study material. @281. Contains material that may help you study certain topics. It also talks about how to study. Prelim next Thursday evening! Make sure you complete P1Conflict on the CMS if you have to. See Exams page of course website for details. 58 people so far did P1Conflict! Next week’s recitation: review for Prelim Sunday, 1-3PM, B11 Kimball. Review session. Holds about 220 people.

Developing methods We use Eclipse to show the development of A2 function evaluate. Here are important points to take away from it. If similar code will appear in two or more places, consider writing a method to avoid that duplication. If you introduce a new method, write a specification for it! Before writing a loop, write a loop invariant for it. Have a loop exploit the structure of the data it processes. Don’t expect your first attempt to be perfect. Just as you rewrite and rewrite an essay, we rewrite programs.

Search as in problem set: b is sorted pre: b 0 b.length post: b 0 h b.length <= v > v ? inv: b 0 h t b.length <= v ? > v Methodology: Draw the invariant as a combination of pre and post Develop loop using 4 loopy questions. Practice doing this! h= –1; t= b.length; while ( ) { } h+1 != t if (b[h+1] <= v) h= h+1; else t= h+1;

Search as in problem set: b is sorted pre: b 0 b.length post: b 0 h b.length ≤ v > v ? inv: b 0 h t b.length ≤ v ? > v b[0] > v? one iteration. h= –1; t= b.length; while (h+1 != t ) { if (b[h+1] <= v) h= h+1; else t= h+1; } b[b.length-1] ≤ 0? b.length iterations Worst case: time is proportional to size of b Since b is sorted, can cut ? segment in half. As a dictionary search

Search as in problem set: b is sorted pre: b 0 b.length post: b 0 h b.length <= v > v ? inv: b 0 h t b.length <= v ? > v inv: b 0 h e t <= v ? ? > v h= –1; t= b.length; while (h != t–1) { } b 0 h e t <= v ≤ v ≤v ? > v int e= (h + t) / 2; // h < e < t b 0 h e t <= v ? >v > v > v if (b[e] <= v) h= e; else t= e;

Binary search: an O(log n) algorithm inv: b 0 h t b.length = n <= v ? > v h= –1; t= b.length; while (h != t–1) { int e= (h+t)/2; if (b[e] <= v) h= e; else t= e; } inv: b 0 h e t <= v ? ? > v n = 2**k ? About k iterations Time taken is proportional to k, or log n. A logarithmic algorithm Write as O(log n) [explain notation next lecture] Each iteration cuts the size of the ? segment in half.

Looking at execution speed n*n ops Looking at execution speed Process an array of size n Number of operations executed 2n+2, n+2, n are all “order n” O(n) Called linear in n, proportional to n 2n + 2 ops n + 2 ops n ops Constant time 0 1 2 3 … size n

InsertionSort pre: b 0 b.length ? post: b 0 b.length sorted inv: or: b[0..i-1] is sorted b 0 i b.length sorted ? A loop that processes elements of an array in increasing order has this invariant inv: b 0 i b.length processed ? for (int i= 0; i < b.length; i= i+1) { maintain invariant }

Each iteration, i= i+1; How to keep inv true? b 0 i b.length sorted ? b 0 i b.length 2 5 5 5 7 3 ? e.g. b 0 i b.length 2 3 5 5 5 7 ? Push b[i] down to its shortest position in b[0..i], then increase i Will take time proportional to the number of swaps needed

What to do in each iteration? 11 inv: b 0 i b.length sorted ? b 0 i b.length 2 5 5 5 7 3 ? e.g. 2 5 5 5 3 7 ? Push b[i] to its sorted position in b[0..i], then increase i Loop body (inv true before and after) 2 5 5 3 5 7 ? 2 5 3 5 5 7 ? 2 3 5 5 5 7 ? b 0 i b.length 2 3 5 5 5 7 ?

InsertionSort Note English statement in body. // sort b[], an array of int // inv: b[0..i-1] is sorted for (int i= 0; i < b.length; i= i+1) { Push b[i] down to its sorted position in b[0..i] } Note English statement in body. Abstraction. Says what to do, not how. This is the best way to present it. We expect you to present it this way when asked. Later, show how to implement that with a loop Many people sort cards this way Works well when input is nearly sorted

InsertionSort start? stop? int k= i; progress? // Q: b[0..i-1] is sorted // Push b[i] down to its sorted position in b[0..i] // R: b[0..i] is sorted start? stop? int k= i; progress? while (k > 0 && b[k] < b[k-1]) { } Swap b[k] and b[k-1] maintain invariant? k= k–1; 2 5 3 5 5 7 ? i k example invariant P: b[0..i] is sorted except that b[k] may be < b[k-1]

How to write nested loops // sort b[], an array of int // inv: b[0..i-1] is sorted for (int i= 0; i < b.length; i= i+1) { Push b[i] down to its sorted position in b[0..i] } // sort b[], an array of int // inv: b[0..i-1] is sorted for (int i= 0; i < b.length; i= i+1) { //Push b[i] down to its sorted //position in b[0..i] int k= i; while (k > 0 && b[k] < b[k-1]) { swap b[k] and b[k-1]; k= k-1; } while (k > 0 && b[k] < b[k-1]) { } k= k–1; If you are going to show implementation, put in “WHAT IT DOES” as a comment Present algorithm like this

InsertionSort // sort b[], an array of int // inv: b[0..i-1] is sorted for (int i= 0; i < b.length; i= i+1) { Push b[i] down to its sorted position in b[0..i] } Worst-case: O(n2) (reverse-sorted input) Best-case: O(n) (sorted input) Expected case: O(n2) O(f(n)) : Takes time proportional to f(n). Formal definition later Pushing b[i] down can take i swaps. Worst case takes 1 + 2 + 3 + … n-1 = (n-1)*n/2 Swaps. Let n = b.length

SelectionSort pre: b 0 b.length ? post: b 0 b.length sorted inv: b 0 i b.length sorted , <= b[i..] >= b[0..i-1] Additional term in invariant Keep invariant true while making progress? e.g.: b 0 i b.length 1 2 3 4 5 6 9 9 9 7 8 6 9 Increasing i by 1 keeps inv true only if b[i] is min of b[i..]

SelectionSort Another common way for people to sort cards Runtime //sort b[], an array of int // inv: b[0..i-1] sorted AND // b[0..i-1] <= b[i..] for (int i= 0; i < b.length; i= i+1) { int m= index of minimum of b[i..]; Swap b[i] and b[m]; } Another common way for people to sort cards Runtime Worst-case O(n2) Best-case O(n2) Expected-case O(n2) sorted, smaller values larger values b 0 i length Each iteration, swap min value of this section into b[i]

Swapping b[i] and b[m] // Swap b[i] and b[m] int t= b[i]; b[i]= b[m]; b[m]= t;

Partition algorithm of quicksort Swap array values around until b[h..k] looks like this: x ? h h+1 k x is called the pivot pre: <= x x >= x h j k post:

20 31 24 19 45 56 4 20 5 72 14 99 pivot partition j 19 4 5 14 20 31 24 45 56 20 72 99 Not yet sorted The 20 could be in the other partition these can be in any order these can be in any order

Partition algorithm x ? h h+1 k pre: b <= x x >= x h j k post: b invariant needs at least 4 sections Combine pre and post to get an invariant <= x x ? >= x h j t k b

Partition algorithm h j t k <= x x ? >= x h j t k b Initially, with j = h and t = k, this diagram looks like the start diagram j= h; t= k; while (j < t) { if (b[j+1] <= b[j]) { Swap b[j+1] and b[j]; j= j+1; } else { Swap b[j+1] and b[t]; t= t-1; } Terminate when j = t, so the “?” segment is empty, so diagram looks like result diagram Takes linear time: O(k+1-h)

QuickSort procedure /** Sort b[h..k]. */ public static void QS(int[] b, int h, int k) { if (b[h..k] has < 2 elements) return; Base case Function does the partition algorithm and returns position j of pivot int j= partition(b, h, k); // We know b[h..j–1] <= b[j] <= b[j+1..k] } //Sort b[h..j-1] and b[j+1..k] QS(b, h, j-1); QS(b, j+1, k);

QuickSort Quicksort developed by Sir Tony Hoare (he was knighted by the Queen of England for his contributions to education and CS). 81 years old. Developed Quicksort in 1958. But he could not explain it to his colleague, so he gave up on it. Later, he saw a draft of the new language Algol 58 (which became Algol 60). It had recursive procedures. First time in a procedural programming language. “Ah!,” he said. “I know how to write it better now.” 15 minutes later, his colleague also understood it.

Worst case quicksort: pivot always smallest value x0 >= x0 j partioning at depth 0 x0 x1 >= x1 j partioning at depth 1 x0 x1 x2 >= x2 j partioning at depth 2 /** Sort b[h..k]. */ public static void QS(int[] b, int h, int k) { if (b[h..k] has < 2 elements) return; int j= partition(b, h, k); QS(b, h, j-1); QS(b, j+1, k);

Best case quicksort: pivot always middle value <= x0 x0 >= x0 0 j n depth 0. 1 segment of size ~n to partition. <=x1 x1 >= x1 x0 <=x2 x2 >=x2 Depth 2. 2 segments of size ~n/2 to partition. Depth 3. 4 segments of size ~n/4 to partition. Max depth: about log n. Time to partition on each level: ~n Total time: O(n log n). Average time for Quicksort: n log n. Difficult calculation

QuickSort procedure /** Sort b[h..k]. */ public static void QS(int[] b, int h, int k) { if (b[h..k] has < 2 elements) return; int j= partition(b, h, k); // We know b[h..j–1] <= b[j] <= b[j+1..k] // Sort b[h..j-1] and b[j+1..k] QS(b, h, j-1); QS(b, j+1, k); } Worst-case: quadratic Average-case: O(n log n) Worst-case space: O(n*n)! --depth of recursion can be n Can rewrite it to have space O(log n) Average-case: O(n * log n)

Partition algorithm Key issue: Choosing pivot How to choose a pivot? Choosing pivot Ideal pivot: the median, since it splits array in half But computing median of unsorted array is O(n), quite complicated Popular heuristics: Use first array value (not good) middle array value median of first, middle, last, values GOOD! Choose a random element

Quicksort with logarithmic space Problem is that if the pivot value is always the smallest (or always the largest), the depth of recursion is the size of the array to sort. Eliminate this problem by doing some of it iteratively and some recursively

Quicksort with logarithmic space Problem is that if the pivot value is always the smallest (or always the largest), the depth of recursion is the size of the array to sort. Eliminate this problem by doing some of it iteratively and some recursively. We may show you this later. Not today!

QuickSort with logarithmic space /** Sort b[h..k]. */ public static void QS(int[] b, int h, int k) { int h1= h; int k1= k; // invariant b[h..k] is sorted if b[h1..k1] is sorted while (b[h1..k1] has more than 1 element) { Reduce the size of b[h1..k1], keeping inv true }

QuickSort with logarithmic space /** Sort b[h..k]. */ public static void QS(int[] b, int h, int k) { int h1= h; int k1= k; // invariant b[h..k] is sorted if b[h1..k1] is sorted while (b[h1..k1] has more than 1 element) { int j= partition(b, h1, k1); // b[h1..j-1] <= b[j] <= b[j+1..k1] if (b[h1..j-1] smaller than b[j+1..k1]) { QS(b, h, j-1); h1= j+1; } else {QS(b, j+1, k1); k1= j-1; } } Only the smaller segment is sorted recursively. If b[h1..k1] has size n, the smaller segment has size < n/2. Therefore, depth of recursion is at most log n

Binary search: find position h of v = 5 pre: array is sorted h = -1 t = 11 1 4 5 6 8 10 11 12 1 4 5 6 8 10 11 12 Loop invariant: b[0..h] <= v b[t..] > v B is sorted h = -1 t = 5 1 4 5 6 8 10 11 12 h = 2 t = 5 1 4 5 6 8 10 11 12 h = 3 t = 5 1 4 5 6 8 10 11 12 h = 3 t = 4 5 1 4 5 6 8 10 11 12 post: <= v h > v