Introduction to Computer Science Recursive Array Programming Recursive Sorting Algorithms Unit 16.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Sorting Sorting is the process of arranging a list of items in a particular order The sorting process is based on specific value(s) Sorting a list of test.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
21/3/00SEM107- Kamin & ReddyClass 15 - Recursive Sorting - 1 Class 15 - Recursive sorting methods r Processing arrays by recursion r Divide-and-conquer.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
CMPS1371 Introduction to Computing for Engineers SORTING.
Recursive sorting: Quicksort and its Complexity
Sorting21 Recursive sorting algorithms Oh no, not again!
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Simple Sorting Algorithms
1 TCSS 342, Winter 2005 Lecture Notes Sorting Weiss Ch. 8, pp
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
CS 280 Data Structures Professor John Peterson. Project Questions?
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
Quicksort.
C++ Plus Data Structures
Quicksort.
Quicksort
CS 280 Data Structures Professor John Peterson. Project Questions? /CIS280/f07/project5http://wiki.western.edu/mcis/index.php.
Algorithm Efficiency and Sorting
1 © 2006 Pearson Addison-Wesley. All rights reserved Searching and Sorting Linear Search Binary Search ; Reading p Selection Sort ; Reading p
Sorting CS-212 Dick Steflik. Exchange Sorting Method : make n-1 passes across the data, on each pass compare adjacent items, swapping as necessary (n-1.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Simple Sorting Algorithms. 2 Outline We are going to look at three simple sorting techniques: Bubble Sort, Selection Sort, and Insertion Sort We are going.
Chapter 8 ARRAYS Continued
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
Recursive Quicksort Data Structures in Java with JUnit ©Rick Mercer.
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Searching. The process used to find the location of a target among a list of objects Searching an array finds the index of first element in an array containing.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
Examples of Recursion Data Structures in Java with JUnit ©Rick Mercer.
Chapter 5 Searching and Sorting. Copyright © 2004 Pearson Addison-Wesley. All rights reserved.1-2 Chapter Objectives Examine the linear search and binary.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Sort Algorithms.
1 C++ Plus Data Structures Nell Dale Chapter 10 Sorting and Searching Algorithms Slides by Sylvia Sorkin, Community College of Baltimore County - Essex.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Searching & Sorting Programming 2. Searching Searching is the process of determining if a target item is present in a list of items, and locating it A.
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
21/3/00SEM107 - © Kamin & ReddyClass 14 - Sorting - 1 Class 14 - Review: Sorting & searching r What are sorting and searching? r Simple sorting algorithms.
Lecture No. 04,05 Sorting.  A process that organizes a collection of data into either ascending or descending order.  Can be used as a first step for.
Computer Science 101 Fast Algorithms. What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n )
1 Sorting اعداد: ابوزيد ابراهيم حامد سعد صبرة حميده الشاذلي عبدالاه السيد محمد احمد.
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Searching and Sorting Searching: Sequential, Binary Sorting: Selection, Insertion, Shell.
M180: Data Structures & Algorithms in Java Sorting Algorithms Arab Open University 1.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Chapter 9 sorting. Insertion Sort I The list is assumed to be broken into a sorted portion and an unsorted portion The list is assumed to be broken into.
Computer Science 1620 Sorting. cases exist where we would like our data to be in ascending (descending order) binary searching printing purposes selection.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
CS 367 Introduction to Data Structures Lecture 11.
Sorting & Searching Geletaw S (MSC, MCITP). Objectives At the end of this session the students should be able to: – Design and implement the following.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Sorting Mr. Jacobs.
Data Structures in Java with JUnit ©Rick Mercer
Programming in Java: lecture 10
Quicksort 1.
Advanced Sorting Methods: Shellsort
slides adapted from Marty Stepp
CSE 373 Data Structures and Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 20 Order Statistics II
Data Structures & Algorithms
Presentation transcript:

Introduction to Computer Science Recursive Array Programming Recursive Sorting Algorithms Unit 16

16- 2 Recursive Array Programming Recursive function definitions assume that a function works for a smaller value. With arrays, "a smaller value" means a shorter array, i.e., a subarray, contiguous elements from the original array We'll define a recursive function over an array by using the same function over a subarray, and a base case Subscripts will mark the lower and upper bounds of the subarrays

16- 3 Subarrays myArray [0][1][2][3][4][5][6][7][8][9] Base case 1 through 9 2 through 9 3 through 9 4 through 9 5 through 9 6 through 9 7 through 9 8 through 9 9 through 9

16- 4 Example: Recursively find sum of array elements, A[lo] to A[hi] Assume sum( ) properly returns sum of elements for a smaller array of doubles Then we could write: double sum (double[ ] A, int lo, int hi) { return ( A[lo] + sum(A, lo + 1, hi) ); } But we're not done; what's the base case?

16- 5 Base Case is when Subarray is Empty, hi is less than lo double sum (double[ ] A, int lo, int hi) { if (hi < lo) return 0.0; else return ( A[lo] + sum(A, lo + 1, hi) ); } Yes, we could have defined this using ( hi == lo ) as the base case…

16- 6 Recursive Sorting Algorithms We can use this same idea of recursive functions over subarrays to rewrite our sorting algorithms Let's see how this works for selection sort, insertion sort, and then some new sorting algorithms

16- 7 Selection Sort (REVIEW) starting order: search through array, find largest value, exchange with first array value: search through rest of array, find second-largest value, exchange with second array value:

16- 8 Selection Sort Pseudocode (REVIEW) for every “first” component in the array find the largest component in the array; exchange it with the “first” component

16- 9 The Selection Sort Java Code (REVIEW) void select (int[ ] data) { // Uses selection sort to order an array of integers. int first, current, largest, temp; for (first = 0; first < data.length - 1; first++) { largest = first; for (current = first + 1; current < data.length; current++) { if ( data[current] > data[largest] ) largest = current; } // Postcondition: largest is index of largest item // from first..end of array if (largest != first) { // We have to make a swap temp = data[largest]; data[largest] = data[first]; // Make the swap data[first] = temp; } } // select

Recursive Selection Sort Let's say we want to sort an array A from index "lo" to index "hi", largest to smallest We place the largest element in A[lo] Then recursively sort the rest of the array from A[lo + 1] to A[hi] The base case is the one-element subarray when lo equals hi

The Recursive Selection Sort Java Code void selectionSort(int[ ] data, int lo, int hi) { // data[0]…data[lo-1] contain the largest values in data, // in descending order if (lo data[locationOfMax]) return lo; else return locationOfMax; } }

Code for swap( ) The above version of selectionSort( ) is much less efficient than the iterative version; we show it just as an example of recursive array programming void swap(int[ ] data, int first, int second) { inttemp; temp = data[first]; data[first] = data[second]; data[second] = temp; }

What Does the Outside World See? We can use overloading, and provide a one- argument version of selectionSort( ) for outside use. No one needs to know whether it was implemented using recursion or iteration: void selectionSort(int[ ] data) { selectionSort(data, 0, data.length - 1); } The internal version should be private

Insertion Sort starting order: move through the array, keeping the left side ordered; when we find the 35, we have to slide the 18 over to make room: continue moving through the array, always keeping the left side ordered, and sliding values over as necessary to do so: 18 slid over

Continue the Insertion Process the left side of the array is always sorted, but may require one or more components to be slid over to make room: 35, 22, and 18 slid over , 22, and 18 slid over , 22, and 18 slid over

Continue the Insertion Process , 35, 22, and 18 slid over nothing slides over , 22, 18, and 10 slid over

The Insertion Sort Java Code (Review) void insert (int[ ] data) { // Uses insertion sort to order an array of integers. int newest, current, newItem; boolean seeking; for (newest = 1; newest < data.length; newest++) { seeking = true; current = newest; newItem = data[newest]; while (seeking) { // seeking newItem's new position on left if (data[current - 1] < newItem) { data[current] = data[current -1]; //slide value right current- -; seeking = (current > 0); } else seeking = false; } // while // Postcondition: newItem belongs in data[current] data[current] = newItem; } // newest for } // insert

How Do We Do Insertion Sort Recursively? How can the ability to sort an array of length n-1 be used to sort an array of length n? Answer: sort the array of length n-1, then insert the nth element in the proper place That is: to sort subarray A[0]…A[hi], sort A[0]…A[hi-1], then insert A[hi] into that smaller subarray insertInOrder(A, hi, x) will be used to insert x into subarray A[0]…A[hi-1]

Recursive Insertion Sort void insertionSort(int[ ] data, int hi) { // Sort data[0]…data[hi] if (hi > 0) { insertionSort(data, hi-1); insertInOrder(data, hi, data[hi]); }

What About insertInOrder( )? We'll define it recursively To insert x into subarray A[0]…A[hi-1]: –If x ≤  A[hi-1], then put x into A[hi] –If x > A[hi-1], then move A[hi-1] into A[hi] and insert x into subarray A[0]…A[hi-2]

Recursive insertInOrder( ) void insertInOrder(int[ ] data, int hi, int x) { // Insert x into data[0]…data[hi-1], filling // in data[hi] in the process. // data[0]…data[hi-1] are sorted. if ( (hi == 0) || (data[hi-1] >= x) ) data[hi] = x; else { data[hi] = data[hi-1]; insertInOrder(data, hi-1, x); } }

What Does the Outside World See (again)? We can use overloading, and provide a one- argument version of insertionSort( ) for outside use. No one needs to know whether it was implemented using recursion or iteration: public void insertionSort(int[ ] data) { insertionSort(data, data.length - 1); } The internal version should be private

More Recursive Sorting: Quicksort Quicksort is an O(n 2 ) algorithm in the worst case, but its running time is usually proportional to n log 2 (n); it is the method of choice for many sorting jobs We’ll first look at the intuition behind the algorithm: take an array VONICAER arrange it so the small values are on the left half and all the big values are on the right half: V ONICAER

Quicksort Intuition Then, do it again: V ONICAER and again: V ON I C AE R The divide-and-conquer strategy will, in general, take log 2 n steps to move the A into position. The intuition is that, doing this for n elements, the whole algorithm will take O(n log 2 n) steps.

What Quicksort is really doing 1.Partition array a into smaller elements and larger ones – smaller ones from a[0]…a[m-1] (not necessarily in order), larger ones in positions a[m+1]…a[length-1] (not necessarily in order), and the “middle” element in a[m]. So a: might be partitioned (with m=4) as: pivot smaller than pivotlarger than pivot

quicksort(), next 2 steps 2. Recursively sort a[0]…a[m-1]. Our a becomes: Recursively sort a[m+1]…a[length-1]. Our a becomes: pivot

quicksort( ) private void quicksort(double[] a, int lo, int hi) { int m; if (hi > lo+1) {// there are at least 3 elements // so sort recursively m = partition(a, lo, hi); quicksort(a, lo, m-1); quicksort(a, m+1, hi); } else// the base case… … }

The base case… We have the base case in this recursion when the subarray a[lo]…a[hi] contains zero elements (lo==hi+1), one element (lo==hi), or two elements (lo==hi-1). With no elements, we ignore it With one element, it’s already sorted With two elements, we just swap them: // 0, 1, or 2 elements, so sort directly if (hi == lo+1 && a[lo] > a[hi]) swap(a, lo, hi);

quicksort( ) private void quicksort(double[] a, int lo, int hi) { int m; if (hi > lo+1) {// there are at least 3 elements // so sort recursively m = partition(a, lo, hi); quicksort(a, lo, m-1); quicksort(a, m+1, hi); } else// 0, 1, or 2 elements, so sort directly if (hi == lo+1 && a[lo] > a[hi]) swap(a, lo, hi); }

The outside world’s version As usual, we overload quicksort( ) and provide a public version that “hides” the last two arguments: public void quicksort(double[] a) { quicksort(a, 0, a.length-1); }

Now, all we need is partition( ) int partition(double[] a, int lo, int hi) { // Choose middle element among a[lo]…a[hi], // and move other elements so that a[lo]…a[m-1] // are all less than a[m] and a[m+1]…a[hi] are // all greater than a[m] // // m is returned to the caller … }

Are you feeling lucky? Now, this would work great if we knew the median value in the array segment, and could choose it as the pivot (i.e., know which small values go left and which large values go right). But we can’t know the median value without actually sorting the array! So instead, we somehow pick a pivot and hope that it is near median. If the pivot is the worst choice (each time), the algorithm becomes O(n 2 ). If we are roughly dividing the subarrays in half each time, we get an O(nlog 2 n) algorithm.

How to pick the median value There are many techniques for doing this. In practice, one good way of choosing the pivot is to take the median of three elements, specifically a[lo+1], a[(lo+hi)/2], and a[hi] We’ll choose their median using the method medianLocation( )

medianLocation( ) int medianLocation(double[] a, int i, int j, int k) { if (a[i] <= a[j]) if (a[j] <= a[k]) return j; else if (a[i] <= a[k]) return k; elsereturn i; else// a[j] < a[i] if (a[i] <= a[k]) return i; else if (a[j] <= a[k]) return k; elsereturn j; }

The partitioning process From the three red elements, we choose the median, a[hi] lo+1hi(lo+hi)/2 We swap the median with a[lo], and start the partitioning process on the rest: lo+1hi(lo+hi)/2

The partitioning process We shuffle around elements from a[lo+1] to a[hi] so that all elements less than the pivot (a[lo]) appear to all the elements greater than the pivot Until we are done, we have no way of knowing how many elements are less and how many elements are greater lo+1hi(lo+hi)/2m

The partitioning process m is the largest subscript that contains a value less than the pivot We have discovered (in our example) that m = 6 We then swap a[m] with a[lo], placing the pivot in its rightful position, in a[m], then continue to sort the left and right subarrays recursively lo+1hi(lo+hi)/2m lo+1hi(lo+hi)/2mlo pivot

How partition( ) works We will use a 3-argument version: int partition(double[ ] a, int lo, int hi) and a 4-argument version: int partition(double[ ] a, int lo, int hi, double pivot) The 3-argument version moves the pivot element into a[lo], calls the 4-argument partition to shuffle the elements in the subarray a[lo+1]…a[hi], then swaps a[lo] into a[m] and returns m

argument partition( ) int partition(double[] a, int lo, int hi) { // Choose middle element among a[lo]…a[hi], // and move other elements so that a[lo]…a[m-1] // are all less than a[m] and a[m+1]…a[hi] are // all greater than a[m] // // m is returned to the caller swap(a, lo, medianLocation(a, lo+1, hi, (lo+hi)/2)); int m = partition(a, lo+1, hi, a[lo]); swap(a, lo, m); return m; }

How the 4-argument partition( ) works The 4-argument version does the main work, recursively calling itself on subarrays We of course assume partition( ) works on any smaller array… If the first element of the array is less than or equal to pivot, it’s already in the right place, just call partition( ) recursively on the rest: if (a[lo] <= pivot) // a[lo] in correct half return partition(a, lo+1, hi, pivot); (this is the “current” lo, not the lo of the whole array)

How the 4-argument partition( ) works If a[lo] > pivot, then a[lo] belongs in the upper half of the subarray, and we swap it with a[hi] We still don’t know where the new a[lo] value should go, so we partition recursively on the subarray that includes a[lo] but not a[hi] if (a[lo] <= pivot) // a[lo] in correct half return partition(a, lo+1, hi, pivot); else {// a[lo] in wrong half swap(a, lo, hi); return partition(a, lo, hi-1, pivot); }

How the 4-argument partition( ) works The base case is the one element subarray, when lo==hi. Is the one element “small” or “large”? If it is small (less than the pivot), then it is at the middle point m (and we can swap it with the pivot) Otherwise, it is just above the middle point (and we want the pivot swapped with the element just below it)

argument partition( ) int partition(double[] a, int lo, int hi, double pivot) { if (hi == lo) if (a[lo] < pivot) return lo; else return lo-1; else if (a[lo] <= pivot) // a[lo] in correct half return partition(a, lo+1, hi, pivot); else {// a[lo] in wrong half swap(a, lo, hi); return partition(a, lo, hi-1, pivot); }

Example of partition The starting array: lohi Choose the median from among: lohi (lo+hi)/2 lo+1 Swap the median with a[lo]: pivot

Example of partition Now, partition the subarray (not counting pivot); a is now the new subarray… lohi a[lo] > pivot, so swap it with a[hi], and continue with the partition: lohi pivot

Example of partition a[lo] is now less than pivot, so we leave it and continue with the partition: lohi Now a[lo] is greater than pivot, so we swap it with a[hi] and continue with the partition: lohi pivot

Example of partition a[lo] is again greater than pivot, so we swap it with a[hi] and continue with the partition: lohi pivot a[lo] is less than the pivot, so lo (i.e., index 2) is returned by the 4-argument partition( ); the 3-argument partition then swaps the pivot and the middle element pivot Now we’re ready to recursively quicksort the left and right subarrays

quicksort( ) private void quicksort(double[] a, int lo, int hi) { int m; if (hi > lo+1) {// there are at least 3 elements // so sort recursively m = partition(a, lo, hi); quicksort(a, lo, m-1); quicksort(a, m+1, hi); } else// 0, 1, or 2 elements, so sort directly if (hi == lo+1 && A[lo] > A[hi]) swap(a, lo, hi); }

Performance Comparison Quicksort: –Best case: O(nlog 2 n) –Worst case: O(n 2 ) – when the pivot is always the second-largest or second-smallest element (since medianLocation won’t let us choose the smallest or largest) –Average case over all possible arrangements of n array elements: O(nlog 2 n) Selection Sort and Insertion Sort –Average case: O(n 2 )