School of Computing Clemson University Fall, 2012

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Garfield AP Computer Science
Theory of Computing Lecture 3 MAS 714 Hartmut Klauck.
Sorting Comparison-based algorithm review –You should know most of the algorithms –We will concentrate on their analyses –Special emphasis: Heapsort Lower.
CSC 213 – Large Scale Programming or. Today’s Goals  Begin by discussing basic approach of quick sort  Divide-and-conquer used, but how does this help?
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
Quicksort Quicksort     29  9.
Quicksort CSE 331 Section 2 James Daly. Review: Merge Sort Basic idea: split the list into two parts, sort both parts, then merge the two lists
CS 171: Introduction to Computer Science II Quicksort.
Fundamentals of Algorithms MCS - 2 Lecture # 16. Quick Sort.
CS 280 Data Structures Professor John Peterson. Project Not a work day but I’ll answer questions as long as they keep coming! I’ll try to leave the last.
CS 280 Data Structures Professor John Peterson. Project Questions?
TDDB56 DALGOPT-D DALG-C Lecture 8 – Sorting (part I) Jan Maluszynski - HT Sorting: –Intro: aspects of sorting, different strategies –Insertion.
CSE 373 Data Structures Lecture 19
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Sorting HKOI Training Team (Advanced)
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Searching and Sorting Recursion, Merge-sort, Divide & Conquer, Bucket sort, Radix sort Lecture 5.
Sorting: Implementation Fundamental Data Structures and Algorithms Klaus Sutner February 24, 2004.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Week 13 - Friday.  What did we talk about last time?  Sorting  Insertion sort  Merge sort  Started quicksort.
Divide and Conquer Sorting Algorithms COMP s1 Sedgewick Chapters 7 and 8.
Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
Data Structures and Algorithms Instructor: Tesfaye Guta [M.Sc.] Haramaya University.
CSE 250 – Data Structures. Today’s Goals  First review the easy, simple sorting algorithms  Compare while inserting value into place in the vector 
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
Advanced Sorting.
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Sorting.
Lecture 2 Sorting.
Introduction to Algorithms Prof. Charles E. Leiserson
Divide and Conquer Strategy
Chapter 7 Sorting Spring 14
Week 12 - Wednesday CS221.
Fast Sorting "The bubble sort seems to have nothing to recommend it, except a catchy name and the fact that it leads to some interesting theoretical problems."
Divide and Conquer.
Data Structures and Algorithms
CSE 143 Lecture 23: quick sort.
Algorithm Design and Analysis (ADA)
CSS 342 Data Structures, Algorithms, and Discrete Mathematics I
Quick Sort (11.2) CSE 2011 Winter November 2018.
Quicksort analysis Bubble sort
CSC215 Lecture Algorithms.
Hassan Khosravi / Geoffrey Tien
8/04/2009 Many thanks to David Sun for some of the included slides!
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Sub-Quadratic Sorting Algorithms
slides adapted from Marty Stepp
CSE373: Data Structure & Algorithms Lecture 21: Comparison Sorting
Topic 17 Faster Sorting "The bubble sort seems to have nothing to recommend it, except a catchy name and the fact that it leads to some interesting theoretical.
Chapter 4.
CSE 326: Data Structures Sorting
CS 3343: Analysis of Algorithms
Analysis of Algorithms
CS 1114: Sorting and selection (part two)
Sorting.
CSE 373 Data Structures and Algorithms
Divide & Conquer Sorting
The Selection Problem.
Design and Analysis of Algorithms
Richard Anderson Lecture 14 Divide and Conquer
CSE 332: Sorting II Spring 2016.
Data Structures and Algorithms CS 244
CMPT 225 Lecture 10 – Merge Sort.
Presentation transcript:

School of Computing Clemson University Fall, 2012 Lecture 8. Recursion, Sorting, Divide and Conquer CpSc 212: Algorithms and Data Structures Brian C. Dean School of Computing Clemson University Fall, 2012

Warm-Up: Amortized Analysis Review Would you be happy if I told you “CpSc 212 will take you 10 hours of work per week, amortized”?

Warm-Up: Amortized Analysis Review Would you be happy if I told you “CpSc 212 will take you 10 hours of work per week, amortized”? Which of the following scenarios would this rule out (using our notion of “amortized”): 10 hours per week over the entire semester. 0 hours per week for 9 weeks, then 100 hours in last week. 100 hours in first week, then 0 hours henceforth. 1, 3, 5, 7, 9, 11, 13, 15, 17, 19 hours per week, respectively.

Iteration Versus Recursion Problem: find max element in A[0…N-1]. Iterative viewpoint on solution: m = A[0]; for (i=1; i<N; i++) m = max(m, A[i]); Recursive viewpoint on solution: int get_max(int *A, int start, int end) { if (start == end) return A[start]; return max(A[0], get_max(A, start+1, end)); } Both O(N) time. What reasons would we have to choose one over the other?

Incremental Construction Build solution by adding input elements one by one, updating solution as we go. Example: Insertion sort O(N2), although much faster if array is already nearly-sorted. Sorts “In-place”. A “stable” sorting algorithm: leaves equal elements in same order.

Incremental Construction Recursive outlook: deal with first element, then recursively solve rest of problem. Example: Insertion sort To sort A[], insert A[0] into sort(rest of A) (still O(N2), in-place, stable) Example: Selection sort Sort(A[]) = min(A) followed by sort(rest of A) (also O(N2), in-place, stable) This approach often maps naturally to linked lists, giving very simple implementations…

Brief Aside: Lisp / Scheme Very simple language, inherently recursive Everything is a list! (car L) : first element of list L (cdr L) : rest of list L Example: find the sum of elements in a list: (define (sum-of-list L) (if (null? L) (+ (car L) (sum-of-list (cdr L))))) Good language to use for training your mind to think recursively…

Divide and Conquer: Merge Sort Recursively sort 1st and 2nd halves Merge two halves into one sorted list

Divide and Conquer: Merge Sort Merging is easy to do in Θ(N) time… Iterative approach: Recursively sort 1st and 2nd halves Merge two halves into one sorted list

Divide and Conquer: Merge Sort Merging is easy to do in Θ(N) time… Iterative approach: Recursive approach? Recursively sort 1st and 2nd halves Merge two halves into one sorted list

Divide and Conquer: Merge Sort Merging is easy to do in Θ(N) time… Iterative approach: Recursive approach? Θ(N log N) total runtime. Why? Is this better than insertion sort? Recursively sort 1st and 2nd halves Merge two halves into one sorted list

Divide and Conquer: Merge Sort Merging is easy to do in Θ(N) time… Iterative approach: Recursive approach? Θ(N log N) total runtime. Why? Is this better than insertion sort? Stable? In-Place? Recursively sort 1st and 2nd halves Merge two halves into one sorted list

More Than One Way to Divide… What if we want to run merge-sort on a linked list? To split our problem into two half-sized sub-lists, it might be easier to “un-interleave” our linked list…

Divide and Conquer: Quicksort Partition array using “pivot” element Recursively sort 1st and 2nd “halves”

Divide and Conquer: Quicksort Partitioning is easy to do in Θ(N) time… Partition array using “pivot” element Recursively sort 1st and 2nd “halves”

Divide and Conquer: Quicksort Partitioning is easy to do in Θ(N) time… Running time: Θ(N2) worst case O(N log N) in practice. O(N log N) with high probability if we choose pivot randomly. Partition array using “pivot” element Recursively sort 1st and 2nd “halves”

Divide and Conquer: Quicksort Partitioning is easy to do in Θ(N) time… Running time: Θ(N2) worst case O(N log N) in practice. O(N log N) with high probability if we choose pivot randomly. Stable? In-place? Partition array using “pivot” element Recursively sort 1st and 2nd “halves”

Quicksort Variants Simple quicksort. Choose pivot using a simple deterministic rule; e.g., first element, last element, median(A[1], A[n], A[n/2]). Θ(n log n) time if “lucky”, but Θ(n2) worst-case. Deterministic quicksort. Pivot on median (we’ll see shortly how to find the median in linear time). Θ(n log n) time, but not the best in practice. Randomized quicksort. Choose pivot uniformly at random. Θ(n log n) time with high probability, and fast in practice (competitive with merge sort).

Further Thoughts on Sorting Any sorting algorithm can be made stable at the expense of in-place operation (so we can implement quicksort to be stable but not in-place, or in-place but not stable). Memory issues: Rather than sort large records, sort pointers to records. Some advanced sorting algorithms only move elements of data O(n) total times. How will caching affect the performance of our various sorting algorithms?

An Obvious Question Is stable in-place sorting possible in O(n log n) time in the comparison-based model? * = can be transformed into a stable, out-of-place algorithm Algorithm Runtime Stable In-Place? Bubble Sort O(n2) Yes Insertion Sort Merge Sort Θ(n log n) No Randomized Quicksort Θ(n log n) w/high prob. No* Yes* Deterministic Quicksort Heap Sort

The Ideal Sorting Algorithm… …would be stable and in-place. …would require only O(n) moves (memory writes) …would be simple and deterministic. …would run in O(n log n) time. (there is an Ω(n log n) worst-case lower bound on any “comparison-based” sorting algorithm) …would run in closer to O(n) time for “nearly sorted” inputs (like insertion sort). We currently only know how to achieve limited combinations of the above properties…