Download presentation
Presentation is loading. Please wait.
Published byAlexia Powers Modified over 7 years ago
1
School of Computing Clemson University Fall, 2012
Lecture 8. Recursion, Sorting, Divide and Conquer CpSc 212: Algorithms and Data Structures Brian C. Dean School of Computing Clemson University Fall, 2012
2
Warm-Up: Amortized Analysis Review
Would you be happy if I told you “CpSc 212 will take you 10 hours of work per week, amortized”?
3
Warm-Up: Amortized Analysis Review
Would you be happy if I told you “CpSc 212 will take you 10 hours of work per week, amortized”? Which of the following scenarios would this rule out (using our notion of “amortized”): 10 hours per week over the entire semester. 0 hours per week for 9 weeks, then 100 hours in last week. 100 hours in first week, then 0 hours henceforth. 1, 3, 5, 7, 9, 11, 13, 15, 17, 19 hours per week, respectively.
4
Iteration Versus Recursion
Problem: find max element in A[0…N-1]. Iterative viewpoint on solution: m = A[0]; for (i=1; i<N; i++) m = max(m, A[i]); Recursive viewpoint on solution: int get_max(int *A, int start, int end) { if (start == end) return A[start]; return max(A[0], get_max(A, start+1, end)); } Both O(N) time. What reasons would we have to choose one over the other?
5
Incremental Construction
Build solution by adding input elements one by one, updating solution as we go. Example: Insertion sort O(N2), although much faster if array is already nearly-sorted. Sorts “In-place”. A “stable” sorting algorithm: leaves equal elements in same order.
6
Incremental Construction
Recursive outlook: deal with first element, then recursively solve rest of problem. Example: Insertion sort To sort A[], insert A[0] into sort(rest of A) (still O(N2), in-place, stable) Example: Selection sort Sort(A[]) = min(A) followed by sort(rest of A) (also O(N2), in-place, stable) This approach often maps naturally to linked lists, giving very simple implementations…
7
Brief Aside: Lisp / Scheme
Very simple language, inherently recursive Everything is a list! (car L) : first element of list L (cdr L) : rest of list L Example: find the sum of elements in a list: (define (sum-of-list L) (if (null? L) (+ (car L) (sum-of-list (cdr L))))) Good language to use for training your mind to think recursively…
8
Divide and Conquer: Merge Sort
Recursively sort 1st and 2nd halves Merge two halves into one sorted list
9
Divide and Conquer: Merge Sort
Merging is easy to do in Θ(N) time… Iterative approach: Recursively sort 1st and 2nd halves Merge two halves into one sorted list
10
Divide and Conquer: Merge Sort
Merging is easy to do in Θ(N) time… Iterative approach: Recursive approach? Recursively sort 1st and 2nd halves Merge two halves into one sorted list
11
Divide and Conquer: Merge Sort
Merging is easy to do in Θ(N) time… Iterative approach: Recursive approach? Θ(N log N) total runtime. Why? Is this better than insertion sort? Recursively sort 1st and 2nd halves Merge two halves into one sorted list
12
Divide and Conquer: Merge Sort
Merging is easy to do in Θ(N) time… Iterative approach: Recursive approach? Θ(N log N) total runtime. Why? Is this better than insertion sort? Stable? In-Place? Recursively sort 1st and 2nd halves Merge two halves into one sorted list
13
More Than One Way to Divide…
What if we want to run merge-sort on a linked list? To split our problem into two half-sized sub-lists, it might be easier to “un-interleave” our linked list…
14
Divide and Conquer: Quicksort
Partition array using “pivot” element Recursively sort 1st and 2nd “halves”
15
Divide and Conquer: Quicksort
Partitioning is easy to do in Θ(N) time… Partition array using “pivot” element Recursively sort 1st and 2nd “halves”
16
Divide and Conquer: Quicksort
Partitioning is easy to do in Θ(N) time… Running time: Θ(N2) worst case O(N log N) in practice. O(N log N) with high probability if we choose pivot randomly. Partition array using “pivot” element Recursively sort 1st and 2nd “halves”
17
Divide and Conquer: Quicksort
Partitioning is easy to do in Θ(N) time… Running time: Θ(N2) worst case O(N log N) in practice. O(N log N) with high probability if we choose pivot randomly. Stable? In-place? Partition array using “pivot” element Recursively sort 1st and 2nd “halves”
18
Quicksort Variants Simple quicksort. Choose pivot using a simple deterministic rule; e.g., first element, last element, median(A[1], A[n], A[n/2]). Θ(n log n) time if “lucky”, but Θ(n2) worst-case. Deterministic quicksort. Pivot on median (we’ll see shortly how to find the median in linear time). Θ(n log n) time, but not the best in practice. Randomized quicksort. Choose pivot uniformly at random. Θ(n log n) time with high probability, and fast in practice (competitive with merge sort).
19
Further Thoughts on Sorting
Any sorting algorithm can be made stable at the expense of in-place operation (so we can implement quicksort to be stable but not in-place, or in-place but not stable). Memory issues: Rather than sort large records, sort pointers to records. Some advanced sorting algorithms only move elements of data O(n) total times. How will caching affect the performance of our various sorting algorithms?
20
An Obvious Question Is stable in-place sorting possible in O(n log n) time in the comparison-based model? * = can be transformed into a stable, out-of-place algorithm Algorithm Runtime Stable In-Place? Bubble Sort O(n2) Yes Insertion Sort Merge Sort Θ(n log n) No Randomized Quicksort Θ(n log n) w/high prob. No* Yes* Deterministic Quicksort Heap Sort
21
The Ideal Sorting Algorithm…
…would be stable and in-place. …would require only O(n) moves (memory writes) …would be simple and deterministic. …would run in O(n log n) time. (there is an Ω(n log n) worst-case lower bound on any “comparison-based” sorting algorithm) …would run in closer to O(n) time for “nearly sorted” inputs (like insertion sort). We currently only know how to achieve limited combinations of the above properties…
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.