Download presentation
Presentation is loading. Please wait.
1
Algorithms: Design and Analysis
, Semester 2, 3. Divide and Conquer Objective look at several divide and conquer examples (merge sort, quick sort)
2
1. Divide-and-Conquer Divide problem into two or more smaller instances Solve smaller instances recursively (conquer) Obtain solution to original (larger) problem by combining these solutions
3
Divide-and-Conquer Technique
solve a problem of size n divide problem into smaller parts (often 2) a recursive algorithm solve subproblem 1 of size n/2 solve subproblem 2 of size n/2 a solution to subproblem 1 a solution to subproblem 2 combine solutions a solution to the original problem
4
2. A Faster Sort: Merge Sort
Initial call: MergeSort(A, 1, n) MERGESORT( A, left, right) If left < right, // if left ≥ right, do nothing mid := floor(left+right)/2) MergeSort(A, left, mid) MergeSort( A, mid+1,right) Merge(A, left, mid, right) return 1 2 ... mid n
5
A faster sort: MergeSort
input A[1 .. n] A[1 . . mid] A[mid n] MERGESORT MERGESORT Sorted A[1 . . mid] Sorted A[mid n] MERGE output
6
Tracing MergeSort() merge
7
Merging two sorted arrays
20 13 7 2 12 11 9 1 Time = one pass through each array = O(n) to merge a total of n elements (linear time).
8
Analysis of Merge Sort Statement Effort MergeSort(A, left, right) T(n)
if (left < right) { O(1) mid = floor((left+right)/2); O(1) MergeSort(A, left, mid); T(n/2) MergeSort(A, mid+1, right); T(n/2) Merge(A, left, mid, right); O(n) } As shown on the previous slides
9
merge() Code merge(A, left, mid, right)
Merges two adjacent subranges of an array A left == the index of the first element of the first range mid == the index of the last element of the first range right == to the index of the last element of the second range
10
with mergesort : double memory
void merge(int[] A, int left, int mid, int right) { int[] temp = new int[right–left + 1]; int aIdx = left; int bIdx = mid+1; for (int i=0; i < temp.length; i++){ if(aIdx > mid) temp[i] = A[bIdx++]; // copy 2nd range else if (bIdx > right) temp[i] = A[aIdx++]; // copy 1st range else if (a[aIdx] <= a[bIdx]) temp[i] = A[aIdx++]; else temp[i] = A[bIdx++]; } // copy back into A for (int j = 0; j < temp.length; j++) A[left+j] = temp[j]; } the problem with mergesort : double memory
11
MergeSort Running Time
Recursive T() equation: T(1) = O(1) T(n) = 2T(n/2) + O(n), for n > 1 Convert to algebra T(1) = a T(n) = 2T(n/2) + cn
12
T(n) = 2T(n/2) + cn 2(2T(n/2/2) + cn/2) + cn 22T(n/22) + cn2/2 + cn 22T(n/22) + cn(2/2 + 1) 22(2T(n/22/b) + cn/22) + cn(2/2 + 1) 23T(n/23) + cn(22/22) + cn(2/2 + 1) 23T(n/23) + cn(22/22 +2/2 + 1) … 2kT(n/2k) + cn(2k-1/2k-1 + 2k-2/2k-2 + … + 22/22 + 2/2 + 1)
13
So we have T(n) = 2kT(n/2k) + cn(2k-1/2k /22 + 2/2 + 1) For k = log2 n n = 2k, so T() argument becomes 1 T(n) = 2kT(1) + cn(k-1+1) = na + cn(log2 n) = O(n) + O(n log2 n) = O(n log2 n) k-1 of these
14
Merge Sort vs Selection Sort
• O(n log n) grows more slowly than O(n2). • In other words, merge sort is asymptotically faster (runs faster) than insertion sort in the worst case. • In practice, merge sort beats selection sort for n > 30 or so.
15
3. Quicksort Proposed by Tony Hoare in 1962.
Voted one of top 10 algorithms of 20th century in science and engineering Sorts “in place” -- rearranges elements using only the array, as in insertion sort, but unlike merge sort which uses extra storage. Very practical (after some code tuning).
16
Divide and conquer Quicksort an n-element array: 1. Divide: Partition the array into two subarrays around a pivot x such that elements in lower subarray ≤ x ≤ elements in upper subarray. 2. Conquer: Recursively sort the two subarrays. 3. Combine: Nothing to do. Key: implementing a linear-time partitioning function
17
Pseudocode quicksort(int[] A, int left, int right) if (left < right) // If the array has 2 or more items pivot = partition(A, left, right) // recursively sort elements smaller than the pivot quicksort(A, left, pivot-1) // recursively sort elements bigger than the pivot quicksort(A, pivot+1, right)
18
Quicksort Diagram pivot
19
3.1. Partitioning Function
PARTITION(A, p, q) // A[p . . q] x ← A[p] // pivot = A[p] Running time i ← p // index = O(n) for n for j ← p + 1 to q elements. if A[ j] ≤ x then i ← i // move the i boundary exchange A[i] ↔ A[ j] // switch big and small exchange A[p] ↔ A[i] return i // return index of pivot
20
Example of partitioning
scan right until find something less than the pivot
21
Example of partitioning
22
Example of partitioning
23
Example of partitioning
swap 10 and 5
24
Example of partitioning
resume scan right until find something less than the pivot
25
Example of partitioning
26
Example of partitioning
27
Example of partitioning
swap 13 and 3
28
Example of partitioning
swap 10 and 2
29
Example of partitioning
30
Example of partitioning
j runs to the end
31
Example of partitioning
swap pivot and 2 so in the middle
32
3.2. Analysis of Quicksort The analysis is quite tricky.
Assume all the input elements are distinct no duplicate values makes this code faster! there are better partitioning algorithms when duplicate input elements exist (e.g. Hoare's original code) Let T(n) = worst-case running time on an array of n elements.
33
Worst-case of quicksort
QUICKSORT runs very slowly when its input array is already sorted (or is reverse sorted). almost sorted data is quite common in the real-world This is caused by the partition using the min (or max) element which means that one side of the partition will have has no elements. Therefore: T(n) = T(0) +T(n-1) + O(n) = O(1) +T(n-1) + O(n) = T(n-1) + O(n) = O(n2) no elements n-1 elements
34
Quicksort isn't Quick? In the worst case, quicksort isn't any quicker than selection sort. So why bother with quicksort? It's average case running time is very good, as we'll see.
35
Best-case Analysis If we’re lucky, PARTITION splits the array evenly: T(n) = 2T(n/2) + O(n) = O(n log n) (same as merge sort)
36
Good and Bad Suppose we alternate good, bad, good, bad, good, partitions …. G(n) = 2B(n/2) + O(n) good B(n) = G(n – 1) + O(n) bad Solving: G(n) = 2( G(n/2 – 1) + O(n/2) ) + O(n) = 2G(n/2 – 1) + O(n) = O(n log n) How can we make sure we choose good partitions? Good!
37
Randomized Quicksort IDEA: Partition around a random element.
Running time is then independent of the input order. No assumptions need to be made about the input distribution. No specific input leads to the worst-case behavior. The worst case is determined only by the output of a random-number generator.
38
3.3. Quicksort in Practice Quicksort is a great general-purpose sorting algorithm. especially with a randomized pivot Quicksort can benefit substantially from code tuning Quicksort can be over twice as fast as merge sort Quicksort behaves well even with caching and virtual memory.
39
4. Timing Comparisons Running time estimates:
Home PC executes 108 compares/second. Supercomputer executes 1012 compares/second selection Lesson 1. Good algorithms are better than supercomputers. Lesson 2. Great algorithms are better than good ones.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.