Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 5 Divide & conquer

Similar presentations


Presentation on theme: "Chapter 5 Divide & conquer"— Presentation transcript:

1 Chapter 5 Divide & conquer
Master method Mergesort and Quicksort Binary tree traversals Closest pair and Convex hull revisited 7/2/2018 Analysis of Algorithms

2 Analysis of Algorithms
7/2/2018 Diagram a problem of size n subproblem 2 of size n/2 subproblem 1 of size n/2 a solution to subproblem 1 a solution to subproblem 2 Figure 5.1, p. 170 a solution to the original problem 7/2/2018 Analysis of Algorithms Chapter 5

3 Technique and Complexity
Divide the problem into two or more pieces. Solve those pieces as sub-problems recursively. Combine the results to find a solution to the original problem. Time (example): T(n) = D(n) + 2T(n/2) + C(n) where: D(n) is the time it takes to divide the problem into two parts C(n) is the time it takes to combine the solutions from those parts 7/2/2018 Analysis of Algorithms

4 Analysis of Algorithms
The Master Method T(n) = aT(n/b) + f(n) where f(n) ∈ Θ(nᵏ) [and of course T(1) is O(1)] a < bᵏ T(n) ∈ Θ(nᵏ) the terms are decreasing a = bᵏ T(n) ∈ Θ(nᵏ log n) the terms are all equal a > bᵏ T(n) ∈ Θ(nlogb a) the terms are increasing Examples: 9T(n/3) + 1 Θ(n²) T(2n/3) Θ(log n) T(n/2) + n Θ(n) 7/2/2018 Analysis of Algorithms

5 Proof of Master Method: T(n) = aT(n/b) + nᵏ
Analysis of Algorithms 7/2/2018 Proof of Master Method: T(n) = aT(n/b) + nᵏ Top level: nᵏ [once] = nᵏ Next level: (n/b)ᵏ … [a times] = nᵏ(a/bᵏ) : ith level: (n/bⁱ)ᵏ … [aⁱ times] = nᵏ(a/bᵏ)ⁱ Last level: O(1) … [alogb n] = nlogb a [let a = bᶜ] The total is a geometric series whose highest term dominates the sum, or else a logarithmic number of equal terms. logb n levels 7/2/2018 Analysis of Algorithms Chapter 5

6 Mergesort algorithm (outline)
Split the array A in two and copy the halves into the arrays B and C. Sort the arrays B and C recursively. Merge the sorted arrays B and C back into array A as follows: Repeat the following until no elements remain in B or C: Compare the first elements remaining in B and C. Move the smaller of the two into A (how do we break ties?) Once one of the arrays is exhausted, move the remaining elements from the other array into A. Not ‘in-place’ 7/2/2018 Analysis of Algorithms

7 Analysis of Algorithms
7/2/2018 Example 2 5 4 6 1 3 2 6 The diagram in the book is probably better. 5 2 4 6 1 3 2 6 7/2/2018 Analysis of Algorithms Chapter 5

8 Analysis of Algorithms
Code Sort(A[0..n−1]) Sort an array of size n > 0 if n > 1 then there are two or more elements let q ← n/2 find the ‘middle’ of the array let B[0..⌊q⌋−1] ← A[0..⌊q⌋−1] Copy the first half of the array let C[0..⌈q⌉−1] ← A[⌊q⌋..n−1] Copy the second half of the array Sort(B[0..⌊q⌋−1]) recursively sort the first half Sort(C[0..⌈q⌉−1]) recursively sort the second half Merge(B, C, A) combine them back into A 7/2/2018 Analysis of Algorithms

9 Analysis of Algorithms
7/2/2018 Diagram Induction Basis A: · · · · · · · · · divide · · · · · sort · · · · · · conquer combine · · · · · · · · · · · 7/2/2018 Analysis of Algorithms Chapter 5

10 Analysis of Algorithms
7/2/2018 Correctness Show: Sort(A[0..n−1]) orders the elements in A[0..n−1] for n > 0. Basis: If n = 1, then A[0..0] has one element, and is hence sorted. Induction: If n > 1, then 0 < q = n/2 < n − 1. If n is even, then ⌊q⌋ = ⌈q⌉ = q and hence B and C are of equal size. Otherwise, if n is odd, then ⌊q⌋ + 1 = ⌈q⌉ and B is one element smaller than C in size. By induction hypothesis, lines 5 and 6 correctly order B[0..⌊q⌋−1] and C[0..⌈q⌉−1]. It remains to show that the merge process works correctly. This is true because the element that is moved back into A at each step is smaller than or equal to all the remaining elements that have not yet been moved. This uses the fact that B and C are already sorted. Verify that all the indices are correct. 7/2/2018 Analysis of Algorithms Chapter 5

11 Analysis of Algorithms
7/2/2018 Complexity Solve the recurrence relation: Is O(n log n) by recursion tree or master method log n levels O(n) O(1) for n = 1 T(n) = 2T(n/2) + O(n) otherwise O(n/2) O(n/2) 7/2/2018 Analysis of Algorithms Chapter 5

12 Analysis of Algorithms
7/2/2018 Quicksort Given an array of keyed records, sort the records in order of their keys. keys → Outline Pick a pivot point x (try to choose a pivot near median) Split the array recursively around a pivot (an ‘in-place’ algorithm) Sort recursively around the pivot (combining is trivial) < x x ≥ x 7/2/2018 Analysis of Algorithms Chapter 5

13 Analysis of Algorithms
Pivoting Select a pivot point (or use the linear-time median algorithm). Partition the list so that all the elements in the positions before the pivot are smaller than the pivot and those after the pivot are larger than or equal to the pivot (see the next slide). Exchange the pivot with the last element in the left (<) half. The pivot is now in its final position. Sort the positions below the pivot and above the pivot. p < p ≥ p 7/2/2018 Analysis of Algorithms

14 Analysis of Algorithms
Partitioning Move all keys < pivot to the left of it; all keys ≥ pivot end up to the right of it. Note: this uses A[i] as the pivot key SPLIT(A, i, j) ► returns pivot point k within A[i..j] k ← i ► pivot point chosen initially left for index = i + 1 to j ► scan all remaining keys if A[index] < A[i] then ► A[index] belongs below the pivot k ← k + 1 ► move pivot point right A[k] ↔ A[index] ► swap with leftmost key ≥ x A[i] ↔ A[k] ► swap pivot with rightmost key < x 7/2/2018 Analysis of Algorithms

15 Analysis of Algorithms
Diagrammatic proof Everything is trivial except for showing that SPLIT partitions correctly. Basis: i = j (just check that it returns i) Induction: Loop invariant right after line 3 (careful of empty regions) It starts with k = i and index = i + 1, and ends with index = j + 1. k pivot x < x ≥ x ? i index j 7/2/2018 Analysis of Algorithms

16 Review of binary tree traversals
Analysis of Algorithms 7/2/2018 Review of binary tree traversals Trick: trace around the outside of the tree, starting at the root CCW Preorder: when nodes are first encountered (going down the left side) Inorder: when going under a node Postorder: when nodes are last encountered (going up the right side) Computing height: h(T) = max{h(TL), h(TR)} + 1 if T ≠ ∅ and otherwise h(∅) = -1. Efficiency: Θ(n) Redraw example correctly so nodes are in correct horizontal order. Explain non-recursive algorithm for binary tree reversal. 7/2/2018 Analysis of Algorithms Chapter 5

17 Extremal elements in a binary search tree
TREE-MINIMUM(x) while left[x] ≠ nil do x ← left[x] return x TREE-MAXIMUM(x) while right[x] ≠ nil do x ← right[x] return x 7/2/2018 Analysis of Algorithms

18 Successor (predecessor is just the reverse)
TREE-SUCCESSOR(x) ► returns nil if x is maximal if right[x] ≠ nil then ► x has a right child return TREE-MINIMUM(right[x]) ► go to the right, otherwise … repeat ► go up until you go right y ← x ► save value of x x ← parent[x] ► go up until x = nil or left[x] = y ► reach root or successor return x ► the successor

19 In-order traversal of a binary search tree
Start at the minimal element (in the left ‘corner’) Repeatedly take successor (until no longer possible) What is the result of this procedure in a binary search tree? The nodes come out in order from smallest to largest. Is there a more efficient procedure? 7/2/2018 Analysis of Algorithms

20 Analysis of Algorithms
Threads Goal: Compute these without comparisons in O(h) time. Strategy: The left and right pointers are partial injective functions. Using threads, complete them into total injective functions l and r. Let l(x) = left[x] if it exists. Otherwise, repeatedly take its “right” parent. Let r(x) = right[x] if it exists. Otherwise, repeatedly take its “left” parent. Now, the successor of x is just r∙l⁻¹(x), and its predecessor is l∙r⁻¹(x). 7/2/2018 Analysis of Algorithms

21 Closest pair problem revisited
Analysis of Algorithms 7/2/2018 Closest pair problem revisited Problem: Given a set of n points in the plane, find the closest pair. Sort the points by their horizontal coordinates. Divide them into two subsets Pl and Pr by a vertical line drawn about the median x = m so that half the points lie to the left or on the line and half the points lie to the right or on the line. What if they all lie on that median line? 7/2/2018 Analysis of Algorithms Chapter 5

22 Closest pair continued
Find recursively the closest pairs for the left and right subsets. Set d = min{dl, dr}. We can limit our attention to the points in the symmetric vertical strip S of width 2d as possible closest pair. (The points are stored and processed in increasing order of their y coordinates.) Scan the points in the vertical strip S from the lowest up. For every point p(x, y) in the strip, inspect points in the strip that may be closer to p than d. There can be no more than 5 such points following p on the strip list (a sort of pigeon-hole principle)! Complexity: T(n) = 2T(n/2) + O(n) 7/2/2018 Analysis of Algorithms

23 Convex hull problem revisited
Problem: Find the smallest convex set which includes a set of points. Sort points lexicographically by their (x, y) coordinate values. Identify (lower) leftmost point p1 and (upper) rightmost point pn. Find point pmax that is farthest away from the line p1pn. Show there are no points to the left of both lines p1pmax and pmaxpn. 7/2/2018 Analysis of Algorithms

24 Analysis of Algorithms
7/2/2018 Convex hull continued Concatenate the upper hull of the points to the left of the line p1pmax with the upper hull of the points to the left of the line pmaxpn. Compute the lower hull in a similar manner. Combine the upper and lower hulls to get the convex hull of the whole. Actually linear-time on average! 7/2/2018 Analysis of Algorithms Chapter 5


Download ppt "Chapter 5 Divide & conquer"

Similar presentations


Ads by Google