Download presentation
Presentation is loading. Please wait.
1
Divide-and-Conquer
2
Divide-and-Conquer The most-well known algorithm design strategy:
Divide instance of problem into two or more smaller instances Solve smaller instances recursively Obtain solution to original (larger) instance by combining these solutions A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
3
Divide-and-Conquer famous Example
a problem of size n subproblem 1 of size n/2 subproblem 2 of size n/2 a solution to subproblem 1 a solution to subproblem 2 a solution to the original problem A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
4
General Concept of Divide & Conquer
Given a function to compute on n inputs, the divide-and-conquer strategy consists of: splitting input into k distinct subsets, 1<k<n, yielding k subproblems. solving these subproblems combining the sub-solutions into solution of the whole problem. if the subproblems are relatively large, then divide-and-conquer is applied again. if the subproblems are small, the are solved without splitting.
5
The Divide and Conquer Algorithm
Divide_Conquer(problem P){ if Small(P) return SimpleSolution(P); else { divide P into smaller instances P1, P2, …, Pk, k2; Apply Divide_Conquer to each of these subproblems; return Combine(Divide_Conque(P1), Divide_Conque(P2),…, Divide_Conque(Pk)); }
6
Divide and Conquer Three Steps of The Divide and Conquer Approach
Divide the problem into two or more smaller subproblems. Conquer the subproblems by solving them recursively. Combine the solutions to the subproblems into the solutions for the original problem.
7
Divide and Conquer Runtime
Master Theorem is optional in our course An instance of size n can be divided into b instances of size n/b, with a of them needing to be solved. (a and b are constants; a ≥ 1 and b > 1.) Runtime is given by
8
Divide-and-Conquer Examples
Sorting: mergesort and quicksort Counting inversions Closest-pair algorithm A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
9
Mergesort Split array A[0..n-1] in two about equal halves and make copies of each half in arrays B and C Sort arrays B and C recursively Merge sorted arrays B and C into array A as follows: Repeat the following until no elements remain in one of the arrays: compare the first elements in the remaining unprocessed portions of the arrays copy the smaller of the two into A, while incrementing the index indicating the unprocessed portion of that array Once all elements in one of the arrays are processed, copy the remaining unprocessed elements from the other array into A. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
10
Pseudocode of Mergesort
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
11
Pseudocode of Merge A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
12
Mergesort Example* Go over this example in detail, then do another example of merging, something like: ( ) (3 4 6)
13
Analysis of Mergesort All cases have same efficiency: Θ(n log n)
Number of comparisons in the worst case is close to theoretical minimum for comparison-based sorting: log2 n! ≈ n log2 n n Space requirement: Θ(n) (not in-place) Stable? yes Can be implemented without recursion (bottom-up) A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
14
Quicksort Select a pivot (partitioning element) – here, the first element Rearrange the list so that all the elements in the first s positions are smaller than or equal to the pivot and all the elements in the remaining n-s positions are larger than or equal to the pivot (see next slide for an algorithm) Exchange the pivot with the last element in the first (i.e., ) subarray — the pivot is now in its final position Sort the two subarrays recursively p A[i]p A[i]p A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
15
Hoare’s Partitioning Algorithm
2 directional scan Full details in [L] page
16
Hoare’s Partitioning Algorithm
17
Quicksort Example A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
18
Quicksort Example
19
Analysis of Quicksort Best case: split in the middle — Θ(n log n)
Worst case: sorted array! — Θ(n2) Average case: random arrays — Θ(n log n) Improvements: better pivot selection: median of three partitioning switch to insertion sort on small subfiles elimination of recursion These combine to 20-25% improvement Stable ? No A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
20
[K] 5.3 Counting Inversions
21
Counting Inversions Music site tries to match your song preferences with others. You rank n songs. Music site consults database to find people with similar tastes. Similarity metric: number of inversions between two rankings. My rank: 1, 2, …, n. Your rank: a1, a2, …, an. Songs i and j inverted if i < j, but ai > aj. Brute force: check all (n2) pairs i and j. Songs A B C D E amazon.com, launch.com, restaurants, movies, . . . Note: there can be a quadratic number of inversions. Asymptotically faster algorithm must compute total number without even looking at each inversion individually. Inversions 3-2, 4-2 Me 1 2 3 4 5 You 1 3 4 2 5
22
Applications Applications. Voting theory. Collaborative filtering.
Measuring the "sortedness" of an array. Sensitivity analysis of Google's ranking function. Rank aggregation for meta-searching on the Web. Nonparametric statistics (e.g., Kendall's Tau distance). etc.
23
Counting Inversions: Divide-and-Conquer
1 5 4 8 10 2 6 9 12 11 3 7
24
Counting Inversions: Divide-and-Conquer
Divide: separate list into two pieces. Divide: O(1). 1 5 4 8 10 2 6 9 12 11 3 7 1 5 4 8 10 2 6 9 12 11 3 7
25
Counting Inversions: Divide-and-Conquer
Divide: separate list into two pieces. Conquer: recursively count inversions in each half. Divide: O(1). 1 5 4 8 10 2 6 9 12 11 3 7 1 5 4 8 10 2 6 9 12 11 3 7 Conquer: 2T(n / 2) 5 blue-blue inversions 8 green-green inversions 5-4, 5-2, 4-2, 8-2, 10-2 6-3, 9-3, 9-7, 12-3, 12-7, 12-11, 11-3, 11-7
26
Counting Inversions: Divide-and-Conquer
Divide: separate list into two pieces. Conquer: recursively count inversions in each half. Combine: count inversions where ai and aj are in different halves, and return sum of three quantities. Divide: O(1). 1 5 4 8 10 2 6 9 12 11 3 7 1 5 4 8 10 2 6 9 12 11 3 7 Conquer: 2T(n / 2) 5 blue-blue inversions 8 green-green inversions 9 blue-green inversions 5-3, 4-3, 8-6, 8-3, 8-7, 10-6, 10-9, 10-3, 10-7 Combine: ??? Total = = 22.
27
Counting Inversions: Combine
Combine: count blue-green inversions Assume each half is sorted. Count inversions where ai and aj are in different halves. Merge two sorted halves into sorted whole. to maintain sorted invariant 3 7 10 14 18 19 2 11 16 17 23 25 6 3 2 2 13 blue-green inversions: Count: O(n) 2 3 7 10 11 14 16 17 18 19 23 25 Merge: O(n)
28
Counting Inversions: Implementation
Pre-condition. [Merge-and-Count] A and B are sorted. Post-condition. [Sort-and-Count] L is sorted. Sort-and-Count(L) { if list L has one element return 0 and the list L Divide the list into two halves A and B (rA, A) Sort-and-Count(A) (rB, B) Sort-and-Count(B) (rB, L) Merge-and-Count(A, B) return r = rA + rB + r and the sorted list L }
29
Closest Pair of Points "Divide and conquer", Caesar's other famous quote "I came, I saw, I conquered" Divide-and-conquer idea dates back to Julius Caesar. Favorite war tactic was to divide an opposing army in two halves, and then assault one half with his entire force.
30
Closest Pair of Points Closest pair. Given n points in the plane, find a pair with smallest Euclidean distance between them. Fundamental geometric primitive. Graphics, computer vision, geographic information systems, molecular modeling, air traffic control. Special case of nearest neighbor, Euclidean MST, Voronoi. Brute force. Check all pairs of points p and q with (n2) comparisons. 1-D version. O(n log n) easy if points are on a line. Assumption. No two points have same x coordinate. fast closest pair inspired fast algorithms for these problems Only to make presentation cleaner
31
Closest Pair of Points Algorithm.
Divide: draw vertical line L so that roughly ½n points on each side. L
32
Closest Pair of Points Algorithm.
Divide: draw vertical line L so that roughly ½n points on each side. Conquer: find closest pair in each side recursively. L 21 12
33
Closest Pair of Points Algorithm.
Divide: draw vertical line L so that roughly ½n points on each side. Conquer: find closest pair in each side recursively. Combine: find closest pair with one point in each side. Return best of 3 solutions. seems like (n2) L 8 21 12
34
Closest Pair of Points Find closest pair with one point in each side, assuming that distance < . L 21 = min(12, 21) 12
35
Closest Pair of Points Find closest pair with one point in each side, assuming that distance < . Observation: only need to consider points within of line L. L 21 = min(12, 21) 12
36
Closest Pair of Points Find closest pair with one point in each side, assuming that distance < . Observation: only need to consider points within of line L. Sort points in 2-strip by their y coordinate. L 7 6 21 5 4 = min(12, 21) 12 3 2 1
37
Closest Pair of Points Find closest pair with one point in each side, assuming that distance < . Observation: only need to consider points within of line L. Sort points in 2-strip by their y coordinate. Only check distances of those within 7 positions in sorted list! L 7 6 21 5 4 = min(12, 21) 12 3 2 1
38
Closest Pair of Points Def. Let si be the point in the 2-strip, with the ith smallest y-coordinate. Claim. If |i – j| 7, then the distance between si and sj is at least . Pf. No two points lie in same ½-by-½ box. Two points at least 2 rows apart have distance 2(½). ▪ j 39 31 ½ 2 rows 30 ½ 29 i ½ 28 27 can reduce neighbors to 6 Note: no points on median line (assumption that no points have same x coordinate ensures that median line can be drawn in this way) 26 25
39
separation can be done in O(N) using linear time median algorithm
O(n log n)
40
Finding the Closest Pair of Points
Problem: Given n ordered pairs (x1 , y1), (x2 , y2), ... , (xn , yn), find the distance between the two points in the set that are closest together.
41
Closest-Points Problem
Brute Force Algorithm Iterate through all possible pairs of points, calculating the distance between each of these pairs. Any time you see a distance shorter than the shortest distance seen, update the shortest distance seen. Since computing the distance between two points takes O(1) time, And there are a total of n(n-1)/2= (n2) distinct pairs of points, It follows that the running time of this algorithm is (n2). Can we do better?
42
Closest-Points Problem
Here’s the idea: Split the set of n points into 2 halves by a vertical line. Do this by sorting all the points by their x-coordinate and then picking the middle point and drawing a vertical line just to the right of it. Recursively solve the problem on both sets of points. Return the smaller of the two values. What’s the problem with this idea?
43
Closest-Points Problem
The problem is that the actual shortest distance between any 2 of the original points MIGHT BE between a point in the 1st set and a point in the 2nd set! Like in this situation: So we would get a shortest distance of 3, instead of 1.
44
We must adapt our approach:
In step 3, we can “save” the smaller of the two values (called δ), then we have to check to see if there are points that are closer than δ apart. δ Do we need to search thru all possible pairs of points from the 2 different sides? NO, we can only consider points that are within δ of our dividing line.
45
Closest-Points Problem
However, one could construct a case where ALL the points on each side are within δ of the vertical line: So, this case is as bad as our original idea where we’d have to compare each pair of points to one another from the different groups. But, wait!! Is it really necessary to compare each point on one side with every other point on every other side???
46
Closest-Points Problem
Consider the following rectangle around the dividing line that is constructed by eight /2 x /2 squares. Note that the diagonal of each square is /√2 , which is less than . Since each square lies on a single side of the dividing line, at most one point lies in each box Because if 2 points were within a single box the distance between those 2 points would be less than . Therefore, there are at MOST 7 other points that could possibly be a distance of less than apart from a given point, that have a greater y coordinate than that point. (We assume that our point is on the bottom row of this grid; we draw the grid that way.)
47
Closest-Points Problem
Now we have the issue of how do we know which 7 points to compare a given point with? The idea is: As you are processing the points recursively, SORT them based on the y-coordinate. Then for a given point within the strip, you only need to compare with the next 7 points.
48
{ } ClosestPair(ptsByX, ptsByY, n) if (n = 1) return 1
// Combine midPoint ptsByX[mid] lrDist min(distL, distR) Construct array yStrip, in increasing y order, of all points p in ptsByY s.t. |p.x − midPoint.x| < lrDist // Check yStrip minDist lrDist for (j 0; j ≤ yStrip.length − 2; j++) { k j + 1 while (k yStrip.length − 1 and yStrip[k].y − yStrip[j].y < lrDist) { d distance(yStrip[j], yStrip[k]) minDist min(minDist, d) k++ } return minDist { ClosestPair(ptsByX, ptsByY, n) if (n = 1) return 1 if (n = 2) return distance(ptsByX[0], ptsByX[1]) // Divide into two subproblems mid n/2 -1 copy ptsByX[ mid] into new array XL in x order. copy ptsByX[mid n − 1] into new array XR copy ptsByY into arrays Y L and Y R in y order, s.t. XL and Y L refer to same points, as do XR,Y R. // Conquer distL ClosestPair(XL, Y L, n/2) distR ClosestPair(XR, Y R, n/2) }
49
Reading [L] Chapter 5 Section 5.3 is NOT required
All about Convex-Hull is optional [K] Section 5.3 All about Trees is not required <yet> All about Graphs is not required <yet> Repeated topics in: [K] 5.4 and [C] Chapter 7
50
Have a lot of fun solving Assignment 5
Questions 1, 2, and 3 are about lecture 9 Questions 4, and 5 are about lecture 10 Question 6 is a better partitioning idea [3 zones instead of 2] Questions 7, and 8 are about lecture 11
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.