Download presentation
Presentation is loading. Please wait.
Published byZoe McKenzie Modified over 9 years ago
1
Theory of Algorithms: Divide and Conquer James Gain and Edwin Blake {jgain | edwin} @cs.uct.ac.za Department of Computer Science University of Cape Town August - October 2004
2
Objectives lTo introduce the divide-and-conquer mind set lTo show a variety of divide-and-conquer solutions: Merge Sort Quick Sort Multiplication of Large Integers Strassen’s Matrix Multiplication Closest Pair and Convex Hull by Divide-and-Conquer lTo discuss the strengths and weaknesses of a divide-and-conquer strategy
3
Divide-and-Conquer lBest known algorithm design strategy: 1.Divide instance of problem into two or more smaller instances 2.Solve smaller instances recursively 3.Obtain solution to original (larger) instance by combining these solutions lThe Master Theorem applies Recurrences are of the form T(n) = aT(n/b) + f (n), a 1, b 2 lSilly Example (Addition): a 0 + …. + a n-1 = (a 0 + … + a n/2-1 ) + (a n/2 + … a n-1 )
4
Divide-and-Conquer Illustrated SUBPROBLEM 2 OF SIZE n/2 SUBPROBLEM 1 OF SIZE n/2 A SOLUTION TO SUBPROBLEM 1 A SOLUTION TO THE ORIGINAL PROBLEM A SOLUTION TO SUBPROBLEM 2 A PROBLEM OF SIZE n
5
Mergesort lAlgorithm: 1.Split A[1..n] in half and copy of each half into arrays B[1.. n/2 ] and C[1.. n/2 ] 2.Recursively MergeSort arrays B and C 3.Merge sorted arrays B and C into array A lMerging: REPEAT until no elements remain in one of B or C 1.Compare 1st element in the rest of B and C 2.Copy smaller into A, incrementing index of corresponding array 3.Once all elements in one of B or C is processed, copy the remaining unprocessed elements from the other array into A
6
Mergesort Example 7 2 1 6 4 7 2 16 4 7 216 4 7 2 2 7 1 2 7 4 6 1 2 4 6 7
7
Efficiency of Mergesort lRecurrence: C(n) = 2 C(n/2) + C merge (n) for n > 1, C(1) = 0 C merge (n) = n - 1 in the worst case lAll cases have same efficiency: ( n log n) lNumber of comparisons is close to theoretical minimum for comparison-based sorting: log n! ≈ n lg n - 1.44 n lSpace requirement: ( n ) (NOT in-place) lCan be implemented without recursion (bottom- up)
8
Quicksort lSelect a pivot (partitioning element) lRearrange the list into two sublists: All elements positioned before the pivot are ≤ the pivot Those positioned after the pivot are > the pivot Requires a pivoting algorithm lExchange the pivot with the last element in the first sublist The pivot is now in its final position lQuickSort the two sublists p A[i]≤pA[i]>p
9
The Partition Algorithm
10
Quicksort Example 5 3 1 9 8 2 7 i j 5 3 1 2 8 9 7 j i 2 3 1 5 8 9 7 2 3 18 9 7 i j 2 1 3 5 3 1 2 8 9 7 i j 2 1 3 j i 1 2 3 Recursive Call Quicksort (2 3 1) & Quicksort (8 9 7) Recursive Call Quicksort (1) & Quicksort (3) i j 8 7 9 i j 8 7 9 j i 7 8 9 Recursive Call Quicksort (7) & Quicksort (9)
11
Worst Case Efficiency of Quicksort lIn the worst case all splits are completely skewed lFor instance, an already sorted list! lOne subarray is empty, other reduced by only one: Make n+1 comparisons Exchange pivot with itself Quicksort left = Ø, right = A[1..n-1] C worst = (n+1) + n + … + 3 = (n+1)(n+2)/2 - 3 = (n 2 ) p A[i]≤pA[i]>p
12
General Efficiency of Quicksort lEfficiency Cases: Best: split in the middle — ( n log n) Worst: sorted array! — ( n 2 ) Average: random arrays — ( n log n) lImprovements (in combination 20-25% faster): Better pivot selection: median of three partitioning avoids worst case in sorted files Switch to Insertion sort on small subfiles Elimination of recursion lConsidered the method of choice for internal sorting for large files (n ≥ 10000)
13
QuickHull Algorithm lInspired by Quicksort 1.Sort points by increasing x-coordinate values 2.Identify leftmost and rightmost extreme points P 1 and P 2 (part of hull) 3.Compute upper hull: Find point P max that is farthest away from line P 1 P 2 Quickhull the points to the left of line P 1 P max Quickhull the points to the left of line P max P 2 4.Similarly compute lower hull P1P1 P max P2P2 L L
14
Finding the Furthest Point lGiven three points in the plane p 1, p 2, p 3 lArea of Triangle = p 1 p 2 p 3 = 1/2 D lD = l D = x 1 y 1 + x 3 y 1 + x 2 y 3 - x 3 y 2 - x 2 y 1 - x 1 y 3 lProperties of D : Positive iff p 3 is to the left of p 1 p 2 Correlates with distance of p 3 from p 1 p 2 x1x1 y1y1 1 x2x2 y2y2 1 x3x3 y3y3 1
15
Efficiency of Quickhull lFinding point farthest away from line P 1 P 2 is linear in the number of points lThis gives same efficiency as quicksort: Worst case: (n 2 ) Average case: (n log n) If an initial sort is required, this can be accomplished in ( n log n) — no increase in asymptotic efficiency class lAlternative Divide-and-Conquer Convex Hull: Graham’s scan and DCHull Also ( n log n) but with lower coefficients
16
Strassen’s Matrix Multiplication lStrassen observed [1969] that the product of two matrices can be computed as follows: lOp Count: Each of M 1, …, M 7 r requires 1 mult and 1 or 2 add/sub Total = 7 mul and 18 add/sub Compared with brute force which requires 8 mults and 4 add/sub ln/2 n/2 submatrices are computed recursively by the same method C 00 C 01 C 10 C 11 A 00 A 01 A 10 A 11 B 00 B 01 B 10 B 11 M 1 + M 4 -M 5 +M 7 M 3 + M 5 M 2 + M 4 M 1 + M 3 - M 2 + M 6 = =*
17
Efficiency of Strassen’s Algorithm lIf n is not a power of 2, matrices can be padded with zeros lNumber of multiplications: M(n) = 7 M(n/2) for n > 1, M(1) = 1 Set n = 2 k, apply backward substitution M(2 k ) = 7M(2 k-1 ) = 7 [7 M(2 k-2 )] = 7 2 M(2 k-2 ) = … = 7 k M(n) = 7 log n = n log 7 ≈ n 2.807 Number of additions: A(n) (n log 7 ) lOther algorithms closer to the lower limit of n 2 multiplications, but are even more complicated
18
Strengths and Weaknesses of Divide-and-Conquer Strengths: Generally improves on Brute Force by one base efficiency class Easy to analyse using the Master Theorem ûWeaknesses: Often requires recursion, which introduces overheads Can be inapplicable and inferior to simpler algorithmic solutions
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.