1 Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.

Slides:



Advertisements
Similar presentations
Advanced Algorithms Piyush Kumar (Lecture 12: Parallel Algorithms) Welcome to COT5405 Courtesy Baker 05.
Advertisements

Lecture 4 Divide and Conquer for Nearest Neighbor Problem
Divide-and-Conquer The most-well known algorithm design strategy:
DIVIDE AND CONQUER. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer.
Dr. Deshi Ye Divide-and-conquer Dr. Deshi Ye
1 Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS4413 Divide-and-Conquer
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2013 Lecture 11.
Algorithm Design Techniques: Divide and Conquer. Introduction Divide and conquer – algorithm makes at least two recursive calls Subproblems should not.
CSC 331: Algorithm Analysis Divide-and-Conquer Algorithms.
1 Divide-and-Conquer Divide-and-conquer. n Break up problem into several parts. n Solve each part recursively. n Combine solutions to sub-problems into.
Theory of Algorithms: Divide and Conquer
Divide and Conquer. Recall Complexity Analysis – Comparison of algorithm – Big O Simplification From source code – Recursive.
Spring 2015 Lecture 5: QuickSort & Selection
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
CS38 Introduction to Algorithms Lecture 7 April 22, 2014.
Algorithm Design Strategy Divide and Conquer. More examples of Divide and Conquer  Review of Divide & Conquer Concept  More examples  Finding closest.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Lecture 4: Divide and Conquer III: Other Applications and Examples Shang-Hua Teng.
Closest Pair of Points Computational Geometry, WS 2006/07 Lecture 9, Part II Prof. Dr. Thomas Ottmann Khaireel A. Mohamed Algorithmen & Datenstrukturen,
Lecture 6 Divide and Conquer for Nearest Neighbor Problem Shang-Hua Teng.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2013 Lecture 12.
Divide-and-Conquer 7 2  9 4   2   4   7
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURES Divide.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
File Organization and Processing Week 13 Divide and Conquer.
1 Closest Pair of Points (from “Algorithm Design” by J.Kleinberg and E.Tardos) Closest pair. Given n points in the plane, find a pair with smallest Euclidean.
1 Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
1 Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CS 61B Data Structures and Programming Methodology July 21, 2008 David Sun.
Divide and Conquer Applications Sanghyun Park Fall 2002 CSE, POSTECH.
CSE 421 Algorithms Lecture 15 Closest Pair, Multiplication.
1 Chapter 4-2 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
1 Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
1 How to Multiply Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. integers, matrices, and polynomials.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 11.
CSCI 256 Data Structures and Algorithm Analysis Lecture 10 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
1 Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 16.
Chapter 2 Divide-and-Conquer algorithms
Chapter 5 Divide and Conquer
分治法.
Chapter 2 Divide-and-Conquer algorithms
Chapter 4 Divide-and-Conquer
Chapter 2 Divide-and-Conquer algorithms
Chapter 4 Divide-and-Conquer
CSCE 411 Design and Analysis of Algorithms
Divide and Conquer / Closest Pair of Points Yin Tat Lee
Chapter 5 Divide and Conquer
Punya Biswas Lecture 15 Closest Pair, Multiplication
Topic: Divide and Conquer
Divide-and-Conquer.
Divide and Conquer / Closest Pair of Points Yin Tat Lee
Divide-and-Conquer The most-well known algorithm design strategy:
CSCI 256 Data Structures and Algorithm Analysis Lecture 12
Lecture 15, Winter 2019 Closest Pair, Multiplication
Divide-and-Conquer 7 2  9 4   2   4   7
Lecture 15 Closest Pair, Multiplication
Presentation transcript:

1 Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.

2 Divide-and-Conquer Divide-and-conquer. n Break up problem into several parts. n Solve each part recursively. n Combine solutions to sub-problems into overall solution. Most common usage. n Break up problem of size n into two equal parts of size ½n. n Solve two parts recursively. n Combine two solutions into overall solution in linear time. Consequence. n Brute force: n 2. n Divide-and-conquer: n log n.

5.1 Mergesort

4 Obvious sorting applications. List files in a directory. Organize an MP3 library. List names in a phone book. Display Google PageRank results. Problems become easier once sorted. Find the median. Find the closest pair. Binary search in a database. Identify statistical outliers. Find duplicates in a mailing list. Non-obvious sorting applications. Data compression. Computer graphics. Interval scheduling. Computational biology. Minimum spanning tree. Supply chain management. Simulate a system of particles. Book recommendations on Amazon. Load balancing on a parallel computer.... Sorting Sorting. Given n elements, rearrange in ascending order.

5 Mergesort Mergesort. n Divide array into two halves. n Recursively sort each half. n Merge two halves to make sorted whole. merge sort divide ALGORITHMS ALGORITHMS AGLORHIMST AGHILMORST Jon von Neumann (1945) O(n) 2T(n/2) O(1)

6 Merging Merging. Combine two pre-sorted lists into a sorted whole. How to merge efficiently? n Linear number of comparisons. n Use temporary array. Challenge for the bored. In-place merge. [Kronrud, 1969] AGLORHIMST AGHI using only a constant amount of extra storage

7 A Useful Recurrence Relation Def. T(n) = number of comparisons to mergesort an input of size n. Mergesort recurrence. Solution. T(n) = O(n log 2 n). Assorted proofs. We describe several ways to prove this recurrence. Initially we assume n is a power of 2 and replace  with =.

8 Direct Proof by Recursion Tree T(n) T(n/2) T(n/4) T(2) n T(n / 2 i ) 2(n/2) = n 4(n/4) = n 2 i (n / 2 i ) = n n/2 (2) = n... log 2 n tree depth Work i

9 Proof by Telescoping Claim. If T(n) satisfies this recurrence, then T(n) = n log 2 n. Pf. For n > 1: assumes n is a power of 2

10 Proof by Induction Claim. If T(n) satisfies this recurrence, then T(n) = n log 2 n. Pf. (by induction on n) n Base case: n = 1. Inductive hypothesis: Assume T(n) = n log 2 n for n ≤ 2n - 1 n Goal: show that T(2n) = 2n log 2 (2n). assumes n is a power of 2

11 Analysis of Mergesort Recurrence Claim. If T(n) satisfies the following recurrence, then T(n)  n  lg n . Pf. (by induction on n) n Base case: n = 1. n Define n 1 =  n / 2 , n 2 =  n / 2 . n Induction step: assume true for 1, 2,..., n–1. log 2 n

5.3 Counting Inversions

13 Music site tries to match your song preferences with others. n You rank n songs. n Music site consults database to find people with similar tastes. Similarity metric: number of inversions between two rankings. n My rank: 1, 2, …, n. n Your rank: a 1, a 2, …, a n. n Songs i and j inverted if i a j. Brute force: check all  (n 2 ) pairs i and j. Key observation: Only #2 is out of place above (one move can produce x > 1 inversions) You Me ABCDE Songs Counting Inversions Inversions 3-2, 4-2

14 Applications Applications. n Voting theory. n Collaborative filtering. n Measuring the "sortedness" of an array. n Sensitivity analysis of Google's ranking function. n Rank aggregation for meta-searching on the Web. n Nonparametric statistics (e.g., Kendall's Tau distance).

15 Counting Inversions: Divide-and-Conquer Divide-and-conquer

16 Counting Inversions: Divide-and-Conquer Divide-and-conquer. n Divide: separate list into two pieces Divide: O(1).

17 Counting Inversions: Divide-and-Conquer Divide-and-conquer. n Divide: separate list into two pieces. n Conquer: recursively count inversions in each half blue-blue inversions8 green-green inversions Divide: O(1). Conquer: 2T(n / 2) 5-4, 5-2, 4-2, 8-2, , 9-3, 9-7, 12-3, 12-7, 12-11, 11-3, 11-7

18 Counting Inversions: Divide-and-Conquer Divide-and-conquer. n Divide: separate list into two pieces. n Conquer: recursively count inversions in each half. n Combine: count inversions where a i and a j are in different halves, and return sum of three quantities blue-blue inversions8 green-green inversions Divide: O(1). Conquer: 2T(n / 2) Combine: ??? 9 blue-green inversions 5-3, 4-3, 8-6, 8-3, 8-7, 10-6, 10-9, 10-3, 10-7 Total = = 22.

19 13 blue-green inversions: Counting Inversions: Combine Combine: count blue-green inversions n Assume each half is sorted. n Count inversions where a i and a j are in different halves. n Merge two sorted halves into sorted whole. Count: O(n) Merge: O(n) to maintain sorted invariant

Merge-And-Count 20 Merge-and-Count(A,B) a_ind = b_ind = 0 count = 0 C = [] while A and B both not empty: a i = A[a_ind] b j = B[b_ind] #Add smaller of a i,b j to C if b j is smaller,then: Add b j to C count+= # elements left in A b_ind+=1 else: add a i to C a_ind += 1 #now one of the lists is empty Append remaining elements in non-empty list to C Return (count,C)

21 Counting Inversions: Implementation Pre-condition. [Merge-and-Count] A and B are sorted. Post-condition. [Sort-and-Count] L is sorted. Sort-and-Count(L) { if list L has one element return 0 and the list L Divide the list into two halves A and B (r A, A)  Sort-and-Count(A) (r B, B)  Sort-and-Count(B) (r B, L)  Merge-and-Count(A, B) return r = r A + r B + r and the sorted list L }

5.4 Closest Pair of Points

23 Closest Pair of Points Closest pair. Given n points in the plane, find a pair with smallest Euclidean distance between them. Fundamental geometric primitive. n Graphics, computer vision, geographic information systems, molecular modeling, air traffic control. n Special case of nearest neighbor, Euclidean MST, Voronoi. Brute force. Check all pairs of points p and q with  (n 2 ) comparisons. 1-D version. O(n log n) easy if points are on a line. Assumption. No two points have same x coordinate. to make presentation cleaner fast closest pair inspired fast algorithms for these problems

24 Closest Pair of Points: First Attempt Divide. Sub-divide region into 4 quadrants. L

25 Closest Pair of Points: First Attempt Divide. Sub-divide region into 4 quadrants. Obstacle. Impossible to ensure n/4 points in each piece. L

26 Closest Pair of Points Algorithm. n Divide: draw vertical line L so that roughly ½n points on each side. L

27 Closest Pair of Points Algorithm. n Divide: draw vertical line L so that roughly ½n points on each side. n Conquer: find closest pair in each side recursively L

28 Closest Pair of Points Algorithm. n Divide: draw vertical line L so that roughly ½n points on each side. n Conquer: find closest pair in each side recursively. n Combine: find closest pair with one point in each side. n Return best of 3 solutions L seems like  (n 2 )

29 Closest Pair of Points Find closest pair with one point in each side, assuming that distance <   = min(12, 21) L

30 Closest Pair of Points Find closest pair with one point in each side, assuming that distance < . n Observation: only need to consider points within  of line L  L  = min(12, 21)

 Closest Pair of Points Find closest pair with one point in each side, assuming that distance < . n Observation: only need to consider points within  of line L. n Sort points in 2  -strip by their y coordinate. L  = min(12, 21)

 Closest Pair of Points Find closest pair with one point in each side, assuming that distance < . n Observation: only need to consider points within  of line L. n Sort points in 2  -strip by their y coordinate. n Only check distances of those within 11 positions in sorted list! L  = min(12, 21)

33 Closest Pair of Points Def. Let s i be the point in the 2  -strip, with the i th smallest y-coordinate. Claim. If |i – j|  12, then the distance between s i and s j is at least . Pf. n No two points lie in same ½  -by-½  box. n Two points at least 2 rows apart have distance  2(½  ). ▪ Fact. Still true if we replace 12 with 7.   ½½ 2 rows ½½ ½½ 39 i j

34 Closest Pair Algorithm Closest-Pair(p 1, …, p n ) : if number of points <= 3: find the closest pair manually Compute separation line L such that half the points are on one side and half on the other side.  1 = Closest-Pair(left half)  2 = Closest-Pair(right half)  = min(  1,  2 ) Collect S = all points further than  from separation line L Sort S by y-coordinate. Scan points in y-order and compare distance between each point and next 11 neighbors. If any of these distances is less than , update . return . O(n log n) 2T(n / 2) O(n) O(n log n) O(n)

35 Closest Pair of Points: Analysis Running time. Q. Can we achieve O(n log n)? A. Yes. Don't sort points in strip from scratch each time. Option 1: Sort the list once, carefully craft sub-problems (book solution) Option 2: Don’t sort the list, but create a ClosestPoint/MergeSort hybrid that sorts points from the bottom-up

Achieving O(n lg n) 36 #Let P x,P y be the same set of points sorted by x and y #coordinates, respectively. Pre-sort points using helper function Closest-Pair(P x,P y ) : if number of points <= 3: find the closest pair manually Compute separation line L by picking the middle element of P x Q x,Q y = sorted sets for left half R x,R y = sorted sets for left half  1 = Closest-Pair(Q x,Q y )  2 = Closest-Pair(R x,R y )  = min(  1,  2 ) Collect S = all points in P y less than  from separation line L #S already sorted by y-coordinate. Scan points in S and compare distance between each point and next 11 neighbors. If any of these distances is less than , update . return . 2T(n / 2) O(n) O(1) O(n)

BOARD WORK: USEFUL RECURRENCE THEOREMS 37

5.5 Integer Multiplication

39 Integer Arithmetic Add. Given two n-digit integers a and b, compute a + b. n O(n) bit operations. Multiply. Given two n-digit integers a and b, compute a × b. n Brute force solution:  (n 2 ) bit operations Add Multiply

40 To multiply two n-digit integers: n Multiply four ½n-digit integers. n Add two ½n-digit integers, and shift to obtain result. Divide-and-Conquer Multiplication: Warmup assumes n is a power of 2

41 To multiply two n-digit integers: n Add two ½n digit integers. n Multiply three ½n-digit integers. n Add, subtract, and shift ½n-digit integers to obtain result. Theorem. [Karatsuba-Ofman, 1962] Can multiply two n-digit integers in O(n ) bit operations. Karatsuba Multiplication AB C A C

42 Karatsuba: Recursion Tree n 3(n/2) 9(n/4) 3 k (n / 2 k ) 3 lg n (2)... T(n) T(n/2) T(n/4) T(2) T(n / 2 k ) T(n/4) T(n/2) T(n/4) T(n/2) T(n/4)... By Theorem on Board (5.4 in book), this fits T(n) = qT(n/2) + cn with q > 2. Therefore, T(n)  O(n lg 3 )

Matrix Multiplication

44 Matrix multiplication. Given two n-by-n matrices A and B, compute C = AB. Brute force.  (n 3 ) arithmetic operations. Fundamental question. Can we improve upon brute force? Matrix Multiplication

45 Matrix Multiplication: Warmup Divide-and-conquer. n Divide: partition A and B into ½n-by-½n blocks. n Conquer: multiply 8 ½n-by-½n recursively. n Combine: add appropriate products using 4 matrix additions.

46 Matrix Multiplication: Key Idea Key idea. multiply 2-by-2 block matrices with only 7 multiplications. n 7 multiplications. n 18 = additions (or subtractions).

47 Fast Matrix Multiplication Fast matrix multiplication. (Strassen, 1969) n Divide: partition A and B into ½n-by-½n blocks. n Compute: 14 ½n-by-½n matrices via 10 matrix additions. n Conquer: multiply 7 ½n-by-½n matrices recursively. n Combine: 7 products into 4 terms using 8 matrix additions. Analysis. n Assume n is a power of 2. n T(n) = # arithmetic operations.

48 Fast Matrix Multiplication in Practice Implementation issues. n Sparsity. n Caching effects. n Numerical stability. n Odd matrix dimensions. n Crossover to classical algorithm around n = 128. Common misperception: "Strassen is only a theoretical curiosity." n Advanced Computation Group at Apple Computer reports 8x speedup on G4 Velocity Engine when n ~ 2,500. n Range of instances where it's useful is a subject of controversy. Remark. Can "Strassenize" Ax=b, determinant, eigenvalues, and other matrix ops.

49 Fast Matrix Multiplication in Theory Q. Multiply two 2-by-2 matrices with only 7 scalar multiplications? A. Yes! [Strassen, 1969] Q. Multiply two 2-by-2 matrices with only 6 scalar multiplications? A. Impossible. [Hopcroft and Kerr, 1971] Q. Two 3-by-3 matrices with only 21 scalar multiplications? A. Also impossible. Q. Two 70-by-70 matrices with only 143,640 scalar multiplications? A. Yes! [Pan, 1980] Decimal wars. n December, 1979: O(n ). n January, 1980: O(n ).

50 Fast Matrix Multiplication in Theory Best known. O(n ) [Coppersmith-Winograd, 1987.] Conjecture. O(n 2+  ) for any  > 0. Caveat. Theoretical improvements to Strassen are progressively less practical.