Introduction to Algorithms 6.046J/18.401J L ECTURE 3 Divide and Conquer Binary search Powering a number Fibonacci numbers Matrix multiplication Strassen’s.

Slides:



Advertisements
Similar presentations
Divide-and-Conquer CIS 606 Spring 2010.
Advertisements

A simple example finding the maximum of a set S of n numbers.
More on Divide and Conquer. The divide-and-conquer design paradigm 1. Divide the problem (instance) into subproblems. 2. Conquer the subproblems by solving.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
Divide and Conquer Strategy
Introduction to Algorithms Jiafen Liu Sept
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS4413 Divide-and-Conquer
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
1 Divide-and-Conquer CSC401 – Analysis of Algorithms Lecture Notes 11 Divide-and-Conquer Objectives: Introduce the Divide-and-conquer paradigm Review the.
Theory of Algorithms: Divide and Conquer
Lecture 2: Divide and Conquer algorithms Phan Thị Hà Dương
Divide and Conquer. Recall Complexity Analysis – Comparison of algorithm – Big O Simplification From source code – Recursive.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 4 Some of the sides are exported from different sources.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer Matrix multiplication and Strassen’s algorithm.
CSC 2300 Data Structures & Algorithms January 30, 2007 Chapter 2. Algorithm Analysis.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
CSE 421 Algorithms Richard Anderson Lecture 11 Recurrences.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Unit 1. Sorting and Divide and Conquer. Lecture 1 Introduction to Algorithm and Sorting.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Arithmetic.
Divide-and-Conquer 7 2  9 4   2   4   7
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 3 Prof. Erik Demaine.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures and Algorithms A. G. Malamos
Project 2 due … Project 2 due … Project 2 Project 2.
9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURES Divide.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
1 Chapter 4 Divide-and-Conquer. 2 About this lecture Recall the divide-and-conquer paradigm, which we used for merge sort: – Divide the problem into a.
Advanced Counting Techniques CSC-2259 Discrete Structures Konstantin Busch - LSU1.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
ADA: 4.5. Matrix Mult.1 Objective o an extra divide and conquer example, based on a question in class Algorithm Design and Analysis (ADA) ,
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
1 Algorithms CSCI 235, Fall 2015 Lecture 6 Recurrences.
Divide and Conquer Strategy
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
Prepared by John Reif, Ph.D.
CMPT 438 Algorithms.
Introduction to Algorithms: Divide-n-Conquer Algorithms
Introduction to Algorithms
Introduction to Algorithms Prof. Charles E. Leiserson
Unit 1. Sorting and Divide and Conquer
CS 3343: Analysis of Algorithms
Chapter 4 Divide-and-Conquer
Divide-and-Conquer The most-well known algorithm design strategy:
CSCE 411 Design and Analysis of Algorithms
CS 3343: Analysis of Algorithms
Introduction to Algorithms 6.046J
Divide-and-Conquer 7 2  9 4   2   4   7
Lecture 6 More Divide-Conquer and Paradigm #4 Data Structure.
Richard Anderson Lecture 12 Recurrences and Divide and Conquer
Divide-and-Conquer The most-well known algorithm design strategy:
Introduction to Algorithms
Introduction to Algorithms
Algorithms Recurrences.
Topic: Divide and Conquer
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

Introduction to Algorithms 6.046J/18.401J L ECTURE 3 Divide and Conquer Binary search Powering a number Fibonacci numbers Matrix multiplication Strassen’s algorithm VLSI tree layout Prof. Erik D. Demaine September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.1

The divide-and-conquer design paradigm 1. Divide the problem (instance) into subproblems. 2. Conquer the subproblems by solving them recursively. 3. Combine subproblem solutions. September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.2

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.3 Merge sort 1. Divide: Trivial. 2. Conquer: Recursively sort 2 subarrays. 3. Combine: Linear-time merge.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.4 Merge sort 1. Divide: Trivial. 2. Conquer: Recursively sort 2 subarrays. 3. Combine: Linear-time merge. # subproblems subproblem size work dividing and combining

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.5 Master theorem (reprise) T(n) = aT(n/b) + f (n) CASE 1: f (n) = O(nlogba –  ), constant  > 0  T(n) =  (nlogba). CASE 2: f (n) =  (nlogba lgkn), constant k  0  T(n) =  (nlogba lgk+1n). CASE 3: f (n) =  (nlogba +  ), constant  > 0, and regularity condition  T(n) =  ( f (n)).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.6 Master theorem (reprise) T(n) = aT(n/b) + f (n) CASE 1: f (n) = O(nlogba –  ), constant  > 0  T(n) =  (nlogba). CASE 2: f (n) =  (nlogba lgkn), constant k  0  T(n) =  (nlogba lgk+1n). CASE 3: f (n) =  (nlogba +  ), constant  > 0, and regularity condition  T(n) =  ( f (n)). Merge sort: a = 2, b = 2  nlogba = nlog22 = n  CASE 2 (k = 0)  T(n) =  (n lg n).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.7 Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.8 Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.9 Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.10 Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.11 Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.12 Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.13 Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.14 Recurrence for binary search # subproblems subproblem size work dividing and combining

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.15 Recurrence for binary search # subproblems subproblem size work dividing and combining C ASE 2 (k=0)

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.16 Powering a number Problem: Compute a n, where n ∈ N. Naive algorithm:  (n).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.17 Powering a number Problem: Compute a n, where n ∈ N. Naive algorithm:  (n). Divide-and-conquer algorithm: if n is even; if n is odd

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.18 Powering a number Problem: Compute a n, where n ∈ N. Naive algorithm:  (n). Divide-and-conquer algorithm: if n is even; if n is odd T(n) = T(n/2) +  (1)  T(n) =  (lg n).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.19 Fibonacci numbers Recursive definition: if If if

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.20 Fibonacci numbers Recursive definition: if If if Naive recursive algorithm: Ω(φ n ) (exponential time), where is the golden ratio.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.21 Computing Fibonacci numbers Bottom-up: Compute F0, F1, F2, …, Fn in order, forming each number by summing the two previous. Running time:  (n).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.22 Computing Fibonacci numbers Bottom-up: Compute F0, F1, F2, …, Fn in order, forming each number by summing the two previous. Running time:  (n). Naive recursive squaring: Fn = φn/ rounded to the nearest integer. 5 Recursive squaring:  (lg n) time. This method is unreliable, since floating-point arithmetic is prone to round-off errors.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.23 Recursive squaring Theorem:

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.24 Theorem: Recursive squaring Algorithm: Recursive squaring. Time =  (lg n).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.25 Theorem: Recursive squaring Algorithm: Recursive squaring. Time =  (lg n). Proof of theorem. (Induction on n.) Base (n = 1):

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.26 Recursive squaring Inductive step (n ≥ 2):

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.27 Matrix multiplication Input: Output:

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.28 Standard algorithm for i ← 1 to n do for j ← 1 to n do cij ← 0 for k ← 1 to n do cij ← cij + aik ⋅ bkj

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.29 Standard algorithm for i ← 1 to n do for j ← 1 to n do cij ← 0 for k ← 1 to n do cij ← cij + aik ⋅ bkj Running time =  (n3)

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.30 Divide-and-conquer algorithm IDEA: n×n matrix = 2×2 matrix of (n/2)×(n/2) submatrices: r = ae+bg s = af +bh t = ce+dg u = cf +dh 8 mults of (n/2)×(n/2) submatrices 4 adds of (n/2)×(n/2) submatrices

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.31 Divide-and-conquer algorithm IDEA: n×n matrix = 2×2 matrix of (n/2)×(n/2) submatrices: r = ae+bg s = af +bh t = ce+dg u = cf +dh 8 mults of (n/2)×(n/2) submatrices 4 adds of (n/2)×(n/2) submatrices recursive

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.32 Analysis of D&C algorithm # submatrices submatrix size work adding submatrices

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.33 Analysis of D&C algorithm # submatrices submatrix size work adding submatrices nlogba = nlog28 = n3 ⇒ CASE 1 ⇒ T(n) =  (n3).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.34 Analysis of D&C algorithm # submatrices submatrix size work adding submatrices nlogba = nlog28 = n3 ⇒ CASE 1 ⇒ T(n) =  (n3). No better than the ordinary algorithm.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.35 Strassen’s idea Multiply 2×2 matrices with only 7 recursive mults.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.36 Strassen’s idea Multiply 2×2 matrices with only 7 recursive mults. P1 = a ⋅ ( f – h) P2 = (a + b) ⋅ h P3 = (c + d) ⋅ e P4 = d ⋅ (g – e) P5 = (a + d) ⋅ (e + h) P6 = (b – d) ⋅ (g + h) P7 = (a – c) ⋅ (e + f )

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.37 Strassen’s idea Multiply 2×2 matrices with only 7 recursive mults. P1 = a ⋅ ( f – h) P2 = (a + b) ⋅ h P3 = (c + d) ⋅ e P4 = d ⋅ (g – e) P5 = (a + d) ⋅ (e + h) P6 = (b – d) ⋅ (g + h) P7 = (a – c) ⋅ (e + f ) r = P5 + P4 – P2 + P6 s = P1 + P2 t = P3 + P4 u = P5 + P1 – P3 – P7

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.38 Strassen’s idea Multiply 2×2 matrices with only 7 recursive mults. P1 = a ⋅ ( f – h) P2 = (a + b) ⋅ h P3 = (c + d) ⋅ e P4 = d ⋅ (g – e) P5 = (a + d) ⋅ (e + h) P6 = (b – d) ⋅ (g + h) P7 = (a – c) ⋅ (e + f ) r = P5 + P4 – P2 + P6 s = P1 + P2 t = P3 + P4 u = P5 + P1 – P3 – P7 7 mults, 18 adds/subs. Note: No reliance on commutativity of mult!

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.39 Strassen’s idea Multiply 2×2 matrices with only 7 recursive mults. P1 = a ⋅ ( f – h) P2 = (a + b) ⋅ h P3 = (c + d) ⋅ e P4 = d ⋅ (g – e) P5 = (a + d) ⋅ (e + h) P6 = (b – d) ⋅ (g + h) P7 = (a – c) ⋅ (e + f ) r = P5 + P4 – P2 + P6 = (a + d) (e + h) + d (g – e) – (a + b) h + (b – d) (g + h) = ae + ah + de + dh + dg –de – ah – bh + bg + bh – dg – dh = ae + bg

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.40 Strassen’s algorithm 1. Divide: Partition A and B into (n/2)×(n/2) submatrices. Form terms to be multiplied using + and –. 2. Conquer: Perform 7 multiplications of (n/2)×(n/2) submatrices recursively. 3. Combine: Form C using + and – on (n/2)×(n/2) submatrices.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.41 Strassen’s algorithm 1. Divide: Partition A and B into (n/2)×(n/2) submatrices. Form terms to be multiplied using + and –. 2. Conquer: Perform 7 multiplications of (n/2)×(n/2) submatrices recursively. 3. Combine: Form C using + and – on (n/2)×(n/2) submatrices. T(n) = 7 T(n/2) +  (n 2 )

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.42 Analysis of Strassen T(n) = 7 T(n/2) +  (n 2 )

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.43 Analysis of Strassen T(n) = 7 T(n/2) +  (n 2 ) nlogba = nlog27 ≈ n2.81 ⇒ CASE 1 ⇒ T(n) =  (n lg 7).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.44 Analysis of Strassen T(n) = 7 T(n/2) +  (n 2 ) nlogba = nlog27 ≈ n2.81 ⇒ CASE 1 ⇒ T(n) =  (n lg 7). The number 2.81 may not seem much smaller than 3, but because the difference is in the exponent, the impact on running time is significant. In fact, Strassen’s algorithm beats the ordinary algorithm on today’s machines for n ≥ 32 or so.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.45 Analysis of Strassen T(n) = 7 T(n/2) +  (n 2 ) nlogba = nlog27 ≈ n2.81 ⇒ CASE 1 ⇒ T(n) =  (n lg 7). The number 2.81 may not seem much smaller than 3, but because the difference is in the exponent, the impact on running time is significant. In fact, Strassen’s algorithm beats the ordinary algorithm on today’s machines for n ≥ 32 or so. Best to date (of theoretical interest only):  (n ).

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.46 VLSI layout Problem: Embed a complete binary tree with n leaves in a grid using minimal area.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.47 VLSI layout Problem: Embed a complete binary tree with n leaves in a grid using minimal area.

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.48 VLSI layout Problem: Embed a complete binary tree with n leaves in a grid using minimal area. H(n) = H(n/2) +  (1) =  (lg n)

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.49 VLSI layout Problem: Embed a complete binary tree with n leaves in a grid using minimal area. H(n) = H(n/2) +  (1) =  (lg n) W(n) = 2W(n/2) +  (1) =  (n)

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.50 VLSI layout Problem: Embed a complete binary tree with n leaves in a grid using minimal area. H(n) = H(n/2) +  (1) =  (lg n) W(n) = 2W(n/2) +  (1) =  (n) Area =  (n lg n)

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.51 H-tree embedding

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.52 H-tree embedding L(n/4)  (1) L(n/4)

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.53 H-tree embedding L(n/4)  (1) L(n/4) L(n) = 2L(n/4) +  (1) =  (  ) Area =  (n) n

September 14, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson L2.54 Conclusion Divide and conquer is just one of several powerful techniques for algorithm design. Divide-and-conquer algorithms can be analyzed using recurrences and the master method (so practice this math). The divide-and-conquer strategy often leads to efficient algorithms.