Download presentation
Presentation is loading. Please wait.
1
6.006- Introduction to Algorithms
Introduction to Algorithms, Lecture 5 September 24, 2001 Introduction to Algorithms Lecture 8 Prof. Silvio Micali CLRS: chapter 4. © 2001 by Charles E. Leiserson
2
Menu Sorting! Insertion Sort Merge Sort Recurrences Master theorem
3
The problem of sorting How can we do it efficiently ?
Input: array A[1…n] of numbers. Output: permutation B[1…n] of A such that B[1] £ B[2] £ … £ B[n] . e.g. A = [7, 2, 5, 5, 9.6] → B = [2, 5, 5, 7, 9.6] How can we do it efficiently ?
4
Insertion sort A: key sorted new location of key
INSERTION-SORT (A, n) ⊳ A[1 . . n] for j ← 2 to n insert key A[j] into the (already sorted) sub-array A[1 .. j-1] by pairwise key-swaps down to its right position Illustration of iteration j A: 1 n i j key new location of key sorted
5
Example of insertion sort
8 2 4 9 3 6
6
Example of insertion sort
8 2 4 9 3 6 1 swap
7
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6
8
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6 1 swap
9
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6
10
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 0 swap
11
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6
12
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 3 swaps
13
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6
14
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6 2 swaps
15
Example of insertion sort
8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6 2 3 4 6 8 9 done Running time? O(n2) e.g. when input is A = [n, n − 1, n − 2, , 2, 1]
16
Meet Merge Sort MERGE-SORT A[1 . . n] divide and conquer
If n = 1, done (nothing to sort). Otherwise, recursively sort A[ n/2 ] and A[ n/ n ] . “Merge” the two sorted sub-arrays. Key subroutine: MERGE
17
Merging two sorted arrays
20 13 7 2 12 11 9 1 … output array
18
Merging two sorted arrays
20 13 7 2 12 11 9 1 1 … output array
19
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 1 … output array
20
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 1 2 … output array
21
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 20 13 7 12 11 9 1 2 … output array
22
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 20 13 7 12 11 9 1 2 7 … output array
23
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 20 13 7 12 11 9 20 13 12 11 9 1 2 7 … output array
24
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 20 13 7 12 11 9 20 13 12 11 9 1 2 7 9 … output array
25
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 20 13 7 12 11 9 20 13 12 11 9 20 13 12 11 1 2 7 9 … output array
26
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 20 13 7 12 11 9 20 13 12 11 9 20 13 12 11 1 2 7 9 11 … output array
27
Merging two sorted arrays
20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 20 13 7 12 11 9 20 13 12 11 9 20 13 12 11 20 13 12 1 2 7 9 11 … output array
28
Merging two sorted arrays
20 13 7 2 12 11 9 1 … output array
29
Merging two sorted arrays
20 13 7 2 12 11 9 1 … output array Time = Q(n) to merge a total of n elements (linear time).
30
Analyzing merge sort MERGE-SORT A[1 . . n] T(n) Q(1) If n = 1, done
Q(n) If n = 1, done 2. Recursively sort A[ n/2 ] and A[ n/2 n ] 3. “Merge” the two sorted lists T(n) = Q(1) if n = 1; 2T(n/2) + Q(n) if n > 1.
31
Recurrence solving Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
32
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
33
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
34
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
35
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
… Q(1) …
36
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
… Q(1) …
37
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
… Q(1) …
38
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
… Q(1) …
39
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
… … Q(1) …
40
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
h = lg n cn/4 cn/4 cn/4 cn/4 cn … … Q(1) … Q(n) #leaves = n
41
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
h = lg n cn/4 cn/4 cn/4 cn/4 cn … … Q(1) … Q(n) Total = ? #leaves = n
42
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
h = lg n cn/4 cn/4 cn/4 cn/4 cn … … Q(1) Q(n) Total = Q(n lg n) #leaves = n
43
The master method “One theorem for all recurrences” (sort of)
It applies to recurrences of the form T(n) = a T(n/b) + f (n) , where a ³ 1, b > 1, and f is positive. e.g. Mergesort: a=2, b=2, f(n)=O(n) e.g.2 Binary Search: a=1, b=2, f(n)=O(1) Basic Idea: Compare f (n) with nlogba. size of each subproblem #subproblems time to split into subproblems and combine results
44
Idea of master theorem T(n) = a T(n/b) + f (n) Recursion tree: f (n)
h = logbn a … f (n/b) f (n/b) f (n/b) a f (n/b) a … f (n/b2) f (n/b2) f (n/b2) a2 f (n/b2) … … T (1) nlogbaT (1) #leaves = ah = alogbn = nlogba ?
45
Idea of master theorem T(n) = a T(n/b) + f (n) Recursion tree: f (n)
… f (n/b) f (n/b) f (n/b) a f (n/b) a h = logbn … a2 f (n/b2) f (n/b2) f (n/b2) f (n/b2) … … CASE 1: The weight increases geometrically from root to leaves. T (1) nlogbaT (1) ⇒ Leaves hold a constant fraction of total weight! Q(nlogba) ?
46
Idea of master theorem T(n) = a T(n/b) + f (n) Recursion tree: f (n)
… f (n/b) f (n/b) f (n/b) a f (n/b) a h = logbn … f (n/b2) f (n/b2) f (n/b2) a2 f (n/b2) … … CASE 2: Weight approximately the same on each level. T (1) nlogbaT (1) ⇒ Total weight ≈ #levels × leaves’ weight! Θ( 𝑛 log 𝑏 𝑎 log 𝑏 𝑎) ?
47
Idea of master theorem T(n) = a T(n/b) + f (n) Recursion tree: f (n)
… f (n/b) f (n/b) f (n/b) a f (n/b) a h = logbn … f (n/b2) f (n/b2) f (n/b2) a2 f (n/b2) … … CASE 3: Weight decreases geometrically from root to leaves. T (1) nlogbaT (1) ⇒ Root holds a constant fraction of total weight! Q( f (n)) ?
48
Three common cases Compare f (n) with nlogba:
f (n) = Θ(nlogba – e) for some constant e > 0. I.e., f (n) grows polynomially slower than nlogba (by an ne factor). cost of level i = ai f (n/bi) = Q(nlogba-e bi e ) (be)i so geometric increase of cost as we go deeper in the tree hence, leaf level cost dominates! Solution: T(n) = Q(nlogba) .
49
Three common cases (cont.)
Compare f (n) with nlogba: f (n) = Q(nlogba log 𝑏 𝑘 𝑛 ) for some constant k ³ 0. I.e., f (n) and nlogba grow at similar rates. (cost of level i ) = ai f (n/bi) = Q(nlogba ⋅log_𝑏k(n/bi)) so all levels have about the same cost Solution: 𝑇 𝑛 =Θ( 𝑛 log 𝑏 𝑎 log 𝑏 𝑘+1 𝑛)
50
Three common cases (cont.)
Compare f (n) with nlogba: f (n) = Θ(nlogba + e) for some constant e > 0. I.e., f (n) grows polynomially faster than nlogba (by an ne factor). (cost of level i ) = ai f (n/bi) = Q(nlogba+e b-i e ) so geometric decrease of cost as we go deeper in the tree hence, root cost dominates! Solution: T(n) = Q( f (n) ) .
51
Example1 Please don’t! T(n) = 2T(n/2) + 1
a = 2, b = 2 nlogba =n f (n) = 1 CASE 1: f (n) = O(n1 – e) for e = 1 T(n) = Q(n). Use Master Theorem: Compute a and b Compute nlogba and f (n) Compare
52
Example 2 T(n) = 2T(n/2) + n a = 2, b = 2 nlogba = n f (n) = n
CASE 2: f (n) = Q(nlg0n), that is, k = 0 T(n) = Q(nlg n).
53
Example 3 T(n) = 4T(n/2) + n3 a = 4, b = 2 nlogba = n2 f (n) = n3
CASE 3: f (n) = W(n2 + e) for e = 1 T(n) = Q(n3).
54
Let’s go master the rest of the day!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.