Download presentation
Presentation is loading. Please wait.
Published byHorace Small Modified over 9 years ago
1
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort
2
10/13/20152 Exam (midterm 1) Closed book exam One cheat sheet allowed (limit to a single page of letter-size paper, double-sided) Tuesday, Feb 24, 10:00 – 11:25pm Basic calculator (no graphing) is allowed
3
10/13/20153 Materials covered Up to Lecture 8 (Feb 6) Comparing functions O, Θ, Ω –Definition, limit method, L’Hopital’s rule, sterling’s formula Analyzing iterative algorithms –Know how to count the number of basic operations, and express the running time as a sum of a series –Know how to compute the sum of a series (geometric, arithmetic, or other frequently seen series) Analyzing recursive algorithms –Define recurrence –Solve recurrence using recursion tree / iteration method –Solve recurrence using master method –Prove using substitution method
4
10/13/20154 Asymptotic notations O: <= o: < Ω: >= ω: > Θ: = (in terms of growth rate)
5
10/13/20155 Mathematical definitions O(g(n)) = {f(n): positive constants c and n 0 such that 0 ≤ f(n) ≤ cg(n) n>n 0 } Ω(g(n)) = {f(n): positive constants c and n 0 such that 0 ≤ cg(n) ≤ f(n) n>n 0 } Θ(g(n)) = {f(n): positive constants c 1, c 2, and n 0 such that 0 c 1 g(n) f(n) c 2 g(n) n n 0 }
6
10/13/20156 Big-Oh Claim: f(n) = 3n 2 + 10n + 5 O(n 2 ) Proof by definition: f(n) = 3n 2 + 10n + 5 3n 2 + 10n 2 + 5, n > 1 3n 2 + 10n 2 + 5n 2, n > 1 18 n 2, n > 1 If we let c = 18 and n 0 = 1, we have f(n) c n 2, n > n 0. Therefore by definition, f(n) = O(n 2 ).
7
10/13/20157 Use limits to compare orders of growth 0 lim f(n) / g(n) = c > 0 ∞ n→∞ f(n) o(g(n)) f(n) Θ (g(n)) f(n) ω (g(n)) f(n) O(g(n)) f(n) Ω(g(n)) lim f(n) / g(n) = lim f(n)’ / g(n)’ n→∞ Condition: If both lim f(n) and lim g(n) = ∞ or 0 L’ Hopital’s rule Stirling’s formula (constant)
8
10/13/20158 Useful rules for logarithms For all a > 0, b > 0, c > 0, the following rules hold log b a = log c a / log c b = lg a / lg b –So: log 10 n = log 2 n / log 2 10 log b a n = n log b a –So: log 3 n = n log3 = (n) b log b a = a –So: 2 log 2 n = n log (ab) = log a + log b –So: log (3n) = log 3 + log n = (log n) log (a/b) = log (a) – log(b) –So: log (n/2) = log n – log 2 = (log n) log b a = 1 / log a b log b 1 = 0
9
10/13/20159 Useful rules for exponentials For all a > 0, b > 0, c > 0, the following rules hold a 0 = 1 (0 0 = ?) Answer: does not exist a 1 = a a -1 = 1/a (a m ) n = a mn (a m ) n = (a n ) m –So: (3 n ) 2 = 3 2n = (3 2 ) n =9 n a m a n = a m+n –So: n 2 n 3 = n 5 –2 n 2 2 = 2 n+2 = 4 * 2 n = (2 n )
10
10/13/201510 More advanced dominance ranking
11
10/13/201511 Sum of arithmetic series If a 1, a 2, …, a n is an arithmetic series, then
12
10/13/201512 Sum of geometric series if r < 1 if r > 1 if r = 1
13
10/13/201513 Sum manipulation rules Example:
14
10/13/201514 Analyzing non-recursive algorithms Decide parameter (input size) Identify most executed line (basic operation) worst-case = average-case? T(n) = i t i T(n) = Θ (f(n))
15
10/13/201515 Statement cost time__ InsertionSort(A, n) { for j = 2 to n { c 1 n key = A[j] c 2 (n-1) i = j - 1; c 3 (n-1) while (i > 0) and (A[i] > key) { c 4 S A[i+1] = A[i] c 5 (S-(n-1)) i = i - 1 c 6 (S-(n-1)) } 0 A[i+1] = key c 7 (n-1) } 0 } Analysis of insertion Sort
16
10/13/201516 Best case Array already sorted S = j=1..n t j t j = 1 for all j S = n. T(n) = Θ (n) 1 ij sorted Key Inner loop stops when A[i] <= key, or i = 0
17
10/13/201517 Worst case Array originally in reverse order sorted S = j=1..n t j t j = j S = j=1..n j = 1 + 2 + 3 + … + n = n (n+1) / 2 = Θ (n 2 ) 1 ij sorted Inner loop stops when A[i] <= key Key
18
10/13/201518 Average case Array in random order S = j=1..n t j t j = j / 2 in average S = j=1..n j/2 = ½ j=1..n j = n (n+1) / 4 = Θ (n 2 ) 1 ij sorted Inner loop stops when A[i] <= key Key
19
10/13/201519 Analyzing recursive algorithms Defining recurrence relation Solving recurrence relation –Recursion tree (iteration) method –Substitution method –Master method
20
10/13/201520 Analyzing merge sort M ERGE -S ORT A[1.. n] 1.If n = 1, done. 2.Recursively sort A[ 1.. n/2 ] and A[ n/2 +1.. n ]. 3.“Merge” the 2 sorted lists T(n)T(n) Θ(1) 2T(n/2) f(n) T(n) = 2 T(n/2) + Θ(n)
21
10/13/201521 Recursive Insertion Sort RecursiveInsertionSort(A[1..n]) 1.if (n == 1) do nothing; 2.RecursiveInsertionSort(A[1..n-1]); 3.Find index i in A such that A[i] <= A[n] < A[i+1]; 4.Insert A[n] after A[i];
22
10/13/201522 Binary Search BinarySearch (A[1..N], value) { if (N == 0) return -1;// not found mid = (1+N)/2; if (A[mid] == value) return mid;// found else if (A[mid] > value) return BinarySearch (A[1..mid-1], value); else return BinarySearch (A[mid+1, N], value) }
23
10/13/201523 Recursion tree Solve T(n) = 2T(n/2) + n. n n/4 n/2 (1) … h = log n n n n #leaves = n (n)(n) Total (n log n) …
24
10/13/201524 Recurrence: T(n) = 2T(n/2) + n. Guess: T(n) = O(n log n). (eg. by recursion tree method) To prove, have to show T(n) ≤ c n log n for some c > 0 and for all n > n 0 Proof by induction: assume it is true for T(n/2), prove that it is also true for T(n). This means: Fact: T(n) = 2T(n/2) + n Assumption: T(n/2)≤ cn/2 log (n/2) Need to Prove: T(n)≤ c n log (n) Substitution method
25
10/13/201525 Proof To prove T(n) = O(n log n), we need to show that T(n) cn logn for some positive c and all sufficiently large n. Let’s assume this inequality is true for T(n/2), which means T(n/2) cn/2 log(n/2) Substitute T(n/2) in the recurrence by the r.h.s. of the above inequality, we have T(n) = 2 T(n/2) + n 2 * cn/2 log (n/2) + n cn (log n – 1) + n cn log n – (cn – n) cn log n for c ≥ 1 and all n ≥ 0. Therefore, by definition, T(n) = O(n log n).
26
10/13/201526 Master theorem T(n) = a T(n/b) + f (n) C ASE 1: f (n) = O(n log b a – ) T(n) = (n log b a ). C ASE 2: f (n) = (n log b a ) T(n) = (n log b a log n). C ASE 3: f (n) = (n log b a + ) and a f (n/b) c f (n) T(n) = ( f (n)). Optional: extended case 2 Key: compare f(n) with n log b a Regularity Condition
27
10/13/201527 Analysis of Quick Sort
28
10/13/201528 Quick sort Another divide and conquer sorting algorithm – like merge sort Anyone remember the basic idea? The worst-case and average-case running time? Learn some new algorithm analysis tricks
29
10/13/201529 Quick sort Quicksort an n-element array: 1.Divide: Partition the array into two subarrays around a pivot x such that elements in lower subarray x elements in upper subarray. 2.Conquer: Recursively sort the two subarrays. 3.Combine: Trivial. x x x x x x ≥ x≥ x ≥ x≥ x Key: Linear-time partitioning subroutine.
30
10/13/201530 Partition All the action takes place in the partition() function –Rearranges the subarray in place –End result: two subarrays All values in first subarray all values in second –Returns the index of the “pivot” element separating the two subarrays x x x x x x ≥ x≥ x ≥ x≥ x pr q
31
10/13/201531 Pseudocode for quicksort Q UICKSORT (A, p, r) if p < r then q P ARTITION (A, p, r) Q UICKSORT (A, p, q–1) Q UICKSORT (A, q+1, r) Initial call: Q UICKSORT (A, 1, n)
32
10/13/201532 Idea of partition If we are allowed to use a second array, it would be easy 6 6 10 5 5 8 8 13 3 3 2 2 11 6 6 5 5 3 3 2 2 13 8 8 10 2 2 5 5 3 3 6 6 11 13 8 8 10
33
10/13/201533 Another idea Keep two iterators: one from head, one from tail 6 6 10 5 5 8 8 13 3 3 2 2 11 6 6 2 2 5 5 3 3 13 8 8 10 11 3 3 2 2 5 5 6 6 13 8 8 10 11
34
10/13/201534 In-place Partition 6 6 10 5 5 8 8 13 3 3 2 2 11 23810 36
35
10/13/201535 Partition In Words Partition(A, p, r): –Select an element to act as the “pivot” (which?) –Grow two regions, A[p..i] and A[j..r] All elements in A[p..i] <= pivot All elements in A[j..r] >= pivot –Increment i until A[i] > pivot –Decrement j until A[j] < pivot –Swap A[i] and A[j] –Repeat until i >= j –Swap A[j] and A[p] –Return j Note: different from book’s partition(), which uses two iterators that both move forward.
36
10/13/201536 Partition Code Partition(A, p, r) x = A[p];// pivot is the first element i = p; j = r + 1; while (TRUE) { repeat i++; until A[i] > x or i >= j; repeat j--; until A[j] < x or j < i; if (i < j) Swap (A[i], A[j]); else break; } swap (A[p], A[j]); return j; What is the running time of partition() ? partition() runs in (n) time
37
10/13/201537 ij 6 6 10 5 5 8 8 13 3 3 2 2 11 x = 6 pr ij 6 6 10 5 5 8 8 13 3 3 2 2 11 ij 6 6 2 2 5 5 8 8 13 3 3 10 11 ij 6 6 2 2 5 5 8 8 13 3 3 10 11 ij 6 6 2 2 5 5 3 3 13 8 8 10 11 ij 6 6 2 2 5 5 3 3 13 8 8 10 11 3 3 2 2 5 5 6 6 13 8 8 10 11 qpr scan swap final swap Partition example
38
10/13/201538 6 6 10 5 5 8 8 11 3 3 2 2 13 3 3 2 2 5 5 6 6 11 8 8 10 13 2 2 3 3 5 5 6 6 8 8 10 11 13 2 2 3 3 5 5 6 6 10 8 8 11 13 2 2 3 3 5 5 6 6 8 8 10 11 13 Quick sort example
39
10/13/201539 Analysis of quicksort Assume all input elements are distinct. In practice, there are better partitioning algorithms for when duplicate input elements may exist. Let T(n) = worst-case running time on an array of n elements.
40
10/13/201540 Worst-case of quicksort Input sorted or reverse sorted. Partition around min or max element. One side of partition always has no elements. (arithmetic series)
41
10/13/201541 Worst-case recursion tree T(n) = T(0) + T(n–1) + n
42
10/13/201542 Worst-case recursion tree T(n) = T(0) + T(n–1) + n T(n)T(n)
43
10/13/201543 n T(0)T(n–1) Worst-case recursion tree T(n) = T(0) + T(n–1) + n
44
10/13/201544 n T(0)(n–1) Worst-case recursion tree T(n) = T(0) + T(n–1) + n T(0)T(n–2)
45
10/13/201545 n T(0)(n–1) Worst-case recursion tree T(n) = T(0) + T(n–1) + n T(0)(n–2) T(0)
46
10/13/201546 n T(0)(n–1) Worst-case recursion tree T(n) = T(0) + T(n–1) + n T(0)(n–2) T(0) height height = n
47
10/13/201547 n T(0)(n–1) Worst-case recursion tree T(n) = T(0) + T(n–1) + n T(0)(n–2) T(0) n height = n
48
10/13/201548 n (n–1) Worst-case recursion tree T(n) = T(0) + T(n–1) + n (n–2) (1) n height = n (1) T(n)= (n) + (n 2 ) = (n 2 )
49
10/13/201549 Best-case analysis (For intuition only!) If we’re lucky, P ARTITION splits the array evenly: T(n)= 2T(n/2) + (n) = (n log n) (same as merge sort) What if the split is always? What is the solution to this recurrence?
50
10/13/201550 Analysis of “almost-best” case
51
10/13/201551 Analysis of “almost-best” case n
52
10/13/201552 Analysis of “almost-best” case n
53
10/13/201553 Analysis of “almost-best” case (1) … … log 10/9 n … O(n) leaves n
54
10/13/201554 log 10 n Analysis of “almost-best” case (1) … … log 10/9 n T(n) n log 10/9 n + (n) … n log 10 n O(n) leaves (n log n)
55
10/13/201555 Quicksort Runtimes Best-case runtime T best (n) (n log n) Worst-case runtime T worst (n) (n 2 ) Worse than mergesort? Why is it called quicksort then? Its average runtime T avg (n) (n log n ) Better even, the expected runtime of randomized quicksort is (n log n)
56
10/13/201556 Randomized quicksort Randomly choose an element as pivot –Every time need to do a partition, throw a die to decide which element to use as the pivot –Each element has 1/n probability to be selected Rand-Partition(A, p, r) d = random(); // a random number between 0 and 1 index = p + floor((r-p+1) * d); // p<=index<=r swap(A[p], A[index]); Partition(A, p, r); // now do partition using A[p] as pivot
57
10/13/201557 Running time of randomized quicksort The expected running time is an average of all cases T(n) = T(0) + T(n–1) + dnif 0 : n–1 split, T(1) + T(n–2) + dnif 1 : n–2 split, T(n–1) + T(0) + dnif n–1 : 0 split, Expectation
58
10/13/201558
59
10/13/201559 Solving recurrence 1.Recursion tree (iteration) method - Good for guessing an answer 2.Substitution method - Generic method, rigid, but may be hard 3.Master method - Easy to learn, useful in limited cases only - Some tricks may help in other cases
60
10/13/201560 Substitution method 1.Guess the form of the solution: (e.g. using recursion trees, or expansion) 2.Verify by induction (inductive step). The most general method to solve a recurrence (prove O and separately):
61
10/13/201561 Expected running time of Quicksort Guess We need to show that for some c and sufficiently large n Use T(n) instead of for convenience
62
10/13/201562 Fact: Need to show: T(n) ≤ c n log (n) Assume: T(k) ≤ ck log (k) for 0 ≤ k ≤ n-1 Proof: if c ≥ 4. Therefore, by defintion, T(n) = (nlogn) using the fact that
63
10/13/201563 What are we doing here? The lg k in the second term is bounded by lg n Tightly Bounding The Key Summation Move the lg n outside the summation Split the summation for a tighter bound
64
10/13/201564 The summation bound so far Tightly Bounding The Key Summation What are we doing here? The lg k in the first term is bounded by lg n/2 What are we doing here? lg n/2 = lg n - 1 What are we doing here? Move (lg n - 1) outside the summation
65
10/13/201565 The summation bound so far Tightly Bounding The Key Summation What are we doing here? Distribute the (lg n - 1) What are we doing here? The summations overlap in range; combine them What are we doing here? The Guassian series
66
10/13/201566 The summation bound so far Tightly Bounding The Key Summation What are we doing here? Rearrange first term, place upper bound on second What are we doing? Guassian series What are we doing? Multiply it all out
67
10/13/201567 Tightly Bounding The Key Summation
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.