CS 3343: Analysis of Algorithms

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 2 Prof. Erik Demaine.
Advertisements

Introduction to Algorithms 6.046J
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
The master method The master method applies to recurrences of the form T(n) = a T(n/b) + f (n), where a  1, b > 1, and f is asymptotically positive.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Lecture 2: Divide and Conquer I: Merge-Sort and Master Theorem Shang-Hua Teng.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Updates HW#1 has been delayed until next MONDAY. There were two errors in the assignment Merge sort runs in Θ(n log n). Insertion sort runs in Θ(n2).
4.Recurrences Hsu, Lih-Hsing. Computer Theory Lab. Chapter 4P.2 Recurrences -- Substitution method Recursion-tree method Master method.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Recurrence Relations Connection to recursive algorithms Techniques for solving them.
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Analysis of Algorithms
Analysis of Algorithms CS 477/677
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Chapter 4. Recurrences. Outline Offers three methods for solving recurrences, that is for obtaining asymptotic bounds on the solution In the substitution.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Solving Recurrences with the Substitution Method.
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
1Computer Sciences. 2 GROWTH OF FUNCTIONS 3.2 STANDARD NOTATIONS AND COMMON FUNCTIONS.
BY Lecturer: Aisha Dawood. A recurrence is a function is defined in terms of:  one or more base cases, and  itself, with smaller arguments. 2.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Equal costs at all levels
Introduction to Algorithms
Recursion Ali.
Mathematical Foundations (Solving Recurrence)
Introduction to Algorithms: Recurrences
CS 3343: Analysis of Algorithms
Lecture 11. Master Theorem for analyzing Recursive relations
Divide-and-Conquer 6/30/2018 9:16 AM
CS 3343: Analysis of Algorithms
Chapter 4: Divide and Conquer
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Introduction to Algorithms 6.046J
Algorithms and Data Structures Lecture III
Divide-and-Conquer 7 2  9 4   2   4   7
Recursion-tree method
Ch 4: Recurrences Ming-Te Chi
Introduction to Algorithms
Divide and Conquer (Merge Sort)
CS 3343: Analysis of Algorithms
Divide-and-Conquer 7 2  9 4   2   4   7
Trevor Brown CS 341: Algorithms Trevor Brown
Analysis of Algorithms
At the end of this session, learner will be able to:
Introduction To Algorithms
Algorithms Recurrences.
CS200: Algorithm Analysis
Algorithms CSCI 235, Spring 2019 Lecture 6 Recurrences
Divide-and-Conquer 7 2  9 4   2   4   7
Quicksort Quick sort Correctness of partition - loop invariant
Algorithms and Data Structures Lecture III
Presentation transcript:

CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method 1/2/2019

Analyzing recursive algorithms Defining recurrence Solving recurrence 1/2/2019

Solving recurrence Recursion tree / iteration method - Good for guessing an answer Substitution method - Generic method, rigid, but may be hard Master method - Easy to learn, useful in limited cases only - Some tricks may help in other cases 1/2/2019

The master method The master method applies to recurrences of the form T(n) = a T(n/b) + f (n) , where a ³ 1, b > 1, and f is asymptotically positive. Divide the problem into a subproblems, each of size n/b Conquer the subproblems by solving them recursively. Combine subproblem solutions Divide + combine takes f(n) time. 1/2/2019

Master theorem T(n) = a T(n/b) + f (n) Key: compare f(n) with nlogba CASE 1: f (n) = O(nlogba – e)  T(n) = Q(nlogba) . CASE 2: f (n) = Q(nlogba)  T(n) = Q(nlogba log n) . CASE 3: f (n) = W(nlogba + e) and a f (n/b) £ c f (n)  T(n) = Q( f (n)) . Regularity Condition 1/2/2019

Case 1 f (n) = O(nlogba – e) for some constant e > 0. Alternatively: nlogba / f(n) = Ω(ne) Intuition: f (n) grows polynomially slower than nlogba Or: nlogba dominates f(n) by an ne factor for some e > 0 Solution: T(n) = Q(nlogba) T(n) = 4T(n/2) + n b = 2, a = 4, f(n) = n log24 = 2 f(n) = n = O(n2-e), or n2 / n = n1 = Ω(n), for e = 1  T(n) = Θ(n2) T(n) = 2T(n/2) + n/logn b = 2, a = 2, f(n) = n / log n log22 = 1 f(n) = n/logn  O(n1-e), or n1/ f(n) = log n  Ω(ne), for any e > 0  CASE 1 does not apply 1/2/2019

Case 2 e.g. T(n) = T(n/2) + 1 logba = 0 T(n) = 2 T(n/2) + n logba = 1 f (n) = Q (nlogba). Intuition: f (n) and nlogba have the same asymptotic order. Solution: T(n) = Q(nlogba log n) e.g. T(n) = T(n/2) + 1 logba = 0 T(n) = 2 T(n/2) + n logba = 1 T(n) = 4T(n/2) + n2 logba = 2 T(n) = 8T(n/2) + n3 logba = 3 1/2/2019

Case 3 f (n) = Ω(nlogba + ) for some constant  > 0. Alternatively: f(n) / nlogba = Ω(n) Intuition: f (n) grows polynomially faster than nlogba Or: f(n) dominates nlogba by an n factor for some  > 0 Solution: T(n) = Θ(f(n)) T(n) = T(n/2) + n b = 2, a = 1, f(n) = n nlog21 = n0 = 1 f(n) = n = Ω(n0+), or n / 1= n = Ω(n)  T(n) = Θ(n) T(n) = T(n/2) + log n b = 2, a = 1, f(n) = log n nlog21 = n0 = 1 f(n) = log n  Ω(n0+), or f(n) / nlog21 / = log n  Ω(ne)  CASE 3 does not apply 1/2/2019

Regularity condition a f (n/b) £ c f (n) for some c < 1 and all sufficiently large n This is needed for the master method to be mathematically correct. to deal with some non-converging functions such as sine or cosine functions For most f(n) you’ll see (e.g., polynomial, logarithm, exponential), you can safely ignore this condition, because it is implied by the first condition f (n) = Ω(nlogba + ) 1/2/2019

Examples T(n) = 4T(n/2) + n a = 4, b = 2  nlogba = n2; f (n) = n. CASE 1: f (n) = O(n2 – e) for e = 1.  T(n) = Q(n2). T(n) = 4T(n/2) + n2 a = 4, b = 2  nlogba = n2; f (n) = n2. CASE 2: f (n) = Q(n2).  T(n) = Q(n2log n). 1/2/2019

Examples T(n) = 4T(n/2) + n3 a = 4, b = 2  nlogba = n2; f (n) = n3. CASE 3: f (n) = W(n2 + e) for e = 1 and 4(n/2)3 £ cn3 (reg. cond.) for c = 1/2.  T(n) = Q(n3). T(n) = 4T(n/2) + n2/log n a = 4, b = 2  nlogba = n2; f (n) = n2/log n. Master method does not apply. In particular, for every constant e > 0, we have ne = w(log n). 1/2/2019

Examples T(n) = 4T(n/2) + n2.5 a = 4, b = 2  nlogba = n2; f (n) = n2.5. CASE 3: f (n) = W(n2 + e) for e = 0.5 and 4(n/2)2.5 £ cn2.5 (reg. cond.) for c = 0.75.  T(n) = Q(n2.5). T(n) = 4T(n/2) + n2 log n a = 4, b = 2  nlogba = n2; f (n) = n2log n. Master method does not apply. In particular, for every constant e > 0, we have ne = w(log n). 1/2/2019

How do I know which case to use How do I know which case to use? Do I need to try all three cases one by one? 1/2/2019

Compare f(n) with nlogba o(nlogba) Possible CASE 1 f(n)  Θ(nlogba) CASE 2 ω(nlogba) Possible CASE 3 check if nlogba / f(n)  Ω(n) check if f(n) / nlogba  Ω(n) 1/2/2019

Examples logba = 2. n = o(n2) => Check case 1 logba = 2. n2 = o(n2) => case 2 logba = 1.3. n = o(n1.3) => Check case 1 logba = 0.5. n = ω(n0.5) => Check case 3 logba = 0. nlogn = ω(n0) => Check case 3 logba = 1. nlogn = ω(n) => Check case 3 1/2/2019

More examples 1/2/2019

Some tricks Changing variables Obtaining upper and lower bounds Make a guess based on the bounds Prove using the substitution method 1/2/2019

Changing variables T(n) = 2T(n-1) + 1 Let n = log m, i.e., m = 2n => T(log m) = 2 T(log (m/2)) + 1 Let S(m) = T(log m) = T(n) => S(m) = 2S(m/2) + 1 => S(m) = Θ(m) => T(n) = S(m) = Θ(m) = Θ(2n) 1/2/2019

Changing variables Let n =2m => sqrt(n) = 2m/2 We then have T(2m) = T(2m/2) + 1 Let T(n) = T(2m) = S(m) => S(m) = S(m/2) + 1 S(m) = Θ (log m) = Θ (log log n) T(n) = Θ (log log n) 1/2/2019

Changing variables T(n) = 2T(n-2) + 1 Let n = log m, i.e., m = 2n => T(log m) = 2 T(log m/4) + 1 Let S(m) = T(log m) = T(n) => S(m) = 2S(m/4) + 1 => S(m) = m1/2 => T(n) = S(m) = (2n)1/2 = (sqrt(2)) n  1.4n 1/2/2019

Obtaining bounds Solve the Fibonacci sequence: T(n) = T(n-1) + T(n-2) + 1 T(n) >= 2T(n-2) + 1 [1] T(n) <= 2T(n-1) + 1 [2] Solving [1], we obtain T(n) >= 1.4n Solving [2], we obtain T(n) <= 2n Actually, T(n)  1.62n 1/2/2019

Obtaining bounds T(n) = T(n/2) + log n T(n)  Ω(log n) T(n)  O(T(n/2) + n) Solving T(n) = T(n/2) + n, we obtain T(n) = O(n), for any  > 0 So: T(n) O(n) for any  > 0 T(n) is unlikely polynomial Actually, T(n) = Θ(log2n) by extended case 2 1/2/2019

Extended Case 2 CASE 2: f (n) = Q(nlogba)  T(n) = Q(nlogba log n). Extended CASE 2: (k >= 0) f (n) = Q(nlogba logkn)  T(n) = Q(nlogba logk+1n). 1/2/2019

Solving recurrence Recursion tree / iteration method - Good for guessing an answer - Need to verify Substitution method - Generic method, rigid, but may be hard Master method - Easy to learn, useful in limited cases only - Some tricks may help in other cases 1/2/2019

Substitution method The most general method to solve a recurrence (prove O and  separately): Guess the form of the solution (e.g. by recursion tree / iteration method) Verify by induction (inductive step). Solve for O-constants n0 and c (base case of induction) 1/2/2019

Proof by substitution Recurrence: T(n) = 2T(n/2) + n. Guess: T(n) = O(n log n). (eg. by recursion tree method) To prove, have to show T(n) ≤ c n log n for some c > 0 and for all n > n0 Proof by induction: assume it is true for T(n/2), prove that it is also true for T(n). This means: Given: T(n) = 2T(n/2) + n Need to Prove: T(n)≤ c n log (n) Assume: T(n/2)≤ cn/2 log (n/2) 1/2/2019

Proof Given: T(n) = 2T(n/2) + n Need to Prove: T(n)≤ c n log (n) Assume: T(n/2)≤ cn/2 log (n/2) Proof: Substituting T(n/2) ≤ cn/2 log (n/2) into the recurrence, we get T(n) = 2 T(n/2) + n ≤ cn log (n/2) + n ≤ c n log n - c n + n ≤ c n log n - (c - 1) n ≤ c n log n for all n > 0 (if c ≥ 1). Therefore, by definition, T(n) = O(n log n). 1/2/2019

Proof by substitution Recurrence: T(n) = 2T(n/2) + n. Guess: T(n) = Ω(n log n). To prove, have to show T(n) ≥ c n log n for some c > 0 and for all n > n0 Proof by induction: assume it is true for T(n/2), prove that it is also true for T(n). This means: Given: Need to Prove: T(n) ≥ c n log (n) Assume: T(n) = 2T(n/2) + n T(n/2) ≥ cn/2 log (n/2) 1/2/2019

Proof Given: T(n) = 2T(n/2) + n Need to Prove: T(n) ≥ c n log (n) Assume: T(n/2) ≥ cn/2 log (n/2) Proof: Substituting T(n/2) ≥ cn/2 log (n/2) into the recurrence, we get T(n) = 2 T(n/2) + n ≥ cn log (n/2) + n ≥ c n log n - c n + n ≥ c n log n + (1 – c) n ≥ c n log n for all n > 0 (if c ≤ 1). Therefore, by definition, T(n) = Ω(n log n). 1/2/2019

More substitution method examples (1) Prove that T(n) = 3T(n/3) + n = O(nlogn) Need to prove that T(n)  c n log n for some c, and sufficiently large n Assume above is true for T(n/3), i.e. T(n/3)  cn/3 log (n/3) 1/2/2019

=> c log 3 – 1 ≥ 0 (for n > 0) => c ≥ 1/log3 => c ≥ log32 T(n) = 3 T(n/3) + n  3 cn/3 log (n/3) + n  cn log n – cn log3 + n  cn log n – (cn log3 – n)  cn log n (if cn log3 – n ≥ 0) cn log3 – n ≥ 0 => c log 3 – 1 ≥ 0 (for n > 0) => c ≥ 1/log3 => c ≥ log32 Therefore, T(n) = 3 T(n/3) + n  cn log n for c = log32 and n > 0. By definition, T(n) = O(n log n). 1/2/2019

More substitution method examples (2) Prove that T(n) = T(n/3) + T(2n/3) + n = O(nlogn) Need to prove that T(n)  c n log n for some c, and sufficiently large n Assume above is true for T(n/3) and T(2n/3), i.e. T(n/3)  cn/3 log (n/3) T(2n/3)  2cn/3 log (2n/3) 1/2/2019

 cn/3 log(n/3) + 2cn/3 log(2n/3) + n T(n) = T(n/3) + T(2n/3) + n  cn/3 log(n/3) + 2cn/3 log(2n/3) + n  cn log n + n – cn (log 3 – 2/3)  cn log n + n(1 – clog3 + 2c/3)  cn log n, for all n > 0 (if 1– c log3 + 2c/3  0) c log3 – 2c/3 ≥ 1 c ≥ 1 / (log3-2/3) > 0 Therefore, T(n) = T(n/3) + T(2n/3) + n  cn log n for c = 1 / (log3-2/3) and n > 0. By definition, T(n) = O(n log n). 1/2/2019

More substitution method examples (3) Prove that T(n) = 3T(n/4) + n2 = O(n2) Need to prove that T(n)  c n2 for some c, and sufficiently large n Assume above is true for T(n/4), i.e. T(n/4)  c(n/4)2 = cn2/16 1/2/2019

? T(n) = 3T(n/4) + n2  3 c n2 / 16 + n2  (3c/16 + 1) n2  cn2 3c/16 + 1  c implies that c ≥ 16/13 Therefore, T(n) = 3(n/4) + n2  cn2 for c = 16/13 and all n. By definition, T(n) = O(n2). ? 1/2/2019

Avoiding pitfalls Guess T(n) = 2T(n/2) + n = O(n) Need to prove that T(n)  c n Assume T(n/2)  cn/2 T(n)  2 * cn/2 + n = cn + n = O(n) What’s wrong? Need to prove T(n)  cn, not T(n)  cn + n 1/2/2019

Subtleties Prove that T(n) = T(n/2) + T(n/2) + 1 = O(n) Need to prove that T(n)  cn Assume above is true for T(n/2) & T(n/2) T(n) <= c n/2 + cn/2 + 1  cn + 1 Is it a correct proof? No! has to prove T(n) <= cn However we can prove T(n) = O (n – 1) 1/2/2019

Making good guess T(n) = 2T(n/2 + 17) + n When n approaches infinity, n/2 + 17 are not too different from n/2 Therefore can guess T(n) = (n log n) Prove : Assume T(n/2 + 17) ≥ c (n/2+17) log (n/2 + 17) Then we have T(n) = n + 2T(n/2+17) ≥ n + 2c (n/2+17) log (n/2 + 17) ≥ n + c n log (n/2 + 17) + 34 c log (n/2+17) ≥ c n log (n/2 + 17) + 34 c log (n/2+17) …. Maybe can guess T(n) = ((n-17) log (n-17)) (trying to get rid of the +17). Details skipped. 1/2/2019