Download presentation
Presentation is loading. Please wait.
1
DAST Tirgul 2
2
Methods to solve Recursions
Blocking by another function Iterative method Guessing + induction Recursion tree Master theorem
3
T(n) = T(g(n)) + θ(f(n))
Iterative Method Appropriate for when: Each iteration has only one recursive call All recursive calls in each iterations have the same input size Like algorithms who split the input, e.g. n, (n/2 + n/2), (n/4 + n/4 + n/4 + n/4), … Steps: (assume the input of (an iteration of) the algorithm is n) Count f(n): how many operations are done in each iteration (as a function of n), besides the recursive call. Give a simple θ bound. Determine g(n): the size of the input for the next iteration (as a function n). Write down: T(n) = T(g(n)) + θ(f(n))
4
Guessing + Induction Appropriate when you think you know what the running time should be. Caution! Induction is a strong tool, but can easily be misused. Steps: Guess the running time. Determine a basis and the induction assumption. Show the induction step.
5
Example – Insertion Sort
Idea: Always keep A[1 … i] sorted Add A[i+1] into it’s sorted place in A[1 … i+1] e.g. advance all of A[j] for all j with A[j] > A[i+1] by 1 in A Non-recursive implementation: InsertionSort(A[1 … n],n) for i = 2 … n insert A[i] into A[1..i-1] (this is actually another loop…) 3 5 6 8 4 2 7 1 i sorted 3 5 6 8 4 2 7 1 sorted insert
6
Example – Insertion Sort
Idea: Always keep A[1 … i] sorted Add A[i+1] into it’s sorted place in A[1 … i+1] e.g. advance all of A[j] for all j with A[j] > A[i+1] by 1 in A Non-recursive implementation: InsertionSort(A[1 … n],n) for i = 2 … n for j = i …2 if ( A[j] < A[j-1] ) swap( A[j], A[j-1]) 3 5 6 8 4 2 7 1 i sorted 3 5 6 8 4 2 7 1 sorted insert insert A[i] into A[1..i-1] θ(n)
7
Example – Insertion Sort
Recursive version – same algorithm: RecInsSort(A[1…n],n,cur) if(cur = 1) then return RecInsSort(A, n, cur-1) (recursive call) insert A[cur] into A[1..cur-1] Is insertion sort (asymptotically) better than bubble-sort? Let’s check! (yey…)
8
Example – Insertion Sort
RecInsSort(A[1…n],n,cur) if(cur = 1) then return RecInsSort(A, n, cur-1) insert A[cur] into A[1..cur-1] Iterative method steps: Count f(n) - how many operations are done in each iteration: Determine g(n) - the size of the input for the next iteration: T(n) = T(g(n)) + θ(f(n)): Θ(cur) T(1) = Θ(1) cur-1 T(n) = T(n-1) + Θ(n)
9
Example – Insertion Sort
T(n) = T(n-1) + Θ(n), T(1) = Θ(1) For O(·) bound: T(n) ≤ T(n-1) + cn ≤ T(n-2) + c(n-1) + cn ≤ … … ≤ T(1) + 2c + 3c +…+ cn ≤ d + cΣk=1..nk ≤ ≤ d + c(n-1)n/2 = d + cn2/2 + cn/2 - c = O(n2) Showing Ω() is almost the same Therefore we get an Θ(n2) tight bound
10
Example – Insertion Sort
RecInsSort(A[1…n],n,cur) if(cur = 1) then return RecInsSort(A, n, cur-1) insert A[cur] into A[1..cur-1] Guess + method steps: Guess the running time: Determine a basis and the induction assumption : * Θ(n2) (since it can be implem-ented non-recursively with 2 simple loops) T(n) ≤ cn2 T(1) ≤ c·12 = Θ(1) Notice we need c to be greater or equal to 1. * Induction on what? Make sure you know, and always write it down in your proofs.
11
Example – Insertion Sort
3. Show the induction step: This is our analytical observation, not the inductive assumption! T(n+1) ≤ T(n) + d(n+1) ≤ cn2 + d(n+1) This is the inductive assumption… If we choose c greater or equal d then T(n+1) ≤ T(n) + d(n+1) ≤ cn2 + d(n+1) ≤c (n2 + n + 1) ≤ c(n2 + 2n + 1) = c(n+1)2
12
Bubble Sort vs. Insertion Sort
13
Recursion Tree Method Appropriate for algorithms where the input is split between several recursive calls: This usually creates a fractional decrease in iteration input size, like: n, (n/2 + n/2) , (n/4 + n/4 + n/4 + n/4), … n, (n/3 + 2n/3), (n/9 + 2n/9 +2n/9 + 4n/9), … Notice that this splitting process can be viewed as a tree Sort of a mix between the iterative and inductive methods. Steps: (assume the input of (an iteration of) the algorithm is n) Count f(n): how many operations are done in each iteration (as a function of n), besides the recursive call. Give a simple θ bound. Determine gi(n): the size of the input for each part of the recursive calls (as a function n). Write down: T(n) = Σi [ T(gi(n)) ] + θ(f(n)) Give an (educated) guess of the running time (intuition in the next slides) Prove your guess (e.g. by induction).
14
Recursion Tree Method * exist c,n0 s.t. for all n>n0:
Example (arbitrary) recursive algorithm: Algo1(A[1..n],lo,hi) diff = hi-lo if diff = 0 do 6 actions if diff = 1 do 7 actions if diff = 2 do 10 actions else do 2diff actions mid int(diff/3) Algo(A,lo,mid) Algo(A,mid+1,hi) * exist c,n0 s.t. for all n>n0: * We neglected the fact that we need to user floor and ceiling. You will prove why in the targil.
15
Recursion Tree cn cn h cn …. ··· cn total #actions in level:
Sum each row! cn h cn ··· …. cn
16
Recursion Tree Method What is the height of the tree h?
Notice that not all paths are of same length… In the worst case (right-most path), the size of the input at each iteration is: n, (2/3)n, (2/3)2n, (2/3)3n, … , (2/3)hn ≤ 1 (for some h*) n ≤ (3/2)h h ≥ log3/2(n) *Notice that h is truly the height of the tree, since h counted our steps till the input size was 1. Since h = log3/2(n) and in each step we take O(n) actions, now we can guess an nlog3/2(n) bound! Actually, we will prove an nlog2(n) bound. It’s easy to show that they are Θ, since they differ by only a multiplicative constant (review log rules!).
17
Formal proof by induction
Basis: T(3) = 10 < c13 · log(3) important! Assumption: for any k<n : T(k) ≤ d · k · log(k) (we have the possibility to fix d later) Inductive Step: for k=n, prove:
18
Inductive Step (*)
19
Inductive Step Similarly, can prove that: T(n)= Ω(nlogn) for:
(*) Choosing d: Similarly, can prove that: T(n)= Ω(nlogn) for: Note: This is a different c !
20
Advanced induction Consider the following algorithm:
This is an advanced example of how induction can be used to show traits of algorithms. Again, it is advanced (so don’t panic!). Consider the following algorithm: Algo2(String S composed of characters in [0…9]) While (S ≠ 0….00) (e.g. contains no 0s) Insert a ‘0’ into S between the first and second chars (thus increasing S’s length by 1) Decrease S’s integer value by 1 (that is, S S – 1) For example, if S = ‘11’ then we will get: 11; 101, 100; , 0999; , ; … Prove that the algorithm stops! Looks frightening, but it’s not. Review the example to easily understand how it works.
21
Advanced induction Set: Intuition: a = S[1]
b = numerical value of S[2 … end] Intuition: Notice that with each step, either a numerically decreases or b (the value!) numerically decreases. Notice that adding zeros, though changing S, does not change b’s value. Since S = a · 00…0 · b, we will claim that S, as a string, decreases lexicographically.
22
Advanced induction Set: Proof: by double induction: a = S[1]
b = numerical value of S[2 … end] Proof: by double induction: If our claim is P(n,m), then the steps are: Prove P(0,0) is true. Prove for all m ≥ 0, P(m,0) P(m+1,0) Prove for all m, n ≥ 0, P(m,n) P(m,n+1)
23
Advanced induction - Proof
Claim P(a,b): for any S = a · 00…0 · b, one iteration of the algorithm will decrease S lexicographically, or stop. Base – P(0, 0): a=0, b=0 S = 00…0, so the algorithm stops. Let a be in [0..9]. For any such a, assume P(a, 0) is true. Claim 1: P(a+1, 0) is also true. Proof 1: since b=0, inserting a 0 does not alter a or b, but subtracting 1 from S requires b to ‘loan’ 1 from a, thus decreasing a’s value and therefore S’s lexicographical value. Let a, b be in [0..9]. For any such a, b, assume P(a,b) is true. Claim 2: P(a, b+1) is also true. Proof 2: if b+1 = 0, then claim 1 stands. If not, then inserting a 0 again does not change the value of S, but subtracting 1 decreases b’s value and therefore S’s lexicographical value. Since NxN is a well-order (e.g. has no infinite decreasing series), and since (a,b) is in NxN, the algorithm will eventually stop.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.