Download presentation
Presentation is loading. Please wait.
1
Recursion Ali
2
Recursion in Real Life
3
What is recursion? * Recursion is a technique that solves a problem by solving a smaller problems of the same type or * it is simply an already active method (or subprogram) being invoked by itself directly or being invoked by another method (or subprogram) indirectly.
4
Basic Base cases: 2. Make progress:
Always have at least one case that can be solved without using recursion. like 2. Make progress: Any recursive call must make progress toward a base case.
5
Four Criteria For Recursion
A recursive function calls itself. This action is what makes the solution recursive. Each recursive call solves an identical, but smaller, problem. A recursive function solves a problem by solving another problem that is identical in nature but smaller in size. A test for the base case enables the recursive calls to stop. There must be a case of the problem (known as base case or stopping case) that is handled differently from the other cases (without recursively calling itself.) In the base case, the recursive calls stop and the problem is solved directly. Eventually, one of the smaller problems must be the base case. The manner in which the size of the problem diminishes ensures that the base case is eventually is reached.
6
Recursive function The recursive function is
a kind of function that calls itself, or a function that is part of a cycle in the sequence of function calls. f1 f1 f2 fn …
7
Example A simple example of a recursively defined function is the factorial function: n! = 1· 2· 3· 4 ···(n –2)·(n –1)·n i.e., the product of the first n positive numbers (by convention, the product of nothing is 1, so that 0! = 1).
8
Types Direct Recursion: procedure Alpha; begin Alpha end;
Indirect (or Mutual) Recursion: procedure Alpha; begin Beta end; procedure Beta; Alpha
9
Why use Recursion? Recursion is a method-based technique for implementing iteration Recursive programs are often more succinct, elegant, and easier to understand than their iterative counterparts. Recursion is also a technique that is useful for defining relationships, and for designing algorithms that implement those relationships.
10
Basis of Recursion * If the sub-problems are similar to the original then we may be able to employ recursion. * Two requirements: (1)the sub-problems must be simpler than the original problem. (2)After a finite number of subdivisions, a sub-problem must be encountered that can be solved outright.
11
Different views Recursive Definition: n! = n * (n-1)! (non-math examples are common too) Recursive Procedure: a procedure that calls itself. Recursive Data Structure: a data structure that contains a pointer to an instance of itself: public class ListNode { Object nodeItem; ListNode next, previous; … }
12
Recursive algorithm A recursive algorithm solves the problem by possibly using the result of applying itself to a simpler problem . like draw this picture
13
Properties of recursive algorithm
A recursive algorithm solves the large problem by using its solution to a simpler sub-problem divide and conquer approach Eventually the sub-problem is simple enough that it can be solved without applying the algorithm to it recursively This is called the ‘base case ’
14
Asymptotic Efficiency of Recurrences
Find the asymptotic bounds of recursive equations. Substitution method domain transformation Changing variable Recursive tree method Master method (master theorem) Provides bounds for: T(n) = aT(n/b)+f(n) where a 1 (the number of subproblems). b>1, (n/b is the size of each subproblem). f(n) is a given function.
15
Example of factorial function
The factorial function: multiply together all numbers from 1 to n. denoted n! n!=n*(n-1)*(n-2)*…2*1 n!= n*(n-1)! if n> General case: Uses a solution to a s simpler sub-problem 1 if n== Base case: Solution is given directly
16
Example of factorial(4);
public int factorial(int n){ if (n==0) return 1; else return n*factorial(n-1); Example of factorial(4); factorial(4) n=4 Returns 4*factorial(3) n=3 Returns 3*factorial(2) n=2 Returns 2*factorial(1) n=1 Returns 1*factorial(0) n=0 Returns 1
17
Recurrences MERGE-SORT T(n/2)+ T(n/2)+ (n) if n>1
Contains details: T(n) = (1) if n=1 T(n/2)+ T(n/2)+ (n) if n>1 Ignore details, T(n) = 2T(n/2)+ (n). T(n) = (1) if n=1 2T(n/2)+ (n) if n>1
18
Example Recurrences T(n) = T(n-1) + n Θ(n2) T(n) = T(n/2) + c Θ(lgn)
Recursive algorithm that loops through the input to eliminate one item T(n) = T(n/2) + c Θ(lgn) Recursive algorithm that halves the input in one step T(n) = T(n/2) + n Θ(n) Recursive algorithm that halves the input but must examine every item in the input T(n) = 2T(n/2) Θ(n) Recursive algorithm that splits the input into 2 halves and does a constant amount of other work
19
Methods for Solving Recurrences
Iteration method Substitution method Recursion tree method Master method
20
Iteration method Convert the recurrence into a summation and try to bound it using known series Iterate the recurrence until the initial condition is reached. Use back-substitution to express the recurrence in terms of n and the initial (boundary) condition.
21
Iteration method example
T(n) = c + T(n/2) = c + c + T(n/4) = c + c + c + T(n/8) Assume n = 2k T(n) = c + c + … + c + T(1) = c lg n + T(1) = Θ(lg n) T(n/2) = c + T(n/4) T(n/4) = c + T(n/8) k times
22
Iteration method example
T(n) = n + 2T(n/2) = n + 2(n/2 + 2T(n/4)) = n + n + 4T(n/4) = n + n + 4(n/4 + 2T(n/8)) = n + n + n + 8T(n/8) … = in + 2iT(n/2i) = kn + 2kT(1) = nlgn + nT(1) = Θ(nlgn) Assume: n = 2k T(n/2) = n/2 + 2T(n/4)
23
The Substitution Method
Guess a solution T(n) = O(g(n)) Induction goal: apply the definition of the asymptotic notation T(n) ≤ d g(n), for some d > 0 and n ≥ n0 Induction hypothesis: T(k) ≤ d g(k) for all k < n Prove the induction goal Use the induction hypothesis to find some values of the constants d and n0 for which the induction goal holds
24
example T(n) = c + T(n/2) Guess: T(n) = O(lgn)
Induction goal: T(n) ≤ d lgn, for some d and n ≥ n0 Induction hypothesis: T(n/2) ≤ d lg(n/2) Proof of induction goal: T(n) = T(n/2) + c ≤ d lg(n/2) + c = d lgn – d + c ≤ d lgn if: – d + c ≤ 0, d ≥ c
25
The Substitution Method
Two steps: Guess the form of the solution. By experience, and creativity. By some heuristics. If a recurrence is similar to one you have seen before. T(n)=2T(n/2+17)+n, similar to T(n)=2T(n/2)+n, , guess O(nlg n). Prove loose upper and lower bounds on the recurrence and then reduce the range of uncertainty. For T(n)=2T(n/2)+n, prove lower bound T(n)= (n), and prove upper bound T(n)= O(n2), then guess the tight bound is T(n)=O(nlg n). By recursion tree. Use mathematical induction to find the constants and show that the solution works.
26
Solve T(n)=2T(n/2)+n Guess the solution: T(n)=O(nlg n),
i.e., T(n) cnlg n for some c. Prove the solution by induction: Suppose this bound holds for n/2, i.e., T(n/2) cn/2 lg (n/2). T(n) 2(cn/2 lg (n/2))+n cn lg (n/2))+n = cn lg n - cn lg 2 +n = cn lg n - cn +n cn lg n (as long as c1)
27
Recursion Tree Idea: Each node represents the cost of a single sub-problem. Sum up the costs with each level to get level cost. Sum up all the level costs to get total cost. Particularly suitable for divide-and-conquer recurrence. Best used to generate a good guess, tolerating “sloppiness”. If trying carefully to draw the recursion-tree and compute cost, then used as direct proof. Convert the recurrence into a tree: Each node represents the cost incurred at various levels of recursion Sum up the costs of all levels
28
Example of tree Example W(n) = 2W(n/2) + n2
29
Recursion Tree for T(n)=3T(n/4)+(n2)
cn2 cn2 c(n/4)2 T(n/16) (c) T(n/4) T(n/4) T(n/4) (a) (b) cn2 cn2 (3/16)cn2 c(n/4)2 c(n/4)2 c(n/4)2 log 4n (3/16)2cn2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 (nlog 43) T(1) T(1) T(1) T(1) T(1) T(1) 3log4n= nlog 43 Total O(n2) (d)
30
Solution to T(n)=3T(n/4)+(n2)
The height is log 4n, #leaf nodes = 3log 4n= n log 43. Leaf node cost: T(1). Total cost T(n)=cn2+(3/16) cn2+(3/16)2 cn2+ +(3/16)log 4n-1 cn2+ (n log 43) =(1+3/16+(3/16)2+ +(3/16)log 4n-1) cn2 + (n log 43) <(1+3/16+(3/16)2+ +(3/16)m+ ) cn2 + (n log 43) =(1/(1-3/16)) cn2 + (n log 43) =16/13cn2 + (n log 43) =O(n2).
31
Prove of above gusse T(n)=3T(n/4)+(n2) =O(n2).
Show T(n) dn2 for some d. T(n) 3(d (n/4)2) +cn2 3(d (n/4)2) +cn2 =3/16(dn2) +cn2 dn2, as long as d(16/13)c.
32
Master method Idea: compare f(n) with nlogba
“Cookbook” for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Idea: compare f(n) with nlogba f(n) is asymptotically smaller or larger than nlogba by a polynomial factor n f(n) is asymptotically equal with nlogba
33
Master method “Cookbook” for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Case 1: if f(n) = O(nlogba -) for some > 0, then: T(n) = (nlogba) Case 2: if f(n) = (nlogba), then: T(n) = (nlogba lgn) Case 3: if f(n) = (nlogba +) for some > 0, and if af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then: T(n) = (f(n)) regularity condition
34
Why n log b a? Assume n = bk k = logbn
Case 1: If f(n) is dominated by n log b a: T(n) = (n log b n) Case 3: If f(n) dominates n log b a: T(n) = (f(n)) Case 2: If f(n) = (n log b a): T(n) = (n log b a log n) Assume n = bk k = logbn At the end of iteration i = k:
35
Example 1 T(n) = 2T(n/2) + n2 a = 2, b = 2, log22 = 1
Compare nlog22 with f(n) = n f(n) = (n) Case 2 T(n) = (nlgn) T(n) = 2T(n/2) + n2 a = 2, b = 2, log22 = 1 Compare n with f(n) = n2 f(n) = (n1+) Case 3 verify regularity cond. a f(n/b) ≤ c f(n) 2 n2/4 ≤ c n2 c = ½ is a solution (c<1) T(n) = (n2)
36
Example 2 T(n) = 3T(n/4) + nlgn a = 3, b = 4, log43 = 0.793
Compare n with f(n) = n1/2 f(n) = O(n1-) Case 1 T(n) = (n) T(n) = 3T(n/4) + nlgn a = 3, b = 4, log43 = 0.793 Compare n0.793 with f(n) = nlgn f(n) = (nlog43+) Case 3 Check regularity condition: 3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n), c=3/4 T(n) = (nlgn)
37
Example 3 T(n) = 2T(n/2) + nlgn a = 2, b = 2, log22 = 1
Compare n with f(n) = nlgn seems like case 3 should apply f(n) must be polynomially larger by a factor of n In this case it is only larger by a factor of lgn
38
Disadvantages The main disadvantage of programming recursively is that, while it makes it easier to write simple and elegant programs, it also makes it easier to write inefficient ones. when we use recursion to solve problems we are interested exclusively with correctness, and not at all with efficiency. Consequently, our simple, elegant recursive algorithms may be inherently inefficient.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.