Recursion Ali.

Slides:



Advertisements
Similar presentations
Divide and Conquer (Merge Sort)
Advertisements

5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
1 Divide & Conquer Algorithms. 2 Recursion Review A function that calls itself either directly or indirectly through another function Recursive solutions.
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
CS 253: Algorithms Chapter 4 Divide-and-Conquer Recurrences Master Theorem Credit: Dr. George Bebis.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Chapter 4: Solution of recurrence relationships
Recurrence Relations Connection to recursive algorithms Techniques for solving them.
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Analysis of Algorithms
Analysis of Algorithms CS 477/677
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 3.
Chapter 4. Recurrences. Outline Offers three methods for solving recurrences, that is for obtaining asymptotic bounds on the solution In the substitution.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Lecture 7. Solution by Substitution Method T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n)
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
COSC 3101A - Design and Analysis of Algorithms 3 Recurrences Master’s Method Heapsort and Priority Queue Many of the slides are taken from Monica Nicolescu’s.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 4: Recurrences.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
BY Lecturer: Aisha Dawood. A recurrence is a function is defined in terms of:  one or more base cases, and  itself, with smaller arguments. 2.
Chapter 4: Solution of recurrence relationships Techniques: Substitution: proof by induction Tree analysis: graphical representation Master theorem: Recipe.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Recurrences It continues… Jeff Chastine. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. A recurrence.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 4.
Equal costs at all levels
Mathematical Foundations (Solving Recurrence)
Design and Analysis of Algorithms
Analysis of Algorithms CS 477/677
UNIT- I Problem solving and Algorithmic Analysis
CS 3343: Analysis of Algorithms
Chapter 4: Divide and Conquer
Analysis of Algorithms
Algorithm Analysis (for Divide-and-Conquer problems)
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Introduction to Algorithms 6.046J
Algorithms and Data Structures Lecture III
Ch 4: Recurrences Ming-Te Chi
CS 3343: Analysis of Algorithms
Recurrences (Method 4) Alexandra Stefan.
Introduction to Algorithms
Divide and Conquer (Merge Sort)
Trevor Brown CS 341: Algorithms Trevor Brown
Ack: Several slides from Prof. Jim Anderson’s COMP 202 notes.
Divide & Conquer Algorithms
Analysis of Algorithms
At the end of this session, learner will be able to:
Introduction To Algorithms
Algorithms Recurrences.
Quicksort Quick sort Correctness of partition - loop invariant
Algorithms and Data Structures Lecture III
Presentation transcript:

Recursion Ali

Recursion in Real Life

What is recursion? * Recursion is a technique that solves a problem by solving a smaller problems of the same type or * it is simply an already active method (or subprogram) being invoked by itself directly or being invoked by another method (or subprogram) indirectly.

Basic Base cases: 2. Make progress: Always have at least one case that can be solved without using recursion. like 2. Make progress: Any recursive call must make progress toward a base case.

Four Criteria For Recursion A recursive function calls itself. This action is what makes the solution recursive. Each recursive call solves an identical, but smaller, problem. A recursive function solves a problem by solving another problem that is identical in nature but smaller in size. A test for the base case enables the recursive calls to stop. There must be a case of the problem (known as base case or stopping case) that is handled differently from the other cases (without recursively calling itself.) In the base case, the recursive calls stop and the problem is solved directly. Eventually, one of the smaller problems must be the base case. The manner in which the size of the problem diminishes ensures that the base case is eventually is reached.

Recursive function The recursive function is a kind of function that calls itself, or a function that is part of a cycle in the sequence of function calls. f1 f1 f2 fn …

Example A simple example of a recursively defined function is the factorial function: n! = 1· 2· 3· 4 ···(n –2)·(n –1)·n i.e., the product of the first n positive numbers (by convention, the product of nothing is 1, so that 0! = 1).

Types Direct Recursion: procedure Alpha; begin Alpha end; Indirect (or Mutual) Recursion: procedure Alpha; begin Beta end; procedure Beta; Alpha

Why use Recursion? Recursion is a method-based technique for implementing iteration Recursive programs are often more succinct, elegant, and easier to understand than their iterative counterparts. Recursion is also a technique that is useful for defining relationships, and for designing algorithms that implement those relationships.

Basis of Recursion * If the sub-problems are similar to the original then we may be able to employ recursion. * Two requirements: (1)the sub-problems must be simpler than the original problem. (2)After a finite number of subdivisions, a sub-problem must be encountered that can be solved outright.

Different views Recursive Definition: n! = n * (n-1)! (non-math examples are common too) Recursive Procedure: a procedure that calls itself. Recursive Data Structure: a data structure that contains a pointer to an instance of itself: public class ListNode { Object nodeItem; ListNode next, previous; … }

Recursive algorithm A recursive algorithm solves the problem by possibly using the result of applying itself to a simpler problem . like draw this picture

Properties of recursive algorithm A recursive algorithm solves the large problem by using its solution to a simpler sub-problem divide and conquer approach Eventually the sub-problem is simple enough that it can be solved without applying the algorithm to it recursively This is called the ‘base case ’

Asymptotic Efficiency of Recurrences Find the asymptotic bounds of recursive equations. Substitution method domain transformation Changing variable Recursive tree method Master method (master theorem) Provides bounds for: T(n) = aT(n/b)+f(n) where a  1 (the number of subproblems). b>1, (n/b is the size of each subproblem). f(n) is a given function.

Example of factorial function The factorial function: multiply together all numbers from 1 to n. denoted n! n!=n*(n-1)*(n-2)*…2*1 n!= n*(n-1)! if n>0 General case: Uses a solution to a s simpler sub-problem 1 if n==0 Base case: Solution is given directly

Example of factorial(4); public int factorial(int n){ if (n==0) return 1; else return n*factorial(n-1); Example of factorial(4); factorial(4) n=4 Returns 4*factorial(3) n=3 Returns 3*factorial(2) n=2 Returns 2*factorial(1) n=1 Returns 1*factorial(0) n=0 Returns 1

Recurrences MERGE-SORT T(n/2)+ T(n/2)+ (n) if n>1 Contains details: T(n) = (1) if n=1 T(n/2)+ T(n/2)+ (n) if n>1 Ignore details, T(n) = 2T(n/2)+ (n). T(n) = (1) if n=1 2T(n/2)+ (n) if n>1

Example Recurrences T(n) = T(n-1) + n Θ(n2) T(n) = T(n/2) + c Θ(lgn) Recursive algorithm that loops through the input to eliminate one item T(n) = T(n/2) + c Θ(lgn) Recursive algorithm that halves the input in one step T(n) = T(n/2) + n Θ(n) Recursive algorithm that halves the input but must examine every item in the input T(n) = 2T(n/2) + 1 Θ(n) Recursive algorithm that splits the input into 2 halves and does a constant amount of other work

Methods for Solving Recurrences Iteration method Substitution method Recursion tree method Master method

Iteration method Convert the recurrence into a summation and try to bound it using known series Iterate the recurrence until the initial condition is reached. Use back-substitution to express the recurrence in terms of n and the initial (boundary) condition.

Iteration method example T(n) = c + T(n/2) = c + c + T(n/4) = c + c + c + T(n/8) Assume n = 2k T(n) = c + c + … + c + T(1) = c lg n + T(1) = Θ(lg n) T(n/2) = c + T(n/4) T(n/4) = c + T(n/8) k times

Iteration method example T(n) = n + 2T(n/2) = n + 2(n/2 + 2T(n/4)) = n + n + 4T(n/4) = n + n + 4(n/4 + 2T(n/8)) = n + n + n + 8T(n/8) … = in + 2iT(n/2i) = kn + 2kT(1) = nlgn + nT(1) = Θ(nlgn) Assume: n = 2k T(n/2) = n/2 + 2T(n/4)

The Substitution Method Guess a solution T(n) = O(g(n)) Induction goal: apply the definition of the asymptotic notation T(n) ≤ d g(n), for some d > 0 and n ≥ n0 Induction hypothesis: T(k) ≤ d g(k) for all k < n Prove the induction goal Use the induction hypothesis to find some values of the constants d and n0 for which the induction goal holds

example T(n) = c + T(n/2) Guess: T(n) = O(lgn) Induction goal: T(n) ≤ d lgn, for some d and n ≥ n0 Induction hypothesis: T(n/2) ≤ d lg(n/2) Proof of induction goal: T(n) = T(n/2) + c ≤ d lg(n/2) + c = d lgn – d + c ≤ d lgn if: – d + c ≤ 0, d ≥ c

The Substitution Method Two steps: Guess the form of the solution. By experience, and creativity. By some heuristics. If a recurrence is similar to one you have seen before. T(n)=2T(n/2+17)+n, similar to T(n)=2T(n/2)+n, , guess O(nlg n). Prove loose upper and lower bounds on the recurrence and then reduce the range of uncertainty. For T(n)=2T(n/2)+n, prove lower bound T(n)= (n), and prove upper bound T(n)= O(n2), then guess the tight bound is T(n)=O(nlg n). By recursion tree. Use mathematical induction to find the constants and show that the solution works.

Solve T(n)=2T(n/2)+n Guess the solution: T(n)=O(nlg n), i.e., T(n) cnlg n for some c. Prove the solution by induction: Suppose this bound holds for n/2, i.e., T(n/2) cn/2 lg (n/2). T(n)  2(cn/2 lg (n/2))+n  cn lg (n/2))+n = cn lg n - cn lg 2 +n = cn lg n - cn +n  cn lg n (as long as c1)

Recursion Tree Idea: Each node represents the cost of a single sub-problem. Sum up the costs with each level to get level cost. Sum up all the level costs to get total cost. Particularly suitable for divide-and-conquer recurrence. Best used to generate a good guess, tolerating “sloppiness”. If trying carefully to draw the recursion-tree and compute cost, then used as direct proof. Convert the recurrence into a tree: Each node represents the cost incurred at various levels of recursion Sum up the costs of all levels

Example of tree Example W(n) = 2W(n/2) + n2

Recursion Tree for T(n)=3T(n/4)+(n2) cn2 cn2 c(n/4)2 T(n/16) (c) T(n/4) T(n/4) T(n/4) (a) (b) cn2 cn2 (3/16)cn2 c(n/4)2 c(n/4)2 c(n/4)2 log 4n (3/16)2cn2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 (nlog 43) T(1) T(1) T(1) T(1) T(1) T(1) 3log4n= nlog 43 Total O(n2) (d)

Solution to T(n)=3T(n/4)+(n2) The height is log 4n, #leaf nodes = 3log 4n= n log 43. Leaf node cost: T(1). Total cost T(n)=cn2+(3/16) cn2+(3/16)2 cn2+  +(3/16)log 4n-1 cn2+ (n log 43) =(1+3/16+(3/16)2+  +(3/16)log 4n-1) cn2 + (n log 43) <(1+3/16+(3/16)2+  +(3/16)m+ ) cn2 + (n log 43) =(1/(1-3/16)) cn2 + (n log 43) =16/13cn2 + (n log 43) =O(n2).

Prove of above gusse T(n)=3T(n/4)+(n2) =O(n2). Show T(n) dn2 for some d. T(n) 3(d (n/4)2) +cn2 3(d (n/4)2) +cn2 =3/16(dn2) +cn2  dn2, as long as d(16/13)c.

Master method Idea: compare f(n) with nlogba “Cookbook” for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Idea: compare f(n) with nlogba f(n) is asymptotically smaller or larger than nlogba by a polynomial factor n f(n) is asymptotically equal with nlogba

Master method “Cookbook” for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Case 1: if f(n) = O(nlogba -) for some  > 0, then: T(n) = (nlogba) Case 2: if f(n) = (nlogba), then: T(n) = (nlogba lgn) Case 3: if f(n) = (nlogba +) for some  > 0, and if af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then: T(n) = (f(n)) regularity condition

Why n log b a? Assume n = bk  k = logbn Case 1: If f(n) is dominated by n log b a: T(n) = (n log b n) Case 3: If f(n) dominates n log b a: T(n) = (f(n)) Case 2: If f(n) = (n log b a): T(n) = (n log b a log n) Assume n = bk  k = logbn At the end of iteration i = k:

Example 1 T(n) = 2T(n/2) + n2 a = 2, b = 2, log22 = 1 Compare nlog22 with f(n) = n  f(n) = (n)  Case 2  T(n) = (nlgn) T(n) = 2T(n/2) + n2 a = 2, b = 2, log22 = 1 Compare n with f(n) = n2  f(n) = (n1+) Case 3  verify regularity cond. a f(n/b) ≤ c f(n)  2 n2/4 ≤ c n2  c = ½ is a solution (c<1)  T(n) = (n2)

Example 2 T(n) = 3T(n/4) + nlgn a = 3, b = 4, log43 = 0.793 Compare n with f(n) = n1/2  f(n) = O(n1-) Case 1  T(n) = (n) T(n) = 3T(n/4) + nlgn a = 3, b = 4, log43 = 0.793 Compare n0.793 with f(n) = nlgn f(n) = (nlog43+) Case 3 Check regularity condition: 3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n), c=3/4 T(n) = (nlgn)

Example 3 T(n) = 2T(n/2) + nlgn a = 2, b = 2, log22 = 1 Compare n with f(n) = nlgn seems like case 3 should apply f(n) must be polynomially larger by a factor of n In this case it is only larger by a factor of lgn

Disadvantages The main disadvantage of programming recursively is that, while it makes it easier to write simple and elegant programs, it also makes it easier to write inefficient ones. when we use recursion to solve problems we are interested exclusively with correctness, and not at all with efficiency. Consequently, our simple, elegant recursive algorithms may be inherently inefficient.