Recursion.  Understand how the Fibonacci series is generated  Recursive Algorithms  Write simple recursive algorithms  Analyze simple recursive algorithms.

Slides:



Advertisements
Similar presentations
Towers of Hanoi Move n (4) disks from pole A to pole C such that a disk is never put on a smaller disk A BC ABC.
Advertisements

Introduction to Algorithms Quicksort
Practice Quiz Question
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
CS4413 Divide-and-Conquer
HST 952 Computing for Biomedical Scientists Lecture 9.
Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
Recursion CSC 220: Data Structure Winter Introduction A programming technique in which a function calls itself. One of the most effective techniques.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
Ver. 1.0 Session 5 Data Structures and Algorithms Objectives In this session, you will learn to: Sort data by using quick sort Sort data by using merge.
Circular Arrays Neat trick: use a circular array to insert and remove items from a queue in constant time The idea of a circular array is that the end.
Chapter 2 Recursion: The Mirrors. © 2005 Pearson Addison-Wesley. All rights reserved2-2 Recursive Solutions Recursion is an extremely powerful problem-solving.
© 2006 Pearson Addison-Wesley. All rights reserved6-1 More on Recursion.
CMPT 225 Recursion-part 3. Recursive Searching Linear Search Binary Search Find an element in an array, return its position (index) if found, or -1 if.
Recursion. Recursive Solutions Recursion breaks a problem into smaller identical problems – mirror images so to speak. By continuing to do this, eventually.
© 2006 Pearson Addison-Wesley. All rights reserved3-1 Chapter 3 Recursion: The Mirrors.
Algorithm Analysis: Big O Notation
COMPSCI 105 S Principles of Computer Science Recursion 3.
Chapter 2 Recursion: The Mirrors. © 2005 Pearson Addison-Wesley. All rights reserved2-2 Recursive Solutions Recursion is an extremely powerful problem-
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Recursion.  Identify recursive algorithms  Write simple recursive algorithms  Understand recursive function calling  With reference to the call stack.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
CHAPTER 7: SORTING & SEARCHING Introduction to Computer Science Using Ruby (c) Ophir Frieder at al 2012.
(c) , University of Washington
Chapter 14: Recursion Starting Out with C++ Early Objects
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
Chapter 2 Recursion: The Mirrors CS Data Structures Mehmet H Gunes Modified from authors’ slides.
Chapter 13 Recursion. Topics Simple Recursion Recursion with a Return Value Recursion with Two Base Cases Binary Search Revisited Animation Using Recursion.
Chapter 3 Recursion: The Mirrors. © 2004 Pearson Addison-Wesley. All rights reserved 3-2 Recursive Solutions Recursion –An extremely powerful problem-solving.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Data Structures and Abstractions with Java, 4e Frank Carrano
Recursion, Complexity, and Sorting By Andrew Zeng.
Chapter 12 Recursion, Complexity, and Searching and Sorting
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
10/14/ Algorithms1 Algorithms - Ch2 - Sorting.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
CSC 211 Data Structures Lecture 13
 O(1) – constant time  The time is independent of n  O(log n) – logarithmic time  Usually the log is to the base 2  O(n) – linear time  O(n*logn)
CSC 221: Recursion. Recursion: Definition Function that solves a problem by relying on itself to compute the correct solution for a smaller version of.
Data Structures R e c u r s i o n. Recursive Thinking Recursion is a problem-solving approach that can be used to generate simple solutions to certain.
Recursion Review: A recursive function calls itself, but with a smaller problem – at least one of the parameters must decrease. The function uses the results.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Priority Queues and Heaps. October 2004John Edgar2  A queue should implement at least the first two of these operations:  insert – insert item at the.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Searching & Sorting Programming 2. Searching Searching is the process of determining if a target item is present in a list of items, and locating it A.
CMPT 225 Recursion. Objectives Understand how the Fibonacci series is generated Recursive Algorithms  Write simple recursive algorithms  Analyze simple.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Computer Science 101 Fast Algorithms. What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n )
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
1 Recursive algorithms Recursive solution: solve a smaller version of the problem and combine the smaller solutions. Example: to find the largest element.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Chapter 9 Recursion © 2006 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.
Sorting & Searching Geletaw S (MSC, MCITP). Objectives At the end of this session the students should be able to: – Design and implement the following.
Data Structures and Algorithms Instructor: Tesfaye Guta [M.Sc.] Haramaya University.
Maitrayee Mukerji. Factorial For any positive integer n, its factorial is n! is: n! = 1 * 2 * 3 * 4* ….* (n-1) * n 0! = 1 1 ! = 1 2! = 1 * 2 = 2 5! =
CS 116 Object Oriented Programming II Lecture 13 Acknowledgement: Contains materials provided by George Koutsogiannakis and Matt Bauer.
Priority Queues and Heaps. John Edgar  Define the ADT priority queue  Define the partially ordered property  Define a heap  Implement a heap using.
Towers of Hanoi Move n (4) disks from pole A to pole C
Hassan Khosravi / Geoffrey Tien
Review: recursion Tree traversals
Teach A level Computing: Algorithms and Data Structures
Recursion "To understand recursion, one must first understand recursion." -Stephen Hawking.
Algorithm design and Analysis
Quicksort analysis Bubble sort
Recursive Algorithms 1 Building a Ruler: drawRuler()
Presentation transcript:

Recursion

 Understand how the Fibonacci series is generated  Recursive Algorithms  Write simple recursive algorithms  Analyze simple recursive algorithms  Understand the drawbacks of recursion  Name other recursive algorithms and data structures John Edgar2

 What happens if you put a pair of rabbits in a field?  More rabbits!  Assume that rabbits take one month to reach maturity and that  Each pair of rabbits produces another pair of rabbits one month after mating. John Edgar3

 How many pairs of rabbits are there after 5 months?  Month 1: start – 1  Month 2: the rabbits are now mature and can mate – 1  Month 3: – the first pair give birth to two babies – 2  Month 4: the first pair give birth to 2 babies, the pair born in month 3 are now mature – 3  Month 5: the 3 pairs from month 4, and 2 new pairs – 5 John Edgar4

AAfter 5 months there are 5 pairs of rabbits ii.e. the number of pairs at 4 months (3) plus the number of pairs at 3 months (2) WWhy? WWhile there are 3 pairs of bunnies in month 4 only 2 of them are able to mate tthe ones alive in month 3 TThis series of numbers is called the Fibonacci series 5 month: pairs:

TThe n th number in the Fibonacci series, fib(n), is: 00 if n = 0, and 1 if n = 1 ffib(n – 1) + fib(n – 2) for any n > 1 ee.g. what is fib(23) EEasy if we only knew fib(22) and fib(21) TThe answer is fib(22) + fib(21) WWhat happens if we actually write a function to calculate Fibonacci numbers like this? John Edgar6

 Let's write a function just like the formula  fib(n) = 0 if n = 0, 1 if n = 1,  otherwise fib(n) = fib(n – 1) + fib(n – 2) John Edgar7 int fib(int n){ if(n == 0 || n == 1){ return n; }else{ return fib(n-1) + fib(n-2); } int fib(int n){ if(n == 0 || n == 1){ return n; }else{ return fib(n-1) + fib(n-2); } The function calls itself C++

 The Fibonacci function is recursive  A recursive function calls itself  Each call to a recursive method results in a separate call to the method, with its own input  Recursive functions are just like other functions  The invocation is pushed onto the call stack  And removed from the call stack when the end of a method or a return statement is reached  Execution returns to the previous method call John Edgar8

int fib(int n){ if(n == 0 || n == 1) return n; else return fib(n-1) + fib(n-2); } John Edgar9 fib(5) fib(4) fib(3) fib(2) fib(1) fib(0) fib(2) fib(1) fib(0) fib(1) fib(2) fib(1) fib(0) fib(1)

 When a function is called it is pushed onto the call stack  This applies to each invocation of that function  When a recursive call is made execution switches to that method call  The call stack records the line number of the previous method where the call was made from  Once a method call execution finishes, returns to the previous invocation John Edgar10

January 2010Greg Mori11

 Recursive functions do not use loops to repeat instructions  But use recursive calls, in if statements  Recursive functions consist of two or more cases, there must be at least one  Base case, and one  Recursive case John Edgar12

 The base case is a smaller problem with a simpler solution  This problem’s solution must not be recursive ▪ Otherwise the function may never terminate  There can be more than one base case John Edgar13

 The recursive case is the same problem with smaller input  The recursive case must include a recursive function call  There can be more than one recursive case John Edgar14

 Define the problem in terms of a smaller problem of the same type  The recursive part  e.g. return fib(n-1) + fib(n-2);  And the base case where the solution can be easily calculated  This solution should not be recursive  e.g. if (n == 0 || n == 1) return n; John Edgar15

 How can the problem be defined in terms of smaller problems of the same type?  By how much does each recursive call reduce the problem size?  By 1, by half, …?  What are the base cases that can be solved without recursion?  Will a base case be reached as the problem size is reduced? John Edgar16

January 2010Greg Mori17

 Linear Search  Binary Search  Assume sorted array John Edgar18

John Edgar19 C++ int linSearch(int *arr, int n, int x){ for (int i=0; i < n; i++){ if(x == arr[i]){ return i; } } //for return -1; //target not found } int linSearch(int *arr, int n, int x){ for (int i=0; i < n; i++){ if(x == arr[i]){ return i; } } //for return -1; //target not found } The algorithm searches the array one element at a time using a for loop

 Base cases  Target is found at first position in array  The end of the array is reached  Recursive case  Target not found at first position ▪ Search again, discarding the first element of the array John Edgar20

John Edgar21 int linSearch(int *arr, int n, int x){ return recLinSearch(arr,n,0,x); } int recLinSearch(int *arr, int n, int i, int x){ if (i >= n){ return -1; } else if (x == arr[i]){ return i; } else return recLinSearch(arr, n, i + 1, x); } int linSearch(int *arr, int n, int x){ return recLinSearch(arr,n,0,x); } int recLinSearch(int *arr, int n, int i, int x){ if (i >= n){ return -1; } else if (x == arr[i]){ return i; } else return recLinSearch(arr, n, i + 1, x); } C++

 Of course, if it’s a sorted array we wouldn’t do linear search John Edgar22

 Each sub-problem searches a subarray  Differs only in the upper and lower array indices that define the subarray  Each sub-problem is smaller than the last one  In the case of binary search, half the size  There are two base cases  When the target item is found and  When the problem space consists of one item ▪ Make sure that this last item is checked John Edgar23

John Edgar24 C++ int binSearch(int *arr, int lower, int upper, int x){ int mid = (lower + upper) / 2; if (lower > upper){ return - 1; //base case } else if(arr[mid] == x){ return mid; //second base case } else if(arr[mid] < x){ return binSearch(arr, mid + 1, upper, x); } else { //arr[mid] > target return binSearch(arr, lower, mid - 1, x); } int binSearch(int *arr, int lower, int upper, int x){ int mid = (lower + upper) / 2; if (lower > upper){ return - 1; //base case } else if(arr[mid] == x){ return mid; //second base case } else if(arr[mid] < x){ return binSearch(arr, mid + 1, upper, x); } else { //arr[mid] > target return binSearch(arr, lower, mid - 1, x); }

January 2010Greg Mori25

 Merge Sort  Quicksort John Edgar26

January 2010Greg Mori27

 What’s the easiest list to sort?  A list of 1 number John Edgar28

 Let’s say I have 2 sorted lists of numbers  How can I merge them into 1 sorted list? John Edgar output List 1List 2

 If I have a list of n numbers, how should I sort them?  I know two things  How to sort a list of 1 number  How to merge 2 sorted lists of numbers into 1 sorted list  Smells like recursion John Edgar30

John Edgar31 mergeSort (array) if (array is length 1) // base case, one element return the array else arr1 = mergeSort(first half of array) arr2 = mergeSort(second half of array) return merge(arr1,arr2) mergeSort (array) if (array is length 1) // base case, one element return the array else arr1 = mergeSort(first half of array) arr2 = mergeSort(second half of array) return merge(arr1,arr2)

 What is the time complexity of a merge? John Edgar output List 1List 2

 How many recursive steps are there?  How large are the merges at each recursive step?  Merge takes O(n) time for n elements John Edgar33

John Edgar Sort entire array Sort halves Sort quarters Sort eighths Sorted quarters Sorted halves Sorted entire array

John Edgar Sort entire array Sort halves Sort quarters Sort eighths  How many recursive steps are there?  How large are the merges at each recursive step?  Merge takes O(n) time for n elements

John Edgar Sort entire array Sort halves Sort quarters Sort eighths  How many recursive steps are there?  O(log n) steps: split array in half each time  How large are the merges at each recursive step?  In total, merge n elements each step  Time complexity is O(n log n)

 Mergesort  Best case: O(n(log 2 n))  Average case: O(n(log 2 n))  Worst case: O(n(log 2 n)) John Edgar37

 Quicksort is a more efficient sorting algorithm than either selection or insertion sort  It sorts an array by repeatedly partitioning it  We will go over the basic idea of quicksort and an example of it  See text / on-line resources for details John Edgar39

 Partitioning is the process of dividing an array into sections (partitions), based on some criteria  "Big" and "small" values  Negative and positive numbers  Names that begin with a-m, names that begin with n-z  Darker and lighter pixels  Quicksort uses repeated partitioning to sort an array John Edgar40

John Edgar41 Partition this array into small and big values using a partitioning algorithm

John Edgar42 Partition this array into small and big values using a partitioning algorithm We will partition the array around the last value (18), we'll call this value the pivot smalls < 18 bigs > 18 pivot 18

John Edgar arr[low ] is greater than the pivot and should be on the right, we need to swap it with something We will partition the array around the last value (18), we'll call this value the pivot Use two indices, one at each end of the array, call them low and high Partition this array into small and big values using a partitioning algorithm

John Edgar arr[low ] (31) is greater than the pivot and should be on the right, we need to swap it with something arr[high] (11) is less than the pivot so swap with arr[low ] Partition this array into small and big values using a partitioning algorithm We will partition the array around the last value (18), we'll call this value the pivot Use two indices, one at each end of the array, call them low and high

John Edgar Partition this array into small and big values using a partitioning algorithm We will partition the array around the last value (18), we'll call this value the pivot Use two indices, one at each end of the array, call them low and high

John Edgar repeat this process until: Partition this array into small and big values using a partitioning algorithm We will partition the array around the last value (18), we'll call this value the pivot Use two indices, one at each end of the array, call them low and high 1207

John Edgar repeat this process until: high and low are the same Partition this array into small and big values using a partitioning algorithm We will partition the array around the last value (18), we'll call this value the pivot Use two indices, one at each end of the array, call them low and high

John Edgar48 repeat this process until: high and low are the same We'd like the pivot value to be in the centre of the array, so we will swap it with the first item greater than it Partition this array into small and big values using a partitioning algorithm We will partition the array around the last value (18), we'll call this value the pivot Use two indices, one at each end of the array, call them low and high

John Edgar49 smallsbigs pivot Partition this array into small and big values using a partitioning algorithm We will partition the array around the last value (18), we'll call this value the pivot Use two indices, one at each end of the array, call them low and high

John Edgar50 Use the same algorithm to partition this array into small and big values bigs! pivot smalls

John Edgar51 Or this one: bigs pivot smalls

 The quicksort algorithm works by repeatedly partitioning an array  Each time a subarray is partitioned there is  A sequence of small values,  A sequence of big values, and  A pivot value which is in the correct position  Partition the small values, and the big values  Repeat the process until each subarray being partitioned consists of just one element John Edgar52

 How long does quicksort take to run?  Let's consider the best and the worst case  These differ because the partitioning algorithm may not always do a good job  Let's look at the best case first  Each time a subarray is partitioned the pivot is the exact midpoint of the slice (or as close as it can get) ▪ So it is divided in half  What is the running time? John Edgar53

John Edgar bigs pivot smalls First partition

John Edgar55 big1 pivot sm Second partition pivot1pivot2 big2sm2

John Edgar56 pivot Third partition pivot1done 01

 Each subarray is divided exactly in half in each set of partitions  Each time a series of subarrays are partitioned around n comparisons are made  The process ends once all the subarrays left to be partitioned are of size 1  How many times does n have to be divided in half before the result is 1?  log 2 (n) times  Quicksort performs around n * log 2 (n) operations in the best case John Edgar57

John Edgar bigs pivot smalls First partition

John Edgar59 bigs pivot smalls Second partition

John Edgar60 bigs pivot Third partition

John Edgar61 pivot smalls Fourth partition

John Edgar62 bigs pivot Fifth partition

John Edgar63 pivot smalls Sixth partition

John Edgar64 pivot Seventh(!) partition

 Every partition step results in just one partition on one side of the pivot  The array has to be partitioned n times, not log 2 (n) times  So in the worst case quicksort performs around n 2 operations  The worst case usually occurs when the array is nearly sorted (in either direction) John Edgar65

 With a large array we would have to be very, very unlucky to get the worst case  Unless there was some reason for the array to already be partially sorted ▪ In which case first randomize the position of the array elements!  The average case is much more like the best case than the worst case John Edgar66

January 2010Greg Mori67

 Recursive algorithms have more overhead than similar iterative algorithms  Because of the repeated method calls  This may cause a stack overflow when the call stack gets full  It is often useful to derive a solution using recursion and implement it iteratively  Sometimes this can be quite challenging! John Edgar68

 Some recursive algorithms are inherently inefficient  e.g. the recursive Fibonacci algorithm which repeats the same calculation again and again  Look at the number of times fib(2) is called  Such algorithms should be implemented iteratively  Even if the solution was determined using recursion John Edgar69

 It is useful to trace through the sequence of recursive calls  This can be done using a recursion tree  Recursion trees can be used to determine the running time of algorithms  Annotate the tree to indicate how much work is performed at each level of the tree  And then determine how many levels of the tree there are John Edgar70

January 2010Greg Mori71

 Recursion is similar to induction  Recursion solves a problem by  Specifying a solution for the base case and  Using a recursive case to derive solutions of any size from solutions to smaller problems  Induction proves a property by  Proving it is true for a base case and  Proving that it is true for some number, n, if it is true for all numbers less than n John Edgar72

 Prove, using induction that the algorithm returns the values  fact(0) = 0! =1  fact(n) = n! = n * (n – 1) * … * 1 if n > 0 John Edgar73 int fact (int x){ if (x == 0){ return 1; } else return n * fact(n – 1); } int fact (int x){ if (x == 0){ return 1; } else return n * fact(n – 1); } C++

 Basis: Show that the property is true for n = 0, i.e. that fact(0) returns 1  This is true by definition as fact(0) is the base case of the algorithm and returns 1  Establish that the property is true for an arbitrary k implies that it is also true for k + 1  Inductive hypothesis: Assume that the property is true for n = k, that is assume that  fact(k) = k * ( k – 1) * ( k – 2) * … * 2 * 1 John Edgar74

IInductive conclusion: Show that the property is true for n = k + 1, i.e., that fact(k + 1) returns ((k + 1) * k * (k – 1) * (k – 2) * … * 2 * 1 BBy definition of the function: fact(k + 1) returns ((k + 1) * fact(k) – the recursive case AAnd by the inductive hypothesis: fact(k) returns kk * (k – 1) * (k – 2) * … * 2 * 1 TTherefore fact(k + 1) must return ((k + 1) * k * (k – 1) * (k – 2) * … * 2 * 1 WWhich completes the inductive proof John Edgar75

 Recursive sum  Towers of Hanoi – see text  Eight Queens problem – see text  Sorting  Mergesort  Quicksort John Edgar76

 Linked Lists are recursive data structures  They are defined in terms of themselves  There are recursive solutions to many list methods  List traversal can be performed recursively  Recursion allows elegant solutions of problems that are hard to implement iteratively ▪ Such as printing a list backwards John Edgar77

January 2010Greg Mori78

 Recursion as a problem-solving tool  Identify base case where solution is simple  Formulate other cases in terms of smaller case(s)  Recursion is not always a good implementation strategy  Solve the same problem many times  Function call overhead  Recursion and induction  Induction proves properties in a form similar to how recursion solves problems John Edgar79

 Carrano Ch. 2, 5 John Edgar80