Algorithms Recurrences.

Slides:



Advertisements
Similar presentations
A simple example finding the maximum of a set S of n numbers.
Advertisements

Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Analysis of Algorithms CS 477/677 Midterm Exam Review Instructor: George Bebis.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
CS 253: Algorithms Chapter 4 Divide-and-Conquer Recurrences Master Theorem Credit: Dr. George Bebis.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Chapter 4: Solution of recurrence relationships
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
October 1, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Analysis of Algorithms
Analysis of Algorithms CS 477/677
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 3.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
COSC 3101A - Design and Analysis of Algorithms 3 Recurrences Master’s Method Heapsort and Priority Queue Many of the slides are taken from Monica Nicolescu’s.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
Recurrences It continues… Jeff Chastine. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. A recurrence.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 4.
Analysis of Algorithms CS 477/677
Recursion Ali.
Mathematical Foundations (Solving Recurrence)
Design and Analysis of Algorithms
Design and Analysis of Algorithms
Analysis of Algorithms CS 477/677
CS 3343: Analysis of Algorithms
UNIT- I Problem solving and Algorithmic Analysis
Lecture 11. Master Theorem for analyzing Recursive relations
Divide-and-Conquer 6/30/2018 9:16 AM
CS 3343: Analysis of Algorithms
Chapter 4 Divide-and-Conquer
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Algorithms and Data Structures Lecture III
Data Structures Review Session
Divide-and-Conquer 7 2  9 4   2   4   7
Ch 4: Recurrences Ming-Te Chi
CS 3343: Analysis of Algorithms
Divide and Conquer (Merge Sort)
Using The Master Method Case 1
Divide-and-Conquer 7 2  9 4   2   4   7
Ack: Several slides from Prof. Jim Anderson’s COMP 202 notes.
Divide & Conquer Algorithms
Analysis of Algorithms
At the end of this session, learner will be able to:
Introduction To Algorithms
The Selection Problem.
Algorithms CSCI 235, Spring 2019 Lecture 6 Recurrences
Divide-and-Conquer 7 2  9 4   2   4   7
Quicksort Quick sort Correctness of partition - loop invariant
Analysis of Algorithms CS 477/677
Divide and Conquer Merge sort and quick sort Binary search
Algorithms and Data Structures Lecture III
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

Algorithms Recurrences

Merge Sort: Example Show MergeSort() running on the array A = {10, 5, 7, 6, 1, 4, 8, 3, 2, 9};

MERGE-SORT Running Time Divide: compute q as the average of p and r: D(n) = (1) Conquer: recursively solve 2 subproblems, each of size n/2  2T (n/2) Combine: MERGE on an n-element subarray takes (n) time  C(n) = (n) (1) if n =1 T(n) = 2T(n/2) + (n) if n > 1

Quicksort Algorithm Given an array of n elements (e.g., integers): If array only contains one element, return Else pick one element to use as pivot. Partition elements into two sub-arrays: Elements less than or equal to pivot Elements greater than pivot Quicksort two sub-arrays Return results

Example We are given array of n integers to sort: 40 20 10 80 60 50 7 30 100

Pick Pivot Element There are a number of ways to pick the pivot element. In this example, we will use the first element in the array: 40 20 10 80 60 50 7 30 100

Partition Result 7 20 10 30 40 50 60 80 100 [0] [1] [2] [3] [4] [5] [6] [7] [8] <= data[pivot] > data[pivot]

Recursion: Quicksort Sub-arrays 7 20 10 30 40 50 60 80 100 [0] [1] [2] [3] [4] [5] [6] [7] [8] <= data[pivot] > data[pivot]

Quicksort For example, given we can select the middle entry, 44, and sort the remaining entries into two groups, those less than 44 and those greater than 44: Notice that 44 is now in the correct location if the list was sorted Proceed by applying the algorithm to the first six and last eight entries 80 38 95 84 66 10 79 44 26 87 96 12 43 81 3 38 10 26 12 43 3 44 80 95 84 66 79 87 96 81

Quicksort example We call quicksort( array, 0, 25 ) 77 49 35 61 48 73 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 77 49 35 61 48 73 95 89 37 57 99 32 94 28 55 51 88 97 62 quicksort( array, 0, 25 )

Recursive Algorithms ?

Fibonacci numbers Leonardo Pisano Born: 1170 in (probably) Pisa (now in Italy) Died: 1250 in (possibly) Pisa (now in Italy) 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, ... “A certain man put a pair of rabbits in a place surrounded on all sides by a wall. How many pairs of rabbits can be produced from that pair in a year if it is supposed that every month each pair begets a new pair which from the second month on becomes productive? “ F(n) = F(n-1) + F(n-2) F(1) = 0, F(2) = 1 F(3) = 1, F(4) = 2, F(5) = 3, and so on

Recurrences Def.: Recurrence = an equation or inequality that describes a function in terms of its value on smaller inputs, and one or more base cases E.g.: Fibonacci numbers: Recurrence: F(n) = F(n-1) + F(n-2) Boundary conditions: F(1) = 0, F(2) = 1 Compute: F(3) = 1, F(4) = 2, F(5) = 3, and so on

Recursive Algorithms A recursive algorithm is an algorithm which solves a problem by repeatedly reducing the problem to a smaller version of itself .. until it reaches a form for which a solution exists. Compared with iterative algorithms: More memory More computation Simpler, natural way of thinking about the problem

Recurrences The expression: is a recurrence. Recurrence: an equation that describes a function in terms of its value on smaller functions

Recurrence Examples

Recursive Example N! (N factorial) is defined for positive values as:   N! = 0 if N = 0 N! = N * (N-1) * (N-2) * (N-3) * … * 1 if N > 0 For example: 5! = 5*4*3*2*1 = 120 3! = 3*2*1 = 6 The definition of N! can be restated as: N! = N * (N-1)! Where 0! = 1 This is the same N!, but it is defined by reducing the problem to a smaller version of the original (hence, recursively). 

Recurrent Algorithms BINARY-SEARCH for an ordered array A, finds if x is in the array A[lo…hi] Alg.: BINARY-SEARCH (A, lo, hi, x) if (lo > hi) return FALSE mid  (lo+hi)/2 if x = A[mid] return TRUE if ( x < A[mid] ) BINARY-SEARCH (A, lo, mid-1, x) if ( x > A[mid] ) BINARY-SEARCH (A, mid+1, hi, x) 12 11 10 9 7 5 3 2 1 4 6 8 mid lo hi

Example A[8] = {1, 2, 3, 4, 5, 7, 9, 11} lo = 1 hi = 8 x = 7 11 9 7 5 6 7 8 11 9 7 5 4 3 2 1 mid = 4, lo = 5, hi = 8 5 6 7 8 11 9 7 5 4 3 2 1 mid = 6, A[mid] = x Found!

Another Example A[8] = {1, 2, 3, 4, 5, 7, 9, 11} lo = 1 hi = 8 x = 6 mid = 4, lo = 5, hi = 8 low high 11 9 7 5 4 3 2 1 mid = 6, A[6] = 7, lo = 5, hi = 5 low high 11 9 7 5 4 3 2 1 mid = 5, A[5] = 5, lo = 6, hi = 5 NOT FOUND! 11 9 7 5 4 3 2 1 high low

Analysis of BINARY-SEARCH Alg.: BINARY-SEARCH (A, lo, hi, x) if (lo > hi) return FALSE mid  (lo+hi)/2 if x = A[mid] return TRUE if ( x < A[mid] ) BINARY-SEARCH (A, lo, mid-1, x) if ( x > A[mid] ) BINARY-SEARCH (A, mid+1, hi, x) T(n) = c + T(n) – running time for an array of size n constant time: c1 constant time: c2 constant time: c3 same problem of size n/2 same problem of size n/2 T(n/2)

Recurrences An equation or inequality that describes a function in terms of its value on smaller inputs. T(n) = T(n-1) + n Recurrences arise when an algorithm contains recursive calls to itself What is the actual running time of the algorithm? Need to solve the recurrence Find an explicit formula of the expression Bound the recurrence by an expression that involves n

Example Recurrences T(n) = T(n-1) + n Θ(n2) T(n) = T(n/2) + c Θ(lgn) Recursive algorithm that loops through the input to eliminate one item T(n) = T(n/2) + c Θ(lgn) Recursive algorithm that halves the input in one step T(n) = T(n/2) + n Θ(n) Recursive algorithm that halves the input but must examine every item in the input T(n) = 2T(n/2) + 1 Θ(n) Recursive algorithm that splits the input into 2 halves and does a constant amount of other work

Methods for Solving Recurrences Master method Iteration method Substitution method Recursion tree method

Idea: compare f(n) with nlogba Master’s method “Cookbook” for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Idea: compare f(n) with nlogba f(n) is asymptotically smaller or larger than nlogba by a polynomial factor n f(n) is asymptotically equal with nlogba

Master’s method “Cookbook” for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Case 1: if f(n) = O(nlogba -) for some  > 0, then: T(n) = (nlogba) Case 2: if f(n) = (nlogba), then: T(n) = (nlogba lgn) Case 3: if f(n) = (nlogba +) for some  > 0, and if af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then: T(n) = (f(n)) regularity condition

The Master Theorem if T(n) = aT(n/b) + f(n) then

The master theorem Thus, we consider three possible cases

Summary of cases To summarize these run times:

Using The Master Method T(n) = 9T(n/3) + n a=9, b=3, f(n) = n nlogb a = nlog3 9 = (n2) Since f(n) = O(nlog3 9 - ), where =1, case 1 applies: Thus the solution is T(n) = (n2)

Examples T(n) = 2T(n/2) + n a = 2, b = 2, log22 = 1 Compare nlog22 with f(n) = n  f(n) = (n)  Case 2  T(n) = (nlgn)

Examples a = 2, b = 2, log22 = 1 Compare n with f(n) = n2 T(n) = 2T(n/2) + n2 a = 2, b = 2, log22 = 1 Compare n with f(n) = n2  f(n) = (n1+) Case 3  verify regularity cond. a f(n/b) ≤ c f(n)  2 n2/4 ≤ c n2  c = ½ is a solution (c<1)  T(n) = (n2)

Examples (cont.) a = 2, b = 2, log22 = 1 Compare n with f(n) = n1/2 T(n) = 2T(n/2) + a = 2, b = 2, log22 = 1 Compare n with f(n) = n1/2  f(n) = O(n1-) Case 1  T(n) = (n)

Examples T(n) = 3T(n/4) + nlgn a = 3, b = 4, log43 = 0.793 Compare n0.793 with f(n) = nlgn f(n) = (nlog43+) Case 3 Check regularity condition: 3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n), c=3/4 T(n) = (nlgn)

Examples T(n) = 2T(n/2) + nlgn a = 2, b = 2, log22 = 1 Compare n with f(n) = nlgn seems like case 3 should apply f(n) must be polynomially larger by a factor of n In this case it is only larger by a factor of lgn

The Iteration Method Convert the recurrence into a summation and try to bound it using known series Iterate the recurrence until the initial condition is reached. Use back-substitution to express the recurrence in terms of n and the initial (boundary) condition.

Iteration Method – Example T(n) = n + 2T(n/2) = n + 2(n/2 + 2T(n/4)) = n + n + 4T(n/4) = n + n + 4(n/4 + 2T(n/8)) = n + n + n + 8T(n/8) … = in + 2iT(n/2i) = kn + 2kT(1) = nlgn + nT(1) = Θ(nlgn) Assume: n = 2k T(n/2) = n/2 + 2T(n/4)

Solving a recurrence relation using iteration method T(n) = T(7n/8) + 2n = T(49n/64) + 2.(7n/8) + 2n = T(343n/512) + 2.(7n/8).(7n/8)+ 2.(7n/8) + 2n = T(1) + 2n ( (7n/8)^i + ..... + 1)

The recursion-tree method Convert the recurrence into a tree: Each node represents the cost incurred at various levels of recursion Sum up the costs of all levels Used to “guess” a solution for the recurrence

Example 1 W(n) = 2W(n/2) + n2 Subproblem size at level i is: n/2i Subproblem size hits 1 when 1 = n/2i  i = lgn Cost of the problem at level i = (n/2i)2 No. of nodes at level i = 2i Total cost:  W(n) = O(n2)

Example 2 E.g.: T(n) = 3T(n/4) + cn2 Subproblem size at level i is: n/4i Subproblem size hits 1 when 1 = n/4i  i = log4n Cost of a node at level i = c(n/4i)2 Number of nodes at level i = 3i  last level has 3log4n = nlog43 nodes Total cost:  T(n) = O(n2)

Example 3 (simpler proof) W(n) = W(n/3) + W(2n/3) + n The longest path from the root to a leaf is: n  (2/3)n  (2/3)2 n  …  1 Subproblem size hits 1 when 1 = (2/3)in  i=log3/2n Cost of the problem at level i = n Total cost:  W(n) = O(nlgn)

Example 3 W(n) = W(n/3) + W(2n/3) + n The longest path from the root to a leaf is: n  (2/3)n  (2/3)2 n  …  1 Subproblem size hits 1 when 1 = (2/3)in  i=log3/2n Cost of the problem at level i = n Total cost:  W(n) = O(nlgn)

Example 3 - Substitution W(n) = W(n/3) + W(2n/3) + O(n) Guess: W(n) = O(nlgn) Induction goal: W(n) ≤ dnlgn, for some d and n ≥ n0 Induction hypothesis: W(k) ≤ d klgk for any K < n (n/3, 2n/3) Proof of induction goal: Try it out as an exercise!! T(n) = O(nlgn)

The substitution method Guess a solution Use induction to prove that the solution works

Substitution method Guess a solution Prove the induction goal T(n) = O(g(n)) Induction goal: apply the definition of the asymptotic notation T(n) ≤ c g(n), for some c > 0 and n ≥ n0 Induction hypothesis: T(k) ≤ c g(k) for all k < n Prove the induction goal Use the induction hypothesis to find some values of the constants c and n0 for which the induction goal holds (strong induction)

Example: Binary Search T(n) = d + T(n/2) Guess: T(n) = O(lgn) Induction goal: T(n) ≤ c logn, for some c and n ≥ n0 Induction hypothesis: T(n/2) ≤ c log(n/2) Proof of induction goal: T(n) = T(n/2) + d ≤ c log(n/2) + d = c lgn – c + d ≤ c logn if: – c + d ≤ 0, c ≥ d Base case?

Example 2 T(n) = T(n-1) + n Guess: T(n) = O(n2) Induction goal: T(n) ≤ c n2, for some c and n ≥ n0 Induction hypothesis: T(n-1) ≤ c(n-1)2 for all k < n Proof of induction goal: T(n) = T(n-1) + n ≤ c (n-1)2 + n = cn2 – (2cn – c - n) ≤ cn2 if: 2cn – c – n ≥ 0  c ≥ n/(2n-1)  c ≥ 1/(2 – 1/n) For n ≥ 1  2 – 1/n ≥ 1  any c ≥ 1 will work

Example 3 T(n) = 2T(n/2) + n Guess: T(n) = O(nlgn) Induction goal: T(n) ≤ cn lgn, for some c and n ≥ n0 Induction hypothesis: T(n/2) ≤ cn/2 lg(n/2) Proof of induction goal: T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n = cn lgn – cn + n ≤ cn lgn if: - cn + n ≤ 0  c ≥ 1 Base case?

Changing variables T(n) = 2T( ) + lgn Rename: m = lgn  n = 2m T (2m) = 2T(2m/2) + m Rename: S(m) = T(2m) S(m) = 2S(m/2) + m  S(m) = O(mlgm) (demonstrated before) T(n) = T(2m) = S(m) = O(mlgm)=O(lgnlglgn) Idea: transform the recurrence to one that you have seen before

Example 2 - Substitution T(n) = 3T(n/4) + cn2 Guess: T(n) = O(n2) Induction goal: T(n) ≤ dn2, for some d and n ≥ n0 Induction hypothesis: T(n/4) ≤ d (n/4)2 Proof of induction goal: ≤ 3d (n/4)2 + cn2 = (3/16) d n2 + cn2 ≤ d n2 if: d ≥ (16/13)c Therefore: T(n) = O(n2)