COSC 3101Winter, 2004 COSC 3101 Design and Analysis of Algorithms Lecture 2. Relevant Mathematics: –The time complexity of an algorithm –Adding made easy.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Lecture: Algorithmic complexity
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Nattee Niparnan. Recall What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation.
Grade School Revisited: How To Multiply Two Numbers Great Theoretical Ideas In Computer Science Victor Adamchik Danny Sleator CS Spring 2010 Lecture.
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Grade School Revisited: How To Multiply Two Numbers Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2004 Lecture 16March 4,
Analysis of Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
Grade School Revisited: How To Multiply Two Numbers Great Theoretical Ideas In Computer Science S. Rudich V. Adamchik CS Spring 2006 Lecture 18March.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
COMPSCI 102 Introduction to Discrete Mathematics.
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Introduction to Algorithms Jiafen Liu Sept
Asymptotic Notations Iterative Algorithms and their analysis
Relevant Mathematics Lecture 2 Classifying Functions The Time Complexity of an Algorithm Adding Made Easy Recurrence Relations.
Relevant Mathematics Jeff Edmonds York University COSC 3101 Lecture skipped. Lecture skipped. (Done when needed) Logic Quantifiers The Time Complexity.
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
Mon 29 Sep 2014Lecture 4 1. Running Time Performance analysis Techniques until now: Experimental Cost models counting execution of operations or lines.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
Fundamentals of Algorithm Analysis Dr. Steve Goddard CSCE 310J: Data Structures &
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
J. Elder COSC 3101N Thinking about Algorithms Abstractly Lecture 2. Relevant Mathematics: Classifying Functions Input Size Time f(i) = n  (n)
Recursive Algorithms Introduction Applications to Numeric Computation.
Recursion Jeff Edmonds York University COSC 6111 Lecture 3 Friends & Steps for Recursion Derivatives Recursive Images Multiplying Parsing Ackermann.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Fundamentals CSE 373 Data Structures Lecture 5. 12/26/03Fundamentals - Lecture 52 Mathematical Background Today, we will review: ›Logs and exponents ›Series.
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Divide & Conquer  Themes  Reasoning about code (correctness and cost)  recursion, induction, and recurrence relations  Divide and Conquer  Examples.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Mathematical Background and Linked Lists. 2 Iterative Algorithm for Sum Find the sum of the first n integers stored in an array v : sum (v[], n) temp_sum.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. CSE 3101Z Design and Analysis of Algorithms.
COMPSCI 102 Introduction to Discrete Mathematics.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
CS 2601 Runtime Analysis Big O, Θ, some simple sums. (see section 1.2 for motivation) Notes, examples and code adapted from Data Structures and Other Objects.
Summations COSC 3101, PROF. J. ELDER 2 Recall: Insertion Sort.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Spring 2015 Lecture 2: Analysis of Algorithms
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
Grade School Revisited: How To Multiply Two Numbers CS Lecture 4 2 X 2 = 5.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Introduction to the Design and Analysis of Algorithms
Thinking about Algorithms Abstractly
Computational Complexity
Great Theoretical Ideas in Computer Science
Growth of functions CSC317.
Great Theoretical Ideas in Computer Science
Grade School Revisited: How To Multiply Two Numbers
CS 3343: Analysis of Algorithms
Grade School Revisited: How To Multiply Two Numbers
At the end of this session, learner will be able to:
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Introduction to Discrete Mathematics
Presentation transcript:

COSC 3101Winter, 2004 COSC 3101 Design and Analysis of Algorithms Lecture 2. Relevant Mathematics: –The time complexity of an algorithm –Adding made easy –Recurrence relations

COSC 3101Winter, 2004 Question from last class: recurrence relations T(1) = k for some constant k T(n) = 4 T(n/2) + k’ n + k’’ for some constants k’ and k’’ MULT(X,Y): If |X| = |Y| = 1 then RETURN XY Break X into a;b and Y into c;d RETURN MULT(a,c) 2 n + (MULT(a,d) + MULT(b,c)) 2 n/2 + MULT(b,d) Question from last class: how does Mult know the length of X and Y (i.e. n)? Answer: data type of X, Y, a, b, c, d must include length field.

COSC 3101Winter, 2004 The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input.

COSC 3101Winter, 2004 Purpose?

COSC 3101Winter, 2004 Purpose To estimate how long a program will run. To estimate the largest input that can reasonably be given to the program. To compare the efficiency of different algorithms. To help focus on the parts of code that are executed the largest number of times. To choose an algorithm for an application.

COSC 3101Winter, 2004 Time Complexity Is a Function Specifies how the running time depends on the size of the input. A function mapping “size” of input  “time” T(n) executed.

COSC 3101Winter, 2004 Definition of Time?

COSC 3101Winter, 2004 Definition of Time # of seconds (machine dependent). # lines of code executed. # of times a specific operation is performed (e.g., addition). Which?

COSC 3101Winter, 2004 Definition of Time # of seconds (machine dependent). # lines of code executed. # of times a specific operation is performed (e.g., addition). These are all reasonable definitions of time, because they are within a constant factor of each other.

COSC 3101Winter, 2004 Size of Input Instance? Suppose input to the algorithm is the integer

COSC 3101Winter, 2004 Size of Input Instance Size of paper: # of bits: # of digits: Value: n = 2 in 2 n = 17 bits n = 5 digits n = ’’ ’’ Which are reasonable?

COSC 3101Winter, 2004 Size of Input Instance Intuitive 2’’ ’’ Size of paper: # of bits: # of digits: Value: n = 2 in 2 n = 17 bits n = 5 digits n = 83920

COSC 3101Winter, 2004 Size of Input Instance Size of paper: # of bits: # of digits: Value: n = 2 in 2 n = 17 bits n = 5 digits n = Intuitive Formal 2’’ ’’

COSC 3101Winter, 2004 Size of Input Instance Size of paper: # of bits: # of digits: Value: n = 2 in 2 n = 17 bits n = 5 digits n = Intuitive Formal Reasonable # of bits = 3.32 * # of digits 2’’ ’’

COSC 3101Winter, 2004 Size of Input Instance Size of paper # of bits # of digits Value - n = 2 in 2 - n = 17 bits - n = 5 digits - n = Intuitive Formal Reasonable Unreasonable # of bits = log 2 (Value) Value = 2 # of bits 2’’ ’’

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm: N/2, N/3, N/4, …. ? Time?

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Time = N

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Is this reasonable? Time = N

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? No! Time = N One is considered fast and the other slow!

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Size of Input Instance?

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? size n = N size n = log N

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Time as function of input size? size n = N size n = log N

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Time = N = n Time = N = 2 n size n = N size n = log N

COSC 3101Winter, 2004 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Linear vs Exponential Time! Time = N = n Time = N = 2 n size n = N size n = log N

COSC 3101Winter, 2004 Size of Input Instance? 14,23,25,30,31,52,62,79,88,98

COSC 3101Winter, 2004 Size of Input Instance # of elements: n = 10 elements 14,23,25,30,31,52,62,79,88,98 10

COSC 3101Winter, 2004 Size of Input Instance # of elements: n = 10 elements 14,23,25,30,31,52,62,79,88,98 10 Is this reasonable?

COSC 3101Winter, 2004 Size of Input Instance # of elements: n = 10 elements ~ Reasonable 14,23,25,30,31,52,62,79,88,98 10 If each element has size c # of bits = c * # of elements

COSC 3101Winter, 2004 Size of Input Instance # of elements: n = 10 elements Reasonable 6, 3, 8, 1, 2, 10, 4, 9, 7, 5 10 If each element is in [1..n] each element has size log n # of bits = n log n ≈ n

COSC 3101Winter, 2004 Time Complexity Is a Function Specifies how the running time depends on the size of the input. A function mapping: –“size” of input  “time” T(n) executed. Or more precisely: –# of bits n needed to represent the input  # of operations T(n) executed.

COSC 3101Winter, 2004 Which Input of size n? There are 2 n inputs of size n. Which do we consider for the time T(n)?

COSC 3101Winter, 2004 Which Input of size n? Provides an upper bound for all possible inputs. Easiest to compute Worst Case For what distribution?Average Case But what is typical?Typical Input

COSC 3101Winter, 2004 Time Complexity of an Algorithm O(n 2 ): Prove that for every input of size n, the algorithm takes no more than cn 2 time. Ω(n 2 ): Find one input of size n, for which the algorithm takes at least this much time. θ (n 2 ): Do both. The time complexity of an algorithm is the largest time required on any input of size n.

COSC 3101Winter, 2004 What is the height of tallest person in the class? Bigger than this? Need to look at only one person Need to look at every person Smaller than this?

COSC 3101Winter, 2004 Time Complexity of a Problem O(n 2 ): Provide an algorithm that solves the problem in no more than this time. Ω(n 2 ): Prove that no algorithm can solve it faster. θ (n 2 ): Do both. The time complexity of a problem is the time complexity of the fastest algorithm that solves the problem.

COSC 3101Winter, 2004 Adding: The Classic Techniques Evaluating ∑ i=1 f(i). n

COSC 3101Winter, 2004 ∑ i=1..n i = n = ? Arithmetic Sum

COSC 3101Winter, n-1+n=S n+n-1+n =S (n+1) + (n+1) + (n+1) (n+1) + (n+1) = 2S n (n+1) = 2S

COSC 3101Winter, n-1+n=S n+n-1+n =S (n+1) + (n+1) + (n+1) (n+1) + (n+1) = 2S n (n+1) = 2S

COSC 3101Winter, n-1+n=S n+n-1+n =S (n+1) + (n+1) + (n+1) (n+1) + (n+1) = 2S n (n+1) = 2S

COSC 3101Winter, n-1+n=S n+n-1+n =S (n+1) + (n+1) + (n+1) (n+1) + (n+1) = 2S n (n+1) = 2S

COSC 3101Winter, 2004 Let’s restate this argument using a geometric representation Algebraic argument

COSC 3101Winter, n = number of white dots.

COSC 3101Winter, n = number of white dots = number of yellow dots n

COSC 3101Winter, 2004 n+1 n+1 n+1 n+1 n+1 = number of white dots = number of yellow dots n n n n n n There are n(n+1) dots in the grid

COSC 3101Winter, 2004 n+1 n+1 n+1 n+1 n+1 = number of white dots = number of yellow dots n n n n n n Note =  (# of terms · last term))

COSC 3101Winter, 2004 ∑ i=1..n i = n =  (# of terms · last term) Arithmetic Sum True whenever terms increase slowly

COSC 3101Winter, 2004 ∑ i=0..n 2 i = n = ? Geometric Sum

COSC 3101Winter, 2004 Geometric Sum

COSC 3101Winter, 2004 ∑ i=0..n 2 i = n = 2 · last term - 1 Geometric Sum

COSC 3101Winter, 2004 ∑ i=0..n r i = r 0 + r 1 + r r n = ? Geometric Sum

COSC 3101Winter, 2004 S Srrrr...rr S 1r S r1 r1 23nn1 n1 n1  1rrr r n          Geometric Sum

COSC 3101Winter, 2004 ∑ i=0..n r i r1 r1 n1     Geometric Sum When r>1?

COSC 3101Winter, 2004 ∑ i=0..n r i r1 r1 n1     Geometric Sum θ(rn)θ(rn) Biggest Term When r>1

COSC 3101Winter, 2004 ∑ i=0..n r i = r 0 + r 1 + r r n =  (biggest term) Geometric Increasing True whenever terms increase quickly

COSC 3101Winter, 2004 ∑ i=0..n r i  Geometric Sum When r<1? 1 r n1   

COSC 3101Winter, 2004 ∑ i=0..n r i  Geometric Sum 1 r n1     θ(1) When r<1 Biggest Term

COSC 3101Winter, 2004 ∑ i=0..n r i = r 0 + r 1 + r r n =  (1) Bounded Tail True whenever terms decrease quickly

COSC 3101Winter, 2004 ∑ i=1..n 1 / i = 1 / / / / / 5 + …+ 1 / n = ? Harmonic Sum

COSC 3101Winter, 2004 f(i) = 1 ∑ i=1..n f(i) = n n Sum of Shrinking Function

COSC 3101Winter, 2004 f(i) = ? ∑ i=1..n f(i) = n 1/2 n Sum of Shrinking Function

COSC 3101Winter, 2004 f(i) = 1/2 i ∑ i=1..n f(i) = 2  Sum of Shrinking Function

COSC 3101Winter, 2004 f(i) = 1/i ∑ i=1..n f(i) = ? n Sum of Shrinking Function

COSC 3101Winter, 2004 Harmonic Sum NB: Error in Jeff’s notes, p.30, bottom:

COSC 3101Winter, 2004 ∑ i=1..n 1 / i = 1 / / / / / 5 + …+ 1 / n =  (log(n)) Harmonic Sum

COSC 3101Winter, 2004 Approximating Sum by Integrating The area under the curve approximates the sum ∑ i=1..n f(i) ≈ ∫ x=1..n f(x) dx

COSC 3101Winter, 2004 Approximating Sums by Integrating: Arithmetic Sums

COSC 3101Winter, 2004 Approximating Sums by Integrating: Geometric Sums

COSC 3101Winter, 2004 Harmonic Sum Approximating Sum by Integrating

COSC 3101Winter, 2004 Approximating Sum by Integrating Problem: Integrating may be hard too.

COSC 3101Winter, 2004 Adding Made Easy We will now classify (most) functions f(i) into four classes: –Geometric Like –Arithmetic Like –Harmonic –Bounded Tail For each class, we will give an easy rule for approximating its sum θ( ∑ i=1..n f(i) )

COSC 3101Winter, 2004 Classifying Animals Vertebrates Birds Mammals Reptiles Fish Dogs Giraffe

COSC 3101Winter, 2004 Classifying Functions Functions Poly. Exp. 2 θ(n)  ∑ i=1..n f(i) = θ(f(n)) 8 · 2 n / n n 3 θ(2 n / n 100 ) n θ(1) 2 θ(n) θ(2n)θ(2n) f(n) = 8 · 2 n / n n 3 θ(2 n / n 100 )  ∑ i=1..n f(i) = θ(2 n / n 100 ) 2 0.5n << 2 n 8 · 2 n / n n 3 Significant Less significant Irrelevant

COSC 3101Winter, 2004 Adding Made Easy Four Classes of Functions

COSC 3101Winter, 2004 Adding Made Easy

COSC 3101Winter, 2004 If the terms f(i) grow sufficiently quickly, then the sum will be dominated by the largest term. f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like: Classic example: ∑ i=1..n 2 i = 2 n+1 -1 ≈ 2 f(n)

COSC 3101Winter, 2004 If the terms f(i) grow sufficiently quickly, then the sum will be dominated by the largest term. f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like: For which functions f(i) is this true? How fast and how slow can it grow?

COSC 3101Winter, 2004 r1 r1 n1      θ( ) r n Last Term when r>1. ∑ i=1..n r i 1rrr r n    θ(f(n)) ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: Upper Extreme: ∑ i=1..n (1000) i ≈ 1.001(1000) n = f(n) When f(n) = r n = 2 θ(n) Geometric Like:

COSC 3101Winter, 2004 Because the constant is sooo big, the statement is just barely true. ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: Upper Extreme: ∑ i=1..n (1000) i ≈ 1.001(1000) n = f(n) f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, 2004 ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: Upper Extreme: ∑ i=1..n (1000) i ≈ 1.001(1000) n = f(n) Even bigger? f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, n2n 2i2i ∑ i=1..n 2 2 ≈ 2 2 = 1f(n) No Upper Extreme: Even bigger! ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, n2n 2i2i ∑ i=1..n 2 2 ≈ 2 2 = 1f(n) No Upper Extreme: Functions in between? ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, 2004 f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, 2004 f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like: f(n) = 2 n 8·8· n 100 ∑ i=1..n f(i) f(n)  c · f(n) f(i)  · f(n) f(i)  · f(i +1 ) EasyHard (i +1 ) 100 i =  (1 + 1 / i ) = 8 · 2 (i+1) (i +1 ) 100 f(i +1 ) f(i) 8·2i8·2i i 100 =  1/.51  · f(i +1 ) f(i)

COSC 3101Winter, 2004

COSC 3101Winter, 2004 In general, if f(n) = c b an n d log e (n) c  ? a  ? b  ? d  ? e  ? f(n) = Ω, θ ? Then ∑ i=1..n f(i) = θ(f(n)).  2 Ω(n) > 0  0  1 (- ,  ) f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, 2004 f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like: Examples:

COSC 3101Winter, 2004 Do All functions in 2 Ω(n) have this property? Maybe not. f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, 2004 Functions that oscillate with exponentially increasing amplitude do not have this property. f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, 2004 Functions expressed with +, -, , , exp, log do not oscillate continually. They are well behaved for sufficiently large n. These do have this property. f(n)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

COSC 3101Winter, 2004 Adding Made Easy Done

COSC 3101Winter, 2004 Adding Made Easy

COSC 3101Winter, 2004 If the terms f(i) are increasing or decreasing relatively slowly, then the sum is roughly the number of terms, n, times the final value. f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 If the terms f(i) are increasing or decreasing relatively slowly, then the sum is roughly the number of terms, n, times the final value. Simple example: ∑ i=1..n 1 = n · 1 f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 If the terms f(i) are increasing or decreasing relatively slowly, then the sum is roughly the number of terms, n, times the final value. Is the statement true for this function? ∑ i=1..n i = n f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Half the terms are roughly the same (within a multiplicative constant) ∑ i=1..n i = n/ n f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 But half the terms are roughly the same and the sum is roughly the number terms, n, times this value ∑ i=1..n i = n/ n ∑ i=1..n i = θ(n · n) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Is the statement true for this function? ∑ i=1..n i 2 = n 2 f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Again half the terms are roughly the same. ∑ i=1..n i = (n/2) n 2 1 / 4 n 2 f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Again half the terms are roughly the same. ∑ i=1..n i = (n/2) n 2 1 / 4 n 2 ∑ i=1..n i 2 = θ(n · n 2 ) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 ∑ i=1..n f(i) ≈ area under curve f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: Consider f(i) = n θ(1) (i.e. f(i) increasing)

COSC 3101Winter, 2004 area of small square  ∑ i=1..n f(i) ≈ area under curve  area of big square f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 n / 2 · f( n / 2 ) = area of small square  ∑ i=1..n f(i) ≈ area under curve  area of big square = n · f(n) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 θ(n · f(n)) = n / 2 · f( n / 2 ) = area of small square  ∑ i=1..n f(i) ≈ area under curve  area of big square = n · f(n) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 θ(n · f(n)) = n / 2 · f( n / 2 ) = area of small square  ∑ i=1..n f(i) ≈ area under curve  area of big square = n · f(n) ? f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 ∑ i=1..n i 2 = n 2 f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(n) = n 2 f( n / 2 ) = θ(f(n)) The key property is = ?

COSC 3101Winter, 2004 The key property is f( n / 2 ) = (n/2) 2 = 1 / 4 n 2 = θ( n 2 ) = θ(f(n)) ∑ i=1..n i 2 = n 2 f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(n) = n 2  = θ(n·f(n)) = θ(n · n 2 )

COSC 3101Winter, 2004 ∑ i=1..n f(i) = ? f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(n) = n θ(1)

COSC 3101Winter, 2004 ∑ i=1..n i r = 1 r + 2 r + 3 r n r f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(n) = n θ(1) f( n / 2 ) = θ(f(n)) The key property is

COSC 3101Winter, 2004 The key property is f( n / 2 ) = (n/2) r = 1 / 2 r n r = θ( n r ) = θ(f(n)) ∑ i=1..n i r = 1 r + 2 r + 3 r n r f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:  = θ(n·f(n)) = θ(n · n r ) f(n) = n θ(1)

COSC 3101Winter, 2004 ∑ i=1..n 2 i = n f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(n) = n θ(1) f( n / 2 ) = θ(f(n)) The key property is

COSC 3101Winter, 2004 The key property is ∑ i=1..n 2 i = n f( n / 2 ) = 2 (n/2) = θ( 2 n ) = θ(f(n)) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(n) = n θ(1)  = θ(n·f(n)) = θ(n · 2 n )

COSC 3101Winter, 2004 Upper Extreme: All functions in between. f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Adding Made Easy Half done

COSC 3101Winter, 2004 f(i) = 1 ∑ i=1..n f(i) = n n Sum of Shrinking Function

COSC 3101Winter, 2004 f(i) = ? ∑ i=1..n f(i) = n 1/2 n Sum of Shrinking Function 1/i 1/2

COSC 3101Winter, 2004 f(i) = 1/i ∑ i=1..n f(i) = log n n Sum of Shrinking Function

COSC 3101Winter, 2004 f(i) = 1/2 i ∑ i=1..n f(i) = 2 Sum of Shrinking Function 

COSC 3101Winter, 2004 If most of the terms f(i) have roughly the same value, then the sum is roughly the number of terms, n, times this value. Does the statement hold for functions f(i) that shrink? f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 If most of the terms f(i) have roughly the same value, then the sum is roughly the number of terms, n, times this value. ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n Does the statement hold for the Harmonic Sum? f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) = ? θ(n · f(n)) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) θ(n · f(n)) = ∑ i=1..n 1 / i = ? f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) θ(n · f(n)) = ∑ i=1..n 1 / i =θ(log n) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(log n) ? f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(log n) θ(1) = θ(n · 1 / n ) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(log n) θ(1) = θ(n · 1 / n ) ≠ No the statement does not hold! f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Adding Made Easy not included

COSC 3101Winter, 2004 Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i Shrinks slightly slower than harmonic. f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i = ? f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(n ) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(n ) ? f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(n ) θ(n ) = θ(n · 1 / n ) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(n ) θ(n ) = θ(n · 1 / n ) = The statement does hold! f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 ∑ i=1..n 1 = n · 1 = n · f(n) Intermediate Case: Lower Extreme: ∑ i=1..n 1 / i = θ(n ) = θ(n · f(n)) All functions in between. f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 ∑ i=1..n 1 = n · 1 = n · f(n) Intermediate Case: Lower Extreme: ∑ i=1..n 1 / i = θ(n ) = θ(n · f(n)) Upper Extreme: ∑ i=1..n i 1000 = 1 / 1001 n 1001 = 1 / 1001 n · f(n) f(n) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

COSC 3101Winter, 2004 Arithmetic Like If f(n) = c b an n d log e (n) c  ? a  ? b  ? d  ? e  ? f(n) = Ω, θ ? then ∑ i=1..n f(i) = θ(n · f(n)). n -1+θ(1) > 0  0 or b  1 > -1 (- ,  ) (For +, -, , , exp, log functions f(n)) Conclusion

COSC 3101Winter, 2004 Adding Made Easy Done

COSC 3101Winter, 2004 Adding Made Easy Harmonic

COSC 3101Winter, 2004 Harmonic Sum ∑ i=1..n 1 / i = 1 / / / / / 5 + …+ 1 / n =  (log(n))

COSC 3101Winter, 2004 Adding Made Easy

COSC 3101Winter, 2004 If the terms f(i) decrease towards zero sufficiently quickly, then the sum will be a constant. The classic example ∑ i=0..n 1 / 2 i = / / / 8 + … = 2. f(n)  n -1-Ω(1)  ∑ i=1..n f(i) = θ(1) Bounded Tail:

COSC 3101Winter, 2004 Upper Extreme: ∑ i=1..n 1 / i = θ(1) f(n)  n -1-Ω(1)  ∑ i=1..n f(i) = θ(1) Bounded Tail:

COSC 3101Winter, 2004 Upper Extreme: ∑ i=1..n 1 / i = θ(1) No Lower Extreme: 2i2i ∑ i=1..n 2 2 = θ(1). 1 All functions in between. f(n)  n -1-Ω(1)  ∑ i=1..n f(i) = θ(1) Bounded Tail:

COSC 3101Winter, 2004 Bounded Tail Conclusion

COSC 3101Winter, 2004 Adding Made Easy Done

COSC 3101Winter, 2004 Adding Made Easy Missing Functions 1 / nlogn logn / n

COSC 3101Winter, 2004 Adding Made Easy Geometric Like: If f(n)  2 Ω(n), then ∑ i=1..n f(i) = θ(f(n)). Arithmetic Like: If f(n) = n θ(1)-1, then ∑ i=1..n f(i) = θ(n · f(n)). Harmonic: If f(n) = 1 / n, then ∑ i=1..n f(i) = log e n + θ(1). Bounded Tail: If f(n)  n -1-Ω(1), then ∑ i=1..n f(i) = θ(1). ( For +, -, , , exp, log functions f(n) ) This may seem confusing, but it is really not. It should help you compute most sums easily.

COSC 3101Winter, 2004 Recurrence Relations T(1) = 1 T(n) = a T(n/b) + f(n)

COSC 3101Winter, 2004 Recurrence Relations  Time of Recursive Program procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Recurrence relations arise from the timing of recursive programs. Let T(n) be the # of “Hi”s on an input of “size” n.

COSC 3101Winter, 2004 procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Given size 1, the program outputs T(1)=1 Hi’s. Recurrence Relations  Time of Recursive Program

COSC 3101Winter, 2004 procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Given size n, the stackframe outputs f(n) Hi’s. Recurrence Relations  Time of Recursive Program

COSC 3101Winter, 2004 procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Recursing on a instances of size n/b generates T(n/b) “Hi”s. Recurrence Relations  Time of Recursive Program

COSC 3101Winter, 2004 procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Recursing a times generates a·T(n/b) “Hi”s. Recurrence Relations  Time of Recursive Program

COSC 3101Winter, 2004 procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) For a total of T(1) = 1 T(n) = a·T(n/b) + f(n) “Hi”s. Recurrence Relations  Time of Recursive Program

COSC 3101Winter, 2004 Solving Technique 1 Guess and Verify Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = 2n 2 – n Verify: Left Hand SideRight Hand Side T(1) = 2(1) 2 – 1 T(n) = 2n 2 – n 1 4T(n/2) + n = 4 [2( n / 2 ) 2 – ( n / 2 )] + n = 2n 2 – n

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = an 2 + bn + c Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = an 2 + bn + 0 Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients c=4c c=0

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = an 2 - 1n + 0 Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients b=2b+1 b=-1

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = an 2 - 1n + 0 Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients a=a

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = 2n 2 - 1n + 0 Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients a+b+c=1 a-1+0=1 a=2

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) + f(n) Solving Technique 3 Approximate Form and Calculate Exponent which is bigger? Guess

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) + f(n) Guess: aT(n/b) << f(n) Simplify: T(n)  f(n) Solving Technique 3 Calculate Exponent In this case, the answer is easy. T(n) =  (f(n))

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) + f(n) Guess: aT(n/b) >> f(n) Simplify: T(n)  aT(n/b) Solving Technique 3 Calculate Exponent In this case, the answer is harder.

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) Guess: G(n) = cn  Verify: Left Hand SideRight Hand Side T(n) = cn  aT(n/b) = a [c ( n / b )  ] = c a b  n  Solving Technique 3 Calculate Exponent ( log a / log b ) = cn 1 = a b -  b  = a  log b = log a  = log a / log b

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) Guess: G(n) = cn  Verify: Left Hand SideRight Hand Side T(n) = cn  aT(n/b) = c [a ( n / b )  ] = c a b  n  Solving Technique 3 Calculate Exponent 1 = a b -  b  = a  log b = log a  = log a / log b ( log a / log b ) = cn ( log 4 / log 2 ) = cn 2

COSC 3101Winter, 2004 Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) + f(n) Solving Technique 3 Calculate Exponent If bigger then T(n) =  (f(n)) If bigger then ( log a / log b ) T(n) =  (n ) And if aT(n/b)  f(n) what is T(n) then?

COSC 3101Winter, 2004 Technique 4: Decorate The Tree T(n) = a T(n/b) + f(n) f(n) T(n/b) T(n) = T(1) T(1) = 1 1 = a

COSC 3101Winter, 2004 f(n) T(n/b) T(n) = a

COSC 3101Winter, 2004 f(n) T(n/b) T(n) = a f(n/b) T(n/b 2 ) a

COSC 3101Winter, 2004 f(n) T(n) = a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a

COSC 3101Winter, f(n) T(n) = a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level i h

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size i h

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 2 i h

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 i h

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i h

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h n/b h

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h n/b h

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h n/b h = 1 base case

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h n/b h = 1

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h = log n / log b n/b h = 1 b h = n h log b = log n h = log n / log b

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame 0n 1 n/b 2 n/b 2 i n/b i h = log n / log b 1

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame 0n f(n) 1 n/bf(n/b) 2 n/b 2 f(n/b 2 ) i n/b i f(n/b i ) h = log n / log b 1T(1)

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames 0n f(n) 1 n/bf(n/b) 2 n/b 2 f(n/b 2 ) i n/b i f(n/b i ) h = log n / log b n/b h T(1)

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames 0n f(n) 1 1 n/bf(n/b) a 2 n/b 2 f(n/b 2 ) a2a2 i n/b i f(n/b i ) aiai h = log n / log b n/b h T(1) ahah

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames 0n f(n) 1 1 n/bf(n/b) a 2 n/b 2 f(n/b 2 ) a2a2 i n/b i f(n/b i ) aiai h = log n / log b n/b h T(1) ahah

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames 0n f(n) 1 1 n/bf(n/b) a 2 n/b 2 f(n/b 2 ) a2a2 i n/b i f(n/b i ) aiai h = log n / log b n/b h T(1) ahah a h = a = n log n / log b log a / log b

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames Work in Level 0n f(n) 1 1 n/bf(n/b) a 2 n/b 2 f(n/b 2 ) a2a2 i n/b i f(n/b i ) aiai h = log n / log b n/b h T(1) n log a / log b

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames Work in Level 0n f(n) 1 1 · f(n) 1 n/bf(n/b) a a · f(n/b) 2 n/b 2 f(n/b 2 ) a2a2 a 2 · f(n/b 2 ) i n/b i f(n/b i ) aiai a i · f(n/b i ) h = log n / log b n/b h T(1) n log a / log b n · T(1) log a / log b Total Work T(n) = ∑ i=0..h a i  f(n/b i )

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) = ∑ i=0..h a i  f(n/b i ) If a Geometric Sum ∑ i=0..n x i  θ(max(first term, last term))

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames Work in Level 0n f(n) 1 1 · f(n) 1 n/bf(n/b) a a · f(n/b) 2 n/b 2 f(n/b 2 ) a2a2 a 2 · f(n/b 2 ) i n/b i f(n/b i ) aiai a i · f(n/b i ) h = log n / log b n/b h T(1) n log a / log b n · T(1) log a / log b Dominated by Top Level or Base Cases

COSC 3101Winter, 2004

COSC 3101Winter, 2004

COSC 3101Winter, 2004

COSC 3101Winter, 2004

COSC 3101Winter, 2004

COSC 3101Winter, 2004

COSC 3101Winter, 2004 Sufficiently close to: T(n) = 4T(n/2)+ n 3 = θ(n 3 ). T 2 (n) = ? Evaluating: T 2 (n) = 4 T 2 (n/2+n 1/2 )+ n 3 θ(n 3 ).

COSC 3101Winter, 2004 Evaluating: T(n) = aT(n-b)+f(n) h = ? n-hb n-ib n-2b n-b T(0) f(n-ib) f(n-2b) f(n-b) f(n) n Work in Level # stack frames Work in stack frame Instance size a aiai a2a2 a 1 n/bn/b a · T(0) a i · f(n-ib) a 2 · f(n-2b) a · f(n-b) 1 · f(n) n/bn/b |base case| = 0 = n-hb h = n / b i Level Likely dominated by base cases Exponential

COSC 3101Winter, 2004 Evaluating: T(n) = 1T(n-b)+f(n) h = ? Work in Level # stack frames Work in stack frame Instance size f(0) f(n-ib) f(n-2b) f(n-b) f(n) n-b n n-hb n-ib n-2b T(0) f(n-ib) f(n-2b) f(n-b) f(n) h = n / b i Level Total Work T(n) = ∑ i=0..h f(b·i) = θ(f(n)) or θ(n · f(n))

COSC 3101Winter, 2004 End