Relevant Mathematics Lecture 2 Classifying Functions The Time Complexity of an Algorithm Adding Made Easy Recurrence Relations.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Lower Bounds & Models of Computation Jeff Edmonds York University COSC 3101 Lecture 8.
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Nattee Niparnan. Recall What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation.
CHAPTER 1 Compiled by: Dr. Mohammad Omar Alhawarat Algorithm’s Analysis.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Chapter 2: Algorithm Analysis
Asymptotic Growth Rate
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
Analysis of Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
COMPSCI 102 Introduction to Discrete Mathematics.
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Introduction to Algorithms Jiafen Liu Sept
Asymptotic Notations Iterative Algorithms and their analysis
Relevant Mathematics Jeff Edmonds York University COSC 3101 Lecture skipped. Lecture skipped. (Done when needed) Logic Quantifiers The Time Complexity.
Program Performance & Asymptotic Notations CSE, POSTECH.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
Fundamentals of Algorithm Analysis Dr. Steve Goddard CSCE 310J: Data Structures &
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
J. Elder COSC 3101N Thinking about Algorithms Abstractly Lecture 2. Relevant Mathematics: Classifying Functions Input Size Time f(i) = n  (n)
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Asymptotic Analysis-Ch. 3
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. CSE 3101Z Design and Analysis of Algorithms.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Nicki Dell Spring 2014.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Summations COSC 3101, PROF. J. ELDER 2 Recall: Insertion Sort.
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
COSC 3101Winter, 2004 COSC 3101 Design and Analysis of Algorithms Lecture 2. Relevant Mathematics: –The time complexity of an algorithm –Adding made easy.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Kompleksitas Waktu Asimptotik CSG3F3 Lecture 5. 2 Intro to asymptotic f(n)=an 2 + bn + c RMB/CSG3F3.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Linda Shapiro Winter 2015.
CMPT 438 Algorithms.
Introduction to the Design and Analysis of Algorithms
Thinking about Algorithms Abstractly
CS 3343: Analysis of Algorithms
Great Theoretical Ideas in Computer Science
Growth of functions CSC317.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Introduction to Algorithms Analysis
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Advanced Analysis of Algorithms
At the end of this session, learner will be able to:
David Kauchak cs161 Summer 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Algorithm Course Dr. Aref Rashad
Presentation transcript:

Relevant Mathematics Lecture 2 Classifying Functions The Time Complexity of an Algorithm Adding Made Easy Recurrence Relations

Start With Some Math Input Size Time Classifying Functions f(i) = n  (n) Recurrence Relations T(n) = a T(n/b) + f(n) Adding Made Easy ∑ i=1 f(i). Time Complexity t(n) =  (n 2 )

Assumed Knowledge Read the section on Existential and Universal Quantifiers, Logarithms and Exponentials g  b Loves(b,g)  b g Loves(b,g)

Classifying Functions Giving an idea of how fast a function grows without going into too much detail.

Which are more alike?

Mammals

Which are more alike?

Dogs

Classifying Animals Vertebrates BirdsMammals Reptiles Fish Dogs Giraffe

Classifying Functions T(n)T(n) ,00010,000 log n 36913amoeba n 1/ bird ,00010,000human n log n ,000130,000my father n2n , elephant n3n3 1, dinosaur 2n2n 1, the universe Note: The universe contains approximately particles. n

Which are more alike? n 1000 n2n2 2n2n

Which are more alike? Polynomials n 1000 n2n2 2n2n

Which are more alike? 1000n 2 3n 2 2n 3

Which are more alike? Quadratic 1000n 2 3n 2 2n 3

Classifying Functions? Functions

Classifying Functions Functions Constant Logarithmic Poly Logarithmic Polynomial Exponential ExpDouble Exp (log n) 5 n5n5 2 5n 5 log n5 2 n5n5 2 5n 2

Classifying Functions? Polynomial

Classifying Functions Polynomial LinearQuadratic Cubic ? 5n 2 5n 5n 3 5n 4

Logarithmic log 10 n = # digits to write n log 2 n = # bits to write n = 3.32 log 10 n log(n 1000 ) = 1000 log(n) Differ only by a multiplicative constant.

Poly Logarithmic (log n) 5 = log 5 n

Logarithmic << Polynomial For sufficiently large n log 1000 n << n 0.001

Linear << Quadratic For sufficiently large n n << n 2

Polynomial << Exponential For sufficiently large n n 1000 << n

Ordering Functions Functions Constant Logarithmic Poly Logarithmic Polynomial Exponential ExpDouble Exp (log n) 5 n5n5 2 5n 5 log n5 2 n5n5 2 5n 2 <<

Which Functions are Constant? 5 1,000,000,000, sin(n) Yes No Yes

Which Functions are “Constant”? 5 1,000,000,000, sin(n) Yes No Yes Lie in between 7 9 The running time of the algorithm is a “Constant” It does not depend significantly on the size of the input.

Which Functions are Quadratic? n 2 … ?

Which Functions are Quadratic? n n n 2 Some constant times n 2.

Which Functions are Quadratic? n n n 2 5n 2 + 3n + 2log n

Which Functions are Quadratic? n n n 2 5n 2 + 3n + 2log n Lie in between

Which Functions are Quadratic? n n n 2 5n 2 + 3n + 2log n Ignore low-order terms Ignore multiplicative constants. Ignore "small" values of n. Write θ(n 2 ).

Which Functions are Polynomial? n 5 … ?

Which Functions are Polynomial? n c n n n to some constant power.

Which Functions are Polynomial? n c n n n 2 + 8n + 2log n 5n 2 log n 5n 2.5

Which Functions are Polynomial? n c n n n 2 + 8n + 2log n 5n 2 log n 5n 2.5 Lie in between

Which Functions are Polynomials? n c n n n 2 + 8n + 2log n 5n 2 log n 5n 2.5 Ignore low-order terms Ignore power constant. Ignore "small" values of n. Write n θ(1)

Which Functions are Exponential? 2 n … ?

Which Functions are Exponential? 2 n n n n times some constant power raises to the power of 2.

Which Functions are Exponential? 2 n n n 8 n 2 n / n n · n 100 too small? too big?

Which Functions are Exponential? 2 n n n 8 n 2 n / n n · n 100 = 2 3n > 2 0.5n < 2 2n Lie in between

Which Functions are Exponential? 2 n n n 8 n 2 n / n n · n 100 = 2 3n > 2 0.5n < 2 2n 2 0.5n > n n = 2 0.5n · 2 0.5n > n 100 · 2 0.5n 2 n / n 100 > n 0.5n

Which Functions are Exponential? Ignore low-order terms Ignore base. Ignore "small" values of n. Ignore polynomial factors. Write 2 θ(n) 2 n n n 8 n 2 n / n n · n 100

Classifying Functions Functions Constant Logarithmic Poly Logarithmic PolynomialExponential Exp Double Exp (log n) θ(1) n θ(1) 2 θ(n) θ(log n)θ(1) 2 n θ(1) 2 θ(n) 2

Notations Theta f(n) = θ(g(n))f(n) ≈ c g(n) BigOh f(n) = O(g(n))f(n) ≤ c g(n) Omega f(n) = Ω(g(n))f(n) ≥ c g(n) Little Oh f(n) = o(g(n))f(n) << c g(n) Little Omega f(n) = ω(g(n))f(n) >> c g(n)

Definition of Theta f(n) = θ(g(n))

Definition of Theta f(n) is sandwiched between c 1 g(n) and c 2 g(n) f(n) = θ(g(n))

Definition of Theta f(n) is sandwiched between c 1 g(n) and c 2 g(n) for some sufficiently small c 1 (= ) for some sufficiently large c 2 (= 1000) f(n) = θ(g(n))

Definition of Theta For all sufficiently large n f(n) = θ(g(n))

Definition of Theta For all sufficiently large n For some definition of “sufficiently large” f(n) = θ(g(n))

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 )

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) c 1 ·n 2  3n 2 + 7n + 8  c 2 ·n 2 ??

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·n 2  3n 2 + 7n + 8  4·n 2 34

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·1 2  3· ·1 + 8  4·1 2 1 False

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·7 2  3· ·7 + 8  4·7 2 7 False

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·8 2  3· ·8 + 8  4·8 2 8 True

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·9 2  3· ·9 + 8  4·9 2 9 True

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·10 2  3· ·  4· True

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·n 2  3n 2 + 7n + 8  4·n 2 ?

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·n 2  3n 2 + 7n + 8  4·n 2 n  8 8

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·n 2  3n 2 + 7n + 8  4·n 2 True n  8

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·n 2  3n 2 + 7n + 8  4·n 2 7n + 8  1 · n / n  1 · n n  8 True

Definition of Theta 3n 2 + 7n + 8 = θ(n 2 ) 3·n 2  3n 2 + 7n + 8  4·n n  8 True

Definition of Theta n 2 = θ(n 3 )

Definition of Theta n 2 = θ(n 3 ) c 1 · n 3  n 2  c 2 · n 3 ??

Definition of Theta n 2 = θ(n 3 ) 0 · n 3  n 2  c 2 · n 3 0 True, but ?

Definition of Theta n 2 = θ(n 3 ) 0 Constants c1 and c2 must be positive! False

Definition of Theta 3n 2 + 7n + 8 = θ(n)

Definition of Theta 3n 2 + 7n + 8 = θ(n) c 1 ·n  3n 2 + 7n + 8  c 2 ·n ??

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3·n  3n 2 + 7n + 8  100·n 3 100

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3·n  3n 2 + 7n + 8  100·n ?

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3·100  3· ·  100· False

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3·n  3n 2 + 7n + 8  10,000·n 3 10,000

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3·10,000  3·10, ·10,  100·10,000 10,000 False

Definition of Theta 3n 2 + 7n + 8 = θ(n) What is the reverse statement?

Understand Quantifiers!!! SamMary BobBeth John Marilin Monro FredAnn SamMary BobBeth John Marilin Monro FredAnn  b,  Loves(b, MM)  b, Loves(b, MM)  b,  Loves(b, MM)]  b, Loves(b, MM)]

Definition of Theta 3n 2 + 7n + 8 = θ(n) The reverse statement

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3n 2 + 7n + 8 > 100·n ?

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3 · · > 100 · True 100

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3n 2 + 7n + 8 > 10,000·n 3 10,000 ?

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3 · 10, · 10, > 10,000 · 10, ,000 True 10,000

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3n 2 + 7n + 8 > c 2 ·n c1c1 c2c2 ?

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3 · c · c > c 2 · c 2 c1c1 c2c2 True c2c2

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3n 2 + 7n + 8 > c 2 ·n c1c1 c2c2 ? n0n0

Definition of Theta 3n 2 + 7n + 8 = θ(n) 3 · c · c > c 2 · c 2 c1c1 max(c 2,n 0 ) True c2c2 n0n0

Definition of Theta 3n 2 + 7n + 8 = g(n) θ(1)  c 1, c 2, n 0,  n  n 0 g(n) c1  f(n)  g(n) c2 3n 2 + 7n + 8 = n θ(1)

Definition of Theta 3n 2 + 7n + 8 = n θ(1) n c1  3n 2 + 7n + 8  n c2  c 1, c 2, n 0,  n  n 0 g(n) c1  f(n)  g(n) c2 ??

Definition of Theta 3n 2 + 7n + 8 = n θ(1) n 2  3n 2 + 7n + 8  n 3  c 1, c 2, n 0,  n  n 0 g(n) c1  f(n)  g(n) c2 23

Definition of Theta 3n 2 + 7n + 8 = n θ(1) n 2  3n 2 + 7n + 8  n 3  c 1, c 2, n 0,  n  n 0 g(n) c1  f(n)  g(n) c2 23? n  ?

Definition of Theta 3n 2 + 7n + 8 = n θ(1) n 2  3n 2 + 7n + 8  n 3  c 1, c 2, n 0,  n  n 0 g(n) c1  f(n)  g(n) c2 235 n  5

Definition of Theta 3n 2 + 7n + 8 = n θ(1) n 2  3n 2 + 7n + 8  n 3 True  c 1, c 2, n 0,  n  n 0 g(n) c1  f(n)  g(n) c2 235 n  5

Order of Quantifiers f(n) = θ(g(n)) ?

Understand Quantifiers!!! One girl Could be a separate girl for each boy. SamMary BobBeth John Marilin Monro FredAnn SamMary BobBeth John Marilin Monro FredAnn

Order of Quantifiers No! It cannot be a different c 1 and c 2 for each n. f(n) = θ(g(n))

Other Notations Theta f(n) = θ(g(n))f(n) ≈ c g(n) BigOh f(n) = O(g(n))f(n) ≤ c g(n) Omega f(n) = Ω(g(n))f(n) ≥ c g(n) Little Oh f(n) = o(g(n))f(n) << c g(n) Little Omega f(n) = ω(g(n))f(n) >> c g(n)

BigOh Notation n 2 = O(n 3 ) n 3 = O(n 2 )

BigOh Notation O(n 2 ) = O(n 3 ) O(n 3 ) = O(n 2 ) Odd Notation f(n) = O(g(n)) Standard f(n) ≤ O(g(n)) Stresses one function dominating another. f(n)  O(g(n)) Stress function is member of class.

The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input.

Purpose?

Purpose To estimate how long a program will run. To estimate the largest input that can reasonably be given to the program. To compare the efficiency of different algorithms. To help focus on the parts of code that are executed the largest number of times. To choose an algorithm for an application.

Time Complexity Is a Function Specifies how the running time depends on the size of the input. A function mapping “size” of input “time” T(n) executed.

Definition of Time?

Definition of Time # of seconds (machine dependent). # lines of code executed. # of times a specific operation is performed (e.g., addition). Which?

Definition of Time # of seconds (machine dependent). # lines of code executed. # of times a specific operation is performed (e.g., addition). These are all reasonable definitions of time, because they are within a constant factor of each other.

Size of Input Instance? 83920

Size of Input Instance Size of paper # of bits # of digits Value - n = 2 in 2 - n = 17 bits - n = 5 digits - n = ’’ ’’ Which are reasonable?

Size of Input Instance Size of paper # of bits # of digits Value - n = 2 in 2 - n = 17 bits - n = 5 digits - n = Intuitive 2’’ ’’

Size of Input Instance Size of paper # of bits # of digits Value - n = 2 in 2 - n = 17 bits - n = 5 digits - n = Intuitive Formal 2’’ ’’

Size of Input Instance Size of paper # of bits # of digits Value - n = 2 in 2 - n = 17 bits - n = 5 digits - n = Intuitive Formal Reasonable # of bits = 3.32 * # of digits 2’’ ’’

Size of Input Instance Size of paper # of bits # of digits Value - n = 2 in 2 - n = 17 bits - n = 5 digits - n = Intuitive Formal Reasonable Unreasonable # of bits = log 2 (Value) Value = 2 # of bits 2’’ ’’

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Time?

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Time = N

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Is this reasonable? Time = N

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? No! Time = N One is considered fast and the other slow!

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Size of Input Instance?

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? size n = N size n = log N

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Time as function of input size? size n = N size n = log N

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Time = N = n Time = N = 2 n size n = N size n = log N

Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? Linear vs Exponential Time! Time = N = n Time = N = 2 n size n = N size n = log N

Size of Input Instance? 14,23,25,30,31,52,62,79,88,98

Size of Input Instance # of elements - n = 10 elements 14,23,25,30,31,52,62,79,88,98 10

Size of Input Instance # of elements - n = 10 elements 14,23,25,30,31,52,62,79,88,98 10 Is this reasonable?

Size of Input Instance # of elements - n = 10 elements ~ Reasonable 14,23,25,30,31,52,62,79,88,98 10 If each element has size c # of bits = c * # of elements

Size of Input Instance # of elements - n = 10 elements Reasonable 14,23,25,30,31,52,62,79,88,98 10 If each element is in [1..n] each element has size log n # of bits = n log n ≈ n

Time Complexity Is a Function Specifies how the running time depends on the size of the input. A function mapping # of bits n needed to represent the input # of operations T(n) executed.

Which Input of size n? There are 2 n inputs of size n. Which do we consider for the time T(n)?

Which Input of size n? Typical Input Average Case Worst Case

Which Input of size n? Typical InputBut what is typical? Average CaseFor what distribution? Worst CaseTime for all inputs is bound. Easiest to compute

What is the height of tallest person in the class? Bigger than this? Need to look at only one person Need to look at every person Smaller than this?

Time Complexity of Algorithm O(n 2 ): Prove that for every input of size n, the algorithm takes no more than cn 2 time. Ω(n 2 ): Find one input of size n, for which the algorithm takes at least this much time. θ (n 2 ): Do both. The time complexity of an algorithm is the largest time required on any input of size n.

Time Complexity of Problem O(n 2 ): Provide an algorithm that solves the problem in no more than this time. Ω(n 2 ): Prove that no algorithm can solve it faster. θ (n 2 ): Do both. The time complexity of a problem is the time complexity of the fastest algorithm that solves the problem.

Adding The Classic Techniques Evaluating ∑ i=1 f(i). n

Gauss ∑ i=1..n i = n = ? Arithmetic Sum

n-1+n=S n+n-1+n =S (n+1) + (n+1) + (n+1) (n+1) + (n+1) = 2S n (n+1) = 2S

n-1+n=S n+n-1+n =S (n+1) + (n+1) + (n+1) (n+1) + (n+1) = 2S n (n+1) = 2S

n-1+n=S n+n-1+n =S (n+1) + (n+1) + (n+1) (n+1) + (n+1) = 2S n (n+1) = 2S

n-1+n=S n+n-1+n =S (n+1) + (n+1) + (n+1) (n+1) + (n+1) = 2S n (n+1) = 2S

Let’s restate this argument using a geometric representation Algebraic argument

n = number of white dots.

n = number of white dots = number of yellow dots n

n+1 n+1 n+1 n+1 n+1 = number of white dots = number of yellow dots n n n n n n There are n(n+1) dots in the grid

n+1 n+1 n+1 n+1 n+1 = number of white dots = number of yellow dots n n n n n n Note =  (# of terms · last term))

Gauss ∑ i=1..n i = n =  (# of terms · last term) Arithmetic Sum True when ever terms increase slowly

∑ i=0..n 2 i = n = ? Geometric Sum

∑ i=0..n 2 i = n = 2 · last term - 1 Geometric Sum

∑ i=0..n r i = r 0 + r 1 + r r n = ? Geometric Sum

S Srrrr...rr S 1r S r1 r1 23nn1 n1 n1  1rrr r n          Geometric Sum

∑ i=0..n r i r1 r1 n1     Geometric Sum When r>1?

∑ i=0..n r i r1 r1 n1     Geometric Sum θ(rn)θ(rn) Biggest Term When r>1

∑ i=0..n r i = r 0 + r 1 + r r n =  (biggest term) Geometric Increasing True when ever terms increase quickly

∑ i=0..n r i  Geometric Sum When r<1? 1 r n1   

∑ i=0..n r i  Geometric Sum 1 r n1     θ(1) When r<1 Biggest Term

∑ i=0..n r i = r 0 + r 1 + r r n =  (1) Bounded Tail True when ever terms decrease quickly

∑ i=1..n 1 / i = 1 / / / / / 5 + …+ 1 / n = ? Harmonic Sum

f(i) = 1 ∑ i=1..n f(i) = n n Sum of Shrinking Function

f(i) = ? ∑ i=1..n f(i) = n 1/2 n Sum of Shrinking Function

f(i) = 1/2 i ∑ i=1..n f(i) = 2  Sum of Shrinking Function

f(i) = 1/i ∑ i=1..n f(i) = ? n Sum of Shrinking Function

Harmonic Sum

∑ i=1..n 1 / i = 1 / / / / / 5 + …+ 1 / n =  (log(n)) Harmonic Sum

Approximating Sum by Integrating The area under the curve approximates the sum ∑ i=1..n f(i) ≈ ∫ x=1..n f(x) dx

Approximating Sum by Integrating 1 / c+1 x c+1 d dx = x c ∫ x=1..n x c dx = 1 / c+1 n c+1 ∑ i=1..n i c ≈ Arithmetic Sums  θ(# of terms · last term) True when ever terms increase slowly = θ(n · n c+1 )

Approximating Sum by Integrating 1 / ln(b) e ln(b)x d dx = e ln(b)x = b x ∫ x=1..n b x dx = 1 / ln(b) b x ∑ i=1..n b i ≈ Geometric Sums  θ(last term) True when ever terms increase quickly = θ(b n )

Harmonic Sum Approximating Sum by Integrating

Problem: Integrating may be hard too.

Outline II) Math proofs of sums V) Ability to know the answer fast I) What is the sum Eg. ∑ i=1..n f(i) = ∑ i=1..n 1 / i =θ(n ) III) Pattern of sums ∑ i=1..n f(i) = θ(n · f(n)) IV) Intuition why this is the sum I flash it up If you follow great. If not don’t panic.

Adding Made Easy We will now classify (most) functions f(i) into four classes: –Geometric Like –Arithmetic Like –Harmonic –Bounded Tail

Adding Made Easy We will now classify (most) functions f(i) into four classes For each class, we will give an easy rule for approximating it’s sum θ( ∑ i=1..n f(i) )

Classifying Animals Vertebrates Birds Mammals Reptiles Fish Dogs Giraffe Mammal  Complex social networks Dog  Man’s best friend

Classifying Functions Functions Poly. Exp. 2 θ(n)  ∑ i=1..n f(i) = θ(f(n)) 8 · 2 n / n n 3 θ(2 n / n 100 ) 8 · 2 n + n 3 n θ(1) 2 θ(n) θ(2n)θ(2n) f(n) = 8 · 2 n / n n 3 θ(2 n / n 100 )  ∑ i=1..n f(i) = θ(2 n / n 100 ) 2 0.5n << 2 n 8 · 2 n / n n 3 Significant Less significant Irrelevant

Adding Made Easy Four Classes of Functions

Adding Made Easy

If the terms f(i) grow sufficiently quickly, then the sum will be dominated by the largest term. Silly example: ,000,000,000 ≈ 1,000,000,000 f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

If the terms f(i) grow sufficiently quickly, then the sum will be dominated by the largest term. Classic example: ? f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

If the terms f(i) grow sufficiently quickly, then the sum will be dominated by the largest term. Classic example: ∑ i=1..n 2 i = 2 n+1 -1 ≈ 2 f(n) f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

If the terms f(i) grow sufficiently quickly, then the sum will be dominated by the largest term. For which functions f(i) is this true? How fast and how slow can it grow? f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

r1 r1 n1      θ( ) r n Last Term when r>1. ∑ i=1..n r i 1rrr r n    θ(f(n)) ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: Upper Extreme: ∑ i=1..n (1000) i ≈ 1.001(1000) n = f(n) Recall, when f(n) = r i = 2 θ(n) f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

Because the constant is sooo big, the statement is just barely true. ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: Upper Extreme: ∑ i=1..n (1000) i ≈ 1.001(1000) n = f(n) f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: Upper Extreme: ∑ i=1..n (1000) i ≈ 1.001(1000) n = f(n) Even bigger? f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

2n2n 2i2i ∑ i=1..n 2 2 ≈ 2 2 = 1f(n) No Upper Extreme: Even bigger? ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

2n2n 2i2i ∑ i=1..n 2 2 ≈ 2 2 = 1f(n) No Upper Extreme: Functions in between? ∑ i=1..n (1.0001) i ≈ 10,000 (1.0001) n = 10,000 f(n) Lower Extreme: f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

∑ i=1..n 2 i = 2 · 2 n –2 = …+ 2 n 8 · [ ] 8·8· 8·8· 8·8· 8·8· 8·8· 8·8· = θ( 2 n ) n 100 i 100 n i n 3 Dominated by the largest term. = θ(f(n)) f(n) = 2 n 8·8· n n 3 = θ( 2 n ) n 100 f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like: n 100 ≈

f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like: f(n) = 2 n 8·8· n 100 ∑ i=1..n f(i) f(n)  c · f(n) f(i)  · f(n) f(i)  · f(i +1 ) EasyHard (i +1 ) 100 i =  (1 + 1 / i ) = 8 · 2 (i+1) (i +1 ) 100 f(i +1 ) f(i) 8·2i8·2i i 100 =  1/.51  · f(i +1 ) f(i)

f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like: f(n) = 2 n 8·8· n 100 ∑ i=1..n f(i) f(n)  c · f(n) f(i)  · f(n)   · f(i +2 )  · f(i +1 ) f(i)  · f(i +1 ) f(i)   · f(i +3 ) .51 (n-i) ·f(i +(n-i) ) .51 (n-i) ·f( n ) f(i) .51 (n-i) ·f( n ) Eg. i = n, f(n) .51 (n-n) ·f( n )  1·f( n ) i = 0, f(0) .51 n ·f( n )

f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like: f(n) = 2 n 8·8· n 100 ∑ i=1..n f(i) f(n)  c · f(n) ∑ i=1..n f(i)  ∑ i=1..n.51 (n-i) ·f( n ) = θ(1) · f(n) f(i)  · f(n)  · f(i +1 ) f(i) f(i) .51 (n-i) ·f( n ) = [∑ i=1..n.51 (n-i) ] · f( n )

Functions Poly. Exp. 2 θ(n)  ∑ i=1..n f(i) = θ(f(n)) 8 · 2 n / n n 3 θ(2 n / n 100 ) 8 · 2 n + n 3 n θ(1) 2 θ(n) θ(2n)θ(2n) f(n) = 8 · 2 n / n n 3 θ(2 n / n 100 )  ∑ i=1..n f(i) = θ(2 n / n 100 ) 2 0.5n << 2 n 8 · 2 n / n n 3 Significant Less significant Irrelevant f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

If f(n) = c b an n d log e (n) c  ? a  ? b  ? d  ? e  ? f(n) = Ω, θ ? then ∑ i=1..n f(i) = θ(f(n)).  2 Ω(n) > 0  0  1 (- ,  ) f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

All functions in 2 Ω(n) ? Maybe not. f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

Functions that oscillate continually do not have the property. f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

Functions expressed with +, -, , , exp, log do not oscillate continually. They are well behaved for sufficiently large n. These do have the property. f(i)  2 Ω(n)  ∑ i=1..n f(i) = θ(f(n)) Geometric Like:

Adding Made Easy Done

Adding Made Easy

If most of the terms f(i) have roughly the same value, then the sum is roughly the number of terms, n, times this value. Silly example: 1, , , , ,005 ≈ 5 · 1,000 f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

If most of the terms f(i) have roughly the same value, then the sum is roughly the number of terms, n, times this value. Another silly example: ∑ i=1..n 1 = n · 1 f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

If most of the terms f(i) have roughly the same value, then the sum is roughly the number of terms, n, times this value. Is the statement true for this function? ∑ i=1..n i = n f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

If most of the terms f(i) have roughly the same value, then the sum is roughly the number of terms, n, times this value. Is the statement true for this function? ∑ i=1..n i = n The terms are not roughly the same. f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

But half the terms are roughly the same. ∑ i=1..n i = n/ n f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

But half the terms are roughly the same and the sum is roughly the number terms, n, times this value ∑ i=1..n i = n/ n ∑ i=1..n i = θ(n · n) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Is the statement true for this function? ∑ i=1..n i 2 = n 2 Even though, the terms are not roughly the same. f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Again half the terms are roughly the same. ∑ i=1..n i = (n/2) n 2 1 / 4 n 2 f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Again half the terms are roughly the same. ∑ i=1..n i = (n/2) n 2 1 / 4 n 2 ∑ i=1..n i 2 = θ(n · n 2 ) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

∑ i=1..n f(i) ≈ area under curve f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

area of small square  ∑ i=1..n f(i) ≈ area under curve  area of big square f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

n / 2 · f( n / 2 ) = area of small square  ∑ i=1..n f(i) ≈ area under curve  area of big square = n · f(n) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

θ(n · f(n)) = n / 2 · f( n / 2 ) = area of small square  ∑ i=1..n f(i) ≈ area under curve  area of big square = n · f(n) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

θ(n · f(n)) = n / 2 · f( n / 2 ) = area of small square  ∑ i=1..n f(i) ≈ area under curve  area of big square = n · f(n) ? f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

∑ i=1..n i 2 = n 2 f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(i) = n 2 f( n / 2 ) = θ(f(n)) The key property is = ?

The key property is f( n / 2 ) = (n/2) 2 = 1 / 4 n 2 = θ( n 2 ) = θ(f(n)) ∑ i=1..n i 2 = n 2 f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(i) = n 2  = θ(n·f(n)) = θ(n · n 2 )

∑ i=1..n f(i) = ? f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(i) = n θ(1)

∑ i=1..n i r = 1 r + 2 r + 3 r n r f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(i) = n θ(1) f( n / 2 ) = θ(f(n)) The key property is

f( n / 2 ) = (n/2) r = 1 / 2 r n r = θ( n r ) = θ(f(n)) ∑ i=1..n i r = 1 r + 2 r + 3 r n r f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:  = θ(n·f(n)) = θ(n · n r ) f(i) = n θ(1)

∑ i=1..n 2 i = n f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(i) = n θ(1) f( n / 2 ) = θ(f(n)) The key property is

∑ i=1..n 2 i = n f( n / 2 ) = 2 (n/2) = θ( 2 n ) = θ(f(n)) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like: f(i) = n θ(1)  = θ(n·f(n)) = θ(n · 2 n )

∑ i=1..n 1 = n · 1 = n · f(n) Middle Extreme: Upper Extreme: ∑ i=1..n i 1000 = 1 / 1001 n 1001 = 1 / 1001 n · f(n) All functions in between. f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Adding Made Easy Half done

f(i) = 1 ∑ i=1..n f(i) = n n Sum of Shrinking Function

f(i) = ? ∑ i=1..n f(i) = n 1/2 n Sum of Shrinking Function 1/i 1/2

f(i) = 1/i ∑ i=1..n f(i) = log n n Sum of Shrinking Function

f(i) = 1/2 i ∑ i=1..n f(i) = 2 Sum of Shrinking Function 

If most of the terms f(i) have roughly the same value, then the sum is roughly the number of terms, n, times this value. Does the statement hold for functions f(i) that shrink? f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

If most of the terms f(i) have roughly the same value, then the sum is roughly the number of terms, n, times this value. ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n Does the statement hold for the Harmonic Sum? f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) = ? θ(n · f(n)) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) θ(n · f(n)) = ∑ i=1..n 1 / i = ? f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) θ(n · f(n)) = ∑ i=1..n 1 / i =θ(log n) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(log n) ? f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(log n) θ(1) = θ(n · 1 / n ) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the Harmonic Sum? ∑ i=1..n 1 / i = 1 / / / / / 5 + … + 1 / n ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(log n) θ(1) = θ(n · 1 / n ) ≠ No the statement does not hold! f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Adding Made Easy not included

Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i Shrinks slightly slower than harmonic. f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i = ? f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(n ) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(n ) ? f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(n ) θ(n ) = θ(n · 1 / n ) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Does the statement hold for the almost Harmonic Sum? ∑ i=1..n 1 / i ∑ i=1..n f(i) = θ(n · f(n)) = ∑ i=1..n 1 / i =θ(n ) θ(n ) = θ(n · 1 / n ) = The statement does hold! f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

∑ i=1..n 1 = n · 1 = n · f(n) Middle Extreme: Lower Extreme: ∑ i=1..n 1 / i = θ(n ) = θ(n · f(n)) All functions in between. f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

∑ i=1..n 1 = n · 1 = n · f(n) Middle Extreme: Lower Extreme: ∑ i=1..n 1 / i = θ(n ) = θ(n · f(n)) Upper Extreme: ∑ i=1..n i 1000 = 1 / 1001 n 1001 = 1 / 1001 n · f(n) f(i) = n θ(1)-1  ∑ i=1..n f(i) = θ(n·f(n)) Arithmetic Like:

Arithmetic Like If f(n) = c b an n d log e (n) c  ? a  ? b  ? d  ? e  ? f(n) = Ω, θ ? then ∑ i=1..n f(i) = θ(n · f(n)). n -1+θ(1) > 0  0 or b  1 > -1 (- ,  ) (For +, -, , , exp, log functions f(n)) Conclusion

Adding Made Easy Done

Adding Made Easy Harmonic

Harmonic Sum ∑ i=1..n 1 / i = 1 / / / / / 5 + …+ 1 / n =  (log(n))

Adding Made Easy

If the terms f(i) decrease towards zero sufficiently quickly, then the sum will be a constant. The classic example ∑ i=0..n 1 / 2 i = / / / 8 + … = 2. f(n)  n -1-Ω(1)  ∑ i=1..n f(i) = θ(1) Bounded Tail:

If f(i) decays even faster, then the tail of the sum is more bounded. 2i2i ∑ i=1..n 2 2 = θ(1). 1 f(n)  n -1-Ω(1)  ∑ i=1..n f(i) = θ(1) Bounded Tail:

Upper Extreme: ∑ i=1..n 1 / i = θ(1) f(n)  n -1-Ω(1)  ∑ i=1..n f(i) = θ(1) Bounded Tail:

Upper Extreme: ∑ i=1..n 1 / i = θ(1) No Lower Extreme: 2i2i ∑ i=1..n 2 2 = θ(1). 1 All functions in between. f(n)  n -1-Ω(1)  ∑ i=1..n f(i) = θ(1) Bounded Tail:

Bounded Tail If f(n) = c b an n d log e (n) c  ? a  ? b  ? d  ? e  ? f(n) = Ω, θ ? then ∑ i=1..n f(i) = θ(1). > 0  (- ,  ) Conclusion or b  1 c n d log e (n) c  a  b  b an

Bounded Tail If f(n) = c n d log e (n) c  ? a  b = 1 d  ? e  ? f(n) = Ω, θ ? then ∑ i=1..n f(i) = θ(1).  n θ(1)-1 > 0 < -1 (- ,  ) Conclusion

Adding Made Easy Done

Adding Made Easy Missing Functions 1 / nlogn logn / n

Adding Made Easy Geometric Like: If f(n)  2 Ω(n), then ∑ i=1..n f(i) = θ(f(n)). Arithmetic Like: If f(n) = n θ(1)-1, then ∑ i=1..n f(i) = θ(n · f(n)). Harmonic: If f(n) = 1 / n, then ∑ i=1..n f(i) = log e n + θ(1). Bounded Tail: If f(n)  n -1-Ω(1), then ∑ i=1..n f(i) = θ(1). ( For +, -, , , exp, log functions f(n) ) This may seem confusing, but it is really not. It should help you compute most sums easily.

Recurrence Relations T(1) = 1 T(n) = a T(n/b) + f(n)

Recurrence Relations  Time of Recursive Program procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Recurrence relations arise from the timing of recursive programs. Let T(n) be the # of “Hi”s on an input of “size” n.

procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Given size 1, the program outputs T(1)=1 Hi’s. Recurrence Relations  Time of Recursive Program

procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Given size n, the stackframe outputs f(n) Hi’s. Recurrence Relations  Time of Recursive Program

procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Recursing on a instances of size n/b generates T(n/b) “Hi”s. Recurrence Relations  Time of Recursive Program

procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) Recursing a times generates a·T(n/b) “Hi”s. Recurrence Relations  Time of Recursive Program

procedure Eg(int n) if(n  1) then put “Hi” else loop i=1..f(n) put “Hi” loop i=1..a Eg(n/b) For a total of T(1) = 1 T(n) = a·T(n/b) + f(n) “Hi”s. Recurrence Relations  Time of Recursive Program

Solving Technique 1 Guess and Verify Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = 2n 2 – n Verify: Left Hand SideRight Hand Side T(1) = 2(1) 2 – 1 T(n) = 2n 2 – n 1 4T(n/2) + n = 4 [2( n / 2 ) 2 – ( n / 2 )] + n = 2n 2 – n

Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = an 2 + bn + c Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients

Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = an 2 + bn + 0 Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients c=4c c=0

Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = an 2 - 1n + 0 Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients b=2b+1 b=-1

Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = an 2 - 1n + 0 Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients a=a

Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) + n Guess: G(n) = 2n 2 - 1n + 0 Verify: Left Hand SideRight Hand Side T(1) = a+b+c T(n) = an 2 +bn+c 1 4T(n/2) + n = 4 [a ( n / 2 ) 2 + b ( n / 2 ) +c] + n = an 2 +(2b+1)n + 4c Solving Technique 2 Guess Form and Calculate Coefficients a+b+c=1 a-1+0=1 a=2

Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) + f(n) Solving Technique 3 Approximate Form and Calculate Exponent which is bigger? Guess

Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) + f(n) Guess: aT(n/b) << f(n) Simplify: T(n)  f(n) Solving Technique 3 Calculate Exponent In this case, the answer is easy. T(n) =  (f(n))

Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) + f(n) Guess: aT(n/b) >> f(n) Simplify: T(n)  aT(n/b) Solving Technique 3 Calculate Exponent In this case, the answer is harder.

Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) Guess: G(n) = cn  Verify: Left Hand SideRight Hand Side T(n) = cn  aT(n/b) = a [c ( n / b )  ] = c a b  n  Solving Technique 3 Calculate Exponent ( log a / log b ) = cn 1 = a b -  b  = a  log b = log a  = log a / log b

Recurrence Relation: T(1) = 1 & T(n) = 4T(n/2) Guess: G(n) = cn  Verify: Left Hand SideRight Hand Side T(n) = cn  aT(n/b) + f(n) = c [a ( n / b )  ] = c a b  n  Solving Technique 3 Calculate Exponent 1 = a b -  b  = a  log b = log a  = log a / log b ( log a / log b ) = cn ( log 4 / log 2 ) = cn 2

Recurrence Relation: T(1) = 1 & T(n) = aT(n/b) + f(n) Solving Technique 3 Calculate Exponent If bigger then T(n) =  (f(n)) If bigger then ( log a / log b ) T(n) =  (n ) And if aT(n/b)  f(n) what is T(n) then?

Technique 4: Decorate The Tree T(n) = a T(n/b) + f(n) f(n) T(n/b) T(n) = T(1) T(1) = 1 1 = a

f(n) T(n/b) T(n) = a

f(n) T(n/b) T(n) = a f(n/b) T(n/b 2 ) a

f(n) T(n) = a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a

f(n) T(n) = a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a f(n/b) T(n/b 2 ) a

Evaluating: T(n) = aT(n/b)+f(n) Level i h

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size i h

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 2 i h

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 i h

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i h

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h n/b h

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h n/b h

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h n/b h = 1 base case

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h n/b h = 1

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size 0n 1 n/b 2 n/b 2 i n/b i h = log n / log b n/b h = 1 b h = n h log b = log n h = log n / log b

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame 0n 1 n/b 2 n/b 2 i n/b i h = log n / log b 1

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame 0n f(n) 1 n/bf(n/b) 2 n/b 2 f(n/b 2 ) i n/b i f(n/b i ) h = log n / log b 1T(1)

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames 0n f(n) 1 n/bf(n/b) 2 n/b 2 f(n/b 2 ) i n/b i f(n/b i ) h = log n / log b n/b h T(1)

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames 0n f(n) 1 1 n/bf(n/b) a 2 n/b 2 f(n/b 2 ) a2a2 i n/b i f(n/b i ) aiai h = log n / log b n/b h T(1) ahah

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames 0n f(n) 1 1 n/bf(n/b) a 2 n/b 2 f(n/b 2 ) a2a2 i n/b i f(n/b i ) aiai h = log n / log b n/b h T(1) ahah

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames 0n f(n) 1 1 n/bf(n/b) a 2 n/b 2 f(n/b 2 ) a2a2 i n/b i f(n/b i ) aiai h = log n / log b n/b h T(1) ahah a h = a = n log n / log b log a / log b

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames Work in Level 0n f(n) 1 1 n/bf(n/b) a 2 n/b 2 f(n/b 2 ) a2a2 i n/b i f(n/b i ) aiai h = log n / log b n/b h T(1) n log a / log b

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames Work in Level 0n f(n) 1 1 · f(n) 1 n/bf(n/b) a a · f(n/b) 2 n/b 2 f(n/b 2 ) a2a2 a 2 · f(n/b 2 ) i n/b i f(n/b i ) aiai a i · f(n/b i ) h = log n / log b n/b h T(1) n log a / log b n · T(1) log a / log b Total Work T(n) = ∑ i=0..h a i  f(n/b i )

Evaluating: T(n) = aT(n/b)+f(n) = ∑ i=0..h a i  f(n/b i ) If a Geometric Sum ∑ i=0..n x i  θ(max(first term, last term))

Evaluating: T(n) = aT(n/b)+f(n) Level Instance size Work in stack frame # stack frames Work in Level 0n f(n) 1 1 · f(n) 1 n/bf(n/b) a a · f(n/b) 2 n/b 2 f(n/b 2 ) a2a2 a 2 · f(n/b 2 ) i n/b i f(n/b i ) aiai a i · f(n/b i ) h = log n / log b n/b h T(1) n log a / log b n · T(1) log a / log b Dominated by Top Level or Base Cases

Evaluating: T(n) = aT(n/b)+f(n) Is the sum Geometric? Simplify by letting f(n) = n c log k n T(n) = ∑ i=0..h a i  f(n/b i ) = ∑ i=0..h a i  (n/b i ) c  log k (n/b i ) = n c ∑ i=0..h 2 [log a - c log b] · i  log k (n/b i ) Geometrically IncreasingArithmetic SumGeometrically Decreasing Sum = θ(last term) = θ( ) Sum = θ(any term  # terms) = θ(f(n)  log n) Sum = θ(first term) = θ(f(n)). log a - c log b < 0log a - c log b = 0log a - c log b < 0 c < log a / log b c = log a / log b & k<-1 c = log a / log b & k>-1 c > log a / log b n log a / log b (n/..) c = n c a i = 2 [log a]·i (.. /b i ) c = 2 [-c log b]·i = n c ∑ i=0..h 2 -d i  log k (n/b i )

Evaluating: T(n) = aT(n/b)+f(n) Is the sum Geometric? Simplify by letting f(n) = n c log k n T(n) = ∑ i=0..h a i  f(n/b i ) = ∑ i=0..h a i  (n/b i ) c  log k (n/b i ) = n c ∑ i=0..h 2 [log a - c log b] · i  log k (n/b i ) Geometrically IncreasingArithmetic SumGeometrically Decreasing Sum = θ(last term) = θ( ) Sum = θ(any term  # terms) = θ(f(n)  log n) Sum = θ(first term) = θ(f(n)). log a - c log b < 0log a - c log b = 0log a - c log b < 0 c < log a / log b c = log a / log b & k<-1 c = log a / log b & k>-1 c > log a / log b n log a / log b k =-1  log k ( n /b i ) = 1 / (logn - i  logb)  1 / i (backwards) = n c ∑ i=0..h 2 0 i  log k (n/b i ) (back wards) de

Evaluating: T(n) = aT(n/b)+f(n) Is the sum Geometric? Simplify by letting f(n) = n c log k n T(n) = ∑ i=0..h a i  f(n/b i ) = ∑ i=0..h a i  (n/b i ) c  log k (n/b i ) = n c ∑ i=0..h 2 [log a - c log b] · i  log k (n/b i ) Geometrically IncreasingArithmetic SumGeometrically Decreasing Sum = θ(last term) = θ( ) Sum = θ(any term  # terms) = θ(f(n)  log n) Sum = θ(first term) = θ(f(n)). log a - c log b < 0log a - c log b = 0log a - c log b < 0 c < log a / log b c = log a / log b & k<-1 c = log a / log b & k>-1 c > log a / log b n log a / log b = n c ∑ i=0..h 2 +d i  log k (n/b i )

Evaluating: T(n) = 4T(n/2)+ n 3 / log 5 n

Time for top level?: Evaluating: T(n) = 4T(n/2)+ n 3 / log 5 n

Time for top level: f(n) = n 3 / log 5 n Time for base cases?: Evaluating: T(n) = 4T(n/2)+ n 3 / log 5 n

Time for top level: f(n) = n 3 / log 5 n Time for base cases: Evaluating: T(n) = 4T(n/2)+ n 3 / log 5 n θ(n ) = log a / log b θ(n ) log ? / log ?

Time for top level: f(n) = n 3 / log 5 n Time for base cases: Evaluating: T(n) = 4T(n/2)+ n 3 / log 5 n θ(n ) = log a / log b θ(n ) = ? log 4 / log 2

Time for top level: f(n) = n 3 / log 5 n Time for base cases: Dominated?: c = ? Evaluating: T(n) = 4T(n/2)+ n 3 / log 5 n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 log a / log b = ?

Time for top level: f(n) = n 3 / log 5 n Time for base cases: Dominated?: c = 3 > 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ n 3 / log 5 n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 Hence, T(n) = ?

Time for top level: f(n) = n 3 / log 5 n Time for base cases: Dominated?: c = 3 > 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ n 3 / log 5 n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 Hence, T(n) = θ(top) = θ(f(n)) = θ(n 3 / log 5 n ).

Time for top level: f(n) = 2 n Time for base cases: Dominated?: c = ? > 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ 2 n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 Hence, T(n) = θ(top) = θ(f(n)) = θ(2 n ). bigger >big The time is even more dominated by the top level of recursion

Time for top level: f(n) = n log 5 n Time for base cases: Dominated?: c = ? 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ n log 5 n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 Hence, T(n) = ? 1< = θ(base cases) = θ(n ) = θ(n 2 ). log a / log b

Time for top level: f(n) = log 5 n Time for base cases: Dominated?: c = ? 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ log 5 n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 Hence, T(n) = ? 0< = θ(base cases) = θ(n ) = θ(n 2 ). log a / log b

Time for top level: f(n) = n 2 Time for base cases: Dominated?: c = ? 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ n 2 θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 Hence, T(n) = ? 2= = θ(f(n) log(n) ) = θ(n 2 log(n) ).

Time for top level: f(n) = n 2 log 5 n Time for base cases: Dominated?: c = 2 = 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ n 2 log 5 n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 k = ? 5 ? Hence, T(n) = ? > -1 = θ(f(n) log(n) ) = θ(n 2 log 6 (n) ).

Time for top level: f(n) = n 2 / log 5 n Time for base cases: Dominated?: c = 2 = 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ n 2 / log 5 n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 k = ? Hence, T(n) = ? -5 ?< -1 = θ(base cases) = θ(n ) = θ(n 2 ). log a / log b

Time for top level: f(n) = n 2 / log n Time for base cases: Dominated?: c = 2 = 2 = log a / log b Evaluating: T(n) = 4T(n/2)+ n 2 / log n θ(n ) = log a / log b θ(n ) = θ(n 2 ) log 4 / log 2 k = ? Hence, T(n) = ? -1 ?= -1 = θ(n c loglog n ) = θ(n 2 loglog n ). Did not do before.

Sufficiently close to: T(n) = 4T(n/2)+ n 3 = θ(n 3 ). T 2 (n) = ? Evaluating: T 2 (n) = 4 T 2 (n/2+n 1/2 )+ n 3 θ(n 3 ).

Evaluating: T(n) = aT(n-b)+f(n) h = ? n-hb n-ib n-2b n-b T(0) f(n-ib) f(n-2b) f(n-b) f(n) n Work in Level # stack frames Work in stack frame Instance size a aiai a2a2 a 1 n/bn/b a · T(0) a i · f(n-ib) a 2 · f(n-2b) a · f(n-b) 1 · f(n) n/bn/b |base case| = 0 = n-hb h = n / b i Level Likely dominated by base cases Exponential

Evaluating: T(n) = 1T(n-b)+f(n) h = ? Work in Level # stack frames Work in stack frame Instance size f(0) f(n-ib) f(n-2b) f(n-b) f(n) n-b n n-hb n-ib n-2b T(0) f(n-ib) f(n-2b) f(n-b) f(n) h = n / b i Level Total Work T(n) = ∑ i=0..h f(b·i) = θ(f(n)) or θ(n · f(n))

End Restart Relevant Mathematics Iterative Algorithms