Mathematics Review and Asymptotic Notation

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 3 Growth of Functions
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Cmpt-225 Algorithm Efficiency.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
Instructor Neelima Gupta
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
Lecture 2 Computational Complexity
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Design and Analysis Algorithm Drs. Achmad Ridok M.Kom Fitra A. Bachtiar, S.T., M. Eng Imam Cholissodin, S.Si., M.Kom Aryo Pinandito, MT Pertemuan 04.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
CS 3343: Analysis of Algorithms
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
Analysis of Algorithms
Analysis of Algorithm Efficiency Dr. Yingwu Zhu p5-11, p16-29, p43-53, p93-96.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
A Lecture /24/2015 COSC3101A: Design and Analysis of Algorithms Tianying Ji Lecture 1.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
MS 101: Algorithms Instructor Neelima Gupta
MS 101: Algorithms Instructor Neelima Gupta
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
Time Complexity of Algorithms (Asymptotic Notations)
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
Foundations of Algorithms, Fourth Edition
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
Analysis of Algorithms Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
CSC 413/513: Intro to Algorithms Introduction Proof By Induction Asymptotic notation.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 1 Prof. Charles E. Leiserson.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Data Structures & Algorithm CS-102 Lecture 12 Asymptotic Analysis Lecturer: Syeda Nazia Ashraf 1.
Mathematical Foundations (Growth Functions) Neelima Gupta Department of Computer Science University of Delhi people.du.ac.in/~ngupta.
Algorithm Analysis 1.
Introduction to Analysis of Algorithms
Introduction to Algorithms
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CSC 413/513: Intro to Algorithms
Introduction to Algorithms Analysis
CS 3343: Analysis of Algorithms
Chapter 2.
Algorithms Analysis Algorithm efficiency can be measured in terms of:
Intro to Data Structures
At the end of this session, learner will be able to:
Algorithm Course Dr. Aref Rashad
Presentation transcript:

Mathematics Review and Asymptotic Notation CS 2133: Data Structures Mathematics Review and Asymptotic Notation

Arithmetic Series Review 1 + 2 + 3 + . . . + n = ? Sn = a + a+d + a+2d + a+3d + . . . a+(n-1)d Sn= 2a+(n-1)d + 2a+(n-2)d + 2a+(n-3)d + … + a 2Sn = 2a+(n-1)d + 2a+(n-1)d + 2a+(n-1)d . . . + 2a+(n-1)d Sn = n/2[2a + (n-1)d] consequently 1+2+3+…+n = n(n+1)/2

Problems Find the sum of the following 1+3+5+ . . . + 121 = ? 1+3+5+ . . . + 121 = ? The first 50 terms of -3 + 3 + 9 + 15 + … 1 + 3/2 + 2 + 5/2 + . . . 25=?

Geometric Series Review 1 + 2 + 4 + 8 + . . . + 2n 1 + 1/2 + 1/4 + . . . + 2-n Theorem: Sn= a + ar + ar2 + . . . + arn rSn= ar + ar2 + . . . + arn + arn+1 Sn-rSn = a - arn+1 What about the case where -1< r < 1 ?

Geometric Problems What is the sum of 3+9/4 + 27/16 + . . . 1/2 - 1/4 + 1/8 - 1/16 + . . .

Harmonic Series This is Eulers constants

Just an Interesting Question What is the optimal base to use in the representation of numbers  n? Example: with base x we have _ _ _ _ _ _ _ _ _ We minimize X values slots

Logarithm Review Ln = loge is called the natural logarithm Lg = log2 is called the binary logarithm How many bits are required to represent the number n in binary

Logarithm Rules The logarithm to the base b of x denoted logbx is defined to that number y such that by = x logb(x1*x2) = logb x1 + logb x2 logb(x1/x2) = logb x1 - logb x2 logb xc = c logbx logbx > 0 if x > 1 logbx = 0 if x = 1 logbx < 0 if 0 < x < 1

Additional Rules logb a = logca / logc b logb (1/a) = - logb a For all real a>0, b>0 , c>0 and n logb a = logca / logc b logb (1/a) = - logb a logb a = 1/ logab a logb n = n logb a

Asymptotic Performance In this course, we care most about the asymptotic performance of an algorithm. How does the algorithm behave as the problem size gets very large? Running time Memory requirements Coming up: Asymptotic performance of two search algorithms, A formal introduction to asymptotic notation

Input Size Time and space complexity This is generally a function of the input size E.g., sorting, multiplication How we characterize input size depends: Sorting: number of input items Multiplication: total number of bits Graph algorithms: number of nodes & edges Etc

Running Time Number of primitive steps that are executed Except for time of executing a function call most statements roughly require the same amount of time y = m * x + b c = 5 / 9 * (t - 32 ) z = f(x) + g(y) We can be more exact if need be

Analysis Worst case Average case Provides an upper bound on running time An absolute guarantee Average case Provides the expected running time Very useful, but treat with care: what is “average”? Random (equally likely) inputs Real-life inputs

An Example: Insertion Sort InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } A[j+1] = key } }

Insertion Sort What is the precondition for this loop? InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } A[j+1] = key } }

Insertion Sort InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } A[j+1] = key } } How many times will this loop execute?

Insertion Sort Statement Effort InsertionSort(A, n) { for i = 2 to n { c1n key = A[i] c2(n-1) j = i - 1; c3(n-1) while (j > 0) and (A[j] > key) { c4T A[j+1] = A[j] c5(T-(n-1)) j = j - 1 c6(T-(n-1)) } 0 A[j+1] = key c7(n-1) } 0 } T = t2 + t3 + … + tn where ti is number of while expression evaluations for the ith for loop iteration

Analyzing Insertion Sort T(n) = c1n + c2(n-1) + c3(n-1) + c4T + c5(T - (n-1)) + c6(T - (n-1)) + c7(n-1) = c8T + c9n + c10 What can T be? Best case -- inner loop body never executed ti = 1  T(n) is a linear function Worst case -- inner loop body executed for all previous elements ti = i  T(n) is a quadratic function T=1+2+3+4+ . . . n-1 + n = n(n+1)/2

Analysis Simplifications Ignore actual and abstract statement costs Order of growth is the interesting measure: Highest-order term is what counts Remember, we are doing asymptotic analysis As the input size grows larger it is the high order term that dominates

Upper Bound Notation We say InsertionSort’s run time is O(n2) Properly we should say run time is in O(n2) Read O as “Big-O” (you’ll also hear it as “order”) In general a function f(n) is O(g(n)) if there exist positive constants c and n0 such that f(n)  c  g(n) for all n  n0 Formally O(g(n)) = { f(n):  positive constants c and n0 such that f(n)  c  g(n)  n  n0

Big O example Show using the definition that 5n+4 O(n) Where g(n)=n First we must find a c and an n0 We now need to show that f(n)  c g(n) for every n  n0 clearly 5n + 5  6n whenever n  6 Hence c=6 and n0=6 satisfy the requirements.

Insertion Sort Is O(n2) Proof Question Suppose runtime is an2 + bn + c If any of a, b, and c are less than 0 replace the constant with its absolute value an2 + bn + c  (a + b + c)n2 + (a + b + c)n + (a + b + c)  3(a + b + c)n2 for n  1 Let c’ = 3(a + b + c) and let n0 = 1 Question Is InsertionSort O(n3)? Is InsertionSort O(n)?

Big O Fact A polynomial of degree k is O(nk) Proof: Suppose f(n) = bknk + bk-1nk-1 + … + b1n + b0 Let ai = | bi | f(n)  aknk + ak-1nk-1 + … + a1n + a0

Lower Bound Notation We say InsertionSort’s run time is (n) In general a function f(n) is (g(n)) if  positive constants c and n0 such that 0  cg(n)  f(n)  n  n0 Proof: Suppose run time is an + b Assume a and b are positive (what if b is negative?) an  an + b

Asymptotic Tight Bound A function f(n) is (g(n)) if  positive constants c1, c2, and n0 such that c1 g(n)  f(n)  c2 g(n)  n  n0 Theorem f(n) is (g(n)) iff f(n) is both O(g(n)) and (g(n)) Proof: someday

 Notation (g) is the set of all functions f such that there exist positive constants c1, c2, and n0 such that 0  c1g(n)  f(n)  c2 g(n) for every n > nc c2 g(n) f(n) c1g(n)

Growth Rate Theorems 1. The power n is in O(n) iff  (with ,>0) and n is in o(n) iff  2. logbn o(n ) for any b and  3. n  o(cn) for any >0 and c>1 4. logbn  O(logbn) for any a and b 5. cn  O(dn) iff cd and cn  o(dn) iff c<d 6. Any constant function f(n) =c is in O(1)

Big O Relationships 1. o(f)  O(f) 2. If fo(g) then O(f) o(g) 4. If f  O(g) then f(n) + g(n)  O(g) 5. If f O(f `) and g  O(g`) then f(n)* g(n) O(f `(n) * g`(n))

Theorem: log(n!)(nlogn) Case 1 nlogn  O(log(n!)) log(n!) = log(n*(n-1)*(n-2) * * * 3*2*1) = log(n*(n-1)*(n-2)**n/2*(n/2-1)* * 2*1 => log(n/2*n/2* * * n/2*1 *1*1* * * 1) = log(n/2)n/2 = n/2 log n/2  O(nlogn) Case 2 log(n!)  O(nlogn) log(n!) = logn + log(n-1) + log(n-2) + . . . Log(2) + log(1) < log n + log n + log n . . . + log n = nlogn

The Little o Theorem: If log(f)o(log(g)) and lim g(n) =inf as n goes to inf then f o(g) Note the above theorem does not apply to big O for log(n2) O(log n) but n2 O(n) Application: Show that 2n  o(nn) Taking the log of functions we have log(2n)=nlog22 and log( nn) = nlog2n. Hence Implies that 2n  o(nn)

Theorem: L'Hospital's Rule

Practical Complexity

Practical Complexity

Practical Complexity

Practical Complexity

Other Asymptotic Notations A function f(n) is o(g(n)) if  positive constants c and n0 such that f(n) < c g(n)  n  n0 A function f(n) is (g(n)) if  positive constants c and n0 such that c g(n) < f(n)  n  n0 Intuitively, o() is like < O() is like  () is like > () is like  () is like =

Comparing functions Definition: The function f is said to dominate g if f(n)/g(n) increases without bound as n increases without bound. i.e. for any c>0 there exist n0>0 such that f(n)> c g(n) for every n>n0

Little o Complexity Little o o(g) is the set of all functions that are dominated by g, i.e. The set of all f such that for every c>0 there exist nc>0 such that f(n)c g(n) for every n > nc

Up Next Solving recurrences Substitution method Master theorem