Lecture 3 Analysis of Algorithms, Part II. Plan for today Finish Big Oh, more motivation and examples, do some limit calculations. Little Oh, Theta notation.

Slides:



Advertisements
Similar presentations
Algorithms Algorithm: what is it ?. Algorithms Algorithm: what is it ? Some representative problems : - Interval Scheduling.
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Analysys & Complexity of Algorithms Big Oh Notation.
Chapter 1 – Basic Concepts
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
CSE 421 Algorithms Richard Anderson Lecture 4. What does it mean for an algorithm to be efficient?
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis (Big O)
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Algorithms  Al-Khwarizmi, arab mathematician, 8 th century  Wrote a book: al-kitab… from which the word Algebra comes  Oldest algorithm: Euclidian algorithm.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Asymptotic Analysis-Ch. 3
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
ADS 1 Algorithms and Data Structures 1 Syllabus Asymptotical notation (Binary trees,) AVL trees, Red-Black trees B-trees Hashing Graph alg: searching,
Algorithm Analysis (Big O)
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Asymptotic Complexity
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
Theoretical analysis of time efficiency
Introduction to Algorithms
GC 211:Data Structures Algorithm Analysis Tools
Analysis of algorithms
CS 3343: Analysis of Algorithms
CS 2210 Discrete Structures Algorithms and Complexity
GC 211:Data Structures Algorithm Analysis Tools
Introduction to Algorithms Analysis
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Analysys & Complexity of Algorithms
Advanced Analysis of Algorithms
Chapter 2.
Performance Evaluation
At the end of this session, learner will be able to:
Analysis of algorithms
David Kauchak cs161 Summer 2009
Advanced Analysis of Algorithms
Complexity Analysis (Part II)
Estimating Algorithm Performance
CS 2210 Discrete Structures Algorithms and Complexity
Data Structures & Programming
Presentation transcript:

Lecture 3 Analysis of Algorithms, Part II

Plan for today Finish Big Oh, more motivation and examples, do some limit calculations. Little Oh, Theta notation Start Recurrences. –Ex.1 ch.1 in textbook –Tower of Hanoi. –Binary search recurrence

An example of Proof by Induction for a Recursive Program Ex 1 Ch 1 textbook Function: int g(int n) if n<=1 return n else return 5 g(n-1) + 6 g(n-2) Prove that g(n)=3 n – 2 n

More about Big Oh lim a(n)/b(n) = constant if and only if a(n) < c b(n) for all n sufficiently large This allows to estimate, when we do not know how to compute limits. Example: the Harmonic numbers H(n) = 1+1/2+1/3+1/4+…+1/n < 1+1+…+1=n Hence H(n)=O(n).

Complexity Classes Big Oh allows to talk about algorithms with asymptotically comparable running times Most important in practice (tractable problems): –Logarithmic and poly-logarithmic: log n, log k n –Polynomial: n, n 2, n 3, …, n k,.. Untractable problems: –Exponential: 2 n, 3 n, 2 n^2, … –Higher (super-exponential): 2 2^n, 2 n^n, …

Most efficient: logarithmic and poly-logarithmic Functions which are O(log n) or polynomials in log n. Examples: –Base of the logarithm is irrelevant inside Big Oh: log b n (any base b) = O(log n) –Harmonic series: 1+1/2+1/3+…+ 1/n = O(log n) Most efficient data structures have search time O(log n) or O(log 2 n)

Upper bound for Harmonic series H n =1+1/2+1/3+…+1/n < 1+1+…+1 = n So H n = O(n), but this estimate is not good enough To prove H n = O(log n) need more advanced calculus (Riemann integrals) 1/x

Examples Binary search: main paradigm, search with O(log n) time per operation –Binary search trees: worst case time is linear, but on average search time is log n –B-trees (used in databases) and B+ trees –AVL-trees –Red-black trees –Splay trees, finger trees Dynamic convex hull algorithms: O(log 2 n) time per update operation

Polynomial Time Most efficient: linear time O(n) –Depth-first and breadth-first search –Connectivity, cycles in graphs Good: O(n log n): –Sorting –Convex hulls (in Computational Geometry) Quadratic O(n 2 ) –Shortest paths in graphs –Minimum spanning tree Cubic O(n 3 ) –All pairs shortest path –Matrix multiplication (naïve) O(n 5 )? O(n 6 )? –Robot Motion Planning Problems with few (5, 6,…) degrees of freedom

Exponential time Towers of Hanoi O(2 n ) Travelling Salesman Satisfiability of boolean formulas and circuits Super-exponential time Many Robot Motion Planning Problems with many (n) degrees of freedom O(2 2^n )

Relative Growth logarithmic polynomial exponential

Largest instance solvable ComplexityLargest in one second Largest in one day Largest in one year n1,000,000 86,400,000,000 31,536,000,000,000 n log n62, 746 2,755,147,514798,160,978,500 n2n ,9385,615,692 n3n3 1004,42131,593 2n2n

Big Omega and Theta Big Oh and Small Oh give Upper bounds on functions Reverses the role in Big Oh: –f(n) = Omega(g(n)) iff g(n) = O(f(n)) lim g(n) / f(n) < constant Big Oh gives upper bounds, Omega notation gives lower bounds Theta: both Big Oh and Omega

Examples f(n) = n 2 +2n-10 and g(n)=3n are Theta(n 2 ) and Theta of each other. Why? f(n)=log 2 n and g(n)=log 3 n are Theta of each other (logarithm base doesn’t count) f(n)=2 n and g(n)=3 n are NOT Theta of each other. f(n)=o(g(n)), grows much slower. Because lim 2 n /3 n =0.