Lecture 3COMPSCI.220.S1.T - 20041 Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into.
Runtime Analysis CSC 172 SPRING 2002 LECTURE 9 RUNNING TIME A program or algorithm has running time T(n), where n is the measure of the size of the input.
CISC220 Spring 2010 James Atlas Lecture 06: Linked Lists (2), Big O Notation.
Cmpt-225 Algorithm Efficiency.
CS Master – Introduction to the Theory of Computation Jan Maluszynski - HT Lecture 8+9 Time complexity 1 Jan Maluszynski, IDA, 2007
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
Algorithm Analysis (Big O)
Analysis of Performance
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
Analysis of Algorithms
Analysis of Algorithm Efficiency Dr. Yingwu Zhu p5-11, p16-29, p43-53, p93-96.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Asymptotic Analysis-Ch. 3
1 Analysis of Algorithms CS 105 Introduction to Data Structures and Algorithms.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Analysis of Algorithms [ Section 4.1 ] Examples of functions important in CS: the constant function:f(n) =
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
Algorithm Analysis Data Structures and Algorithms (60-254)
Time Complexity of Algorithms
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Lecture 2COMPSCI AP G Gimel'farb1 Estimated Time to Sum Subarrays Ignore data initialisation “Brute-force” summing with two nested loops: T(n) 
Algorithm Analysis (Big O)
Lecture aid 1 Devon M. Simmonds University of North Carolina, Wilmington CSC231 Data Structures.
CISC220 Spring 2010 James Atlas Lecture 07: Big O Notation.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
Concepts of Algorithms CSC-244 Unit 3 and 4 Algorithm Growth Rates Shahid Iqbal Lone Computer College Qassim University K.S.A.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
CMPT 438 Algorithms.
Theoretical analysis of time efficiency
Analysis of Algorithms
Analysis of algorithms
Introduction to Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Analysis of Algorithms
Introduction to Algorithms
Lecture 06: Linked Lists (2), Big O Notation
Analysis of algorithms
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
Chapter 2.
PAC Intro to “big o” Lists Professor: Evan Korth New York University
Algorithm Analysis Bina Ramamurthy CSE116A,B.
Searching, Sorting, and Asymptotic Complexity
Time Complexity Lecture 14 Sec 10.4 Thu, Feb 22, 2007.
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Time Complexity Lecture 15 Mon, Feb 27, 2006.
At the end of this session, learner will be able to:
Analysis of algorithms
CSE 373, Copyright S. Tanimoto, 2001 Asymptotic Analysis -
Analysis of Algorithms
Algorithm Course Dr. Aref Rashad
Presentation transcript:

Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size becomes large, the most significant term is that which has the largest power of n This term increases faster than other terms which reduce in significance

Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time T(n)  0.25n 2  0.5n  0.25n 2 nT(n)0.25n 2 0.5n value % ,5502, ,75062,

Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Constants of proportionality depend on the compiler, language, computer, etc. –It is useful to ignore the constants when analysing algorithms. Constants of proportionality are reduced by using faster hardware or minimising time spent on the “inner loop” –But this would not effect behaviour of an algorithm for a large problem!

Lecture 3COMPSCI.220.S1.T Algorithm Versus Implementation Analysis of time complexity takes no account of the constant of proportionality c Analysis involves relative changes of running time

Lecture 3COMPSCI.220.S1.T Elementary Operations Basic arithmetic operations ( + ; – ;  ;  ; % ) –Relational operators ( ==, !=, >, =, <= ) Boolean operations (AND,OR,NOT), Branch operations, … Input for problem domains (meaning of n ): Sorting: n items Graph / path: n vertices / edges Image processing: n pixels Text processing: string length

Lecture 3COMPSCI.220.S1.T “Big-Oh” Tool O(…) for Analysing Algorithms Typical curves of time complexity: T(n)  log n, T(n)  n T(n)  n log n T(n)  n k T(n)  2 n

Lecture 3COMPSCI.220.S1.T Relative growth: G(n) = f(n) / f(5) ComplexityInput size n Function f(n) Constant Logarithm log n1234 Linear n “n log n” n log n Quadratic n2n ,625 Cubic n3n , Exponential 2n2n

Lecture 3COMPSCI.220.S1.T “Big-Oh” O(…) : Linear Complexity Linear complexity  time T(n)  n O(n)  running time does not grow faster than a linear function of the problem size n

Lecture 3COMPSCI.220.S1.T Logarithmic “Big-Oh” Complexity Logarithmic complexity: time T(n)  log n O(log n)  running time does not grow faster than a log function of the problem size n

Lecture 3COMPSCI.220.S1.T g(n) is O(f(n)), or g(n) = O(f(n)) Function g(n) is “Big- Oh” of f(n) if, starting from some n > n 0, there always exist a function c·f(n) that grows faster than the function g(n)

Lecture 3COMPSCI.220.S1.T “Big-Oh” O(…) : Formal Definition for those who are not afraid of Maths : Let f(n) and g(n) be positive-valued functions defined on the positive integers n The function g is defined as O(f) and is said to be of the order of f(n) iff ( read: if and only if ) there are a real constant c > 0 and an integer n 0 such that g(n)  c·f(n) for all n > n 0

Lecture 3COMPSCI.220.S1.T “Big-Oh” O(…) : Informal Meaning for those who are afraid of Maths: g is O(f) means that the algorithm with time complexity g runs (for large n ) at most as fast, within a constant factor, as the algorithm with time complexity f Note that the formal definition of “Big-Oh” differs slightly from the Stage I calculus:

Lecture 3COMPSCI.220.S1.T “Big-Oh” O(…) : Informal Meaning g is O(f) means that the order of time complexity of the function g is asymptotically less than or equal to the order of time complexity of the function f –Asymptotical behaviour  only for the large values of n –Two functions are of the same order when they each are “ Big-Oh ” of the other: f = O(g) AND g = O(f) –This property is called “ Big-Theta ”: g =  (f)

Lecture 3COMPSCI.220.S1.T O(…) Comparisons: Two Crucial Ideas The exact running time function is not important, since it can be multiplied by any arbitrary positive constant. Two functions are compared only asymtotically, for large n, and not near the origin –If the constants involved are very large, then the asymptotical behaviour is of no practical interest!

Lecture 3COMPSCI.220.S1.T “Big-Oh” Examples - 1 Linear function g(n) = an + b ; a > 0, is O(n ): g(n) < (a + |b|) · n for all n  1 Do not write O(2n) or O(an + b) as this means still O(n) ! O(n) - running time: T(n) = 3n + 1 T(n) = n T(n) = – 8 · n T(n) = 10 6 · n + 1 Remember that “Big-Oh” describes an “asymptotic behaviour” for large problem sizes

Lecture 3COMPSCI.220.S1.T “Big-Oh” Examples - 2 Polynomial P k (n) = a k n k + a k-1 n k-1 + … + a 1 n + a 0 ; a k > 0, is O(n k ) Do not write O(P k (n)) as this means still O(n k ) ! O(n k ) - running time: T(n)  3n 2  5n  1 is O(n 2 ) Is it also O(n 3 )? T(n)  10  8 n 3  10 8 n 2  30 is O(n 3 ) T(n)  10  8 n 8  1000n  1 is O(n 8 )

Lecture 3COMPSCI.220.S1.T “Big-Oh” Examples - 3 Exponential g(n) = 2 n+k is O( 2 n ) : 2 n+k = 2 k · 2 n for all n Exponential g(n) = m n+k is O( l n ), l  m > 1 : m n+k  l n+k = l k · l n for all n, k Remember that a “brute-force” search for the best combination of n interdependent binary decisions by exhausting all the 2 n possible combinations has the exponential time complexity, and try to find more efficient ways to solve your problem if n  20 … 30

Lecture 3COMPSCI.220.S1.T “Big-Oh” Examples - 4 Logarithmic function g(n) = log m n is of order log 2 n because log m n = log m 2 · log 2 n for all n, m > 0 Do not write O(log m n) as this means still O(log n) ! You will find later that the most efficient search for data in an ordered array has the logarithmic complexity