Time Complexity of Algorithms

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

 O: order of magnitude  Look at the loops and to see whether the loops are nested. ◦ One single loop: O(n) ◦ A nested loop: O(n 2 ) ◦ A nested loop.
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Asymptotic Analysis Motivation Definitions Common complexity functions
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Data Structure Algorithm Analysis TA: Abbas Sarraf
CS2336: Computer Science II
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
COMPSCI 102 Introduction to Discrete Mathematics.
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
1 Big-Oh Notation CS 105 Introduction to Data Structures and Algorithms.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
Iterative Algorithm Analysis & Asymptotic Notations
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Analysis of Algorithms [ Section 4.1 ] Examples of functions important in CS: the constant function:f(n) =
MS 101: Algorithms Instructor Neelima Gupta
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
Geoff Holmes and Bernhard Pfahringer COMP206-08S General Programming 2.
Asymptotic Algorithm Analysis The asymptotic analysis of an algorithm determines the running time in big-Oh (big O) notation To perform the asymptotic.
Asymptotic Analysis (based on slides used at UMBC)
Copyright © 2014 Curt Hill Growth of Functions Analysis of Algorithms and its Notation.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
Algorithm Analysis Part of slides are borrowed from UST.
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
Lecture aid 1 Devon M. Simmonds University of North Carolina, Wilmington CSC231 Data Structures.
Kompleksitas Waktu Asimptotik CSG3F3 Lecture 5. 2 Intro to asymptotic f(n)=an 2 + bn + c RMB/CSG3F3.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Chapter 2 Algorithm Analysis
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Growth of functions CSC317.
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Advanced Analysis of Algorithms
Chapter 2.
Time Complexity Lecture 14 Sec 10.4 Thu, Feb 22, 2007.
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Time Complexity Lecture 15 Mon, Feb 27, 2006.
At the end of this session, learner will be able to:
Design and Analysis of Algorithms
CSE 373, Copyright S. Tanimoto, 2001 Asymptotic Analysis -
Algorithm Analysis T(n) O() Growth Rates 5/21/2019 CS 303 – Big ‘Oh’
Advanced Analysis of Algorithms
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Presentation transcript:

Time Complexity of Algorithms Introduction to Algorithm Analysis March 9, 2004 Time Complexity of Algorithms If running time T(n) is O(f(n)) then the function f measures time complexity Polynomial algorithms: T(n) is O(nk); k = const Exponential algorithm: otherwise Intractable problem: if no polynomial algorithm is known for its solution Lecture 4 COMPSCI.220.FS.T - 2004 COMPSCI.220.FS.T - 2003 - Lecture 4

Answer these questions Running time T(n) Complexity O(n) n2 + 100 n + 1 0.001n3 + n2 + 1 23 n 23n 23+n 23n Lecture 4 COMPSCI.220.FS.T - 2004

Answer these questions Running time T(n) Complexity O(n) 0.0001 n + 10000 100000 n + 10000 0.0001 n2 + 10000 n 100000 n2 + 10000 n 30 log20(23n) actually NOT that hard… Lecture 4 COMPSCI.220.FS.T - 2004

Big-Omega The function g(n) is W(f(n)) iff there exist a real positive constant c > 0 and a positive integer n0 such that g(n)  cf(n) for all n  n0 Big Omega is just opposite to Big Oh It generalises the concept of “lower bound” () in the same way as Big Oh generalises the concept of “upper bound” (≤) If f(n) is O(g(n)) then g(n) is W(f(n)) Lecture 4 COMPSCI.220.FS.T - 2004

Big-Theta The function g(n) is Q(f(n)) iff there exist two real positive constants c1 > 0 and c2 > 0 and a positive integer n0 such that: c1f(n)  g(n)  c2f(n) for all n  n0 Whenever two functions, f and g, are of the same order, g(n) is Q(f(n)), they are each Big-Oh of the other: g(n) is O(f(n)) AND f(n) is O(g(n)) Lecture 4 COMPSCI.220.FS.T - 2004

Upper bounds of complexity “Big-Oh” specifies an upper bound of complexity so that the following (and like) relationships hold: 1 = O(log n) = O(n) = O(n log n) = … log n = O(n) = O(n log n) = O(na); a > 1 = … n = O(n log n) = O(na); a > 1 = O(2n) =… n log n = O(na); a > 1 = O(nk); k > a = … nk = O(na); a > k = O(2n) = … Lecture 4 COMPSCI.220.FS.T - 2004

Time complexity growth f(n) Number of data items processed per: 1 minute 1 day 1 year 1 century n 10 14,400 5.26106 5.26108 n log n 3,997 883,895 6.72107 n1.5 1,275 65,128 1.40106 n2 379 7,252 72,522 n3 112 807 3,746 2n 20 29 35 Lecture 4 COMPSCI.220.FS.T - 2004

Beware exponential complexity ☺If a linear, O(n), algorithm processes 10 items per minute, then it can process 14,400 items per day, 5,260,000 items per year, and 526,000,000 items per century. ☻If an exponential, O(2n), algorithm processes 10 items per minute, then it can process only 20 items per day and 35 items per century... Lecture 4 COMPSCI.220.FS.T - 2004

Big-Oh vs. Actual Running Time Example 1: let algorithms A and B have running times TA(n) = 20n ms and TB(n) = 0.1n log2n ms In the “Big-Oh”sense, A is better than B… But: on which data volume can A outperform B? TA(n) < TB(n) if 20n < 0.1n log2n, or log2n > 200, that is, when n >2200 ≈ 1060 ! Thus, in all practical cases B is better than A… Lecture 4 COMPSCI.220.FS.T - 2004

Big-Oh vs. Actual Running Time Example 2: let algorithms A and B have running times TA(n) = 20n ms and TB(n) = 0.1n2 ms In the “Big-Oh”sense, A is better than B… But: on which data volumes A outperforms B? TA(n) < TB(n) if 20n < 0.1n2, or n > 200 Thus A is better than B in most practical cases except for n < 200 when B becomes faster… Lecture 4 COMPSCI.220.FS.T - 2004

“Big-Oh” Feature 1: Scaling Constant factors are ignored. Only the powers and functions of n should be exploited: for all c > 0  cf = O(f) where f  f(n) It is this ignoring of constant factors that motivates for such a notation! Examples: 50n, 50000000n, and 0.0000005n are O(n) Lecture 4 COMPSCI.220.FS.T - 2004

“Big-Oh” Feature 2: Transitivity If h does not grow faster than g and g does not grow faster than f, then h does not grow faster than f: h = O(g) AND g = O(f)  h = O(f) In other words, if f grows faster than g and g grows faster than h, then f grows faster than h Example: h = O(g); g = O(n2)  h = O(n2) Lecture 4 COMPSCI.220.FS.T - 2004

Feature 3: The Rule of Sums The sum grows as its fastest term: g1 = O(f1) AND g2 = O(f2)  g1+g2 = O(max{f1,f2}) if g = O(f) and h = O(f), then g + h = O (f) if g = O(f), then g + f = O (f) Examples: if h = O(n) AND g = O(n2), then g + h = O(n2) if h = O(n log n) AND g = O(n log log n), then g + h = O(n log n) Lecture 4 COMPSCI.220.FS.T - 2004

The Rule of Sums Lecture 4 COMPSCI.220.FS.T - 2004

Feature 4: The Rule of Products The upper bound for the product of functions is given by the product of the upper bounds for the functions: g1 = O(f1) AND g2 = O(f2)  g1  g2 = O( f1  f2 ) if g = O(f) and h = O(f), then g  h = O (f2) if g = O(f), then g  h = O (f  h) Example: if h = O(n) AND g = O(n2), then g  h = O(n3) Lecture 4 COMPSCI.220.FS.T - 2004

Ascending order of complexity Introduction to Algorithm Analysis March 9, 2004 Ascending order of complexity 1  log log n  log n  n  n log n  na; 1<a <2  n2  n3  nm; m > 3  2n … Questions: Where is the place of n2 log n? Where is the place of n2.79? Answers: … n2  n2 log n  n2.79  n3… Lecture 4 COMPSCI.220.FS.T - 2004 COMPSCI.220.FS.T - 2003 - Lecture 4

Answers to the questions Running time T(n) Complexity O(n) n2 + 100 n + 1 O(n2) 0.001n3 + n2 + 1 O(n3) 23 n O(n) 23n O(8n) as 23n(23)n 23+n O(2n) as 23+n232n 23n O(3n) Lecture 4 COMPSCI.220.FS.T - 2004

Answers to the questions Running time T(n) Complexity O(n) 0.0001 n + 10000 O(n) 100000 n + 10000 0.0001 n2 + 10000 n O(n2) 100000 n2 + 10000 n 30 log20(23n) actually NOT that hard… O(log n) as logc(ab)=logca +logcb Lecture 4 COMPSCI.220.FS.T - 2004