CS 2604 Data Structures and File Management

Slides:



Advertisements
Similar presentations
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Advertisements

Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
Chapter 3 Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Instructor Neelima Gupta
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Asymptotics Data Structures & OO Development I 1 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Analysis Order of magnitude analysis.
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Math Review Data Structures & File Management Computer Science Dept Va Tech July 2000 ©2000 McQuain WD 1 Summation Formulas Let N > 0, let A, B, and C.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Asymptotic Analysis-Ch. 3
Time Complexity of Algorithms (Asymptotic Notations)
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Asymptotic Complexity
Algorithm Analysis 1.
Analysis of Algorithms
Chapter 2 Algorithm Analysis
Introduction to Algorithms: Asymptotic Notation
COMP108 Algorithmic Foundations Algorithm efficiency
Asymptotic Analysis.
Introduction to Algorithms
CS 3343: Analysis of Algorithms
Analysis of Algorithms
Week 2 - Friday CS221.
Analysis of Algorithms
Growth of functions CSC317.
The Growth of Functions
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Sorting Algorithms Written by J.J. Shepherd.
Asymptotic Notations Algorithms Lecture 9.
Asymptotics Section 3.2 of Rosen Spring 2017
CSCI 2670 Introduction to Theory of Computing
CSC 413/513: Intro to Algorithms
Asymptotic Growth Rate
Analysis of Algorithms
Asymptotic Analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Data Structures Review Session
CS 3343: Analysis of Algorithms
Orders of Growth Rosen 5th ed., §2.2.
Analysis of Algorithms
Advanced Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms
Performance Evaluation
CSC 380: Design and Analysis of Algorithms
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Advanced Algorithms Analysis and Design
At the end of this session, learner will be able to:
Discrete Mathematics 7th edition, 2009
Chapter 3 Growth of Functions
Advanced Analysis of Algorithms
Complexity Analysis (Part II)
Estimating Algorithm Performance
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Algorithm Course Dr. Aref Rashad
Analysis of Algorithms
Presentation transcript:

CS 2604 Data Structures and File Management Big-O Analysis The formal definition deserves some commentary, but the main point is simply that big-O is about upper bounds… Make sure the students realize that the upper bound, even in the general big-O sense isn't uniquely determined. This material is covered MUCH more formally in the 4000-level algorithms course; UPC has indicated that the coverage here is probably somewhat over the top for 2604. In any case, you shouldn't spend too much time on this. What is vital is that the students learn the complexity results for the data structures covered throughout the course, and have some intuitive feel for their significance. Order of magnitude analysis requires a number of mathematical definitions and theorems. The most basic concept is commonly termed big-O. Definition: Suppose that f(n) and g(n) are nonnegative functions of n. Then we say that f(n) is O(g(n)) provided that there are constants C > 0 and N > 0 such that for all n > N, f(n)  Cg(n). By the definition above, demonstrating that a function f is big-O of a function g requires that we find specific constants C and N for which the inequality holds (and show that the inequality does, in fact, hold). Big-O expresses an upper bound on the growth rate of a function, for sufficiently large values of n. Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

CS 2604 Data Structures and File Management Big-O Example This is probably overkill, but it illustrates the difficulty (for the students) of applying the formal definition on the previous slide. That's the motivation for the theorems that follow. Take the function obtained in the algorithm analysis example earlier: Intuitively, one should expect that this function grows similarly to n2. To show that, we will shortly prove that: and that Why? Because then we can argue by substitution (of non-equal quantities): Thus, applying the definition with C = 15/2 and N = 1, T(n) is O(n2). Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

CS 2604 Data Structures and File Management Proving the Details This is just here to complete the preceding example… again, the point is that applying the formal definition requires a certain degree of creativity. Theorem: if a = b and 0  c  d then a*c  b*d. Claim: proof: Obviously 5/2 < 5, so (5/2)n < 5n. Also, if n  1then by the Theorem above, 5n  5n2. Hence, since  is transitive, (5/2)n  5n2. Claim: proof: For all n, n2  0. Obviously 0  -3. So by transitivity … Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

CS 2604 Data Structures and File Management Big-O Theorems The theorems here are just the usual ones that any book at this level would be expected to provide. They obviate much of the need to apply the formal definition. For all the following theorems, assume that f(n) is a function of n and that K is an arbitrary constant. Theorem 1: K is O(1) Theorem 2: A polynomial is O(the term containing the highest power of n) Theorem 3: K*f(n) is O(f(n)) [i.e., constant coefficients can be dropped] Theorem 4: If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)). [transitivity] Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

CS 2604 Data Structures and File Management Big-O Theorems … continued. This one's relatively important because it codifies a scale that one must know to apply the following theorems. Note that the scale isn't complete. Theorem 5: Each of the following functions is strictly big-O of its successors: K [constant] logb(n) [always log base 2 if no base is shown] n n logb(n) n2 n to higher powers 2n 3n larger constants to the n-th power n! [n factorial] nn smaller larger Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

CS 2604 Data Structures and File Management Big-O Theorems Theorem 6: In general, f(n) is big-O of the dominant term of f(n), where “dominant” may usually be determined from Theorem 5. Theorem 7: For any base b, logb(n) is O(log(n)). Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

Big-Omega and Big-Theta CS 2604 Data Structures and File Management Big-Omega and Big-Theta Again, the formal definitions are important but at this level the primary point is the intuitive meaning of lower bound and equivalence. In addition to big-O, we may seek a lower bound on the growth of a function: Definition: Suppose that f(n) and g(n) are nonnegative functions of n. Then we say that f(n) is (g(n)) provided that there are constants C > 0 and N > 0 such that for all n > N, f(n)  Cg(n). Big-  expresses a lower bound on the growth rate of a function, for sufficiently large values of n. Finally, we may have two functions that grow at essentially the same rate: Definition: Suppose that f(n) and g(n) are nonnegative functions of n. Then we say that f(n) is (g(n)) provided that f(n) is O(g(n)) and also that f(n) is (g(n)). Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

CS 2604 Data Structures and File Management Order and Limits Most data structures books don't seem to present these theorems… I find them useful because it's much easier for most of our students to apply these (since they have a firmer grasp of limits than of algebra). The task of determining the order of a function is simplified considerably by the following result: Theorem 8: f(n) is (g(n)) if Recall Theorem 7… we may easily prove it (and a bit more) by applying Theorem 8: The last term is finite and positive, so logb(n) is (log(n)) by Theorem 8. Corollary: if the limit above is 0 then f(n) is O(g(n)), and if the limit is  then f(n) is (g(n)). Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

CS 2604 Data Structures and File Management More Theorems Some of the big-O theorems may be strengthened now: Theorem 9: If K > 0 is a constant, then K is (1). Theorem 10: A polynomial is (the highest power of n). proof: Suppose a polynomial of degree k. Then we have: Now ak > 0 since we assume the function is nonnegative. So by Theorem 8, the polynomial is (nk). QED Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

Big- Is an Equivalence Relation CS 2604 Data Structures and File Management Big- Is an Equivalence Relation Theorem 11: If f(n) is (g(n)) and g(n) is (h(n)) then f(n) is (h(n)). [transitivity] This follows from Theorem 4 and the observation that big-  is also transitive. Theorem 12: If f(n) is (g(n)) then g(n) is (f(n)). [symmetry] Theorem 13: If f(n) is (f(n)). [reflexivity] By Theorems 11–13,  is an equivalence relation on the set of nonnegative functions. The equivalence classes represent fundamentally different growth rates. Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

Applications to Algorithm Analysis CS 2604 Data Structures and File Management Applications to Algorithm Analysis Ex 1: An algorithm (e.g, see slide 2.9) with complexity function is (n2) by Theorem 10. Ex 2: An algorithm (e.g, see slide 2.10) with complexity function is O(n log(n)) by Theorem 5. Furthermore, the algorithm is also (n log(n)) by Theorem 8 since: For most common complexity functions, it's this easy to determine the big-O and/or big- complexity using the given theorems. Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics

Complexity of Linear Storage CS 2604 Data Structures and File Management Complexity of Linear Storage The most important point is that there is no "winner" when comparing contiguous to linked storage. Each has advantages in some situations and liabilities in others. The comparison here is a primary motivation for the development of alternatives like the skip list and the binary search tree, which combine most of the advantages of contiguous and linked storage. For a contiguous list of N elements, assuming each is equally likely to be the target of a search: - average search cost is (N) if list is randomly ordered - average search cost is (log N) is list is sorted - average random insertion cost is (N) - insertion at tail is (1) For a linked list of N elements, assuming each is equally likely to be the target of a search: - average search cost is (N), regardless of list ordering - average random insertion cost is (1), excluding search time Computer Science Dept Va Tech January 2004 ©2000-2004 McQuain WD Asymptotics