Week 2 - Wednesday.  What did we talk about last time?  Running time  Big Oh notation.

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Week 12 - Wednesday.  What did we talk about last time?  Hunters and prey  Class variables  Big Oh notation.
Chapter 1 – Basic Concepts
Big-O and Friends. Formal definition of Big-O A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N Example: Let f(n)
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Cmpt-225 Algorithm Efficiency.
Analysis of Algorithms (pt 2) (Chapter 4) COMP53 Oct 3, 2007.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
The Efficiency of Algorithms
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
CS 310 – Fall 2006 Pacific University CS310 Complexity Section 7.1 November 27, 2006.
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Algorithmic Complexity: Complexity Analysis of Time Complexity Complexities Nate the Great.
Week 2 CS 361: Advanced Data Structures and Algorithms
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms
Mathematics Review and Asymptotic Notation
Week 2 - Monday.  What did we talk about last time?  Exceptions  Threads  OOP  Interfaces.
CS 1704 Introduction to Data Structures and Software Engineering.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
3.3 Complexity of Algorithms
RUNNING TIME 10.4 – 10.5 (P. 551 – 555). RUNNING TIME analysis of algorithms involves analyzing their effectiveness analysis of algorithms involves analyzing.
Time Complexity of Algorithms (Asymptotic Notations)
CS 206 Introduction to Computer Science II 09 / 18 / 2009 Instructor: Michael Eckmann.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
Week 12 - Friday.  What did we talk about last time?  Finished hunters and prey  Class variables  Constants  Class constants  Started Big Oh notation.
Asymptotic Analysis CSE 331. Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where?
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 8: Crash Course in Computational Complexity.
2IL50 Data Structures Spring 2016 Lecture 2: Analysis of Algorithms.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
Computational complexity The same problem can frequently be solved with different algorithms which differ in efficiency. Computational complexity is a.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Week 2 - Wednesday CS221.
Thinking about Algorithms Abstractly
Introduction to Algorithms
Analysis of algorithms
Week 2 - Friday CS221.
Growth of functions CSC317.
Complexity Analysis.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Growth Rate
Fundamentals of Algorithms MCS - 2 Lecture # 9
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 201 Fundamental Structures of Computer Science
Advanced Analysis of Algorithms
Chapter 2.
O-notation (upper bound)
Analysis of algorithms
David Kauchak cs161 Summer 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Advanced Analysis of Algorithms
Algorithm Course Dr. Aref Rashad
Presentation transcript:

Week 2 - Wednesday

 What did we talk about last time?  Running time  Big Oh notation

 What's the running time to factor a large number N?  How many edges are in a completely connected graph?  If you have a completely connected graph, how many possible tours are there (paths that start at a given node, visit all other nodes, and return to the beginning)?  How many different n-bit binary numbers are there?

 Here is a table of several different complexity measures, in ascending order, with their functions evaluated at n = 100 DescriptionBig Ohf(100) ConstantO(1)1 LogarithmicO(log n)6.64 LinearO(n)O(n)100 LinearithmicO(n log n) QuadraticO(n2)O(n2)10000 CubicO(n3)O(n3) ExponentialO(2 n )1.27 x FactorialO(n!)9.33 x

 Computers get faster, but not in unlimited ways  If computers get 10 times faster, here is how much a problem from each class could grow and still be solvable DescriptionBig OhIncrease in Size ConstantO(1)Unlimited LogarithmicO(log n)1000 LinearO(n)O(n)10 LinearithmicO(n log n)10 QuadraticO(n2)O(n2)3-4 CubicO(n3)O(n3)2-3 ExponentialO(2 n )Hardly changes FactorialO(n!)Hardly changes

 There is nothing better than constant time  Logarithmic time means that the problem can become much larger and only take a little longer  Linear time means that time grows with the problem  Linearithmic time is just a little worse than linear  Quadratic time means that expanding the problem size significantly could make it impractical  Cubic time is about the reasonable maximum if we expect the problem to grow  Exponential and factorial time mean that we cannot solve anything but the most trivial problem instances

 What is a logarithm?  Definition:  If b x = y  Then log b y = x (for positive b values)  Think of it as a de-exponentiator  Examples:  log 10 (1,000,000) =  log 3 (81) =  log 2 (512) =

 Add one to the logarithm in a base and you'll get the number of digits you need to represent that number in that base  In other words, the log of a number is related to its length  Even big numbers have small logs  If there's no subscript, log 10 is assumed in math world, but log 2 is assumed for CS  Also common is ln, the natural log, which is log e

 log b (xy) = log b (x) + log b (y)  log b (x/y) = log b (x) - log b (y)  log b (x y ) = y log b (x)  Base conversion:  log b (x) = log a (x)/log a (b)  As a consequence:  log 2 (n) = log 10 (n)/c 1 = log 100 (n)/c 2 = log b (n)/c 3 for b > 1  log 2 n is O(log 10 n) and O(log 100 n) and O(log b n) for b > 1

 Let f(n) and g(n) be two functions over integers  f(n) is O(g(n)) if and only if  f(n) ≤ c∙g(n) for all n > N  for some positive real numbers c and N  In other words, past some arbitrary point, with some arbitrary scaling factor, g(n) is always bigger

 We’ve been sloppy so far, saying that something is O(n) when its running time is proportional to n  Big Oh is actually an upper bound, meaning that something whose running time is proportional to n  Is O(n)  But is also O(n 2 )  And is also O(2 n )  If the running time of something is actually proportional to n, we should say it's Θ(n)  We often use Big Oh because it's easier to find an upper bound than to get a tight bound

 O establishes an upper bound  f(n) is O(g(n)) if there exist positive numbers c and N such that f(n) ≤ cg(n) for all n ≥ N  Ω establishes a lower bound  f(n) is Ω(g(n)) if there exist positive numbers c and N such that f(n) ≥ cg(n) for all n ≥ N  Θ establishes a tight bound  f(n) is Θ(g(n)) if there exist positive numbers c 1,c 2 and N such that c 1 g(n) ≤ f(n) ≤ c 2 g(n) for all n ≥ N

 O and Ω have a one-to-many relationship with functions  4n is O(n 2 ) but it is also O(n 3 ) and O(n 4 log n)  6n log n is Ω(n log n) but it is also Ω(n)  Θ is one-to-many as well, but it has a much tighter bound  Sometimes it is hard to find Θ  Upper bounding isn’t too hard, but lower bounding is difficult for real problems

1. If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)) 2. If f(n) is O(h(n)) and g(n) is O(h(n)), then f(n) + g(n) is O(h(n)) 3. an k is O(n k ) 4. n k is O(n k+j ), for any positive j 5. If f(n) is cg(n), then f(n) is O(g(n)) 6. log a n is O(log b n) for integers a and b > 1 7. log a n is O(n k ) for integer a > 1 and real k > 0

 How much time does a binary search take at most?  What about at least?  What about on average, assuming that the value is in the list?

 Give a tight bound for n n log n  Give a tight bound for 2 n + a where a is a constant  Give functions f 1 and f 2 such that f 1 (n) and f 2 (n) are O(g(n)) but f 1 (n) is not O(f 2 (n))

 Implementing an array-backed list  Read section 1.3

 Finish Assignment 1  Due Friday by 11:59pm  Keep working on Project 1  Due Friday, September 18 by 11:59pm