Week 2 - Friday CS221.

Slides:



Advertisements
Similar presentations
Complexity, Origami, etc. Big Oh Traditional Origami Fold and cut.
Advertisements

Chapter 1 – Basic Concepts
Big-O and Friends. Formal definition of Big-O A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N Example: Let f(n)
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Asymptotic Analysis Motivation Definitions Common complexity functions
The Efficiency of Algorithms
CS2336: Computer Science II
CS 310 – Fall 2006 Pacific University CS310 Complexity Section 7.1 November 27, 2006.
Algorithm analysis and design Introduction to Algorithms week1
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Week 2 CS 361: Advanced Data Structures and Algorithms
Mathematics Review and Asymptotic Notation
Week 2 - Monday.  What did we talk about last time?  Exceptions  Threads  OOP  Interfaces.
Week 2 - Wednesday.  What did we talk about last time?  Running time  Big Oh notation.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Asymptotic Complexity
Analysis of Algorithms
Analysis of Algorithms
Chapter 2 Algorithm Analysis
Theoretical analysis of time efficiency
Week 2 - Wednesday CS221.
COMP9024: Data Structures and Algorithms
COMP9024: Data Structures and Algorithms
Introduction to Algorithms
Introduction Algorithms Order Analysis of Algorithm
Recitation 10 Prelim Review.
CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis
Complexity analysis.
Algorithms Algorithm Analysis.
Lecture 2 Analysis of Algorithms
Analysis of Algorithms
COMP9024: Data Structures and Algorithms
Week 15 – Monday CS221.
Growth of functions CSC317.
Analysis of Algorithms
Complexity Analysis.
CSCE 3110 Data Structures & Algorithm Analysis
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Analysis of Algorithms
CS 201 Fundamental Structures of Computer Science
Advanced Analysis of Algorithms
DS.A.1 Algorithm Analysis Chapter 2 Overview
Chapter 2.
Analysis of Algorithms
CSCE 3110 Data Structures & Algorithm Analysis
Searching, Sorting, and Asymptotic Complexity
Algorithms Analysis Algorithm efficiency can be measured in terms of:
Intro to Data Structures
CSC 380: Design and Analysis of Algorithms
At the end of this session, learner will be able to:
The Efficiency of Algorithms
Advanced Analysis of Algorithms
Estimating Algorithm Performance
Algorithm/Running Time Analysis
CS 2604 Data Structures and File Management
Algorithm Course Dr. Aref Rashad
Analysis of Algorithms
Presentation transcript:

Week 2 - Friday CS221

Last time What did we talk about last time? Running time Big Oh notation

Questions?

Project 1

Assignment 1

Back to complexity

What's the running time? int count = 0; for( int i = 0; i < n; i += 2 ) for( int j = 0; j < n; j += 3 ) count++;

What's the running time? int count = 0; for( int i = 0; i < n; i++ ) for( int j = 0; j < n; j++ ) { if(j == n - 1) i = n; count++; }

Mathematical issues What's the running time to factor a large number N? How many edges are in a completely connected graph? If you have a completely connected graph, how many possible tours are there (paths that start at a given node, visit all other nodes, and return to the beginning)? How many different n-bit binary numbers are there?

Hierarchy of complexities Here is a table of several different complexity measures, in ascending order, with their functions evaluated at n = 100 Description Big Oh f(100) Constant O(1) 1 Logarithmic O(log n) 6.64 Linear O(n) 100 Linearithmic O(n log n) 664.39 Quadratic O(n2) 10000 Cubic O(n3) 1000000 Exponential O(2n) 1.27 x 1030 Factorial O(n!) 9.33 x 10157

What is log? The log operator is short for logarithm Taking the logarithm means de-exponentiating something log 10 7 =7 log 10 𝑥 =𝑥 What's the log 1,000,000 ?

What's the running time? int count = 0; for( int i = 1; i <= n; i *= 2 ) count++;

Logarithms Formal definition: Think of it as a de-exponentiator If bx = y Then logb y = x (for positive b values) Think of it as a de-exponentiator Examples: log10(1,000,000) = log3(81) = log2(512) =

Log base 2 In the normal world, when you see a log without a subscript, it means the logarithm base 10 "What power do you have to raise 10 to to get this number?" In computer science, a log without a subscript usually means the logarithm base 2 "What power do you have to raise 2 to to get this number?" log 2 8 =8 log 2 𝑦 =𝑦 What's the log 2048? (Assuming log base 2)

Log math logb(xy) = logb(x) + logb(y) logb(x/y) = logb(x) - logb(y) logb(xy) = y logb(x) Base conversion: logb(x) = loga(x)/loga(b) As a consequence: log2(n) = log10(n)/c1 = log100(n)/c2 = logb(n)/c3 for b > 1 log2n is O(log10n) and O(log100 n) and O(logbn) for b > 1

More on log Add one to the logarithm in a base and you'll get the number of digits you need to represent that number in that base In other words, the log of a number is related to its length Even big numbers have small logs If there's no subscript, log10 is assumed in math world, but log2 is assumed for CS Also common is ln, the natural log, which is loge

Log is awesome The logarithm of the number is related to the number of digits you need to write it That means that the log of a very large number is pretty small An algorithm that runs in log n time is very fast Number log10 log2 1,000 3 10 1,000,000 6 20 1,000,000,000 9 30 1,000,000,000,000 12 40

Big Oh, Big Omega, Big Theta

Formal definition of Big Oh Let f(n) and g(n) be two functions over integers f(n) is O(g(n)) if and only if f(n) ≤ c∙g(n) for all n > N for some positive real numbers c and N In other words, past some arbitrary point, with some arbitrary scaling factor, g(n) is always bigger

Different kinds of bounds We’ve been sloppy so far, saying that something is O(n) when its running time is proportional to n Big Oh is actually an upper bound, meaning that something whose running time is proportional to n Is O(n) But is also O(n2) And is also O(2n) If the running time of something is actually proportional to n, we should say it's Θ(n) We often use Big Oh because it's easier to find an upper bound than to get a tight bound

All three are useful measures O establishes an upper bound f(n) is O(g(n)) if there exist positive numbers c and N such that f(n) ≤ cg(n) for all n ≥ N Ω establishes a lower bound f(n) is Ω(g(n)) if there exist positive numbers c and N such that f(n) ≥ cg(n) for all n ≥ N Θ establishes a tight bound f(n) is Θ(g(n)) if there exist positive numbers c1,c2 and N such that c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ N

Tight bounds O and Ω have a one-to-many relationship with functions 4n2+ 3 is O(n2) but it is also O(n3) and O(n4 log n) 6n log n is Ω(n log n) but it is also Ω(n) Θ is one-to-many as well, but it has a much tighter bound Sometimes it is hard to find Θ Upper bounding isn’t too hard, but lower bounding is difficult for many real problems

Facts If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)) If f(n) is O(h(n)) and g(n) is O(h(n)), then f(n) + g(n) is O(h(n)) ank is O(nk) nk is O(nk+j), for any positive j If f(n) is cg(n), then f(n) is O(g(n)) loga n is O(logb n) for integers a and b > 1 loga n is O(nk) for integer a > 1 and real k > 0

Binary search example Implement binary search How much time does a binary search take at most? What about at least? What about on average, assuming that the value is in the list?

Complexity practice Give a tight bound for n1.1 + n log n Give a tight bound for 2n + a where a is a constant Give functions f1 and f2 such that f1(n) and f2(n) are O(g(n)) but f1(n) is not O(f2(n))

Upcoming

Next time… ADTs Implementing an array-backed list Read section 1.3

Reminders Read section 1.3 Finish Assignment 1 Start Assignment 2 Due Friday by 11:59pm Start Assignment 2 Due next Friday, September 15 by 11:59pm Keep working on Project 1 Due Friday, September 22 by 11:59pm