Asymptotic Analysis CSE 331. Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where?

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

Complexity, Origami, etc. Big Oh Traditional Origami Fold and cut.
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
CPSC 320: Intermediate Algorithm Design & Analysis Asymptotic Notation: (O, Ω, Θ, o, ω) Steve Wolfman 1.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Asymptotic Growth Rate
Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into.
Growth of Functions CIS 606 Spring 2010.
Lecture 7 CSE 331 Sep 16, Feedback forms VOLUNTARY Last 5 mins of the lecture.
Asymptotic Analysis Motivation Definitions Common complexity functions
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (1) Asymptotic Complexity 10/28/2008 Yang Song.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithms
CSE 421 Algorithms Richard Anderson Lecture 3. Classroom Presenter Project Understand how to use Pen Computing to support classroom instruction Writing.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
BY Lecturer: Aisha Dawood.  stands alone on the right-hand side of an equation (or inequality), example : n = O(n 2 ). means set membership :n ∈ O(n.
2.1 Computational Tractability. 2 Computational Tractability Charles Babbage (1864) As soon as an Analytic Engine exists, it will necessarily guide the.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Growth of Functions. Asymptotic Analysis for Algorithms T(n) = the maximum number of steps taken by an algorithm for any input of size n (worst-case runtime)
Lecture 3 CSE 331. Stable Matching Problem Problem Statement Algorithm Problem Definition Implementation Analysis.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Time Complexity of Algorithms
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
David Meredith Growth of Functions David Meredith
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Asymptotic Notation Faculty Name: Ruhi Fatima
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
Analysis of Algorithms: Methods and Examples CSE 2320 – Algorithms and Data Structures Vassilis Athitsos Modified by Alexandra Stefan University of Texas.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
1 Potential for Parallel Computation Chapter 2 – Part 2 Jordan & Alaghband.
CSC317 1 Recap: Oh, Omega, Theta Oh (like ≤) Omega (like ≥) Theta (like =) O(n) is asymptotic upper bound 0 ≤ f(n) ≤ cg(n) Ω(n) is asymptotic lower bound.
Analysis of Algorithms
Lecture 8 CSE 331 Sep 14, 2011.
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
O-notation (upper bound)
Lecture 7 CSE 331 Sep 14, 2016.
Lecture 6 CSE 331 Sep 11, 2017.
Asymptotic Growth Rate
Lecture 8 CSE 331 Sep 12, 2014.
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Lecture 7 CSE 331 Sep 13, 2017.
Analysis of Algorithms
Lecture 9 CSE 331 Sep 19, 2012.
Richard Anderson Lecture 3
Chapter 2.
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Lecture 7 CSE 331 Sep 13, 2011.
O-notation (upper bound)
Lecture 8 CSE 331 Sep 15, 2011.
Lecture 9 CSE 331 Sep 20, 2010.
Lecture 6 CSE 331 Sep 12, 2016.
Lecture 7 CSE 331 Sep 10, 2014.
Richard Anderson Winter 2019 Lecture 4
David Kauchak cs161 Summer 2009
Lecture 7 CSE 331 Sep 11, 2013.
Advanced Analysis of Algorithms
Richard Anderson Autumn 2015 Lecture 4
Presentation transcript:

Asymptotic Analysis CSE 331

Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where? Platform independent definition What are real instances? Worst-case Inputs Efficient in terms of what? Input size N N = 2n 2 for SMP

Definition-II n! Analytically better than brute force How much better? By a factor of 2?

Definition-III Should scale with input size If N increases by a constant factor, so should the measure Polynomial running time At most c. N d steps (c>0, d>0 absolute constants) Step: “primitive computational step”

Which one is better?

Now?

And now?

The actual run times n! 100n 2 n2n2 Asymptotic View

Asymptotic Analysis ( Travelling Salesman Problem

Asymptotic Notation O is similar to ≤ Ω is similar to ≥ Θ is similar to =

g(n) is O(f(n)) g(n) n0n0 c*f(n) for some c>0 n

g(n) = pn 2 + qn + r Is g(n) in O(n 3 )? Yes Is g(n) in O(n 2 )? Yes Is g(n) in O(n)? No

g(n) is Ω(f(n)) g(n) n1n1 n ε*f(n) for some ε>0

g(n) = pn 2 + qn + r Is g(n) in Ω(n 3 )? No Is g(n) in Ω(n 2 )? Yes Is g(n) in Ω(n)? Yes

g(n) is Θ(f(n)) g(n) n0n0 c*f(n) for some c>0 n ε*f(n) for some ε>0

g(n) = pn 2 + qn + r Is g(n) in Θ(n 3 )? No Is g(n) in Θ(n 2 )? Yes Is g(n) in Θ(n)? No

Properties of asymptotic (applies to O, Ω, and Θ) Transitivity: if (g is O(h)) and (h is O(f)) -then (g is O(f)) Sum of functions (sequential code): if (g is O(f)) and (h is O(f)) -then (g+h is O(f)) Multiplication of functions (nested code): if (g is O(f 1 )) and (h is O(f 2 )) -then gh is O(f 1 f 2 )

Calculus for Asymptotic Analysis f(n) is O(g(n)) f(n) is Ω(g(n)) f(n) is Θ(g(n))

Calculus for Asymptotic Analysis f(n) is O(n 3 ) f(n) is Θ(n 2 ) f(n) is Ω(n)

Homework 1.2 You have the basic tools of asymptotic analysis Can you apply them to more complicated equations?

Asymptotic Analysis for Algorithms T(n) = the maximum number of steps taken by an algorithm for any input of size n (worst-case runtime) If T(n) n 0 then T(n) is in O(u(n)) – Need to show that the number of steps taken is less than u(n) for all inputs of size n>n 0 If T(n)>L(n) for all n>n 0 then T(n) is in Ω(L(n)) – Need to show that there exists at least one input that takes more than L(n) steps for all n>n 0

Asymptotic Analysis for Problems To prove that a problem is O(f(n)) – Provide an algorithm that solves the problem with T(n) that is O(f(n)) time – The problem is upper bounded by f(n) Example: The Gale-Shapley algorithm proves that the Stable Matching Problem is O(n 2 ) – The Stable Matching Problem has an upper bound of n 2

Asymptotic Analysis for Problems To prove that a problem is Ω(f(n)) – Prove that there can not exist any algorithm that solves the problem and has a T(n) that is Ω(f(n)) – The problem is lower bounded by f(n) Proving a lower bound is much more difficult than proving an upper bound! – No lower bounds for cryptographic primitives – P vs. NP could be solved with an exponential lower bound on any NP-complete problem When the proven upper and lower bounds are the same, the bounds are tight and we get to use Θ – The upper bound is always the same as the lower bound, but we can’t always prove it

Asymptotic Questions?