Computational Complexity

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

 O: order of magnitude  Look at the loops and to see whether the loops are nested. ◦ One single loop: O(n) ◦ A nested loop: O(n 2 ) ◦ A nested loop.
Algorithm Analysis.
Lecture: Algorithmic complexity
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
ALG0183 Algorithms & Data Structures Lecture 7 Big-Oh, Big-Omega, Big-Theta, Little-Oh 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks.
CHAPTER 1 Compiled by: Dr. Mohammad Omar Alhawarat Algorithm’s Analysis.
Introduction to Sorting
Runtime Analysis CSC 172 SPRING 2002 LECTURE 9 RUNNING TIME A program or algorithm has running time T(n), where n is the measure of the size of the input.
CISC220 Spring 2010 James Atlas Lecture 06: Linked Lists (2), Big O Notation.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (1) Asymptotic Complexity 10/28/2008 Yang Song.
Data Structure Algorithm Analysis TA: Abbas Sarraf
CS2336: Computer Science II
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Asymptotic Notations Iterative Algorithms and their analysis
Program Performance & Asymptotic Notations CSE, POSTECH.
Iterative Algorithm Analysis & Asymptotic Notations
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
CS223 Advanced Data Structures and Algorithms 1 Sorting and Master Method Neil Tang 01/21/2009.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CS 61B Data Structures and Programming Methodology July 10, 2008 David Sun.
The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. CSE 3101Z Design and Analysis of Algorithms.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Asymptotic Analysis CSE 331. Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where?
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CISC220 Spring 2010 James Atlas Lecture 07: Big O Notation.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
Concepts of Algorithms CSC-244 Unit 3 and 4 Algorithm Growth Rates Shahid Iqbal Lone Computer College Qassim University K.S.A.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Algorithm Analysis 1.
Analysis of Non – Recursive Algorithms
Analysis of Non – Recursive Algorithms
Introduction to Analysis of Algorithms
Analysis of Algorithms
Thinking about Algorithms Abstractly
Introduction to complexity
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Analysis of Algorithms
Lecture 06: Linked Lists (2), Big O Notation
Assignment 2 Tze calculations.
Great Theoretical Ideas in Computer Science
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Analysis of Algorithms
DS.A.1 Algorithm Analysis Chapter 2 Overview
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Algorithms Analysis Algorithm efficiency can be measured in terms of:
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Performance Evaluation
8. Comparison of Algorithms
Advanced Analysis of Algorithms
Analysis of Algorithms
Presentation transcript:

Computational Complexity CSC221 – Data Structures September 5, 2007

How to Determine Computational Complexity of an Algorithm Step 1: Estimate the number of steps in the most computationally-intensive parts of the program Loops Recursion (Note: We’re estimating worst-case complexity.)

How to Determine Computational Complexity of an Algorithm Step 2: Compare this number of steps to one of the “standard” functions that we use to describe computational complexity some constant c constant time N linear logN logarithmic NlogN N times logarthmic N2 quadratic N3 cubic 2N exponential

How to Determine Computational Complexity of an Algorithm Step 3: Describe the complexity as precisely as possible, using O, θ, o, or Ω. If T(N) is an estimate of the number of steps needed to execute your function and f(N) is a “standard” function against which you’re comparing f(N), then T(N) is O(f(N)) means, loosely, that T(N) grows at about the same rate or more slowly than f(N). (“more slowly” is better) T(N) is θ(f(N)) means, loosely, that T(N) grows at about the same rate as f(N). T(N) is o(f(N)) means, loosely, that T(N) grows more slowly than f(N). T(N) is Ω(f(N)) means, loosely, that T(N) grows at about the same rate or more quickly than f(N).

Examples What can you say about T(N) = 2N – 3? T(N) is θ(N) T(N) is O(N) T(N) is Ω(N) T(N) is o(N2) T(N) is Ω(2) T(N) is O(N2) T(N) is Ω(N2) false T(N) is o(2) false

What is the most precise thing you can say about T(N) = 2N – 3? T(N) is θ(N) is the most precise thing we can say. Often, we say T(N) is O(N), even though T(N) is θ(N) is more precise.

What do these statements mean, graphically? 2N-3 grows at about the same rate as N.

What do these statements mean, graphically? N2 grows faster than N.

How do you prove your claims, using the definitions of O, θ, Ω, and o? T(N) = O(f(N)) if there exist positive constants c and n0 such that T(N) ≤ cf(N) when N ≥ n0. Say that T(N) is 2N – 3 and f(N) is N. Then the above definition holds true for c = 2 and n0 = 0. (Note: There isn’t just one c or n0 to make the definition true, but you only need to show one such pair.)

Show that N2 + 2N is O(N2) using the definition The definition holds true for c = 2 and n0 = 2 N2 + 2N = 22 + 2*2 = 8 for N = 2 2 N2 = 2* 22 = 8 for N = 2 N2 + 2N = 32 + 2*3 = 15 for N = 3 2 N2 = 2* 32 = 18 for N = 3

You can see that N2 + 2N grows at about the same rate or more slowly than 2 N2 Choose an no after the point where the functions cross. After that point, the values of T(N) are smaller than those of cf(N).

You can also prove your claim by using limits. Determine the If the limit is then T(N) is o(f(N)) finite T(N) is O(f(N)) Nonzero finite T(N) is θ(f(N)) nonzero T(N) is Ω(f(N))

Say T(N) = 2N – 3 and f(N) = N Nonzero finite – therefore, 2N-3 = θ(N) It is also true that 2N-3 = O(N) It is also true that 2N-3 = Ω(N)