Algorithm Complexity & Big-O Notation: From the Basics CompSci Club 29 May 2014.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

11.2 Complexity Analysis. Complexity Analysis As true computer scientists, we need a way to compare the efficiency of algorithms. Should we just use a.
CS 263.  Classification of algorithm against a model pattern ◦ Each model demonstrate the performance scalability of an algorithm  Sorting algorithms.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Dan Grossman Spring 2010.
Chapter 3 Growth of Functions
Chapter 10 Algorithm Efficiency
Complexity Analysis (Part I)
CS 307 Fundamentals of Computer Science 1 Asymptotic Analysis of Algorithms (how to evaluate your programming power tools) based on presentation material.
Complexity Analysis (Part II)
Algorithmic Complexity 2 Fawzi Emad Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Data Structures Review Session 1
The Efficiency of Algorithms
Data Structure Algorithm Analysis TA: Abbas Sarraf
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014.
COMP s1 Computing 2 Complexity
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Program Efficiency and Complexity
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Week 2 CS 361: Advanced Data Structures and Algorithms
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
CMSC 341 Asymptotic Analysis. 8/3/07 UMBC CMSC 341 AA2-color 2 Complexity How many resources will it take to solve a problem of a given size?  time 
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
Complexity Analysis Chapter 1.
Analysis of Algorithms
Chapter 19: Searching and Sorting Algorithms
©Silberschatz, Korth and Sudarshan3.1 Algorithms Analysis Algorithm efficiency can be measured in terms of:  Time  Space  Other resources such as processors,
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Data Structure Introduction.
Computer Science and Software Engineering University of Wisconsin - Platteville 8. Comparison of Algorithms Yan Shi CS/SE 2630 Lecture Notes Part of this.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Asymptotic Analysis (based on slides used at UMBC)
Introduction to Analysis of Algorithms CS342 S2004.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Big-Oh Notation. Agenda  What is Big-Oh Notation?  Example  Guidelines  Theorems.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Scalability for Search Scaling means how a system must grow if resources or work grows –Scalability is the ability of a system, network, or process, to.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
8/3/07CMSC 341 Asymptotic Anaylsis1 CMSC 341 Asymptotic Analysis.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
A Introduction to Computing II Lecture 5: Complexity of Algorithms Fall Session 2000.
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
Analysis of Algorithms & Recurrence Relations. Recursive Algorithms Definition –An algorithm that calls itself Components of a recursive algorithm 1.Base.
CSC 212 – Data Structures Lecture 15: Big-Oh Notation.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Algorithms April-May 2013 Dr. Youn-Hee Han The Project for the Establishing the Korea ㅡ Vietnam College of Technology in Bac Giang.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
(Complexity) Analysis of Algorithms Algorithm Input Output 1Analysis of Algorithms.
Algorithm Analysis 1.
Introduction to Algorithms: Asymptotic Notation
COMP108 Algorithmic Foundations Algorithm efficiency
Introduction to Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Asymptotic Notations Algorithms Lecture 9.
8. Comparison of Algorithms
Presentation transcript:

Algorithm Complexity & Big-O Notation: From the Basics CompSci Club 29 May 2014

History -- Number Theory ←Edmund Landau ( ) Image source: /landau.jpg Intelligent mathematician of Germany Supervisor: Frobenius Dirichlet Series Number theory – over 250 papers, simple proof of the Prime Number Theorem, development on algebraic number fields *Asymptotic behavior of functions*; O is for Order

History -- Application to CS Big-O Notation -- used to study performance, complexity of algorithms in Comp Sci – Execution Time T(n) – Memory Usage (hard drive, network use, etc) Performance – what are these variables? Complexity – how does execution time change with greater amnt of data? Amortized analysis – studying the worst case scenario of algorithms, using big-O notation, determining complexity

Definition & Notations, I 1.If there is number N and number c such that:  f(x) ≤ c*g(x) for all x > N Then we can write: f(N) O (g(N))  N – problem size, input size, list size  We see various examples, as different functions: O (N 3 ) O (a N ) O (log (N))

Definitions & Notations, II f(n) O (g(x)), f(x) ≤ c*g(x) f(n) Θ (g(x)), f(x) = c*g(x) f(n) Ω (g(x)), f(x) ≥ c*g(x) *Note: in many texts these will be ‘equals’ signs, though many mathematicians such as myself find this to be inadequate notation (the ‘equals’ operator implies true converses, which is not true in all cases)

For Those of You in Calc Class… 2.If we know that: lim _f(x)_ x→∞ g(x) …then f(x) = o (g(x)). o However! This is actually little-oh notation (a stricter quality of Big-Oh Notation) o There exists a number N such that f(x) N and for all values of c. = 0,

Example Problems You may note that, in the coming examples, constant values don’t end up mattering very much. Dept. CS at Univ. Wisconsin-Madison describes / proves the evaluation of complexity 1.Summing up the times of each statement public void testComplexity () { statement1; statement2; … } = O (1) Solely a function of number of statements (N) …

Some More Complicated Examples 2.for-loop complexity Proportional to the upper index of the loop 3.Nested for-loops, each starting at int ** = 0: Proportional to O (N*M) or O (N 2 ) if N=M = O (N*M) = O (N)

Some More Complicated Examples for (int k = 0; k < N; k++) { for (int j = k; j < N; j++) { statements; } = O (N 2 ) n(n+1)/2 = (1/2)(n 2 ± n) … + N =

Some Practice Problems What is the worst-case complexity of the each of the following code fragments? Two loops in a row: for (i = 0; i < N; i++) { sequence of statements } for (j = 0; j < M; j++) { sequence of statements } How would the complexity change if the second loop went to N instead of M?

Some Practice Problems A nested loop followed by a non-nested loop: for (i = 0; i < N; i++) { for (j = 0; j < N; j++) { sequence of statements } for (k = 0; k < N; k++) { sequence of statements } A nested loop in which the number of times the inner loop executes depends on the value of the outer loop index: for (i = 0; i < N; i++) { for (j = N; j > i; j--) { sequence of statements }

When Does a Constant Matter? One can study time functions with greater specificity, for smaller differences in complexity T 1 (N) = kN; T 2 (N) = aN b, b > 1 T 1 can become more efficient than T 2 after a certain number of trials

Some Well-known Algorithms & Their Complexities From Wikipedia:  Constant time: Size of array  Logarithmic: BinarySearch Algorithm  Quadratic: Bubble Sort & Insertion Sort …And these can be verified by hand (Binary Search, List size, insertion sort, etc) = O (1) = O (log(N)) = O (N 2 )

Sources Cited lexities