CSC 430: Basics of Algorithm Runtime Analysis 1. What is Runtime Analysis? Fundamentally, it is the process of counting the number of “steps” it takes.

Slides:



Advertisements
Similar presentations
Algorithms Algorithm: what is it ?. Algorithms Algorithm: what is it ? Some representative problems : - Interval Scheduling.
Advertisements

Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Chapter 10 Algorithm Efficiency
CSC401 – Analysis of Algorithms Lecture Notes 1 Introduction
CS 206 Introduction to Computer Science II 09 / 10 / 2008 Instructor: Michael Eckmann.
Introduction to Analysis of Algorithms
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
Analysis of Performance
Program Performance & Asymptotic Notations CSE, POSTECH.
CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a lgorithms.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
Algorithm Analysis. Algorithm An algorithm is a clearly specified set of instructions which, when followed, solves a problem. recipes directions for putting.
1 Big-Oh Notation CS 105 Introduction to Data Structures and Algorithms.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
2.1 Computational Tractability
DISCRETE MATHEMATICS I CHAPTER 11 Dr. Adam Anthony Spring 2011 Some material adapted from lecture notes provided by Dr. Chungsim Han and Dr. Sam Lomonaco.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 4.
2.1 Computational Tractability. 2 Computational Tractability Charles Babbage (1864) As soon as an Analytic Engine exists, it will necessarily guide the.
Complexity & Analysis of Data Structures & Algorithms Piyush Kumar (Lecture 2: Algorithmic Analysis) Welcome to COP4531 Based on slides from J. Edmonds,
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Asymptotic Analysis-Ch. 3
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
MS 101: Algorithms Instructor Neelima Gupta
Geoff Holmes and Bernhard Pfahringer COMP206-08S General Programming 2.
CSCI 256 Data Structures and Algorithm Analysis Lecture 3 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Time Complexity of Algorithms
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Algorithm Analysis (Big O)
Computer Science Background for Biologists CSC 487/687 Computing for Bioinformatics Fall 2005.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
 2007 Pearson Education, Inc. All rights reserved Sorting: A Deeper Look.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
CMPT 438 Algorithms.
Chapter 2 Algorithm Analysis
Design and Analysis of Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Introduction to Algorithms
GC 211:Data Structures Algorithm Analysis Tools
CS 3343: Analysis of Algorithms
Analysis of Algorithms
Chapter 2 Basics of Algorithm Analysis
Introduction to Algorithms Analysis
Complexity & Analysis of Data Structures & Algorithms
Chapter 2.
Fundamentals of the Analysis of Algorithm Efficiency
Performance Evaluation
Chapter 2 Basics of Algorithm Analysis
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Presentation transcript:

CSC 430: Basics of Algorithm Runtime Analysis 1

What is Runtime Analysis? Fundamentally, it is the process of counting the number of “steps” it takes for an algorithm to terminate “Step” = primitive computer operation – Arithmetic – Read/Write memory – Make a comparison – … The number of steps needed depends on: – The fundamental algorithmic structure – The data structures you use 2

3 Historical Reasons to Care about Runtime Analysis Charles Babbage (1864) As soon as an Analytic Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will arise - By what course of calculation can these results be arrived at by the machine in the shortest time? - Charles Babbage Analytic Engine (schematic) Earliest computational problems were physical: What’s the fastest way to reconfigure a train using two turntables? When first invented, computers were rare (and quite slow) Had to schedule use-time This held true until about 1990 Apple rounded-rectangle story

Why YOU Should Care It’s no longer about speed A prediction: – By 2020, all new personal computers will contain more than 32 micro- processors What is it about? – Google – Wikipedia – Facebook – YouTube Licensed under Creative Commons: Moore%27s_Law_-_2008.svg 4

Other Types of Algorithm Analysis Algorithm space analysis Approximate-solution analysis Parallel algorithm analysis A PROBLEM’S run-time analysis Complexity class analysis 5

6 Runtime Motivates Design Easiest Algorithm to implement for any problem: Brute force. – check every possible solution. – Typically takes 2 N time or worse for inputs of size N. – Unacceptable in practice. Usually, through insight, we can do much better What do we mean by “better”?

7 Worst-Case Analysis Best case running time: – Rarely used; often superfluous Average case running time: – Obtain bound on running time of algorithm on random input as a function of input size N. – Hard (or impossible) to accurately model real instances by random distributions. – Acceptable in certain, easily-justified scenarios Worst case running time: – Obtain bound on largest possible running time – Generally captures efficiency in practice. – Pessimistic view, but hard to find effective alternative.

Polynomial Time Desirable scaling property: When the input size increases (e.g., if it doubles), the algorithm should only slow down by some constant factor C. Poly-time algorithms satisfy the above scaling property. A function is polynomial if there exist constants c > 0 and d > 0 such that on every input of size N, its running time is bounded by c N d steps. 8

9 Defining Efficiency: Worst-Case Polynomial-Time Def. An algorithm is efficient if its worst-case running time is polynomial or better. Not always true, but works in practice: – 6.02   N 20 is technically poly-time – Breaking through the exponential barrier of brute force typically exposes some crucial structure of the problem. – Most poly-time algorithms that people develop have low constants and low exponents (e.g., N 1 N 2 N 3 ). Exceptions. – Sometimes the problem size demands greater efficiency than poly-time – Some poly-time algorithms do have high constants and/or exponents, and are useless in practice. – Some exponential-time (or worse) algorithms are widely used because the worst-case instances seem to be rare. simplex method Unix grep

10 Ignorance is NOT Bliss Lesson: Know the running time of your algorithms!

2.2 Asymptotic Order of Growth 11

What is Asymptotic Analysis Instead of giving precise runtime, we’ll show that algorithms are bounded within a constant factor of common functions Three types of bounds – Upper Bounds (never greater) – Lower Bounds (never less) – Tight Bounds (Upper and Lower are same) 12

13 Growth Notation Upper bounds (Big-Oh): – T(n)  O(f(n)) if there exist constants c > 0 and n 0  0 such that for all n  n 0 we have T(n)  c · f(n). Lower bounds (Omega): – T(n)   (f(n)) if there exist constants c > 0 and n 0  0 such that for all n  n 0 we have T(n)  c · f(n). Tight bounds (Theta): – T(n)   (f(n)) if T(n) is both O(f(n)) and  (f(n)). Frequently abused notation: T(n) = O(f(n)). – Asymmetric: f(n) = 5n 3 ; g(n) = 3n 2 f(n) = O(n 3 ) = g(n) but f(n)  g(n). Better notation: T(n)  O(f(n)). Frequently abused notation: T(n) = O(f(n)). – Asymmetric: f(n) = 5n 3 ; g(n) = 3n 2 f(n) = O(n 3 ) = g(n) but f(n)  g(n). Better notation: T(n)  O(f(n)).

Use-Cases Big-Oh: – “The algorithm’s runtime is O(N lgN), a potential improvement over prior approaches, which were O(N 2 ).” Omega: – “Their algorithm’s runtime is  (N 2 ), which is worse than our algorithm’s proven O(N 1.5 ) runtime.” Theta: – “Our original algorithm was  (N lgN) and O(N 2 ). With modifications, we were able to improve the runtime bounds to  (N 1.5 )” 14

15 Properties Transitivity. – If f  O(g) and g  O(h) then f  O(h). – If f   (g) and g   (h) then f   (h). – If f   (g) and g   (h) then f   (h). Additivity. – If f  O(h) and g  O(h) then f + g  O(h). – If f   (h) and g   (h) then f + g   (h). – If f   (h) and g  O(h) then f + g   (h). – These generalize to any number of terms “Symmetry”: – If f   (h) then h   (f) – If f  O(h) then h   (f) – If f   (h) then h  O(f)

16 Asymptotic Bounds for Some Common Functions Polynomials. a 0 + a 1 n + … + a d n d is  (n d ) if a d > 0. Polynomial time. Running time is O(n d ) for some constant d independent of the input size n. Logarithms. O(log a n) = O(log b n) for any constants a, b > 0. Logarithms. For every x > 0, log n = O(n x ). Exponentials. For every r > 1 and every d > 0, n d = O(r n ). every exponential grows faster than every polynomial can avoid specifying the base log grows slower than every polynomial

Proving Bounds Use Transitivity: – If an algorithm is at least as fast as another, then it has the same upper bound Use Additivity: – f(n) = n 3 + n 2 + n – f(n)   (n 3 ) Be direct: – Start with the definition and provide the parts it requires Take the logarithm of both functions Use limits: Keep in mind that sometimes functions can’t be bounded. (But I won’t assign you any like that!) 17

Practice 18

2.4 A Survey of Common Running Times 19

20 Linear Time: O(n) Linear time. Running time is at most a constant factor times the size of the input. Computing the maximum. Compute maximum of n numbers a 1, …, a n. def max(L): max = L[0] for i in L: if i > max: max = i return max

21 Linear Time: O(n) Claim. Merging two lists of size n takes O(n) time. Pf. After each comparison, the length of output list increases by 1. #the merge algorithm combines two #sorted lists into one larger sorted #list def merge(A,B): i = j = 0 M = [] while i < len(A) or j < len(B): if A[i]  B[j]: M.append(A[i]) i+=1 else: M.append(B[j]) j+=1 if i == len(A): return M + B[j:] return M + A[i:]

22 O(n log n) Time O(n log n) time. Arises in divide-and-conquer algorithms. Sorting. Mergesort and heapsort are sorting algorithms that perform O(n log n) comparisons. Largest empty interval. Given n time-stamps x 1, …, x n on which copies of a file arrive at a server, what is largest interval of time when no copies of the file arrive? O(n log n) solution. Sort the time-stamps. Scan the sorted list in order, identifying the maximum gap between successive time-stamps.

23 Quadratic Time: O(n 2 ) Quadratic time. Enumerate all pairs of elements. Closest pair of points. Given a list of n points in the plane (x 1, y 1 ), …, (x n, y n ), find the pair that is closest. Remark.  (n 2 ) seems inevitable, but this is just an illusion. def closest_pair(L): x1 = L[0] x2 = L[1] #no need to compute the sqrt for the distance min = (x1[0] – x2[0])**2 + (x1[1] – x2[1])**2 for i in range(len(L)): for j in range(i+1,len(L)) : x1,x2 = L[i],L[j] d = (x1[0] – x2[0])**2 + (x1[1] – x2[1])**2 if d < min : min  d return min see chapter 5

24 Cubic Time: O(n 3 ) Cubic time. Enumerate all triples of elements. Set disjointness. Given n sets S 1, …, S n each of which is a subset of 1, 2, …, n, is there some pair of these which are disjoint? def find_disjoint_sets(S): for i in range(len(S)): for j in range(i+1,len(S)): for p in S[i]: if p in S[j]: break; else: #no p in S[i] belongs to S[j] return S[i],S[j] #return tuple if found return None #return nothing if all sets overlap

25 Polynomial Time: O(n k ) Time Independent set of size k. Given a graph, are there k nodes such that no two are joined by an edge? O(n k ) solution. Enumerate all subsets of k nodes. – Check whether S is an independent set = O(k 2 ). – Number of k element subsets = – O(k 2 n k / k!) = O(n k ). def independent_k(V,E,k): #subset_k,independent omitted for space for sub in subset_k(V,k): if independent(sub,E): return sub return None poly-time for k=17, but not practical k is a constant

26 Exponential Time Independent set. Given a graph, what is maximum size of an independent set? O(2 n ) solution. Enumerate all subsets. def max_independent(V,E): for k in len(V): if not independent_k(V,E,k): return k-1