Big-O Analyzing Algorithms Asymptotically CS 226 5 February 2002 Noah Smith ( )

Slides:



Advertisements
Similar presentations
Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
Advertisements

Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
© 2004 Goodrich, Tamassia 1 Lecture 01 Algorithm Analysis Topics Basic concepts Theoretical Analysis Concept of big-oh Choose lower order algorithms Relatives.
Analysis of Algorithms
Analysis of Algorithms (pt 2) (Chapter 4) COMP53 Oct 3, 2007.
Fall 2006CSC311: Data Structures1 Chapter 4 Analysis Tools Objectives –Experiment analysis of algorithms and limitations –Theoretical Analysis of algorithms.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Algorithm/Running Time Analysis. Running Time Why do we need to analyze the running time of a program? Option 1: Run the program and time it –Why is this.
Analysis of Performance
Algorithm analysis and design Introduction to Algorithms week1
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Analysis of Algorithms Lecture 2
Searching – Linear and Binary Searches. Comparing Algorithms Should we use Program 1 or Program 2? Is Program 1 “fast”? “Fast enough”? P1P2.
Program Performance & Asymptotic Notations CSE, POSTECH.
Analysis Tools Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms
Analysis of Algorithms1 The Goal of the Course Design “good” data structures and algorithms Data structure is a systematic way of organizing and accessing.
Mathematics Review and Asymptotic Notation
Data Structures Lecture 8 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Analysis of Algorithms [ Section 4.1 ] Examples of functions important in CS: the constant function:f(n) =
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
Time Complexity of Algorithms
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Analysis of Algorithms Algorithm Input Output © 2010 Goodrich, Tamassia1Analysis of Algorithms.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
1 COMP9024: Data Structures and Algorithms Week Two: Analysis of Algorithms Hui Wu Session 2, 2014
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Analysis of Algorithms
COMP9024: Data Structures and Algorithms
COMP9024: Data Structures and Algorithms
Searching – Linear and Binary Searches
Introduction to Algorithms
Analysis of Algorithms
COMP9024: Data Structures and Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms Analysis
Analysis of Algorithms
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Analysis of Algorithms
Advanced Analysis of Algorithms
Chapter 2.
Analysis of Algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Advanced Analysis of Algorithms
Analysis of Algorithms
Presentation transcript:

Big-O Analyzing Algorithms Asymptotically CS February 2002 Noah Smith ( )

Comparing Algorithms Should we use Program 1 or Program 2? Is Program 1 “fast”? “Fast enough”? P1P2

You and Igor: the empirical approach Implement each candidate Run itTime it That could be lots of work – also error-prone. Which inputs? What machine/OS ?

Toward an analytic approach … Today is just math: How to solve “which algorithm” problems without machines, test data, or Igor!

The Big Picture Input (n = 3) Output Input (n = 4) Output Input (n = 8) Output How long does it take for the algorithm to finish? Algorithm

Primitives Primitive operations Primitive operations x = 4assignment x = 4assignment... x arithmetic... x arithmetic if (x < y)...comparison if (x < y)...comparison x[4]index an array x[4]index an array *xdereference (C) *xdereference (C) x.foo( )calling a method x.foo( )calling a method Others Others new/mallocmemory usage new/mallocmemory usage

How many foos? for (j = 1; j <= N; ++j) { foo( ); } ΣN j = 1 1 = N

How many foos? for (j = 1; j <= N; ++j) { for (k = 1; k <= M; ++k) { foo( ); }} ΣN j = 1 1 = NM ΣM k = 1

How many foos? for (j = 1; j <= N; ++j) { for (k = 1; k <= j; ++k) { foo( ); }} ΣN j = 1 1 = Σj k = 1 ΣN j = 1 j = N (N + 1) 2

How many foos? for (j = 0; j < N; ++j) { for (k = 0; k < j; ++k) { foo( ); }} for (j = 0; j < N; ++j) { for (k = 0; k < M; ++k) { foo( ); }} N(N + 1)/2 NM

How many foos? void foo(int N) { if(N <= 2) return; foo(N / 2); } T(0) = T(1) = T(2) = 1 T(n) = 1 + T(n/2) if n > 2 T(n)= 1 + (1 + T(n/4)) = 2 + T(n/4) = 2 + (1 + T(n/8)) = 3 + T(n/8) = 3 + (1 + T(n/16)) = 4 + T(n/16) … ≈ log 2 n

The trick an k + bn k-1 + … + yn + z an k n k

Big O Definition: Let f and g be functions mapping N to R. We say that f(n) is O(g(n)) if there exist c є R, c > 0 and n 0 є N, n 0 ≥ 1 such that f(n) ≤ cg(n) for all n є N, n ≥ n 0

Example 1 f(n)f(n) g(n)g(n) n0n0

Example 2 h(n)h(n) g(n)g(n) n0n0

Example 3 h(n)h(n) k(n)k(n)

k(n)k(n) h(n)h(n)

h(n)h(n) k(n)k(n)

Example 4 3n23n2 n2n2

Some complexity classes … Constant O(1) Logarithmic O(log n) Linear O(n)O(n)O(n)O(n) Quadratic O(n2)O(n2)O(n2)O(n2) Cubic O(n3)O(n3)O(n3)O(n3) Polynomial O(np)O(np)O(np)O(np) Exponential O(an)O(an)O(an)O(an)

Don’t be confused … We typically say, We typically say, f(n) is O(g(n)) or f(n) = O(g(n)) But O(g(n)) is really a set of functions. But O(g(n)) is really a set of functions. It might be more clear to say, It might be more clear to say, f(n) є O(g(n)) But I don’t make the rules. But I don’t make the rules. Crystal clear: “f(n) is order (g(n)” Crystal clear: “f(n) is order (g(n)”

Intuitively … To say f(n) is O(g(n)) is to say that To say f(n) is O(g(n)) is to say that f(n) is “less than or equal to” g(n) We also have (G&T pp ): We also have (G&T pp ): Θ(g(n)) Big Theta “equal to” Ω(g(n)) Big Omega “greater than or equal to” o(g(n)) Little o “strictly less than” ω(g(n)) Little omega “strictly greater than”

Big-Omega and Big-Theta Ω is just like O except that f(n) ≥ cg(n); f(n) is O(g(n))  g(n) is Ω(f(n)) Θ is both O and Ω (and the constants need not match); f(n) is O(g(n))  f(n) is Ω(g(n))  f(n) is Θ(g(n))

little o Definition: Let f and g be functions mapping N to R. We say that f(n) is o(g(n)) if for any c є R, c > 0 there exists n 0 є N, n 0 > 0 such that f(n) ≤ cg(n) for all n є N, n ≥ n 0 (little omega, ω, is the same but with ≥)

Multiple variables for(j = 1; j <= N; ++j) for(k = 1; k <= N; ++k) for(l = 1; l <= M; ++l) foo(); for(j = 1; j <= N; ++j) for(k = 1; k <= M; ++k) for(l = 1; l <= M; ++l) foo(); O(N 2 M + NM 2 )

Multiple primitives for(j = 1; j <= N; ++j){ sum += A[j]; for(k = 1; k <= M; ++k) { sum2 += B[j][k]; C[j][k] = B[j][k] * A[j] + 1; for(l = 1; l <= k; ++l) B[j][k] -= B[j][l]; }}

Tradeoffs: an example 0, 0 0, 1 0, 2 0, 3 … 1, 0 1, 1 1, 2 1, 3 … N, 0 N, 1 N, 2 … N, N  N … (lots of options in between)

Another example I have a set of integers between 0 and 1,000,000. I have a set of integers between 0 and 1,000,000. I need to store them, and I want O(1) lookup, insertion, and deletion. I need to store them, and I want O(1) lookup, insertion, and deletion. Constant time and constant space, right? Constant time and constant space, right? … 1,000,000

Big-O and Deceit Beware huge coefficients Beware huge coefficients Beware key lower order terms Beware key lower order terms Beware when n is “small” Beware when n is “small”

Does it matter? Let n = 1,000, and 1 ms / operation. n = 1000, 1 ms/op max n in one day n 1 second 86,400,000 n log 2 n 10 seconds 3,943,234 n2n2n2n2 17 minutes 9,295 n3n3n3n3 12 days 442 n4n4n4n4 32 years 96 n  years 6 2n2n2n2n 1.07  years 26

Worst, best, and average Gideon is a fast runner … up hills. … up hills. … down hills. … down hills. … on flat ground. … on flat ground. Gideon is the fastest swimmer … on the JHU team. … on the JHU team. … in molasses. … in molasses. … in our research lab. … in our research lab. … in 5-yard race. … in 5-yard race. … on Tuesdays. … on Tuesdays. … in an average race. … in an average race.

What’s average? Strictly speaking, average (mean) is relative to some probability distribution. Strictly speaking, average (mean) is relative to some probability distribution. mean(X) = Pr(x)  x Unless you have some notion of a probability distribution over test cases, it’s hard to talk about average requirements. Unless you have some notion of a probability distribution over test cases, it’s hard to talk about average requirements. Σ x

Now you know … How to analyze the run-time (or space requirements) of a piece of pseudo-code. How to analyze the run-time (or space requirements) of a piece of pseudo-code. Some new uses for Greek letters. Some new uses for Greek letters. Why the order of an algorithm matters. Why the order of an algorithm matters. How to avoid some pitfalls. How to avoid some pitfalls.