CS 2601 Runtime Analysis Big O, Θ, some simple sums. (see section 1.2 for motivation) Notes, examples and code adapted from Data Structures and Other Objects.

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

Lecture: Algorithmic complexity
90-723: Data Structures and Algorithms for Information Processing Copyright © 1999, Carnegie Mellon. All Rights Reserved. 1 Lecture 2: Basics Data Structures.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Dan Grossman Spring 2010.
Chapter 10 Algorithm Efficiency
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Cmpt-225 Algorithm Efficiency.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
The Efficiency of Algorithms
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Spring2012 Lecture#10 CSE 246 Data Structures and Algorithms.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014.
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
Design and Analysis Algorithm Drs. Achmad Ridok M.Kom Fitra A. Bachtiar, S.T., M. Eng Imam Cholissodin, S.Si., M.Kom Aryo Pinandito, MT Pertemuan 04.
Iterative Algorithm Analysis & Asymptotic Notations
Analysis of Algorithms
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Nicki Dell Spring 2014.
Asymptotic Analysis (based on slides used at UMBC)
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Asymptotic Behavior Algorithm : Design & Analysis [2]
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
CSE 373: Data Structures and Algorithms
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
Searching Topics Sequential Search Binary Search.
Week 12 - Monday.  What did we talk about last time?  Defining classes  Class practice  Lab 11.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
1 Algorithms Searching and Sorting Algorithm Efficiency.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Linda Shapiro Winter 2015.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Big-O. Speed as function Function relating input size to execution time – f(n) = steps where n = length of array f(n) = 4(n-1) + 3 = 4n – 1.
Chapter 2 Algorithm Analysis
Introduction to Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms
David Kauchak CS52 – Spring 2015
Introduction to Data Structures
Searching, Sorting, and Asymptotic Complexity
CSE 373: Data Structures & Algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Presentation transcript:

CS 2601 Runtime Analysis Big O, Θ, some simple sums. (see section 1.2 for motivation) Notes, examples and code adapted from Data Structures and Other Objects Using C++ by Main & Savitch

CS 2602 Sums - review Some quick identities (just like integrals): Not like the integral:

CS 2603 Closed form for simple sums A couple easy ones you really should stick in your head:

CS 2604 Example (motivation), sect 1.2 Problem: to count the # of steps in the Eiffel Tower Setup: Jack and Jill are at the top of the tower w/paper and a pencil Let n ≡ # of stairs to climb the Eiffel Tower

CS 2605 Eiffel Tower, alg. 1 1 st attempt: –Jack takes the paper and pencil, walks stairs –Makes mark for each step –Returns, shows Jill Runtime: –# steps Jack took: 2n –# marks: n –So, T 1 (n) = 2n + n = 3n

CS 2606 Eiffel Tower, alg. 2 2 nd plan: –Jill doesn’t trust Jack, keeps paper and pencil –Jack descends first step –Marks step w/hat –Returns to Jill, tells her to make a mark –Finds hat, moves it down a step –etc.

CS 2607 Eiffel Tower, alg. 2 (cont.) Analysis: –# marks = n –# steps = 2 ( …+n ) = 2 * n(n+1)/2 = n 2 + n –So, T 2 (n) = n 2 + 2n

CS 2608 Eiffel Tower, alg. 3 3 rd try: –Jack & Jill see their friend Steve on the ground –Steve points to a sign –Jill copies the number down Runtime: –# steps Jack took: 0 –# marks: log 10 ( n ) –So, T 3 (n) = log 10 ( n )

CS 2609 Comparison of algorithms Cost of algorithm (T(n)) vs. number of inputs, n, (the number of stairs).

CS Wrap-up T 1 (n) is a line –So, if we double the # of stairs, the runtime doubles T 2 (n) is a parabola –So, if we double the # of stairs, the runtime quadruples (roughly) T 3 (n) is a logarithm –We’d have to multiply the number of stairs by a factor of 10 to increase T by 1 (roughly) –Very nice function

CS Some “what ifs” Suppose 3 stairs (or 3 marks) can be made per 1 second (There are really 2689 steps) n2n3n T 1 (n)45 min90 min135 min T 2 (n)28 days112 days252 days T 3 (n)2 sec

CS More observations While the relative times for a given n are a little sobering, what is of larger importance (when evaluating algorithms) is how each function grows –T 3 apparently doesn’t grow, or grows very slowly –T 1 grows linearly –T 2 grows quadratically

CS Asymptotic behavior When evaluating algorithms (from a design point of view) we don’t want to concern ourselves with lower-level details: –processor speed –the presence of a floating-point coprocessor –the phase of the moon We are concerned simply with how a function grows as n grows arbitrarily large I.e., we are interested in its asymptotic behavior

CS Asymptotic behavior (cont.) As n gets large, the function is dominated more and more by its highest-order term (so we don’t really need to consider lower- order terms) The coefficient of the leading term is not really of interest either. A line w/a steep slope will eventually be overtaken by even the laziest of parabolas (concave up). That coefficient is just a matter of scale. Irrelevant as n→∞

CS Big-O, Θ A formal way to present the ideas of the previous slide: T(n) = O( f(n) ) iff there exist constants k, n 0 such that: k*f(n) > T(n) for all n>n 0 In other words, T(n) is bound above by f(n). I.e., f(n) gets on top of T(n) at some point, and stays there as n→∞ So, T(n) grows no faster than f(n)

CS Θ Further, if f(n) = O( T(n) ), then T(n) grows no slower than f(n) We can then say that T(n) = Θ(n) I.e., T(n) can be bound both above and below with f(n)

CS Setup First we have to decide what it is we’re counting, what might vary over the algorithm, and what actions are constant Consider: for( i=0; i<5; ++i ) ++cnt; – i=0 happens exactly once – ++i happens 5 times – i<5 happens 6 times – ++cnt happens 5 times

CS If i and cnt are int s, then assignment, addition, and comparison is constant (exactly 32 bits are examined) So, i=0, i<5, and ++i each take some constant time (though probably different) We may, for purposes of asymptotic analysis, ignore the overhead: –the single i=0 –the extra i<5 and consider the cost of executing the loop a single time

CS Setup (cont.) We decide that ++cnt is a constant- time operation (integer addition, then integer assignment) So, a single execution of the loop is done in constant time Let this cost be c:

CS Setup (cont.) So, the total cost of executing that loop can be given by: Constant time Makes sense. Loop runs a set number of times, and each loop has a constant cost

CS Setup (cont.) T(n) = 5c, where c is constant We say T(n) = O(1) From the definition of Big-O, let k = 6c: 6c(1) > 5c This is true everywhere, for all n, so, for all n>0, certainly. Easy

CS Eg 2 Consider this loop: for( i=0; i<n; ++i ) ++cnt; Almost just like last one:

CS Eg 2 (cont.) Now, T(n) is linear in n T(n) = O(n) This means we can multiply n by something to get bigger than T(n): __ n > cn,let k = 2c 2cn > cn This isn’t true everywhere. We want to know that it becomes true somewhere and stays true as n →∞

CS Eg. 2 (cont.) Solve the inequality: 2cn > cn cn > 0 n > 0 (c is strictly positive, so, not 0) cn gets above T(n) at n=0 and stays there as n gets large So, T(n) grows no faster than a line

CS Eg. 3 Let’s make the previous example a little more interesting: Say T(n) = 2cn + 13c T(n) = O(n) So, find some k such that kn > 2cn + 13c (past some n 0 ) Let k = 3c. => 3cn > 2cn + 13c

CS Find values of n for which the inequality is true: 3cn > 2cn + 13c cn > 13c n > 13 3cn gets above T(n) at n=13, and stays there. T(n) grows no faster than a line Eg. 3 (cont.)

CS Eg 4 Nested loops: for( i=0; i<n; ++i ) for( j=1; j<=n; ++j ) ++cnt; Runtime given by:

CS Eg. 4 (cont.) Claim: T(n) = O(n 2 )  there exists a constant k such that kn 2 > cn 2,let k = 2c: 2cn 2 > cn 2 Where is this true? cn 2 > 0 n 2 > 0 n > 0

CS Eg. 4 (cont.) 2cn 2 gets above cn 2 at n=0 and stays there T(n) is bound above by a parabola T(n) grows no faster than a parabola

CS Eg. 5 Let’s say T(n) = 2cn 2 + 2cn + 3c Claim: T(n) = 0(n 2 ) We just need to choose a k larger than the coefficient of the leading term. Let k = 3c  3cn 2 > 2cn 2 + 2cn + 3c Where? (Gather like terms, move to one side) cn 2 - 2cn - 3c > 0

CS Eg. 5 (cont.) This one could be solved directly, but it is usually easier to find the roots of the parabola on the left (which would be where our original 2 parabolas intersected) This happens at n=-1 and n=3 So, plug something like n=4 into our original inequality. Is it true? Then it’s true for all n>3 (since we know they don’t intersect again)

CS Eg. 5 (cont.) 3cn 2 gets above T(n) at n=3 and stays there T(n) grows no faster than a parabola T(n) can be bound above by a parabola