CSE 326: Data Structures Asymptotic Analysis Hannah Tang and Brian Tjaden Summer Quarter 2002.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Lecture: Algorithmic complexity
CSE 373 Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 3: Asymptotic Analysis Dan Grossman Spring 2010.
CSE 326 Asymptotic Analysis David Kaplan Dept of Computer Science & Engineering Autumn 2001.
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
CSE 326: Data Structures Introduction 1Data Structures - Introduction.
CSE 326: Data Structures Lecture #2 Analysis of Algorithms Alon Halevy Fall Quarter 2000.
The Efficiency of Algorithms
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Chapter 1 Introduction Definition of Algorithm An algorithm is a finite sequence of precise instructions for performing a computation or for solving.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
Algorithm Analysis (Big O)
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014.
COMP s1 Computing 2 Complexity
Asymptotic Notations Iterative Algorithms and their analysis
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Week 2 CS 361: Advanced Data Structures and Algorithms
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
Mathematics Review and Asymptotic Notation
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
Analysis of Algorithms
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Catie Baker Spring 2015.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Lauren Milne Summer 2015.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
1 CSE 326 Data Structures: Complexity Lecture 2: Wednesday, Jan 8, 2003.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
CSE 373: Data Structures and Algorithms
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
A Introduction to Computing II Lecture 5: Complexity of Algorithms Fall Session 2000.
CSE 326: Data Structures Lecture #3 Asymptotic Analysis Steve Wolfman Winter Quarter 2000.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Linda Shapiro Winter 2015.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
Analysis of Algorithms
CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis
CSE 332: Data Abstractions Asymptotic Analysis
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis
Analysis of Algorithms
Searching, Sorting, and Asymptotic Complexity
CSE 373 Data Structures and Algorithms
CSE 326: Data Structures Asymptotic Analysis
CSE 373 Optional Section Led by Yuanwei, Luyi Apr
Analysis of Algorithms
Presentation transcript:

CSE 326: Data Structures Asymptotic Analysis Hannah Tang and Brian Tjaden Summer Quarter 2002

Today’s Outline How’s the project going? Finish up stacks, queues, lists, and bears, oh my! Math review and runtime analysis Pretty pictures Asymptotic analysis

Analyzing Algorithms: Why Bother? From “Programming Pearls”, by Jon Bentley Communications of the ACM, Nov 1984

Analyzing Algorithms Computer scientists analyze algorithms to precisely characterize an algorithm’s: –Time complexity (running time) –Space complexity (memory use) This allows us to get a better sense of the various tradeoffs between several algorithms –For instance, do we know how complex the 1984 algorithm is, compared to the 1945 algorithm? A problem’s input size is indicated by a number n –Sometimes have multiple inputs, e.g. m and n The running time of an algorithm is a function of n –n, 2 n, n log n, n(log n 2 ) + 5n 3

Hannah Takes a Break bool ArrayFind(int array[], int n, int key ) { // Insert your algorithm here } What algorithm would you choose to implement this code snippet?

Hannah Takes a Break: Simplifying assumptions Ideal single-processor machine (serialized operations) “Standard” instruction set (load, add, store, etc.) All operations take 1 time unit (including, for our purposes, each Java or C++ statement

HTaB: Analyzing Code Basic Java/C++ operations Consecutive statements Conditionals Loops Function calls Recursive functions Constant time Sum of times Larger branch plus test Sum of iterations Cost of function body Solve recurrence relation

HTaB: Linear Search Analysis bool ArrayFind(int array[], int n, int key ) { for( int i = 0; i < n; i++ ) { // Found it! if( array[i] == key ) return true; } return false; } Exact Runtime: Best Case: Worst Case:

HTaB: Binary Search Analysis bool ArrayFind( int array[], int s, int e, int key ) { // The subarray is empty if( e – s <= 0 ) return false; // Search this subarray int mid = (e - s) / 2; if( array[key] == array[mid] ) { return true; } else if( key < array[mid] ) { return ArrayFind( array, s, mid, key ); } else { return ArrayFind( array, mid, e, key ); } Exact Runtime: Best case: Worst case:

Back to work: Solving Recurrence Relations 1.Determine the recurrence relation. What are the base case(s)? 2.“Expand” the original relation to find an equivalent general expression in terms of the number of expansions. 3.Find a closed-form expression by setting the number of expansions to a value which reduces the problem to a base case

Linear Search vs Binary Search Linear SearchBinary Search Exact Runtime Best Case Worst Case So … which algorithm is best? What the tradeoffs did you make?

Fast Computer vs. Slow Computer

Fast Computer vs. Smart Programmer (round 1)

Fast Computer vs. Smart Programmer (round 2)

Asymptotic Analysis Asymptotic analysis looks at the order of the running time of the algorithm –A valuable tool when the input gets “large” –Ignores the effects of different machines or different implementations of the same algorithm Intuitively, to find the asymptotic runtime, throw away the constants and low-order terms –Linear search is T(n) = n  O(n) –Binary search is T(n) = T(n) = 4 log 2 n + 1  O(log n) Remember: the fastest algorithm has the slowest growing function for its runtime

Order Notation: Intuition Although not yet apparent, as n gets “sufficiently large”, f(n) will be “greater than or equal to” g(n) f(n) = n 3 + 2n 2 g(n) = 100n

Order Notation: Definition O( f(n) ) is a set of functions g(n)  O( f(n) ) iff There exist c and n 0 such that g(n)  c f(n) for all n  n 0 Example: 100n  5 (n 3 + 2n 2 ) for all n  19 So g(n)  O( f(n) ) Sometimes, you’ll see the notation g(n) = O(f(n)). This equivalent to g(n)  O(f(n)). However, the notation O(f(n)) = g(n) is not correct

Order Notation: Example 100n  5 (n 3 + 2n 2 ) for all n  19 So g(n)  O( f(n) )

Oops: Set Notation n + 3n 2 O( n 3 ) 45697n 3 - 4n 2n n 2 log n “O( f(n) ) is a set of functions” So we say both 100n 2 log n = O( n 3 ) and 100n 2 log n  O( n 3 )

Set Notation n + 3n 2 O( n 3 ) 45697n 3 - 4n 2n n 2 log n O( 2 n ) 1.5 n n + n n log n 2 Set notation allows us to formalize our intuition O( n 3 )  O( 2 n )

Big-O Common Names constant:O(1) logarithmic:O(log n) linear:O(n) log-linear:O(n log n) superlinear:O(n 1+c )(c is a constant, where 0 < c < 1) quadratic:O(n 2 ) polynomial:O(n k )(k is a constant) exponential:O(c n )(c is a constant > 1)

Meet the Family O( f(n) ) is the set of all functions asymptotically less than or equal to f(n) –o( f(n) ) is the set of all functions asymptotically strictly less than f(n)  ( f(n) ) is the set of all functions asymptotically greater than or equal to f(n) –  ( f(n) ) is the set of all functions asymptotically strictly greater than f(n)  ( f(n) ) is the set of all functions asymptotically equal to f(n)

Meet the Family Formally (don’t worry about dressing up) g(n)  O( f(n) ) iff There exist c and n 0 such that g(n)  c f(n) for all n  n 0 –g(n)  o( f(n) ) iff There exists a n 0 such that g(n) < c f(n) for all c and n  n 0 g(n)   ( f(n) ) iff There exist c and n 0 such that g(n)  c f(n) for all n  n 0 –g(n)   ( f(n) ) iff There exists a n 0 such that g(n) > c f(n) for all c and n  n 0 g(n)   ( f(n) ) iff g(n)  O( f(n) ) and g(n)   ( f(n) )

Big-Omega et al. Intuitively Asymptotic NotationMathematics Relation O    = o<  >

True or False? 10,000 n n   (n 2 ) n 2   (n 2 ) n   (n 2 ) n log n  O(2 n ) n log n   (n) n  o(n 4 )

Another Kind of Analysis Runtime may depend on actual input, not just length of input Analysis based on input type: –Worst case Your worst enemy is choosing input –Average case Assume a probability distribution of inputs –Best case Not too useful Amortized analysis Runtime over many runs, regardless of underlying probability for inputs

HTaB: Pros and Cons of Asymptotic Analysis

To Do Start project 1 –Due Monday, July 1 st at 10 PM sharp! Sign up for 326 mailing list(s) –Don’t forget to use the new web interfaces! Prepare for tomorrow’s quiz –Possible topics: Math concepts from 321 (skim section 1.2 in Weiss) Lists, stacks, queues, and the tradeoffs between various implementations Whatever asymptotic analysis stuff we covered today Possible middle names for Brian C. Tjaden, Hannah C. Tang, and Albert J. Wong Read chapter 2 (algorithm analysis), section 4.1 (introduction to trees), and sections (priority queues and binary heaps)