Big-Oh and Execution Time: A Review

Slides:



Advertisements
Similar presentations
Chapter 20 Computational complexity. This chapter discusses n Algorithmic efficiency n A commonly used measure: computational complexity n The effects.
Advertisements

MATH 224 – Discrete Mathematics
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
CSE Lecture 3 – Algorithms I
Introduction to Analysis of Algorithms
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
CS 307 Fundamentals of Computer Science 1 Asymptotic Analysis of Algorithms (how to evaluate your programming power tools) based on presentation material.
1 Algorithm Efficiency, Big O Notation, and Role of Data Structures/ADTs Algorithm Efficiency Big O Notation Role of Data Structures Abstract Data Types.
CS Winter 2010 Big-Oh Review. A Machine-independent Way to Describe Execution Time The purpose of a big-Oh characterization is to describe the change.
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithm.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
Abstract Data Types (ADTs) Data Structures The Java Collections API
COMP s1 Computing 2 Complexity
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis O Ω.
Introduction to Analysis of Algorithms CS342 S2004.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Algorithm Analysis (Big O)
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
19 Searching and Sorting.
Analysis of Algorithms
Mathematical Foundation
Analysis of Algorithms
Introduction to Search Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
COMP108 Algorithmic Foundations Algorithm efficiency
Introduction to complexity
Introduction to Algorithms
Introduction to Analysing Costs
COMP 53 – Week Seven Big O Sorting.
Algorithm Analysis CSE 2011 Winter September 2018.
Algorithm Analysis (not included in any exams!)
Building Java Programs
Algorithm design and Analysis
ITEC 2620M Introduction to Data Structures
What is CS 253 about? Contrary to the wide spread belief that the #1 job of computers is to perform calculations (which is why the are called “computers”),
Algorithm Efficiency, Big O Notation, and Javadoc
Algorithm Efficiency Chapter 10.
Big O Notation.
Data Structures Review Session
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
DS.A.1 Algorithm Analysis Chapter 2 Overview
Algorithm Analysis Bina Ramamurthy CSE116A,B.
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
Building Java Programs
Recurrences (Method 4) Alexandra Stefan.
Complexity Analysis Text is mainly from Chapter 2 by Drozdek.
IST311 - CIS265/506 Cleveland State University – Prof. Victor Matos
At the end of this session, learner will be able to:
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
David Kauchak cs161 Summer 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Complexity Analysis (Part II)
Estimating Algorithm Performance
Algorithms and data structures: basic definitions
Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.
Presentation transcript:

Big-Oh and Execution Time: A Review CS 261 – Data Structures Big-Oh and Execution Time: A Review

Big-Oh: Purpose A machine-independent way to describe execution time Purpose of a big-Oh characterization is: to describe change in execution time relative to change in input size in a way that is independent of issues such as machine times or compilers

Big-Oh: Algorithmic Analysis We want a method for determining the relative speeds of algorithms that: doesn’t depend upon hardware used (e.g., PC, Mac, etc.) the clock speed of your processor what compiler you use even what language you write in

Algorithmic Analysis Suppose that algorithm A processes n data elements in time T. Algorithmic analysis attempts to estimate how T is affected by changes in n. In other words, T is a function of n when we use A.

A Simple Example Suppose we have an algorithm that is O(n) (e.g., summing elements of array) Suppose to sum 10,000 elements takes 32 ms. How long to sum 20,000 elements? If the size doubles, the execution time doubles

Non-Linear Times Suppose the algorithm is O(n2) (e.g., sum elements of a 2-D array) Suppose size doubles, what happens to execution time? It goes up by 4 Why 4? Need to figure out how to do this …

The Calculation The ratio of the big-Oh sizes should equal the ratio of the execution times We increased n by a factor of two: then solve for x

A More Complex Problem Acme Widgets uses a merge sort algorithm to sort their inventory of widgets If it takes 66 milliseconds to sort 4096 widgets, then approx. how long will it take to sort 1,048,576 widgets? (Note: merge sort is O(n log n), 4096 is 212, and 1,048,576 is 220, and)

A More Complex Problem (cont.) Setting up the formula: Solve for x (remember log 2y is just y)

Determining Big Oh: Simple Loops For simple loops, ask yourself how many times loop executes as a function of input size: Iterations dependent on a variable n Constant operations within loop double minimum(double data[], int n) { // Pre: values has at least one element. // Post: returns the smallest value in collection. int i; double min = data[0]; for(i = 1; i < n; i++) if(data[i] < min) min = data[i]; return min; } O(n)

Determining Big Oh: Not-So-Simple Loops Not always simple iteration and termination criteria Iterations dependent on a function of n Constant operations within loop Possibility of early exit: int isPrime(int n) { int i; for(i = 2; i * i < n; i++) { // If i is a factor, if (n % i == 0) return 0; // then not prime. } // If loop exits without finding return 1; // a factor  n is prime. } O( n ) But what happens if it exits early? Best case? Average case? Worst case?

Determining Big Oh: Nested Loops Nested loops (dependent or independent) multiply: void insertionSort(double arr[], unsigned int n) { unsigned i, j; double elem; for(i = 1; i < n; i++) { // Move element arr[i] into place. elem = arr[i]; for (j = i - 1; j >= 0 && elem < arr[j]; j--) { arr[j+1] = arr[j]; j--; // Slide old value up. } arr[j+1] = elem; Worst case (reverse order): 1 + 2 + … + (n-1) = (n2 – n) / 2  O(n2) n - 1 times 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 1

Determining Big Oh: Recursion For recursion, ask yourself: How many times will function be executed? How much time does it spend on each call? Multiply these together double exp(double a, int n) { if (n < 0) return 1 / exp(a, -n); if (n = 0) return 1; return a * exp(a, n - 1); } Not always as simple as above example: Often easier to think about algorithm instead of code

Determining Big Oh: Calling Functions When one function calls another, big-Oh of calling function also considers big-Oh of called function (and how many times embedded function(s) are called): void removeElement(double e, struct Vector *v) { int i = vectorSize(v); while (i-- > 0) if (e == vectorGet(v, i)) { vectorRemove(v, i); return; } O(??) O(n) O(1) O(n)

Determining Big Oh: Logarithmic I’m thinking of a number between 0 and 1024 How many guesses do you need to find my number? Answer: approximately log 1024 = log 210 = 10 In algorithmic analysis, the log of n is the number of times you can split n in half (binary search, etc)

Summation and the Dominant Component A method’s running time is sum of time needed to execute sequence of statements, loops, etc. within method For algorithmic analysis, the largest component dominates (and constant multipliers are ignored) Function f(n) dominates g(n) if there exists a constant value n0 such that for all values of n > n0, f(n) > g(n) Example: analysis of a given method shows its execution time as 8n + 3n2 + 23 Don’t write O(8n + 3n2 + 23) or even O(n + n2 + 1), but just O(n2) Single loop Nested loop Constant-time statements.

Computation Complexities Function Common Name n! Factorial 2n (or cn) Exponential nd, d > 3 Polynomial n3 Cubic n2 Quadratic n n n log n n Linear Root-n log n Logarithmic 1 Constant n3 n2 n n n log2n n Everything else 1, log2n, n n n log2n 1

Benchmarking Algorithmic analysis is the first and best way, but not the final word What if two algorithms are both of the same complexity? Example: bubble sort and insertion sort are both O(n2) So, which one is the “faster” algorithm? Benchmarking: run both algorithms on the same machine Often indicates the constant multipliers and other “ignored” components Still, different implementations of the same algorithm often exhibit different execution times – due to changes in the constant multiplier or other factors (such as adding an early exit to bubble sort)

Reading & Worksheets Worksheet: Lesson 2 – Using Big-Oh to Estimate Clock Time Reading: Introduction & Chapter 1 C/C++ Programmer’s Reference Worksheet Time