11.2 Complexity Analysis. Complexity Analysis As true computer scientists, we need a way to compare the efficiency of algorithms. Should we just use a.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

90-723: Data Structures and Algorithms for Information Processing Copyright © 1999, Carnegie Mellon. All Rights Reserved. 1 Lecture 2: Basics Data Structures.
Computational Complexity 1. Time Complexity 2. Space Complexity.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved L17 (Chapter 23) Algorithm.
1 CSE1301 Computer Programming Lecture 31: List Processing (Search)
Complexity Analysis (Part I)
Complexity Analysis (Part I)
CISC220 Spring 2010 James Atlas Lecture 06: Linked Lists (2), Big O Notation.
1 Algorithm Efficiency, Big O Notation, and Role of Data Structures/ADTs Algorithm Efficiency Big O Notation Role of Data Structures Abstract Data Types.
Professor John Peterson
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithm.
Bigointro1 Algorithm Analysis & Program Testing An introduction.
Algorithm/Running Time Analysis. Running Time Why do we need to analyze the running time of a program? Option 1: Run the program and time it –Why is this.
Abstract Data Types (ADTs) Data Structures The Java Collections API
COMP s1 Computing 2 Complexity
Pointers (Continuation) 1. Data Pointer A pointer is a programming language data type whose value refers directly to ("points to") another value stored.
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Big Oh Algorithm Analysis with
Week 2 CS 361: Advanced Data Structures and Algorithms
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
ITEC 2620A Introduction to Data Structures Instructor: Prof. Z. Yang Course Website: 2620a.htm Office: TEL 3049.
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Analysis of Algorithms
CSE1301 Computer Programming: Lecture 26 List Processing (Search)
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
Growth of Functions. Asymptotic Analysis for Algorithms T(n) = the maximum number of steps taken by an algorithm for any input of size n (worst-case runtime)
©Silberschatz, Korth and Sudarshan3.1 Algorithms Analysis Algorithm efficiency can be measured in terms of:  Time  Space  Other resources such as processors,
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
 O(1) – constant time  The time is independent of n  O(log n) – logarithmic time  Usually the log is to the base 2  O(n) – linear time  O(n*logn)
Computer Science and Software Engineering University of Wisconsin - Platteville 8. Comparison of Algorithms Yan Shi CS/SE 2630 Lecture Notes Part of this.
RUNNING TIME 10.4 – 10.5 (P. 551 – 555). RUNNING TIME analysis of algorithms involves analyzing their effectiveness analysis of algorithms involves analyzing.
Lecture 10 – Algorithm Analysis.  Next number is sum of previous two numbers  1, 1, 2, 3, 5, 8, 13, 21 …  Mathematical definition 2COMPSCI Computer.
The Growth of Functions Rosen 3.2. What affects runtime of a program? the machine it runs on the programming language the efficiency of the compiler the.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
ANALYSING COSTS COMP 103. RECAP  ArrayList: add(), ensuring capacity, iterator for ArrayList TODAY  Analysing Costs 2 RECAP-TODAY.
CISC220 Spring 2010 James Atlas Lecture 07: Big O Notation.
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Section 1.7 Comparing Algorithms: Big-O Analysis.
1 Algorithms Searching and Sorting Algorithm Efficiency.
Introduction to Analysing Costs 2013-T2 Lecture 10 School of Engineering and Computer Science, Victoria University of Wellington  Marcus Frean, Rashina.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
(Complexity) Analysis of Algorithms Algorithm Input Output 1Analysis of Algorithms.
Algorithm Analysis 1.
Complexity Analysis (Part I)
Chapter 2 Algorithm Analysis
Big-O notation.
Analysis of Non – Recursive Algorithms
Analysis of Non – Recursive Algorithms
Analysis of Algorithms
Algorithmic Efficency
Program Efficiency Interested in “order of magnitude”
Complexity Analysis.
Building Java Programs
Introduction to Data Structures
Programming and Data Structure
Algorithm Analysis Bina Ramamurthy CSE116A,B.
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
Algorithms Analysis Algorithm efficiency can be measured in terms of:
8. Comparison of Algorithms
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Algorithmic complexity
Algorithm/Running Time Analysis
Complexity Analysis (Part I)
Complexity Analysis (Part I)
Presentation transcript:

11.2 Complexity Analysis

Complexity Analysis As true computer scientists, we need a way to compare the efficiency of algorithms. Should we just use a stopwatch to time our programs? No, different processors will cause the same program to run a different speeds. Instead, we will count the number of operations that must run.

Counting Operations public static double avg(int[] a, int n){ double sum=0; for(int j=0; j<n; j++) sum+=a[j]; return (sum/n); } n = number of elements There is one operation in the for-loop Loop runs n times 1 loop comparison 1 loop increment 1 operation 3n operations Total Operations: 3n operation here here & here = 3 operations

Grouping Complexities So the run-time of our average function depends on the size of the array. Here are some possibilities: Array size 1: runtime = 3(1) + 3 = 6 Array size 50: runtime = 3(50) + 3 = 153 Array size 1000: runtime = 3(1000) + 3 = 3003 Notice that increasing the size of the array by a constant multiple has approximately the same effect on the runtime.

Grouping Complexities In order to compare runtimes, it is convenient to have some grouping schema. Think of runtimes as a function of the number of inputs. How do mathematicians group functions?

Functions How might you “describe” these functions? What “group” do they belong to? f(x) = 3x + 5 f(x) = 4x x + 90 f(x) = 10x 3 – 30 f(x) = log(x+30) f(x) = 2 (3x + 3) Linear Quadratic Cubic Logarithmic Exponential

Big-O Notation Instead of using terms like linear, quadratic, and cubic, computer scientists use big-O notation to discuss runtimes. A function with linear runtime is said to be of order n. The shorthand looks like this: O(n)

Functions If the following are runtimes expressed as functions of the number of inputs (x), we would label them as follows: f(x) = 3x + 5 f(x) = 4x x + 90 f(x) = 10x 3 – 30 f(x) = log(x+30) f(x) = 2 (3x + 3) O(n) O(n 2 ) O(n 3 ) O(log n) O(2 n )

O(n) Method public static double avg(int[][] a){ int sum=0, count=0; for(int j=0; j<a.length; j++){ for(int k=0; k<a[j].length; k++){ sum+=a[j][k]; count++; } } return ((double)sum/count); } 2 assignments here Operation Count = 2Operation Count = 4 Loop runs a constant number of times, r. This causes 2r operations n is to the first power: O(n) Operation Count = 6Operation Count = 6 + 2rOperation Count = 2n rOperation Count = 5n r 1 cast & 1 division = 2 operations 2 loop assignments This loop runs k times for each outer loop iteration. These 3 ops run rk times. rk = n (# elements) 3n operations This causes 2rk operations. rk = n (# elements) 2n operations

O(n 2 ) Method public static void bubble(int[] a, int n){ for (i=0; i<n-1; i++) {//how many are sorted for (j=0; j<n-1-i; j++)//unsorted part if (a[j+1] < a[j]) //compare neighbors swap(a, j, j+1); } } Operation Count = 1 1 Loop assignments. Operation Count = 1 + 3(n-1)Operation Count = 3n – 2 Inner loop has n-1 iterations n-2 iterations n-3 iterations ….. 1 iteration Total = n(n-1)/2 iterations 2 loop ops 1 comparison + 3 ops for swap 6 operations On Each iteration 6[n(n-1)/2] =3n(n-1) =3n 2 -3n Operations from inner loop Operation Count = 3n – 2 + 3n 2 – 3nOperation Count = 3n 2 – 2 3 operations for each of the n-1 iterations.

The common Big-O values O(1) : constant O(log n): Logarithmic O(n) : Linear O(n logn) : n logn O(n 2 ): Quadratic O(n 3 ): Cubic O(2 n ): exponential

# of inputs # operations O(n 3 ) O(n 2 ) O(nlogn) O(n) O(logn) O(1) Algorithms are categorized by how their runtime increases in relation to the number of inputs.