Running Time Performance analysis

Slides:



Advertisements
Similar presentations
Computational Complexity 1. Time Complexity 2. Space Complexity.
Advertisements

Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 10 Algorithm Efficiency
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
Mon 29 Sep 2014Lecture 4 1. Running Time Performance analysis Techniques until now: Experimental Cost models counting execution of operations or lines.
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Analysis of Algorithms
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
26 Sep 2014Lecture 3 1. Last lecture: Experimental observation & prediction Cost models: Counting the number of executions of Every single kind of command.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
CSC 205 Java Programming II Algorithm Efficiency.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Analysis of Algorithms [ Section 4.1 ] Examples of functions important in CS: the constant function:f(n) =
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
Spring 2015 Lecture 2: Analysis of Algorithms
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Lecture 4 Jianjun Hu Department of Computer Science and Engineerintg University of South Carolina CSCE350 Algorithms and Data Structure.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
CMPT 438 Algorithms.
CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis Kevin Quinn Fall 2015.
Chapter 2 Algorithm Analysis
CS 3343: Analysis of Algorithms
Analysis of Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
COMP108 Algorithmic Foundations Algorithm efficiency
Introduction to complexity
Introduction Algorithms Order Analysis of Algorithm
GC 211:Data Structures Algorithm Analysis Tools
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Catie Baker Spring 2015.
CS 3343: Analysis of Algorithms
Growth of functions CSC317.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Computation.
CS 3343: Analysis of Algorithms
Asymptotic Notations Algorithms Lecture 9.
CS 3343: Analysis of Algorithms
CSC 413/513: Intro to Algorithms
Introduction to Algorithms Analysis
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Programming and Data Structure
Algorithm Analysis Bina Ramamurthy CSE116A,B.
Searching, Sorting, and Asymptotic Complexity
Searching, Sorting, and Asymptotic Complexity
CSE 373: Data Structures & Algorithms
Algorithms Analysis Algorithm efficiency can be measured in terms of:
At the end of this session, learner will be able to:
David Kauchak cs161 Summer 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Algorithms CSCI 235, Spring 2019 Lecture 2 Introduction to Asymptotic Analysis Read Ch. 2 and 3 of Text 1.
Estimating Algorithm Performance
Algorithm Course Dr. Aref Rashad
Presentation transcript:

Running Time Performance analysis Lecture 4 Running Time Performance analysis Techniques until now: Experimental Approximate running time using cost models counting execution of operations or lines of code. under some assumptions only some operations count cost of each operation = 1 time unit Tilde notation: T(n) ~ (5/3)n2 Today: Asymptotic Running Time: Θ/O/Ω-notation Examples: insertionSort & binarySearch

Fast computer vs. Slow computer Lecture 4 How fast is T(n) = (3/2)n2 + (3/2)n - 1 ? Fast computer vs. Slow computer

Fast Computer vs. Smart Programmer Lecture 4 T1(n) = (3/2)n2 + (3/2)n - 1 T2(n) = (3/2)n - 1 Fast Computer vs. Smart Programmer

Fast Computer vs Smart Programmer (rematch!) Lecture 4 T1(n) = (3/2)n2 + (3/2)n - 1 T2(n) = (3/2)n - 1 Fast Computer vs Smart Programmer (rematch!)

Lecture 4 A smart programmer with a better algorithm always beats a fast computer with a worst algorithm for sufficiently large inputs.

Lecture 4 At large enough input sizes only the rate of growth of an algorithm’s running time matters. That’s why we dropped the lower-order terms in the approximate tilde notation: When T(n) = (3/2)n2 + (3/2)n -1 we write: T(n) ~ (3/2)n2 However: to calculate (3/2)n2 we need to first calculate (3/2)n2 + (3/2)n -1 It is not possible to calculate the coefficient 3/2 without the complete polynomials.

Worst Case In the worst case the array is in reverse sorted order. Lecture 3 Worst Case cost no of times for (j = 1; j<A.length; j++) { //shift A[j] into the sorted A[0..j-1] i=j-1 while i>=0 and A[i]>A[i+1] { 1 2+…+n swap A[i], A[i+1] 1 1+…+(n-1) i=i-1 }} return A In the worst case the array is in reverse sorted order. T(n) = Σi=2..n(i) + Σi=1..n-1(i) = Σi=1..n(i) - 1 + Σi=1..n-1(i) = (n(n+1)/2 - 1) + 2n(n-1)/2 ~ (5/2)n2

Lecture 4 Simpler approach It turns out that even the coefficient of the highest order term of polynomials is not all that important for large enough inputs. This leads us to Asymptotic running time: T(n) = Θ(n2) We calculate directly the growth function Even with such a simplification, we can compare algorithms to discover the best ones Sometimes constants matter in the real-world performance of algorithms, but in many cases we can ignore them. We can write the asymptotic running time of best/worst/average case

Important Growth Functions Lecture 4 Important Growth Functions From better to worse: Function f Name 1 constant log n logarithmic n linear n.log n n2 quadratic n3 cubic … 2n exponential ...

Important Growth Functions Lecture 4 Important Growth Functions From better to worse: Function f Name 1 constant log n logarithmic n linear n.log n n2 quadratic n3 cubic … 2n exponential ... The first 4 are practically fast (most commercial programs run in such Θ-time) Anything less than exponential is theoretically fast (P vs NP)

Important Growth Functions Lecture 4 Important Growth Functions From better to worse: Function f Name Problem size solved in mins (today) 1 constant any log n logarithmic any n linear billions n.log n hundreds of millions n2 quadratic tens of thousands n3 cubic thousands … 2n exponential 100 ...

What programs have these running times? Lecture 4 What programs have these running times? With experience we can recognise patterns in code with known running time

Θ(1) Any fixed number of basic operations. But not: assignments Lecture 4 Θ(1) Any fixed number of basic operations. assignments int i = 5 memory access A[5] fixed number of combinations of the above swap A[i], A[j] (3 array accesses & 3 assignments) any other basic command But not: array initialisation int A[] = new int[n] (Θ(n))

Lecture 4 Θ(n) Pattern of code: loops where each iteration decreases the problem size by a constant factor for(j=1; j<n; j++){ …<constant cost operations>… } Problem: iterate from 1 … n Initial problem size: n each iteration decreases problem size by 1. k=n; while(k>0){ …<constant cost operations>… ; k=k-100; } Problem: iterate from n … 1 each iteration decreases problem size by 100.

Lecture 4 Θ(log n) Pattern of code: loops where each iteration divides the problem size by a constant j=n; while(j>=0){…<constant cost operations>…; j=j/2} Problem: iterate from n … 0 Initial problem size: n each iteration divides problem size by 2.

5 Oct 2015 Lecture 4

Asymptotic Running Time Θ(f(n)) 5 Oct 2015 Lecture 4 Asymptotic Running Time Θ(f(n)) It has useful operations: Simplify: replace multiplicative constants with 1 Θ(2) = Θ(1) Θ(10n2) = Θ(n2) Θ(log300 n ) = Θ(log2 n) Always use the simplest possible functions

Asymptotic Running Time Θ(f(n)) Lecture 4 Asymptotic Running Time Θ(f(n)) It has useful operations: Addition: keep only highest factor Θ(1) + Θ(1) = Θ(1) Θ(n) + Θ(n2) = Θ(n2) Θ(n3) + Θ(n3) = Θ(n3) Θ(n2.log n) + Θ(n2) = Θ(n2.log n)

Asymptotic Running Time Θ(f(n)) Lecture 4 Asymptotic Running Time Θ(f(n)) It has useful operations: Multiplication: multiply inner functions Θ(n) × Θ(n2) = Θ(n3) Θ(n) × Θ(log n) = Θ(n.log n) Θ(1) × Θ(log n) = Θ(log n)

Asymptotic Running Time Θ(f(n)) Lecture 4 Asymptotic Running Time Θ(f(n)) It has useful operations: Simplify: replace multiplicative constants with 1 Θ(2) = Θ(1) Θ(10n2) = Θ(n2) Θ(log200(n)) = Θ(log2(n)) Addition: keep only highest factor Θ(1) + Θ(1) = Θ(1) Θ(n) + Θ(n2) = Θ(n2) Θ(n3) + Θ(n3) = Θ(n3) Θ(n2log(n)) + Θ(n2) = Θ(n2log(n)) Multiplication: multiply inner functions Θ(n) × Θ(n2) = Θ(n3) Θ(n) × Θ(log n) = Θ(n.log n) Θ(n2.log n) + Θ(n2) = Θ(n2.log n) The above are because of the following general theorems: Θ(f(n)) + Θ(g(n)) = Θ(g(n)) if Θ(f(n)) ≤ Θ(g(n)) Θ(f(n)) × Θ(g(n)) = Θ(f(n) × g(n)) If f(n) = Θ(g(n)) and g(n) = Θ(h(n)) then f(n) = Θ(h(n))

How to calculate Asymptotic running times? Lecture 4 How to calculate Asymptotic running times? First decide what case you want to calculate Worst case input  usually Best case input Average case input Then add up costs of each operation multiplied with number of times called As we did with the precise/approximate running times but use asymptotic notation take advantage of the simple addition and multiplication operations

InsertionSort – asymptotic worst-case analysis Lecture 4 InsertionSort – asymptotic worst-case analysis cost No of times for j = 1 to A.length { //shift A[j] into the sorted A[0..j-1] i=j-1 while i>=0 and A[i]>A[i+1] { swap A[i], A[i+1] i=i-1 }} return A

InsertionSort – asymptotic worst-case analysis Lecture 4 InsertionSort – asymptotic worst-case analysis cost No of times for j = 1 to A.length { Θ(1) //shift A[j] into the sorted A[0..j-1] i=j-1 Θ(1) while i>=0 and A[i]>A[i+1] { Θ(1) swap A[i], A[i+1] Θ(1) i=i-1 Θ(1) }} return A Θ(1)

InsertionSort – asymptotic worst-case analysis Lecture 4 InsertionSort – asymptotic worst-case analysis cost No of times for j = 1 to A.length { Θ(1) Θ(Ν) //shift A[j] into the sorted A[0..j-1] i=j-1 Θ(1) Θ(Ν) while i>=0 and A[i]>A[i+1] { Θ(1) swap A[i], A[i+1] Θ(1) i=i-1 Θ(1) }} return A Θ(1)

InsertionSort – asymptotic worst-case analysis Lecture 4 InsertionSort – asymptotic worst-case analysis cost No of times for j = 1 to A.length { Θ(1) Θ(Ν) //shift A[j] into the sorted A[0..j-1] i=j-1 Θ(1) Θ(Ν) while i>=0 and A[i]>A[i+1] { Θ(1) Θ(Ν)xΘ(Ν) swap A[i], A[i+1] Θ(1) i=i-1 Θ(1) }} return A Θ(1)

InsertionSort – asymptotic worst-case analysis Lecture 4 InsertionSort – asymptotic worst-case analysis cost No of times for j = 1 to A.length { Θ(1) Θ(Ν) //shift A[j] into the sorted A[0..j-1] i=j-1 Θ(1) Θ(Ν) while i>=0 and A[i]>A[i+1] { Θ(1) Θ(Ν)xΘ(Ν) swap A[i], A[i+1] Θ(1) Θ(Ν2) i=i-1 Θ(1) Θ(Ν2) }} return A Θ(1) Θ(1)

InsertionSort – asymptotic worst-case analysis Lecture 4 InsertionSort – asymptotic worst-case analysis cost No of times for j = 1 to A.length { Θ(1) Θ(Ν) //shift A[j] into the sorted A[0..j-1] i=j-1 Θ(1) Θ(Ν) while i>=0 and A[i]>A[i+1] { Θ(1) Θ(Ν)xΘ(Ν) swap A[i], A[i+1] Θ(1) Θ(Ν2) i=i-1 Θ(1) Θ(Ν2) }} return A Θ(1) Θ(1) T(n) = Θ(1) x Θ(Ν) + Θ(1) x Θ(Ν) + Θ(1) x Θ(Ν2) + Θ(1) x Θ(Ν2) + Θ(1) x Θ(Ν2) + Θ(1) x Θ(1)

InsertionSort – asymptotic worst-case analysis Lecture 4 InsertionSort – asymptotic worst-case analysis cost No of times for j = 1 to A.length { Θ(1) Θ(Ν) //shift A[j] into the sorted A[0..j-1] i=j-1 Θ(1) Θ(Ν) while i>=0 and A[i]>A[i+1] { Θ(1) Θ(Ν)xΘ(Ν) swap A[i], A[i+1] Θ(1) Θ(Ν2) i=i-1 Θ(1) Θ(Ν2) }} return A Θ(1) Θ(1) T(n) = Θ(1) x Θ(Ν) + Θ(1) x Θ(Ν) + Θ(1) x Θ(Ν2) + Θ(1) x Θ(Ν2) + Θ(1) x Θ(Ν2) + Θ(1) x Θ(1) = Θ(Ν) + Θ(Ν) + Θ(Ν2) + Θ(Ν2) + Θ(Ν2) + Θ(1) = Θ(Ν2)

One more example: BinarySeach Specification: Input: array a[0..n-1], integer key Input property: a is sorted Output: integer pos Output property: if key==a[i] then pos==I

BinarySearch – worst case asymptotic running time Cost No of times lo = 0, hi = a.length-1 while (lo <= hi) { int mid = lo + (hi - lo) / 2 if (key < a[mid]) then hi = mid - 1 else if (key > a[mid]) then lo = mid + 1 else return mid } return -1

BinarySearch – worst case asymptotic running time Cost No of times lo = 0, hi = a.length-1 Θ(1) Θ(1) while (lo <= hi) { Θ(1) Θ(log n) int mid = lo + (hi - lo) / 2 Θ(1) Θ(log n) if (key < a[mid]) then hi = mid – 1 Θ(1) Θ(log n) else if (key > a[mid]) then lo = mid + 1 Θ(1) Θ(log n) else return mid Θ(1) Θ(log n) } return -1 Θ(1) Θ(1) T(n) = Θ(log n)

Cost No of times Θ(1) Θ(1) Θ(1) Θ(N) Θ(N) Θ(N)

Examples (comparisons) Θ(n log n) =?= Θ(n)

Examples (comparisons) Θ(n log n) > Θ(n) Θ(n2 + 3n – 1) =?= Θ(n2)

Examples (comparisons) Θ(n log n) > Θ(n) Θ(n2 + 3n – 1) = Θ(n2)

Examples (comparisons) Θ(n log n) > Θ(n) Θ(n2 + 3n – 1) = Θ(n2) Θ(1) =?= Θ(10) Θ(5n) =?= Θ(n2) Θ(n3 + log(n)) =?= Θ(100n3 + log(n)) Write all of the above in order, writing = or < between them

Examples (comparisons) MyAlgorithm has an asymptotic worst case running time: T(N) = O(Ν2 lgN) Does that mean: T(N) = O(Ν3)? T(N) = O(Ν2)? T(N) = Ω(Ν)? T(N) = Ω(Ν3)? T(N) = Θ(Ν2 lgN)?

Examples (comparisons) MyAlgorithm has an asymptotic worst case running time: T(N) = O(Ν2 lgN) Does that mean: T(N) = O(Ν3)? YES T(N) = O(Ν2)? NO T(N) = Ω(Ν)? NO T(N) = Ω(Ν3)? NO T(N) = Θ(Ν2 lgN)? NO

Examples (comparisons) MyAlgorithm has an asymptotic worst case running time: T(N) = Ω(Ν2 lgN) Does that mean : T(N) = O(Ν3)? T(N) = O(Ν2)? T(N) = Ω(Ν)? T(N) = Ω(Ν3)? T(N) = Θ(Ν2 lgN)?

Examples (comparisons) MyAlgorithm has an asymptotic worst case running time: T(N) = Ω(Ν2 lgN) Does that mean: T(N) = O(Ν3)? NO T(N) = O(Ν2)? NO T(N) = Ω(Ν)? YES T(N) = Ω(Ν3)? NO T(N) = Θ(Ν2 lgN)? NO

Examples (comparisons) MyAlgorithm has an asymptotic worst case running time: T(N) = Θ(Ν2 lgN) Does that mean: T(N) = O(Ν3)? T(N) = O(Ν2)? T(N) = Ω(Ν)? T(N) = Ω(Ν3)?

Examples (comparisons) MyAlgorithm has an asymptotic worst case running time: T(N) = Θ(Ν2 lgN) Does that mean: T(N) = O(Ν3)? Yes T(N) = O(Ν2)? NO T(N) = Ω(Ν)? YES T(N) = Ω(Ν3)? NO *Because the above means MyAlgorithm is both O(N2 lgN) and Ω(N2 lgN).