1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
CMSC 341 Asymptotic Analysis. 2 Mileage Example Problem: John drives his car, how much gas does he use?
Analysys & Complexity of Algorithms Big Oh Notation.
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 10 Algorithm Efficiency
CSC401 – Analysis of Algorithms Lecture Notes 1 Introduction
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Analysis of Algorithms Algorithm Input Output. Analysis of Algorithms2 Outline and Reading Running time (§1.1) Pseudo-code (§1.1) Counting primitive operations.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Performance
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
C. – C. Yao Data Structure. C. – C. Yao Chap 1 Basic Concepts.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Iterative Algorithm Analysis & Asymptotic Notations
Complexity Analysis Chapter 1.
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
Analysis of Algorithm Efficiency Dr. Yingwu Zhu p5-11, p16-29, p43-53, p93-96.
Asymptotic Analysis-Ch. 3
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Asymptotic Analysis (based on slides used at UMBC)
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Algorithm Analysis Part of slides are borrowed from UST.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Algorithm Analysis Algorithm Analysis Lectures 3 & 4 Resources Data Structures & Algorithms Analysis in C++ (MAW): Chap. 2 Introduction to Algorithms (Cormen,
Algorithm Analysis (Big O)
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Program Performance 황승원 Fall 2010 CSE, POSTECH. Publishing Hwang’s Algorithm Hwang’s took only 0.1 sec for DATASET1 in her PC while Dijkstra’s took 0.2.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Chapter 2 Algorithm Analysis
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Analysis of algorithms
Programming and Data Structure
Chapter 2.
Programming and Data Structure
At the end of this session, learner will be able to:
Analysis of algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Estimating Algorithm Performance
Presentation transcript:

1 Chapter 2 Program Performance – Part 2

2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts for the time spent in all parts of the program/function Program step: loosely defined to be a syntactically or semantically meaningful segment of a program for which the execution time is independent of the instance characteristics Return a+b*c/(a-b)*4 X =y

3 Use a global variable to count program steps Count = 2n + 3

4 Counting steps in a recursive function t Rsum = 2, n=0 t Rsum = 2+t Rsum (n-1), n>0 t Rsum = 2+2+t Rsum (n-2), n>0 t Rsum = 2(n+1), n>=0

5 Matrix Addition

6 Count steps in Matrix Addition count = 2rows*cols+2rows+1

7 Using a Step Table Sum

8 Rsum

9 Matrix Addition

10 Matrix Transpose Template void transpose(T** a, int rows) { for (int i = 0; i < rows ; i++) for (int j = i+1; j < rows ; j++) swap(a[i][j], a[j][i]) }

11 Matrix Transpose

12 Inefficient way to compute the prefix sums for j = 0, 1, …, n-1 Note: number of S/E for sum() varies depending on parameters

13 Steps Per Execution Sum(a, n) requires 2n + 3 steps Sum(a, j + 1) requires 2(j+1) + 3 = 2j +5 steps Assignment statement: b[j]=sum(….) ==>2j + 6 steps n-1 Total: ∑ (2j +6) = n(n+5) j=0

14 Prefix sums

15 Sequential Search - Best case

16 Sequential Search - Worst case

17 Average for successful searches X has equal probability of being any one element of a. Step count if X is a[j]

18 Average for successful searches

19 Insertion in a Sorted Array – Best Case

20 Insertion – Worst Case

21 Insertion - Average the step count for inserting into position j is 2n-2j+3 Average count is:

22 Asymptotic Notation Objectives of performance Evaluation: –Compare time complexities of two programs that do the same function –Predict the growth in run time as instance characteristics change Operation count and step count methods not accurate for either objectives –Op count: counts some ops and ignores others –Step count: definition of a step is inexact

23 Asymptotic Notation If two programs: –Program A with complexity C 1 n 2 +C 2 n –Program B with complexity C 3 n Program B is faster than program A for sufficiently large values of n For Small values of n, either could be faster and it may not matter any way. There is a break-even point for n beyond which B is always faster than A.

24 Asymptotic Notation Describes behavior of space and time complexities of programs for LARGE instance characteristics –To establish a relative order among functions. –To compare their relative rate of growth Allows us to make meaningful, though inexact statements about the complexity of programs

25 Mathematical background T(n) denotes the time or space complexity of a program Big- Oh: Growth rate of T(n) is <= f(n) T(n)=  ( f(n) ) iff constants c and n 0 exist such that T(n) =n 0 f is an upper bound function for T Example: Algoritm A is  (n 2 ) means, for data sets big enough (n>n 0 ), algorithm A executes less than c*n 2 (c a positive constant).

26 The Idea Example: –1000n larger than n 2 for small values of n n 2 grows at a faster rate and thus n 2 will eventually be the larger function. –Here we have T(n) = 1000n, f(n) = n 2, n 0 = 1000, and c=1 T(n) n 0 –Thus we say that 1000n =  (n 2 ) –Note that we can get a tighter upper bound

27 Example Suppose T(n) = 10n 2 + 4n + 2 for n>= 2, T(n) <= 10n 2 + 5n for n>=5, T(n) <= 11n 2 T(n) = O(n 2 )

28 Big Oh Ratio Theorem T(n) = O(f(n)) iff (T(n)/f(n)) < c for some finite constant c. f(n) dominates T(n).

29 Examples Suppose T(n) = 10n 2 + 4n + 2 T(n)/n 2 = /n + 2/n 2 (T(n)/ n 2 ) = 10 T(n) = O (n 2 )

30 Common Orders of Magnitude FunctionsName 1Constant log nLogarithmic log 2 nLog-squared n log n n 2 Quadratic n 3 Cubic 2 n Exponential n!Factorial

31 Loose Bounds Suppose T(n) = 10n2 + 4n n2 + 4n + 2 <= 11n3 T(n) = O(n3) Need to get the smallest upper bound.

32 Polynomials If T(n) = a m n m + ….+a 1 n 1 +a 0 n 0 then T(n) = O(n m )

33 Omga Notation--Lower Bound Omega: T(n)=  ( g(n) ) iff constants c and n 0 exist such that T(n)>=c g(n) for all n >=n 0 Establishes a lower bound eg: T(n) = C 1 n 2 +C 2 n C 1 n 2 +C 2 n  C 1 n 2 for all n  1 T(n)  C 1 n 2 for all n  1 T(n) is  ( n 2 ) Note: T(n) is also  ( n) and  ( 1). Need to get largest lower-bound

34 Omega Ratio Theorem T(n) =  (f(n)) iff (f(n)/T(n)) <= c for some finite constant c.

35 Lower Bound of Polynomials If T(n) = a m n m + ….+a 1 n 1 +a 0 n 0 then T(n) =  (n m ) T(n) = n n n 2 +1 T(n) is  (n 4 )

36 Theta Notation Theta: When O and  meet we indicate that with  notation Definition: T(n)=  ( h(n) ) iff constants c 1, c 2 and n 0 exist such that c 1 h(n) n 0 T(n)=  ( h(n) ) iff T(n)=O(h(n)) and T(n)=  (h(n)) e.g. T(n) = 3n + 8 3n = 1 T(n) =  (n) T(n) = 20log 2 (n) +8 =  log 2 (n) log 2 (n) =32

37 Theta Notation cntd T(n) = 1000n T(n) = O(n2) but T(n) !=  (n 2 ) because T(n) !=  n 2 )

38 Theta of Polynomials If T(n) = a m n m + ….+a 1 n 1 +a 0 n 0 then T(n) =  (n m )

39 Little o Notation Little- Oh: Growth rate of T(n) is < p(n) T(n)=  ( p(n) ) if T(n)=  ( p(n) ) and T(n)!=  ( p(n) ) T(n) = 1000n T(n) o(n 2 )

40 Simplifying Rules If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)). If f(n) is O(kg(n)) for any k>0, then f(n) is O(g(n)). f 1 (n) = O(g 1 (n)) and f 2 (n) = O(g 2 (n)), then (a) (f 1 + f 2 )(n) = max (O(g 1 (n)), O(g 2 (n))), (b) f 1 (n) * f 2 (n) = O(g 1 (n) * g 2 (n))

41 Some Points DO NOT include constants or low-order terms inside a Big-Oh. For example: –T(n) = O(2n 2 ) or –T(n) = O(n 2 + n) are the same as: –T(n) = O(n 2 )

42 Examples Example1: a = b; This assignment takes constant time, so it is  Example 2: sum =0; for( I= 0; I<= n; I++) sum += n; time complexity is  (n)

43 Examples CNTD a = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) a++; time complexity is  (n 2 )

44 Examples CNTD a = 0; for (i=1; i<=n; i++) for (j=1; j<= i ; j++) a++; a++ statement will execute n(n+1)/2 times time complexity is  (n 2 )

45 Examples CNTD a = 0;  (1) for (i=1; i<=n; i++) for (j=1; j<= i ; j++) a++;  (n 2 ) for (k=1; k<=n; k++)  (n) A[k] = k-1; time complexity is  (n 2 )

46 Examples CNTD Not all doubly nested loops execute n 2 times a = 0; for (i=1; i<=n; i++) for (j=1; j<= n ; j *= 2) a++; Inner loop executes log 2 (n) Outer loop execute n times time complexity is  (n log 2 (n))

47 Useful asymptotic identities

48 Inference rules

49 First determine the asymptotic complexity of each statement and then add up

50 Asymptotic complexity of Rsum

51 Asymptotic complexity of Matrix Addition

52 Asymptotic complexity of Transpose

53 Asymptotic complexity of Inef

54 Asymptotic complexity of Sequential Search

55 Binary Search Worst-case complexity is Θ(log n)

56 Performance Measurement Chapter 2 Section 6

57 Run time on a pseudo machine

58 Conclusions The utility of a program with exponential complexity is limited to small n (typically <= 40) Programs that have a complexity of high degree polynomial are also of limited utility Linear complexity is desirable in practice of programming

59 Performance Measurement Obtain the actual space and time requirements of a program Choosing Instance Size Developing the test data - exhibits the best-, worst-, and average-case time complexity (using randomly generated data) Setting up the experiment - write a program that will measure the desired run times

60 Measuring the performance of Insertion Sort Program

61 Measuring the performance of Insertion Sort Program (continue)

62 Experimental results - Insertion Sort

63 Measuring with repeated runs

64 Do without overhead

65 Do without overhead (continue)

66 Overhead

67 End of Chapter 2