Program Performance 황승원 Fall 2010 CSE, POSTECH. Publishing Hwang’s Algorithm Hwang’s took only 0.1 sec for DATASET1 in her PC while Dijkstra’s took 0.2.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 2 Some of the sides are exported from different sources.
Lecture3: Algorithm Analysis Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Fundamentals of Python: From First Programs Through Data Structures
Analysys & Complexity of Algorithms Big Oh Notation.
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Introduction to Analysis of Algorithms
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
Complexity Analysis (Part I)
Cmpt-225 Algorithm Efficiency.
Insertion Sort for (i = 1; i < n; i++) {/* insert a[i] into a[0:i-1] */ int t = a[i]; int j; for (j = i - 1; j >= 0 && t < a[j]; j--) a[j + 1] = a[j];
Time Complexity s Sorting –Insertion sorting s Time complexity.
The Efficiency of Algorithms
Data Structure Algorithm Analysis TA: Abbas Sarraf
Elementary Data Structures and Algorithms
Insertion Sort for (int i = 1; i < a.length; i++) {// insert a[i] into a[0:i-1] int t = a[i]; int j; for (j = i - 1; j >= 0 && t < a[j]; j--) a[j + 1]
Algorithm Analysis (Big O)
Spring2012 Lecture#10 CSE 246 Data Structures and Algorithms.
COMP s1 Computing 2 Complexity
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Program Performance & Asymptotic Notations CSE, POSTECH.
CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a lgorithms.
Week 2 CS 361: Advanced Data Structures and Algorithms
C. – C. Yao Data Structure. C. – C. Yao Chap 1 Basic Concepts.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Unit III : Introduction To Data Structures and Analysis Of Algorithm 10/8/ Objective : 1.To understand primitive storage structures and types 2.To.
Complexity Analysis Chapter 1.
Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Asymptotic Analysis-Ch. 3
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
1 Radix Sort. 2 Classification of Sorting algorithms Sorting algorithms are often classified using different metrics:  Computational complexity: classification.
Algorithm Analysis (Big O)
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Searching Topics Sequential Search Binary Search.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Program Performance. 2 Concepts Memory and time complexity of a program Measuring the time complexity using the operation count and step count.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
CMPT 438 Algorithms.
Chapter 2 Algorithm Analysis
Analysis of Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Introduction to Algorithms
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
Chapter 12: Analysis of Algorithms
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
Introduction to Algorithms Analysis
CS 201 Fundamental Structures of Computer Science
Chapter 2.
Asst. Dr.Surasak Mungsing
Insertion Sort for (int i = 1; i < n; i++)
Analysis of Algorithms
Estimating Algorithm Performance
Insertion Sort for (int i = 1; i < n; i++)
Presentation transcript:

Program Performance 황승원 Fall 2010 CSE, POSTECH

Publishing Hwang’s Algorithm Hwang’s took only 0.1 sec for DATASET1 in her PC while Dijkstra’s took 0.2 sec. TWO TIMES FASTER!!!! (maybe I deserve two T awards) Only for this set? (best, worst, average) Scalable to problem size? Only in her computer (specialized for some operation?)

Program Performance Program performance is the amount of computer memory and time needed to run a program. How is it determined? – Analytically performance analysis – Experimentally performance measurement

Criteria for Measurement Space – amount of memory program occupies – usually measured in bytes or MB Time – execution time – usually measured by the number of executions

Space Complexity Space complexity is defined as the amount of memory a program needs to run to completion. Why is this of concern? – We may not have sufficient memory on our computer. – The space complexity may define an upper bound on the data that the program can handle. (e.g., 8,168,684,336 pages, Aug 2005)

Components of Program Space Program space = Instruction space + data space + stack space The instruction space is dependent on several factors. – the compiler that generated the machine code – the compiler options that were set at compilation time – the target computer

Components of Program Space Data space – very much dependent on the computer architecture and compiler – The magnitude of the data that a program works with is another factor char1float4 short2double8 int2long double10 long4pointer2 Unit: bytesSee Figure 2.2

Components of Program Space Data space – Choosing a “smaller” data type has an effect on the overall space usage of the program. – Choosing the correct type is especially important when working with arrays. – How much memory is allocated with these declarations? double a[100]; int a[100]; x4!!

Components of Program Space Environment Stack Space – Every time a function is called, the following data are saved on the stack. 1. the return address 2. the values of all local variables and value formal parameters 3. the binding of all reference and const reference parameters – What is the impact of recursive function calls on the environment stack space? e.g., fact(n)=n*fact(n-1), fact(1)=1 fact(2) =fact(2)*3 =(1*2)*3 =(fact(1)*2)*3 fact(1) fact(3)

Space Complexity Summary To make your programs more space efficient? – Always choose the optimal (smallest necessary) data type – Choose non-recursive algorithms when appropriate. READ: Chapter 2.2.2

Time Complexity Time complexity is the amount of computer time a program needs to run. Why do we care about time complexity? – Some programs require a real-time response. – If there are many solutions to a problem, typically we’d like to choose the quickest.

Time Complexity How do we measure? – Count a particular operation(operation counts) e.g., comparisons in sorting – Count the number of steps(step counts) – Asymptotic complexity

Human Experiment: SORT!

Insertion sort (hand simulation) [0] [1] [2] [3] [4] ij t=1 Best case? Worst case?

Running Example: Insertion Sort for (int i = 1; i < n; i++) // n is the number of // elements in array { // insert a[i] into a[0:i-1] int t = a[i]; int j; for (j = i - 1; j >= 0 && t < a[j]; j--) a[j + 1] = a[j]; a[j + 1] = t; } 3122 n=3 i= i= j=0

Comparison Count for (int i = 1; i < n; i++) { // insert a[i] into a[0:i-1] int t = a[i]; int j; for (j = i - 1; j >= 0 && t < a[j]; j--) a[j + 1] = a[j]; a[j + 1] = t; }

Comparison Count for (int i = 1; i < n; i++) for (j = i - 1; j >= 0 && t < a[j]; j--) a[j + 1] = a[j]; How many comparisons are made? The number of compares depends on a[]s and t as well as on n.

Comparison Count Worst case count = maximum count Best case count = minimum count Average count

Worst Case Comparison Count for (j = i - 1; j >= 0 && t < a[j]; j--) a[j + 1] = a[j]; a = [1,2,3,4] and t = 0 ⇒ 4 compares a = [1,2,3,4,…,i] and t = 0 ⇒ i compares

Worst Case Comparison Count for (int i = 1; i < n; i++) for (j = i - 1; j >= 0 && t < a[j]; j--) a[j + 1] = a[j]; total compares = …+(n-1) = (n-1)n/2 READ Examples 2.7~2.18

Step Count The operation-count method omits accounting for the time spent on all but the chosen operation The step-count method count for all the time spent in all parts of the program A program step is loosely defined to be a syntactically or semantically meaningful segment of a program for which the execution time is independent of the instance characteristics. – 100 adds, 100 subtracts, 1000 multiples can be counted as one step. – However, n adds cannot be counted as one step. EXACT COUNT DOESN’T MEAN MUCH

Step Count steps/execution (s/e) for (int i = 1; i < n; i++)1 { 0 // insert a[i] into a[0:i-1]0 int t = a[i];1 int j;0 for (j = i - 1; j >= 0 && t < a[j]; j--)1 a[j + 1] = a[j];1 a[j + 1] = t;1 } 0

Step Count s/efrequency for (int i = 1; i < n; i++)1 n-1 { 0 0 // insert a[i] into a[0:i-1]0 0 int t = a[i];1 n-1 int j;0 0 for (j = i - 1; j >= 0 && t < a[j]; j--)1 (n-1)n/2 a[j + 1] = a[j];1 (n-1)n/2 a[j + 1] = t;1 n-1 } 0 n-1

Step Count Total step counts = (n-1) (n-1) (n-1)n/2 + (n-1)n/2 + (n-1) + (n-1) = n 2 + 3n – 4 READ Examples 2.19~2.23

Asymptotic Complexity Neither of the two yield a very accurate measure – Operation counts: focus on “key” operations and ignore all others – Step counts: the notion of a step is itself inexact Asymptotic complexity provides meaningful statements about the time and space complexities of a program

Asymptotic Notation Describes the behavior of the time or space complexity for large instance characteristic Big Oh (O) notation provides an upper bound for the growth rate of the given function Omega (Ω) notation provides a lower bound Theta (  ) notation is used when an algorithm can be bounded both from above and below by the same function

Big Oh Time complexity T(n) of Insertion sort lies in the complexity class O(n 2 ). There exists c and n 0 such that time or number of operations does not exceed c.n 2 on any input of size n (n >= n 0 ). e.g., c=3 n 0 =1

Omega Time complexity T(n) of Insertion sort lies in the complexity class Ω(n 2 ). There exists c and n 0 such that time or number of operations exceeds c.n 2 on any input of size n (n >= n 0 ). e.g., c=1 n 0 =2

Theta Examples As seen above the lower bound for the insertion sort problem lies in  (n 2 ). Its upper bound lies in O(n 2 ) Thus,  (n 2 ) How good is it?

Smart Lingo: Common Growth Rate Functions 1 (constant): growth is independent of the problem size n. log 2 N (logarithmic): growth increases slowly compared to the problem size (good search) N (linear): directly proportional to the size of the problem. N * log 2 N (n log n): typical of some divide and conquer approaches (good sort) N 2 (quadratic): typical in nested loops N 3 (cubic): more nested loops 2 N (exponential): growth is extremely rapid and possibly impractical.

Beating Dummies: Practical Complexities Overly complex programs may not be practical given the computing power of the system. lognnnlognn2n2 n3n3 2n2n

O(n) Sort? Sort as good as search? Really? What does Wikipedia say?

Example of LSD-Radix Sort Input is an array of 15 integers. For integers, the number of buckets is 10, from 0 to 9. The first pass distributes the keys into buckets by the least significant digit (LSD). When the first pass is done, we have the following

Example of LSD-Radix Sort…contd We collect these, keeping their relative order: Now we distribute by the next most significant digit, which is the highest digit in our example, and we get the following. When we collect them, they are in order

Sort in search engine? Top-20? Special lectures – next two weeks

Coming Up Next READING: Ch 2-3 NEXT: Linear List– Array Representation (Ch 5)