Analysis of Algorithms: Methods and Examples CSE 2320 – Algorithms and Data Structures Alexandra Stefan Based on presentations by Vassilis Athitsos and.

Slides:



Advertisements
Similar presentations
Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
Advertisements

Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Fundamentals of Python: From First Programs Through Data Structures
ALG0183 Algorithms & Data Structures Lecture 3 Algorithm Analysis 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks Weiss Chapter 5 Sahni.
CHAPTER 1 Compiled by: Dr. Mohammad Omar Alhawarat Algorithm’s Analysis.
Computational Complexity 1. Time Complexity 2. Space Complexity.
Chapter 3 Growth of Functions
Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into.
Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into.
Cmpt-225 Algorithm Efficiency.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
CS 206 Introduction to Computer Science II 01 / 28 / 2009 Instructor: Michael Eckmann.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
Analysis of Performance
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Analysis of Algorithms Lecture 2
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Program Performance & Asymptotic Notations CSE, POSTECH.
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
Analysis of Algorithms CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Complexity Analysis Chapter 1.
Analysis of Algorithms
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Complexity A decidable problem is computationally solvable. But what resources are needed to solve the problem? –How much time will it require? –How much.
Analysis of Algorithms: Methods and Examples CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
1 Dr. J. Michael Moore Data Structures and Algorithms CSCE 221 Adapted from slides provided with the textbook, Nancy Amato, and Scott Schaefer.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Chapter 10 Algorithm Analysis.  Introduction  Generalizing Running Time  Doing a Timing Analysis  Big-Oh Notation  Analyzing Some Simple Programs.
Analysis of Algorithms CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. CSE 3101Z Design and Analysis of Algorithms.
Analysis of Algorithms CSE 2320 – Algorithms and Data Structures Vassilis Athitsos Modified by Alexandra Stefan University of Texas at Arlington 1.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Analysis of Algorithms CSE 2320 – Algorithms and Data Structures Vassilis Athitsos Modified by Alexandra Stefan University of Texas at Arlington 1.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Asymptotes: Why? How to describe an algorithm’s running time? (or space, …) How does the running time depend on the input? T(x) = running time for instance.
Analysis of Algorithms: Methods and Examples CSE 2320 – Algorithms and Data Structures Vassilis Athitsos Modified by Alexandra Stefan University of Texas.
1 The Role of Algorithms in Computing. 2 Computational problems A computational problem specifies an input-output relationship  What does the.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
CMPT 438 Algorithms.
Introduction to Algorithms
Introduction to Analysis of Algorithms
Thinking about Algorithms Abstractly
Introduction to complexity
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Analysis of Algorithms
Introduction to Algorithms
Analysis of Algorithms: Methods and Examples
CS 3343: Analysis of Algorithms
Time Complexity for Loops
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Recurrences (Method 4) Alexandra Stefan.
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Introduction to Algorithms
Summary Simple runtime problems ‘Counting’ instructions
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
CSE 373, Copyright S. Tanimoto, 2001 Asymptotic Analysis -
Presentation transcript:

Analysis of Algorithms: Methods and Examples CSE 2320 – Algorithms and Data Structures Alexandra Stefan Based on presentations by Vassilis Athitsos and Bob Weems University of Texas at Arlington Updated 2/3/2016 1

Reading CLRS: Chapter 3 (all) – strongly encourraged Sedgewick: Chapter 2 (except for 2.5 – Basic recurrences) 2

Counting instructions Count in detail the total number of instructions executed by each of the following pieces of code: // Example A. Notice the ; at the end of the for loop. for (i = 0; i <N; i++) ; // Example B (source: Dr. Bob Weems) for (i=0; i<N; i++) for (j=0; j<N; j++) { c[i][j] = 0; for (k=0; k<N; k++) c[i][j] += a[i][k]*b[k][j]; }

Counting instructions – continued // Example C (source: Dr. Bob Weems) T(N) = … for (i = 0; i<N; i++) for(j=1; j<N; j = j+j) printf("A"); // Example D. T(N) = …. for (i = 0; i<N; i++) for(j=1; j<N; j = j*2) printf("A"); // Example E. T(N) = …. for (i = 0; i<N; i++) for(j=N; j>=1; j = j/2) printf("A"); // Example F. T(N) = …. for (i = 0; i<N; i++) for(j=1; j<N; j = j+2) printf("A"); 4

Estimate runtime Problem: The total number of instructions in a program (or a piece of a program) is and the computer it runs on executes 10 9 instructions per second. How long will it take to run this program? Give the answer in seconds. If it is very large, transform it in larger units (hours, days, years). Summary: – Total instructions: – Speed: 10 9 instructions/second Answer: – Time = (total instructions)/speed = (10 12 instructions) / (10 9 instr/sec) = 10 3 seconds ~ 15 minutes Note that this computation is similar to computing the time it takes to travel a certain distance ( e.g. 120miles) given the speed (e.g. 60 miles/hour). 5

Estimate runtime A slightly different way to formulate the same problem: – total number of instructions in a program (or a piece of a program) is and – the computer it runs on executes one instruction in one nanosecond (10 -9 seconds) – How long will it take to run this program? Give the answer in seconds. If it is very large, transform it in larger units (hours, days, years) Summary: – total instructions – seconds per instruction Answer: – Time = (total instructions) * (seconds per instruction) = (10 12 instructions)* (10 -9 sec/instr) = 10 3 seconds ~ 15 minutes 6

Motivation for Big-Oh Notation Scenario: a client requests an application for sorting his records. Which one of the 3 sorting algorithms that we discussed will you implement? – Selection sort – Insertion sort – Merge sort 7

Comparing algorithms Comparing linear, N lg N, and quadratic time. Quadratic time algorithms become impractical (too slow) much faster than linear and N lg N time algorithms. Of course, what we consider "impractical" depends on the application. – Some applications are more tolerant of longer running times. 8 NN log NN2N (1 million)about 20 million10 12 (one trillion) 10 9 (1 billion)about 30 billion10 18 (one quintillion) (1 trillion)about 40 trillion10 24 (one septillion)

Motivation for Big-Oh Notation Given an algorithm, we want to find a function that describes the time performance of the algorithm. Computing the number of instructions in detail is NOT desired: – It is complicated and the details are not relevant – The number of machine instructions and runtime depend on factors other than the algorithm: Programming language Compiler optimizations Performance of the computer it runs on (CPU, memory) There are some details that we would actually NOT want this function to include, because they can make a function unnecessarily complicated. When comparing two algorithms we want to see which one is better for very large data – asymptotic behavior – It is not very important what happens for small size data. The Big-Oh notation describes the asymptotic behavior and greatly simplifies algorithmic analysis. 9

Asymptotic Notation Goal: we want to be able to say things like: – Selection sort will take time proportional to N 2 Θ(N 2 ) – Insertion sort will take time at most proportional N 2 O(N 2 ) proportional to N when the data is already sorted – Any sorting algorithm will take time at least proportional to N Ω(N) Math functions that are: – Θ(N 2 ) : – O(N 2 ) : – Ω(N 2 ) : 10

11

Big-Oh 12

Asymptotic Bounds and Notation (CLRS chapter 3) 13

Abuse of notation 14 Instead of : We may use:

Theta vs Big-Oh The Theta notation is stricter than the Big-Oh notation: – We can say that N 2 = O(N 100 ). – We cannot say that N 2 = Θ(N 100 ). 15

Simplifying Big-Oh Notation Suppose that we are given this running time: f(N) = 35N N + lg(N) How can we express f(N) in Big-Oh notation? 16

Simplifying Big-Oh Notation Suppose that we are given this running time: f(N) = 35N N + lg(N) How can we express f(N) in Big-Oh notation? Typically we say that f(N) = O(N 2 ). The following are also correct, but unnecessarily complicated, and thus less useful, and rarely used. – f(N) = O(N 2 ) + O(N). – f(N) = O(N 2 ) + O(N) + O(lgN) + O(1). – f(N) = O(35N N + lg(N) ). 17

Simplifying Big-Oh Notation Suppose that we are given this running time: f(N) = 35N N + lg(N) We say that g(N) = O(N 2 ). Why is this mathematically correct? – Why can we ignore the non-quadratic terms? Ans 1: Using the Big-Oh definition: we can find an N 0 and c such that, for all N ≥ N 0 : f(N) ≤ cN 2. – If you don't believe this, do the calculations for practice. Use: c = 38, N 0 = 41 ( or N 0 = 1536) f(N) = 35N N + lg(N) ≤ 38N 2, for all N ≥ 41 18

Polynomial functions If f(N) is a polynomial function, then it is Θ of the dominant term. 19

Properties of O, Ω and Θ 20

Using Limits 21

Using Limits 22

Using Limits: An Example 23

Using Limits: An Example 24

Using Limits: An Example 25

Big-Oh Hierarchy 26

Big-Oh Transitivity 27

Big-Oh Transitivity 28

Using Substitutions 29

Using Substitutions 30

Example Problem 1 31

Example Problem 2 32

Asymptotic notation for two parameters (CLRS) f(N,M) is O(g(N,M)) if there exist constants c 0, N 0 and M 0 such that: f(N,M) ≤ c 0 g(N,M) for all pairs (N,M) s.t. either N ≥ N 0 or M ≥ M 0 33

Useful logarithm properties 34

Summary 35