Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester, 2010-2011.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Chapter 3 Growth of Functions
CSC401 – Analysis of Algorithms Lecture Notes 1 Introduction
Asymptotic Growth Rate
© 2004 Goodrich, Tamassia 1 Lecture 01 Algorithm Analysis Topics Basic concepts Theoretical Analysis Concept of big-oh Choose lower order algorithms Relatives.
Introduction to Analysis of Algorithms
Analysis of Algorithms Algorithm Input Output. Analysis of Algorithms2 Outline and Reading Running time (§1.1) Pseudo-code (§1.1) Counting primitive operations.
Complexity Analysis (Part I)
Analysis of Algorithms (Chapter 4)
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Analysis of Algorithms
Analysis of Algorithms (pt 2) (Chapter 4) COMP53 Oct 3, 2007.
Fall 2006CSC311: Data Structures1 Chapter 4 Analysis Tools Objectives –Experiment analysis of algorithms and limitations –Theoretical Analysis of algorithms.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Performance
CS2210 Data Structures and Algorithms Lecture 2:
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Analysis of Algorithms Lecture 2
Lecture 2 We have given O(n 3 ), O(n 2 ), O(nlogn) algorithms for the max sub-range problem. This time, a linear time algorithm! The idea is as follows:
Analysis of Algorithms
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 Big-Oh Notation CS 105 Introduction to Data Structures and Algorithms.
Analysis Tools Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Lecture 2 Computational Complexity
Analysis of Algorithms
Analysis of Algorithms1 The Goal of the Course Design “good” data structures and algorithms Data structure is a systematic way of organizing and accessing.
Algorithm Input Output An algorithm is a step-by-step procedure for solving a problem in a finite amount of time. Chapter 4. Algorithm Analysis (complexity)
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Data Structures Lecture 8 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
CS 3343: Analysis of Algorithms
DISCRETE MATHEMATICS I CHAPTER 11 Dr. Adam Anthony Spring 2011 Some material adapted from lecture notes provided by Dr. Chungsim Han and Dr. Sam Lomonaco.
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
1 Analysis of Algorithms CS 105 Introduction to Data Structures and Algorithms.
Introduction to Analysis of Algorithms COMP171 Fall 2005.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Analysis of Algorithms [ Section 4.1 ] Examples of functions important in CS: the constant function:f(n) =
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Analysis of Algorithms Algorithm Input Output © 2010 Goodrich, Tamassia1Analysis of Algorithms.
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Announcement We will have a 10 minutes Quiz on Feb. 4 at the end of the lecture. The quiz is about Big O notation. The weight of this quiz is 3% (please.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
1 COMP9024: Data Structures and Algorithms Week Two: Analysis of Algorithms Hui Wu Session 2, 2014
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Analysis of Algorithms
COMP9024: Data Structures and Algorithms
COMP9024: Data Structures and Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Analysis of Algorithms
COMP9024: Data Structures and Algorithms
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Asymptotic Notations Algorithms Lecture 9.
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Analysis of Algorithms
Presentation transcript:

Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,

Learning Objectives ► To analyze the efficiency of algorithms in terms of counting operations and Big-Oh notation

Algorithm Efficiency ► An algorithm should not use any more of the computer’s resources than necessary ► Two options ► Benchmarking ► Analysis

Algorithm Efficiency ► Benchmarking ► Measure execution time of an algorithm using System.currentTimeMillis() or other methods ► Pitfalls ► Limited set of test inputs – may not be indicative of running time on other inputs ► Comparing two algorithms requires the same machine setup ► Algorithm must first be implemented

Algorithm Analysis ► Define primitive operations ► Assigning a value to a variable ► Calling a method ► Performing an arithmetic operation ► Comparing two values ► Indexing into an array ► Following an object reference ► Returning from a method

Algorithm Analysis ► Count the total number of operations: Algorithm arrayMax(A,n): Input: An array A storing n integers. Output: The maximum element in A. max  A[0] for i  1 to (n - 1) do if max < A[i] then max  A[i] return max

Algorithm Analysis ► Some notes ► The for statement implies assignments, comparisons, subtractions and increments ► The statement max  A[i] will sometimes not be carried out

What to count? max  A[0] for i  1 to (n - 1) do if max < A[i] then max  A[i] return max assignment, array access assignment, comparison, subtraction, increment (2 operations) comparison, array access assignment, array access return

Algorithm Analysis ► Counting operations: max  A[0] for i  1 to (n - 1) do if max < A[i] then max  A[i] return max Running time (worst case) Running time (average case) n + 2(n-1) 2(n-1) 0 … 2(n-1) 1 8n-2 6n

Algorithm Analysis ► Exercise: What if there are nested loops? for i  1 to n do for j  1 to n do print i, j ► Note: inner loop gets executed n times

Worst Case vs. Average Case ► Worst Case: maximum number of operations executed ► Average Case: average number of operations on a typical run ► Need to define range of inputs ► Need to find out probability distribution on range of input

Algorithm Analysis ► Considerations for Counting Primitive Steps ► Implicit Operations in the Execution Path ► Worst-case vs average-case vs best-case ► Arbitrariness of Measurement ► Compare algorithms by looking at growth rates ► Linear vs polynomial vs exponential ► Goal ► To simplify analysis by getting rid of irrelevant information

Asymptotic Behavior ► How important is the exact number of primitive operations? ► Example: arrayMax ► It’s enough to say: “The running time of arrayMax grows proportionally to n” nBest CaseWorst Case

Big-Oh Notation ► Big-Oh notation provides a way to compare two functions ► “f(n) is O(g(n))” means: f(n) is less than or equal to g(n) up to a constant factor for large values of n

Formal Definition of Big-Oh ► Let f(n) and g(n) be functions of the running times of algorithms F and G respectively ► f(n) is O(g(n)) or “f(n) is Big-Oh of g(n)” if there exists: ► a real constant c > 0 ► an integer constant n 0 ≥ 1 such that ► f(n) ≤ c g(n) for all n ≥ n 0

Example ► f(n) = 2n + 5 and g(n) = n ► Consider the condition 2n + 5 ≤ n Will this condition ever hold? No! ► Suppose we multiply a constant to n 2n + 5 ≤ 3n The condition holds for values of n ≥ 5 ► Thus, we can select c = 3 and n 0 = 5

Example

Big-Oh Notation ► 6n - 3 is O(n) ► c = 6 ► 6n - 3 ≤ 6n for all n ≥ 1 ► 3n 2 + 2n is O(n 2 ) ► c = 4 ► Is there an n 0 such that 3n 2 + 2n ≤ 4n 2 for all n ≥ n 0 ? ► 4n 3 + 8n is O(n 4 )

Big-Oh Notation ► n a ∈ O(n b ) whenever a ≤ b ► Suppose there is a c such that: ► n a ≤ cn b ⇔ 1 ≤ cn b-a ► Since 1 ≤ a ≤ b, b – a ≥ 0 ► If b = a then, 1 ≤ cn b-a ⇒ 1 ≤ cn 0 ⇒ 1 ≤ c ► So 1 ≤ cn d (where d≥0) will always be true for all n ≥ 1 and c ≥ 1

Big-Oh Notation ► Usually to prove that f(n) ∈ O(g(n)) we give a c that can work then solve for n 0 ► However, this usually involves factorization of polynomials for which is hard for degrees greater than 2. ► An alternative is to prove by giving an n 0 then solving for c.

Big-Oh Notation ► 3n n + 34 ∈ O(n 2 ) ► Let n 0 = 1 so n ≥ 1. Thus: ► n 2 ≥ n 2, n 2 ≥ n, n 2 ≥ 1 ► 3n 2 ≥ 3n 2, 26n 2 ≥ 26n, 34n 2 ≥ 34 ► Adding all terms: ► 3n n n 2 ≥ 3n n + 34 ► 63n 2 ≥ 3n n + 34 ► So c = 63

Big-Oh Notation ► n 2 ∉ O(n) ► Proof by contradiction. ► Suppose ∃ c > 0 and n 0 ≥1 such that n 2 ≤cn for all n ≥ n 0 ► n 2 ≤cn ⇒ n ≤ c (since n > 0, we can safely divide) ► This implies that, n 0 ≤ n ≤ c for n 2 ≤cn which contradicts our definition.

Big-Oh Notation ► 3 n ∉ O(2 n ) ► Suppose 3 n ∈ O(2 n ) then 3 n ≤ c2 n, for some c > 0 and n ≥ n 0 ≥1 ► Note that log is a monotonically increasing function so ► 0 < a ≤ b if and only if log(a) ≤ log(b) ► log 3 n ≤ log c2 n ⇒ n log 3 ≤ log c + n log 2 ► n (log 3 – log 2) ≤ log c

Big-Oh Notation ► 3 n ∉ O(2 n ) ► n (log 3 – log 2) ≤ log c ► To make the inequality easier to read, let ► b = log 3 – log 2 (note that b > 0) ► a = log c ► Thus, nb ≤ a which is a contradiction since n cannot have an upper bound.

Big-Oh Properties/Identities ► g ( O ( f(n) ) ) = { g ( h(n) ) for all h(n) in O ( f(n) ) } ► c ⋅ O ( f(n) ) = O ( f(n) ) ► O ( O ( f(n) ) ) = O ( f(n) ) ► O ( f(n) ) ⋅ O ( g(n) ) = O ( f(n) ⋅ g(n) ) ► f(n) ⋅ O ( g(n) ) = O ( f(n) ⋅ g(n) ) ► O ( f(n) ) + O ( g(n) ) = O ( |f(n)| + |g(n)| )

Big-Oh Notation ► Big-Oh allows us to ignore constant factors and lower order (or less dominant) terms ► Rule: Drop lower order terms and constant factors ► 5n + 2 is O(n) ► 4n 3 log n + 6n is O(n 3 log n) ► Allows us to classify functions into categories

Function Categories ► The constant function:f(n) = 1 ► The linear function:f(n) = n ► The quadratic function:f(n) = n 2 ► The cubic function:f(n) = n 3 ► The exponential function:f(n) = 2 n ► The logarithm function:f(n) = log n ► The n log n function:f(n) = n log n

Comparing Function Categories ► Linear (n) is better than quadratic (n 2 ) which is better than exponential (2 n ) ► Are there any function categories better than linear? Yes! ► Constant (1) ► Logarithmic (log n) ► “Better” means resulting values are smaller (slower growth rates)

Functions by Increasing Growth Rate ► The constant function:f(n) = 1 ► The logarithm function:f(n) = log n ► The linear function:f(n) = n ► The n log n function:f(n) = n log n ► The quadratic function:f(n) = n 2 ► The cubic function:f(n) = n 3 ► The exponential function:f(n) = 2 n

Big-Oh in this Course ► For this course, you will be expected to assess the running time of an algorithm and classify it under one of the categories, using Big-Oh notation ► You should be able to recognize, for instance, that, most of the time (not always): ► Algorithms with single loops are O(n) ► Algorithms with double-nested loops are O(n 2 )

Big-Oh as an Upper Bound ► The statement "f(n) is O( g(n) )" indicates that g(n) is an upper bound for f(n) ► Which means it is also correct to make statements like: ► 3n+5 is O(n 2 ) ► 3n+5 is O(2 n ) ► 3n+5 is O(5n + log n - 2) ► But the statement 3n+5 is O(n) is the “tightest” statement one can make

Other Ways of Analysis ► Aside from Big-Oh, there are other ways of analyzing algorithm running time ► Big-OmegaΩ(g(n)) ► Specifies an asymptotic lower bound rather than an upper bound ► Big-ThetaΘ(g(n)) ► Specifies both asymptotic upper and lower bounds. i.e. f(n) ∈ Θ(g(n)) if f(n) ∈ O(g(n)) and f(n) ∈ Ω(g(n))

Big-Omega and Big-Theta ► 6n - 3 is Ω(n) ► c = 3 ► 6n - 3 ≥ 3n for all n ≥ 1 ► 3n 2 + 2n is Θ(n 2 ) ► c = 3 ► 3n 2 + 2n ≥ 3n 2 for all n ≥ 1 ► Therefore, it is Ω(n 2 ) ► 4n 3 + 8n is not Θ(n 4 )