{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)

Slides:



Advertisements
Similar presentations
Efficiency of Algorithms
Advertisements

HST 952 Computing for Biomedical Scientists Lecture 10.
Complexity Theory  Complexity theory is a problem can be solved? Given: ◦ Limited resources: processor time and memory space.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved L17 (Chapter 23) Algorithm.
Efficiency of Algorithms
Complexity Analysis (Part I)
CS107 Introduction to Computer Science
Cmpt-225 Algorithm Efficiency.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Concept of Basic Time Complexity Problem size (Input size) Time complexity analysis.
Algorithm Efficiency and Sorting Bina Ramamurthy CSE116A,B.
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithm.
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithms Dilemma: you have two (or more) methods to solve problem, how to choose the BEST? One approach: implement each algorithm in C, test.
Algorithm Analysis (Big O)
Spring2012 Lecture#10 CSE 246 Data Structures and Algorithms.
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
Lecture 2 Computational Complexity
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
Analysis of Algorithms
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All rights reserved. 1 Chapter 22 Developing Efficient Algorithms.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
CSS342: Algorithm Analysis1 Professor: Munehiro Fukuda.
© Copyright 2012 by Pearson Education, Inc. All Rights Reserved. 1 Chapter 16 Developing Efficient Algorithms.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Algorithm Analysis Dr. Bernard Chen Ph.D. University of Central Arkansas Fall 2008.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Liang, Introduction to Java Programming, Ninth Edition, (c) 2013 Pearson Education, Inc. All rights reserved. 1 Chapter 24 Developing Efficient Algorithms.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
2-0 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 2 Theoretical.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
تصميم وتحليل الخوارزميات عال311 Chapter 3 Growth of Functions
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
C++ How to Program, 7/e © by Pearson Education, Inc. All Rights Reserved.
Searching Topics Sequential Search Binary Search.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Algorithm Complexity is concerned about how fast or slow particular algorithm performs.
LECTURE 9 CS203. Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search) and sorting (selection sort.
Algorithm Analysis 1.
CS 302 Data Structures Algorithm Efficiency.
CIS265/506 Cleveland State University – Prof. Victor Matos
Chapter 22 Developing Efficient Algorithms
Analysis of Algorithms
Analysis of Algorithms
Algorithm Efficiency and Sorting
Analysis of Algorithms
CS2013 Lecture 5 John Hurley Cal State LA.
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
IST311 - CIS265/506 Cleveland State University – Prof. Victor Matos
Analysis of Algorithms
Presentation transcript:

{ CS203 Lecture 7 John Hurley Cal State LA

2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search) and sorting (selection sort vs. insertion sort). Which one is better? One possible approach to answer this question is to implement these algorithms in Java and run the programs to get execution time. But there are two problems for this approach:  First, there are many tasks running concurrently on a computer. The execution time of a particular program is dependent on the system load.  Second, the execution time is dependent on specific input. Consider linear search and binary search for example. If an element to be searched happens to be the first in the list, linear search will find the element quicker than binary search.

3 Growth Rate It is very difficult to compare algorithms by measuring their execution time. To overcome these problems, a theoretical approach was developed to analyze algorithms independent of computers and specific input. This approach approximates the effect of a change on the size of the input. In this way, you can see how fast an algorithm’s execution time increases as the input size increases, so you can compare two algorithms by examining their growth rates.

4 Big O Notation Consider linear search. The linear search algorithm compares the key with the elements in the array sequentially until the key is found or the array is exhausted. If the key is not in the array, it requires n comparisons for an array of size n. If the key is in the array, it requires n/2 comparisons on average. The algorithm’s execution time is proportional to the size of the array. If you double the size of the array, you will expect the number of comparisons to double. The algorithm grows at a linear rate. The growth rate has an order of magnitude of n. Computer scientists use the Big O notation to abbreviate for “order of magnitude.” Using this notation, the complexity of the linear search algorithm is O(n), pronounced as “order of n.”

5 Best, Worst, and Average For the same input size, an algorithm’s execution time may vary, depending on the input. An input that results in the shortest execution time is called the best-case input and an input that results in the longest execution time is called the worst-case input. Best-case and worst-case are not representative, but worst-case analysis is very useful. You can show that the algorithm will never be slower than the worst-case. An average-case analysis attempts to determine the average amount of time among all possible input of the same size. Average-case analysis is ideal, but difficult to perform, because it is hard to determine the relative probabilities and distributions of various input instances for many problems. Worst-case analysis is easier to obtain and is thus common. So, the analysis is generally conducted for the worst-case.

6 Ignore Multiplicative Constants The linear search algorithm requires n comparisons in the worst-case and n/2 comparisons in the average-case. Using the Big O notation, both cases require O(n) time. The multiplicative constant (1/2) can be omitted. Algorithm analysis is focused on growth rate. The multiplicative constants have no impact on growth rates. The growth rate for n/2 or 100n is the same as n, i.e., O(n) = O(n/2) = O(100n).

7 Ignore Non-Dominating Terms Consider the algorithm for finding the maximum number in an array of n elements. If n is 2, it takes one comparison to find the maximum number. If n is 3, it takes two comparisons to find the maximum number. In general, it takes n-1 comparisons to find maximum number in a list of n elements. Algorithm analysis is concerned with large input size. If the input size is small, there is no significance to estimate an algorithm’s efficiency. As n grows larger, the n part in the expression n-1 dominates the complexity. The Big O notation allows you to ignore the non-dominating part (e.g., -1 in the expression n-1) and highlight the important part (e.g., n in the expression n-1). So, the complexity of this algorithm is O(n).

8 Useful Mathematic Summations

9 Repetition: Simple Loops T(n) = (a constant c) * n = cn = O(n) for (i = 1; i <= n; i++) { k = k + 5; k = k + 5;} constant time executed n times Ignore multiplicative constants (e.g., “c”). Time Complexity

10 Repetition: Nested Loops T(n) = (a constant c) * n * n = c n 2 = O(n 2 ) for (i = 1; i <= n; i++) { for (j = 1; j <= n; j++) { for (j = 1; j <= n; j++) { k = k + i + j; k = k + i + j; }} constant time executed n times Ignore multiplicative constants (e.g., “c”). Time Complexity inner loop executed n times

11 Repetition: Nested Loops T(n) = c + 2c + 3c + 4c + … + nc = c n(n+1)/2 = (c/2) n 2 + (c/2)n = O(n 2 ) for (i = 1; i <= n; i++) { for (j = 1; j <= i; j++) { for (j = 1; j <= i; j++) { k = k + i + j; k = k + i + j; }} constant time executed n times Ignore non-dominating terms Time Complexity inner loop executed i times Ignore multiplicative constants

12 Repetition: Nested Loops T(n) = 20 * c * n = O(n) for (i = 1; i <= n; i++) { for (j = 1; j <= 20; j++) { for (j = 1; j <= 20; j++) { k = k + i + j; k = k + i + j; }} constant time executed n times Time Complexity inner loop executed 20 times Ignore multiplicative constants (e.g., 20*c)

13 Sequence T(n) = c * * c * n = O(n) for (i = 1; i <= n; i++) { for (j = 1; j <= 20; j++) { for (j = 1; j <= 20; j++) { k = k + i + j; k = k + i + j; }} executed n times Time Complexity inner loop executed 20 times for (j = 1; j <= 10; j++) { k = k + 4; } executed 10 times

14 Selection T(n) = test time + worst-case (if, else) = O(n) + O(n) = O(n) if (list.contains(e)) { System.out.println(e); System.out.println(e);}else for (Object t: list) { for (Object t: list) { System.out.println(t); System.out.println(t); } Time Complexity Let n be list.size(). Executed n times. O(n)

15 Constant Time The Big O notation estimates the execution time of an algorithm in relation to the input size. If the time is not related to the input size, the algorithm is said to take constant time with the notation O(1). For example, a method that retrieves an element at a given index in an array takes constant time, because it does not grow as the size of the array increases.