LECTURE 9 CS203. Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search) and sorting (selection sort.

Slides:



Advertisements
Similar presentations
Lecture 9 CS203. Math Ahead!  The rest of this lecture uses a few math principles that you learned in HS but may have forgotten.  Do not worry (too.
Advertisements

Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved L17 (Chapter 23) Algorithm.
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
CS107 Introduction to Computer Science
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithm.
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
COMP s1 Computing 2 Complexity
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithm Lecture 3 Recurrence, control structure and few examples (Part 1) Huma Ayub (Assistant Professor) Department of Software Engineering.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
Chapter 19 Searching, Sorting and Big O
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
Analysis of Algorithms
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All rights reserved. 1 Chapter 22 Developing Efficient Algorithms.
© Copyright 2012 by Pearson Education, Inc. All Rights Reserved. 1 Chapter 16 Developing Efficient Algorithms.
Liang, Introduction to Java Programming, Ninth Edition, (c) 2013 Pearson Education, Inc. All rights reserved. 1 Chapter 24 Developing Efficient Algorithms.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
C++ How to Program, 7/e © by Pearson Education, Inc. All Rights Reserved.
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Algorithm Complexity is concerned about how fast or slow particular algorithm performs.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Chapter 22 Developing Efficient Algorithms
16 Searching and Sorting.
19 Searching and Sorting.
Applied Discrete Mathematics Week 2: Functions and Sequences
Introduction to Analysis of Algorithms
Analysis of Algorithms
CSC 427: Data Structures and Algorithm Analysis
Analysis of Algorithms
Introduction to complexity
Analysis of Algorithms
Introduction to Algorithms
Lecture 3 of Computer Science II
Introduction to Algorithms
Algorithm Analysis CSE 2011 Winter September 2018.
CS 3343: Analysis of Algorithms
Algorithm Analysis (not included in any exams!)
Building Java Programs
Algorithm design and Analysis
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
What is CS 253 about? Contrary to the wide spread belief that the #1 job of computers is to perform calculations (which is why the are called “computers”),
CS 201 Fundamental Structures of Computer Science
Algorithm Efficiency and Sorting
Analysis of Algorithms
CS2013 Lecture 5 John Hurley Cal State LA.
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
24 Searching and Sorting.
Building Java Programs
Recurrences (Method 4) Alexandra Stefan.
CSC 427: Data Structures and Algorithm Analysis
IST311 - CIS265/506 Cleveland State University – Prof. Victor Matos
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Analysis of Algorithms
Algorithms and data structures: basic definitions
Presentation transcript:

LECTURE 9 CS203

Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search) and sorting (selection sort vs. insertion sort). Which one is better? One approach is to implement these algorithms in Java and run the programs to get execution time. But there are two problems: First, there are many tasks running concurrently on a computer. The execution time of a particular program is dependent on the system load. Second, the execution time is usually dependent on specific input. Consider linear search and binary search: if the key happens to be the first element in the list, linear search will find the element quicker than binary search, which is unrepresentative of the general performance of the algorithms. 2

Growth Rate In addition to the problems mentioned in the last slide, the importance of algorithm performance is increasingly important as input size increases. We may not care that linear search is less efficient than binary search if we only have a few values to search. The standard approach to measuring algorithm efficiency approximates the effect of increasing the size of the input. In this way, you can see how an algorithm’s execution time increases with input size. 3

Math Ahead! The rest of this lecture uses a few math principles that you learned in HS but may have forgotten. Do not worry (too much) if your math background is shaky. We introduce mathematical material at a gentle pace. When I started working on my MS here, I had not taken a math class in 25 years, and I managed to learn this material. You can too. On the other hand, if you want to study this material in more detail, you will not be disappointed. You just have to wait until you take CS312. 4

Summations Summation is the operation of adding a sequence of numbers; the result is their sum or total. Summation is designated with the Greek symbol sigma (∑) 5 Find the sum Start from 1 Iterate through values of i Stop at 100 This summation means "the sum of all integers between 1 and 100 inclusive"

Useful Mathematic Summations 6

7 The first summation on the previous slide would usually be expressed this way: For the following values, the value of the summation is 5050, since 100(101)/2 = 10100/2 = 5050:

Logarithms The logarithm of a number with respect to a base number is the exponent to which the base must be raised to yield the number. Here is the notation: log b (y) = x where b is the base. The parentheses are usually left out in practice. Examples: log 2 8 = 3 log 10 10,000 = 4 The word "logarithm" is derived (by an early-modern mathematician) from Greek and means roughly "number reasoning." Interestingly (to me, anyway) it is completely unrelated to it anagram "algorithm," which is derived from an Arabic (prefixed with al-) version of the name of the medieval Persian mathematician Kwarizmi 8

Logarithms In other fields, log without an stated base is understood to refer to log 10 or log e. In CS, log without further qualification is understood to refer to log 2, pronounced "log base 2" or "binary logarithm." The base is not important in comparing algorithms, but, as you will see, the base is almost always 2 when we are calculating the complexity of programming algorithms. 9

Big O Notation Linear search compares the key with the elements in the array sequentially until the key is found or the array is exhausted. If the key is not in the array, it requires n comparisons for an array of size n. If the key is in the array, it requires, in the average case, n/2 comparisons. This algorithm’s execution time is proportional to the size of the array. If you double the size of the array, you will expect the number of comparisons to double in the average case as well as in the worst case. The algorithm grows at a linear rate. The growth rate has an order of magnitude of n. Computer scientists use the Big O notation as an abbreviation for “order of magnitude.” Using this notation, the complexity of the linear search algorithm is O(n), pronounced as “order of n.” 10

Best, Worst, and Average An input that results in the shortest execution time for a given input size is called the best-case input and an input that results in the longest execution time is called the worst-case input. Best-case and worst-case are not representative, but worst- case analysis is very useful. You can show that the algorithm will never be slower than the worst-case. An average-case analysis attempts to determine the average amount of time among all possible input of the same size. Average-case analysis is ideal, but difficult to perform, because it is hard to determine the relative probabilities and distributions of various input instances for many problems. Worst-case analysis is easier to obtain and is thus common. So, the analysis is generally conducted for the worst-case. 11

Ignore Multiplicative Constants The linear search algorithm requires n comparisons in the worst-case and n/2 comparisons in the average-case. Using the Big O notation, both cases require O(n) time. The multiplicative constant (1/2) can be omitted. Multiplicative constants have no impact on the order of magnitude of the growth rate. The growth rate for n/2 or 100n is the same as n, i.e., O(n) = O(n/2) = O(100n). 12

Ignore Non-Dominating Terms Consider the algorithm for finding the maximum number in an array of n elements. If n is 2, it takes one comparison to find the maximum number. If n is 3, it takes two comparisons to find the maximum number. In general, it takes n-1 comparisons to find maximum number in a list of n elements. Algorithm analysis is concerned with large input size. If the input size is small, there is no need to estimate an algorithm’s efficiency. As n grows larger, the n part in the expression n-1 dominates the complexity. The Big O notation allows you to ignore the non-dominating part (e.g., -1 in the expression n-1) and highlight the important part (e.g., n in the expression n-1). So, the complexity of this algorithm is O(n). 13

Repetition: Simple Loops for (i = 1; i <= n; i++) { k = k + 5; } 14 T(n) = (a constant c) * n = cn = O(n) constant time executed n times Ignore multiplicative constants (e.g., “c”). Time Complexity

Repetition: Nested Loops for (i = 1; i <= n; i++) { for (j = 1; j <= n; j++) { k = k + i + j; } 15 T(n) = (a constant c) * n * n = cn 2 = O(n 2 ) executed n times Ignore multiplicative constants (e.g., “c”). Time Complexity inner loop executed n times

Repetition: Nested Loops for (i = 1; i <= n; i++) { for (j = 1; j <= i; j++) { k = k + i + j; } 16 T(n) = c + 2c + 3c + 4c + … + nc = cn(n+1)/2 = (c/2)n 2 + (c/2)n = O(n 2 ) executed n times Ignore non-dominating terms Time Complexity inner loop executed i times Ignore multiplicative constants

Repetition: Nested Loops for (i = 1; i <= n; i++) { for (j = 1; j <= 20; j++) { k = k + i + j; } 17 T(n) = 20 * c * n = O(n) executed n times Time Complexity inner loop executed 20 times Ignore multiplicative constants (e.g., 20*c)

Sequence for (i = 1; i <= n; i++) { for (j = 1; j <= 20; j++) { k = k + i + j; } 18 T(n) = c * * c * n = O(n) executed n times Time Complexity inner loop executed 20 times for (j = 1; j <= 10; j++) { k = k + 4; } executed 10 times

Selection if (list.contains(e)) { System.out.println(e); } else for (Object t: list) { System.out.println(t); } 19 T(n) = test time + worst-case (if, else) = O(n) + O(n) = O(n) Time Complexity Let n be list.size(). Executed n times.

Constant Time The Big O notation estimates the execution time of an algorithm in relation to the input size. If the time is not related to the input size, the algorithm is said to take constant time with the notation O(1). For example, a method that retrieves an element at a given index in an array takes constant time, because it does not grow as the size of the array increases. 20

Time Complexity for ArrayList and LinkedList 21

Recurrence Relations A recurrence relation is a rule by which a sequence is generated Eg, the sequence 5, 8, 11, 14, 17, 20… Is described by the recurrence relation a 0 = 5 a n = a n Divide-and-conquer algorithms are often described in terms of recurrence relations 22

Analyzing Binary Search Binary Search searches an array or list that is *sorted* In each step, the algorithm compares the search key value with the key value of the middle element of the array. If the keys match, then a matching element has been found and its index, or position, is returned. Otherwise, if the search key is less than the middle element's key, then the algorithm repeats its action on the sub-array to the left of the middle element or, if the search key is greater, on the sub-array to the right. If the remaining array at any step to be searched is empty, then the key cannot be found in the array and a special "not found" indication is returned. 23

Logarithmic Time: Analyzing Binary Search Each iteration in binary search contains a fixed number of operations, denoted by c. Let T(n) denote the time complexity for a binary search on a list of n elements. Since we are studying the rate of growth of execution time, we can define T(1) to equal 1. Assume n is a power of 2; this makes the math simpler and, if it is not true, the difference is irrelevant to the order of complexity. Let k=log n. In other words, n = 2 k Since binary search eliminates half of the input after each comparison, 24 CS-style recurrence relation c is the cost of one iteration

Logarithmic Time Ignoring constants and smaller terms, the complexity of the binary search algorithm is O(log n). An algorithm with the O(log n) time complexity is called a logarithmic algorithm. The base of the log is 2, but the base does not affect a logarithmic growth rate, so it can be omitted. The time to execute a logarithmic algorithm grows slowly as the problem size increases. If you square the input size (with base = 2), the time taken doubles. 25