Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.

Slides:



Advertisements
Similar presentations
CSE Lecture 3 – Algorithms I
Advertisements

Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Reference: Tremblay and Cheston: Section 5.1 T IMING A NALYSIS.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Eleg667/2001-f/Topic-1a 1 A Brief Review of Algorithm Design and Analysis.
Algorithm Efficiency and Sorting
Algorithm Efficiency and Sorting Bina Ramamurthy CSE116A,B.
Analysis of Algorithm.
Elementary Data Structures and Algorithms
Algorithm Analysis (Big O)
COMP s1 Computing 2 Complexity
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
Week 11 Introduction to Computer Science and Object-Oriented Programming COMP 111 George Basham.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Iterative Algorithm Analysis & Asymptotic Notations
Analysis of Algorithms
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Data Structure Introduction.
Algorithms and data structures: basic definitions An algorithm is a precise set of instructions for solving a particular task. A data structure is any.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithm Analysis (Big O)
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Searching Topics Sequential Search Binary Search.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Analysis of Algorithms
CSC 427: Data Structures and Algorithm Analysis
Analysis of Algorithms
Lecture 14 Searching and Sorting Richard Gesick.
Introduction to Search Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Introduction to Algorithms
GC 211:Data Structures Algorithm Analysis Tools
Introduction to Algorithms
Building Java Programs
Algorithm design and Analysis
Chapter 12: Analysis of Algorithms
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
What is CS 253 about? Contrary to the wide spread belief that the #1 job of computers is to perform calculations (which is why the are called “computers”),
Algorithm Efficiency Chapter 10.
Lecture 11 Searching and Sorting Richard Gesick.
Data Structures Review Session
CS 201 Fundamental Structures of Computer Science
Algorithm Efficiency and Sorting
Analysis of Algorithms
Algorithm Efficiency: Searching and Sorting Algorithms
Algorithm Analysis Bina Ramamurthy CSE116A,B.
GC 211:Data Structures Algorithm Analysis Tools
Algorithms Analysis Algorithm efficiency can be measured in terms of:
Analysis of Algorithms
CSC 427: Data Structures and Algorithm Analysis
CSE 1342 Programming Concepts
At the end of this session, learner will be able to:
Analysis of Algorithms
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Analysis of Algorithms
Analysis of Algorithms
Module 8 – Searching & Sorting Algorithms
Analysis of Algorithms
Algorithms and data structures: basic definitions
Presentation transcript:

Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases Every algorithm has certain inputs that allow it to perform far better than would be expected. Small data sets An algorithm may not display its inefficiencies if the data set on which it is run is too small. Hardware differences Software differences 11/11/2019 Algorithm Analysis

Performance Factors There are many factors that influence the performance of an algorithm Number of copies/comparisons made Number of statements executed Varying implementations on different machines Variations in the input set Most often we will estimate the performance of an algorithm by analyzing the algorithm 11/11/2019 Algorithm Analysis

Measuring Efficiency The efficiency of an algorithm is a measure of the amount of resources consumed in solving a problem of size n. The resource we are most interested in is time We can use the same techniques we will present to analyze the consumption of other resources, such as memory space. It would seem that the most obvious way to measure the efficiency of an algorithm is to run it and measure how much processor time is needed 11/11/2019 Algorithm Analysis

Asymptotic Analysis The time it takes to solve a problem is usually an increasing function of the size of the problem The bigger the problem the longer it will take to solve. We need a formula that associates n, the problem size, with t, the time required to obtain a solution. We can get this type of information using asymptotic analysis to develop expressions of the form: t ≈ O[f(n)] 11/11/2019 Algorithm Analysis

Formal Definition t ≈ O(f(n)) if there are constants c and n0 such that t  cf(n) when nn0 Note that the constant is not part of the function used to describe the performance of the algorithm (i.e. there is no O(2n), only O(n)) Big “O” gives the worst case performance, the average performance of the algorithm might be better. 11/11/2019 Algorithm Analysis

Big Oh If the order of the algorithm were, for example, O(n3), then the relationships between t and n would be given by a formula of the form t <= M n3 Although we often do not know anything about the value of the constant M, we can say that If the size of the problem doubles, the total time needed to solve the problem will increase about eightfold If the problem size triples, the overall running time will be about 27 times greater. 11/11/2019 Algorithm Analysis

Performance Classification f(n) Classification 1 Constant: run time is fixed, and does not depend upon n. Most instructions are executed once, or only a few times, regardless of the amount of information being processed log n Logarithmic: when n increases, so does run time, but much slower. When n doubles, log n increases by a constant, but does not double until n increases to n2. Common in programs which solve large problems by transforming them into smaller problems. n Linear: run time varies directly with n. Typically, a small amount of processing is done on each element. n log n When n doubles, run time slightly more than doubles. Common in programs which break a problem down into smaller sub-problems, solves them independently, then combines solutions n2 Quadratic: when n doubles, runtime increases fourfold. Practical only for small problems; typically the program processes all pairs of input (e.g. in a double nested loop). n3 Cubic: when n doubles, runtime increases eightfold 2n Exponential: when n doubles, run time squares. This is often the result of a natural, “brute force” solution. 11/11/2019 Algorithm Analysis

How To Use It For reasonably large problems we always want to select an algorithm of the lowest order possible. If algorithm A is O[f(n)] and algorithm B is O[g(n)], then algorithm A is said to be of a lower order than B if f(n) < g(n) for all n greater than some constant k. So given a choice of an algorithm that is O(n2) and an algorithm than O(n3) we would go for the O(n2) algorithm. 11/11/2019 Algorithm Analysis

Algorithm Efficiencies 11/11/2019 Algorithm Analysis

The Critical Section When analyzing an algorithm, we do not care about the behavior of each statement We focus our analysis on the part of the algorithm where the greatest amount of its time is spent A critical section has the following characteristics: It is an operation central to the functioning of the algorithm, and its behavior typifies the overall behavior of the algorithm It is contained inside the most deeply nested loops of the algorithm and is executed as often as any other section of the algorithm. 11/11/2019 Algorithm Analysis

The Critical Section The critical section can be said to be at the "heart" of the algorithm We can characterize the overall efficiency of an algorithm by counting how many times this critical section is executed as a function of the problem size The critical section dominates the completion time If an algorithm is divided into two parts, the first taking O(f(n)) followed by a second that takes O(g(n)), then the overall complexity of the algorithm is O(max[f(n), g(n)]). The slowest and most time-consuming section determines the overall efficiency of the algorithm. 11/11/2019 Algorithm Analysis

Analogy Imagine that you have the job of delivering a gift to someone living in another city thousands of miles away The job involves the following three steps: wrapping the gift driving 2,000 miles to the destination city delivering the package to the proper person The critical section obviously step 2. Changing step 2 to flying instead of driving would have a profound impact of the completion time of the task. 11/11/2019 Algorithm Analysis

O(n) where n is the length of the array  Linear Sequential Search public static int search( int array[], int target ) { // Assume the target is not in the array int position = -1;   // Step through the array until the target is found // or the entire array has been searched. for ( int i=0; i < array.length && position == -1; i=i+1 ) { // Is the current element what we are looking for? if ( array[ i ] == target ) { position = i; } // Return the element's position return position; O(n) where n is the length of the array  Linear 11/11/2019 Algorithm Analysis

O(log2n) where n is the length of the array  Logarithmic Binary Search public static int search( int array[], int target ) { int start = 0; // The start of the search region int end = array.length; // The end of the search region int position = -1; // Position of the target // While there's something to search and we have not found it while ( start <= end && position == -1 ) { int middle = (start + end) / 2;   // Determine where the target should be if ( target < array[ middle ] ) { end = middle – 1; // Smaller must be in left half } else if ( target > array[ middle ] ) { start = middle + 1; // Larger must be in right half } else { position = middle; // Found it!! }   return position; O(log2n) where n is the length of the array  Logarithmic 11/11/2019 Algorithm Analysis

O(n2) where n is the length of the array  Quadractic Selection Sort O(n2) where n is the length of the array  Quadractic public static void selectionSort( int array[] ) { // Find the smallest element and move it to the top // array.length – 1 times. The result is a sorted array for ( int i = 0; i < array.length - 1; i = i + 1 ) {   for ( int j = i + 1; j < array.length; j = j + 1 ) { if ( array[ j ] < array[ i ] ) { // Elements out of order -- swap int tmp = array[ i ]; array[ i ] = array[ j ]; array[ j ] = tmp; } 11/11/2019 Algorithm Analysis

O(n2) where n is the length of the array  Quadractic Bubble Sort public static void bubbleSort( int array[] ) { int i = 0; // How many elements are sorted - initially none boolean swap; // Was a swap made during this pass?   // Keep making passes through the array until it is sorted do { swap = false; // Swap adjacent elements that are out of order for ( int j = 0; j < array.length - i - 1; j++ ) { // If the two elements are out of order - swap them if ( array[ j ] > array[ j + 1 ] ) { int tmp = array[ j ]; array[ j ] = array[ j + 1 ]; array[ j + 1 ] = tmp; swap = true; // Made a swap – array might not be sorted } i = i + 1; // One more element in the correct place } while ( swap ); O(n2) where n is the length of the array  Quadractic 11/11/2019 Algorithm Analysis

O(n2) where n is the length of the array  Quadractic Insertion Sort O(n2) where n is the length of the array  Quadractic public static void insertionSort( int array[] ) { // i marks the beginning of the unsorted array for ( int i = 1; i < nums.length; i = i + 1 ) { // Insert the first element in the unsorted portion of // the array into the sorted portion. Works from the // bottom of the sorted portion to the top for ( int j=i-1; j>=0 && nums[j+1] < nums[j]; j=j+1 ) { // Since we have not found the correct position for the // new element - move the elements in the sorted // portion down one int tmp = nums[ j ]; nums[ j ] = nums[ j + 1 ]; nums[ j + 1 ] = tmp; } 11/11/2019 Algorithm Analysis