CSS 342 Data Structures, Algorithms, and Discrete Mathematics I

Slides:



Advertisements
Similar presentations
CSE Lecture 3 – Algorithms I
Advertisements

Discrete Structures CISC 2315
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 10 Algorithm Efficiency
Cmpt-225 Algorithm Efficiency.
The Efficiency of Algorithms
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis Dr. Bernard Chen Ph.D. University of Central Arkansas.
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
Mathematics Review and Asymptotic Notation
Analysis of Algorithms
Algorithm Efficiency Chapter 10 Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
CS1020 Data Structures and Algorithms I Lecture Note #11 Analysis of Algorithms.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
CSS342: Algorithm Analysis1 Professor: Munehiro Fukuda.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Data Structure Introduction.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Introduction to Algorithms: Verification, Complexity, and Searching (2) Andy Wang Data Structures, Algorithms, and Generic Programming.
Algorithm Analysis Dr. Bernard Chen Ph.D. University of Central Arkansas Fall 2008.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Algorithm Analysis 1.
Chapter 10 Algorithm Efficiency
Algorithm Efficiency and Sorting
CS 302 Data Structures Algorithm Efficiency.
Chapter 2 Algorithm Analysis
Mathematical Foundation
Analysis of Algorithms
Analysis of Algorithms
COMP108 Algorithmic Foundations Algorithm efficiency
Analysis of Algorithms
Introduction to Algorithms
CSS 342 Data Structures, Algorithms, and Discrete Mathematics I
Algorithm Analysis (not included in any exams!)
CSS 342 Data Structures, Algorithms, and Discrete Mathematics I
Introduction to Algorithms Analysis
Algorithm Efficiency Chapter 10.
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Algorithm Efficiency and Sorting
Analysis of Algorithms
DS.A.1 Algorithm Analysis Chapter 2 Overview
Algorithm Efficiency: Searching and Sorting Algorithms
Chapter 2.
Algorithm Efficiency Chapter 10
Programming and Data Structure
Algorithm Analysis Bina Ramamurthy CSE116A,B.
Searching, Sorting, and Asymptotic Complexity
8. Comparison of Algorithms
At the end of this session, learner will be able to:
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Complexity Analysis (Part II)
Estimating Algorithm Performance
Analysis of Algorithms
Algorithm Efficiency and Sorting
Presentation transcript:

CSS 342 Data Structures, Algorithms, and Discrete Mathematics I Lecture 11. 2/18/2015 CARRANO CHAPT 10

Agenda HW 4 Algorithm Efficiency. Big O notation.

Quality Metrics Performance Maintainability Correctness Testability Space Time Scalability Maintainability Readability Extensibility Debug-ability Correctness Cost of bug for boxed-product v. service Testability Usability

Performance: Where in the lifecycle? Performance should be considered in three places in the lifecycle Design Algorithm Analysis Code / Data factoring Pre-Release Under load (service)

Execution Time: counting operations Traversal of linked nodes – example: Displaying data in linked chain of n nodes requires time proportional to n Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

CSS342: Algorithm Analysis Counting operations for (i = 1; i <= n; i++) { for j = 1; j <= i; j++) for (k = 1 k <= 5; k++) Task } Loop on k: 5 * t Loop on j: 𝑗=1 𝑖 5∗𝑡 = 5*t*i Loop on i: 𝑖=1 𝑛 5∗𝑡∗𝑖 = 5 * t * n * (n + 1) / 2 = (5 * t * n2) / 2 + (5 * t * n) / 2 CSS342: Algorithm Analysis

Analysis and Big O Notation Algorithm A is order f ( n ): Denoted O( f ( n )) If constants k and n0 exist Such that A requires no more than k  f ( n ) time units to solve a problem of all sizes n ≥ n0 Big O is upper-bound, not tight upper bound Algorithm A grows no faster than O(n) Tight upper bound is Big Theta Θ However, it is often abused in CS to mean the tight upper bound

An aside… with more detail Computer scientists are lazy mathematicians… From a pure mathematical sense Big O is upper-bound, not tight upper bound: Algorithm A grows no faster than O( f(n) ) Ω(n) is lower-bound: Algorithm A grows faster than Ω ( f(n) ) Tight upper bound is Θ(f(n)). Algorithm A is Θ( f(n) ) when it is both O( f(n) ) and Ω( f(n) ) O(n) is often abused in CS to mean the tight upper bound as well

Proving an algorithm / code snippet is O( f(n) ) Determine a formula for the number of important operations in algorithm or snippet as a function of n. Say g(n). Show there is a constant, k, such that k * f(n) > g(n) for all n greater than some number. Rejoice!

Complexity calculations g(n) = (5 * t * n2) / 2 + (5 * t * n) / 2 Find k and n0 such that k*n2 > f(n) for all n>n0 k = 3t, n0=5 3tn2 > 2.5tn2 + 2.5tn 3 > 2.5 + 2.5/n, n>5 g(n) is O(n2)

What is lower bound Big-O complexity of this snippet j = n; while (j >= 1) { Call.Task(); j = j / 3; } g(n): number of calls to Call.Task() g(n) = log3n + 1 Snippet is O(log3n)

What is lower bound Big-O complexity of this algorithm j = n; while (j >= 1) { for (i = 1; i <= j; i++) Call.Task(); } j = j / 2 g(n): number of calls to Call.Task()

Computer Scientist of the week Dennis Ritchie Inventor of Unix with Ken Thompson Kernel, Shell, Utilities Goals: simple, communal Inventor of C Turing award in 1983 http://www.bing.com/videos/search?q=dennis+ritchie+&FORM=HDRSC3#view=detail&mid=DB6306C75D1B677FD6AADB6306C75D1B677FD6AA

In the 1st while-loop, j = n. for-loop executes Call.Task() n times. In the 2nd while-loop, j <= n/2, n/2 times In the 3rd while-loop, j <= n/4, n/4 times In the zth while-loop, j <= n/2z n/2z times = 1 g(n) = n + (n/2) + (n/4) + (n/8) + …. + (n/2z) where z = log2n Geometric Sum: a + ar + ar2 + …. + arn = a(rn+1 – 1)/(r – 1), where r != 1. g(n) = n + n/2 + n/4 + … + n/2z = n + n*(1/2) + n*(1/2)2 + … + n * (1/2)z = n( (1/2) z-1 -1)/(1/2 – 1) = n( (1/2) z-1-1)/(-1/2) = 2n(1-1/2z-1) = n(2 – 1/2z) = n(2 – 1/n) = 2n - 1 let k = 2, n0 = 1 g(n) is O( n )

Intuitive interpretation of growth-rate functions Constant Independent of problem size O(log2n) Logarithmic Increase slowly as size increases. Ex. Binary search O(n) Linear Increase directly with size. Searching an unsorted list or malformed tree. O(nlog2n) Increase more rapidly than a liner algorithm. Ex. Merge sort O(n2) Quadratic Ex. Two nested loops; bubble sort O(n3) Cubic Ex. Three nested loops O(2n) Exponential Increase too rapidly to be practical CSS342: Algorithm Analysis

Order of growth of common functions

Analysis and Big O Notation FIGURE 10-3 A comparison of growth-rate functions: (a) in tabular form Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Analysis and Big O Notation Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Big O Simplifications O(f(n)) + O(g(n)) = O(f(n) + g(n)) Focus on the dominant factor: Low-order terms can be ignored. O(n3+4n) = O(n3) Multiplicative constant in the high-order term can be ignored. O(3n3+4) = O(n3) O(f(n)) + O(g(n)) = O(f(n) + g(n)) If f(n) = O(g(n)) and g(n) = O(h(n)) , then f(n)=O(h(n))

Worst, Best, and Average-case Analysis Worst-case analysis: Algorithm A requires no more than k * f(n) time units. It might happen rarely, if at all, in practice. Best-case analysis: Like worst-case analysis, it might happen rarely. Average-case analysis: A requires no more than k * f(n) time units for all but a finite number of values of n Difficulties: determining probability and distribution of size n and input data

Efficiency of Sequential Search 29 10 14 37 13 52 46 75 5 43 21 69 88 1 Hit! Best case O(1) Hit! Worst case O(n) Hit! Average case O(n/2) = O(n)

Complexity of Binary Search low =0 mid = (15 + 0)/2 = 7 high=15 10 a 1 5 10 13 14 21 29 37 43 46 52 69 75 88 91 99 new high=7–1=6 int binarySearch( const vector<int> &a, const int x ) { int low = 0; int high = a.size( ) – 1; int mid; while ( low <= high ) { mid = ( low + high ) / 2; if ( a[mid] < x ) low = mid + 1; else if ( a[mid] > x ) high = mid – 1; else return mid; } return NOT_FOUND; // NOT_FOUND = -1

K < 1 + log2N < K + 1 K = O(log2N) 5 10 13 14 21 29 37 43 46 52 69 75 88 91 99 1st step 2nd step 3rd step Hit! N = 2 K = 1 N = 4 K = 2 N = 8 K = 3 N = 16 K = 4 N = 2k K = K 2K-1 < N < 2K K – 1 < log2N < K K < 1 + log2N < K + 1 K = O(log2N) 4th step

Big-O complexity. Lab3 + operator on two lists Merge operator What was your efficiency? Call insert on each element? One list after the other? Merge operator

Minimum Element in an Array Given an array of N items, find the smallest item. What order (in big O) will this problem be bound to? 1 N 3 9 1 8 5 2 7 4 6 -1 10 -2

Closest Points in the Plane Given N points in a plane (that is, an x-y coordinate system), find the pair of points that are closest together. What order (in big O) will this problem be bound to? y sqrt( (x2 – x1)2 + (y2 – y1)2 ) 2,1 1,4 1,6 4,5 4,7 5,4 5,6 6,3 6,7 7,1 x 8,4

Intractable, Unsolvable, and NP problems Tractable problems Their worst-case time is proportional to a polynomial (class P) However, sometimes this can take a long time Intractable problems Their worst-case time cannot be bounded to a polynomial. Unsolvable algorithms: They have no algorithms at all. Turing’s Halting problem NP(Non-Polynomial) problems No solution with Polynomial worst-case performance, but when solved can be checked polynomial time NP-Complete Problems Class of problems that if any can be solved with polynomial worst-case, all can be solved Ex, Traveling salesperson problem