CS 213: Data Structures and Algorithms

Slides:



Advertisements
Similar presentations
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Introduction to Analysis of Algorithms
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Data Structure Algorithm Analysis TA: Abbas Sarraf
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
CS Discrete Mathematical Structures Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, 9:30-11:30a.
COMP s1 Computing 2 Complexity
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Mathematics Review and Asymptotic Notation
CS 1704 Introduction to Data Structures and Software Engineering.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Lecture 5 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 1. Complexity Bounds.
CS 615: Design & Analysis of Algorithms Chapter 2: Efficiency of Algorithms.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Algorithm Complexity Analysis (Chapter 10.4) Dr. Yingwu Zhu.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
Algorithm Analysis 1.
Lecture 2 Algorithm Analysis
Chapter 2 Algorithm Analysis
Introduction to Algorithms
Mathematical Foundation
Introduction to Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Catie Baker Spring 2015.
COMP 53 – Week Seven Big O Sorting.
Introduction to Algorithms
Randomized Algorithms
Algorithm Analysis CSE 2011 Winter September 2018.
CS 3343: Analysis of Algorithms
Algorithm Analysis (not included in any exams!)
CS 3343: Analysis of Algorithms
Data Structures & Algorithms
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
Unit-2 Divide and Conquer
CS 3343: Analysis of Algorithms
CSCI 2670 Introduction to Theory of Computing
Randomized Algorithms
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
Big-O, Ω, ϴ Trevor Brown CSC263 Tutorial 1 Big-O, Ω, ϴ Trevor Brown
CS200: Algorithms Analysis
Divide and Conquer Algorithms Part I
Searching, Sorting, and Asymptotic Complexity
Searching, Sorting, and Asymptotic Complexity
CSE 373: Data Structures & Algorithms
Introduction to Algorithms
Introduction To Algorithms
David Kauchak cs161 Summer 2009
An Introduction to Programming though C++
Complexity Analysis (Part II)
Discrete Mathematics CS 2610
Analysis of Algorithms
Presentation transcript:

CS 213: Data Structures and Algorithms Abhiram G. Ranade Analysis of the time taken by an algorithm

We would like to write programs that are ... Correct Fast Use less memory Are easy to understand Are easy to modify if needed ... Today: Fast

Some terms (Computational) Problem: Something like “Find GCD”, Sorting, matrix multiplication. Map from set of acceptable input values to desired output values. Acceptable input value: (Problem) Instance Instances of “Find GCD” all pairs of positive integers Instances of Sorting: sequences of numbers “Size” of the instance: Formal definition: number of bits needed to specify the instance Informal: a convenient parameter or set of parameters from which the size can be calculated. Size(Sorting): number of keys Size(Matrix multiplication): number of rows, columns of the matrices.

When do you call a program fast? Compile and run – if it finishes quickly, it is fast! It might run fast because the computer on which it runs is better. Most programs run fast if you give as input a small instance. Even among instances of the same length, some may take more time than others.

Our (very crude) estimate of time taken by a program Non crude estimate of time taken by a program = number of operations performed by the program The number of operations is measured as a function of n, the size of the instance. Our estimate: We only say something like Program X takes time proportional to n. Program Y takes time proportional to n2 If for the same n different instances take different times, we consider the largest of those. “Time taken by by Program Y (for any instance) is at most proportional to n2”

Notation “Time taken is O(n)” : Similarly O(n2), O(nlogn), ... Time taken by the worst instance is at most kn for some k. Time taken by all instances is at most kn for some k. Similarly O(n2), O(nlogn), ...

Comparing algorithms If one algorithm takes time proportional to n and the other to n2 then the former is considered better. For large enough n, the time for the former will actually be small. Example: 100n < n2 for n>100. Computers solve large problem instances ⇒ so more interest in large n.

Estimating time taken: Example 1 Vector addition. for(int i=0; i<n; i++) c[i] = a[i] + b[i]; In each iteration some fixed number of operations are performed. +, ==, [], <, n iterations. Total time: proportional to n, or O(n)

Example 2: Determine if an array A[0..n-1] contains a given number t bool found = false; for(int i=0; i<n; i++){ if(A[i] == t) { found = true; break; } t is present in array at position i ⇒ i iterations executed. t is not present in array ⇒ n iterations executed (Worst) Time = what is needed for n iterations + time for statement 1. = cn + c’ ≤ c’’n, i.e. at most proportional to n, or O(n)

Example 3: multiplying nxn square matrices for(int i=0; i<n; i++){ for(int j=0; i<n; j++){ c[i][j] = 0; for(int k=0; k<n; k++){ c[i][j] += a[i][k] * b[k][j] } Time = count how many times each statement is executed. = n, n2, n3, n3 times Each statement takes some fixed time to execute once Time = an + bn2 + cn3 ≤ (a+b+c)n3 = proportional to n3 = O(n3)

Example 4: what if there is a function call // find if t is present in A[0..n-1] for(int i=0; i<m; i++){ cin >> t; cout << present(A,n,t); } Time = m * (c + time for present(A,n,t)) We need to know time for present(A,n,t). say O(n) So total time = O(mn) Expressing time in terms of two parameters is OK.

Example 5: recursion int factorial(int n){ if(n==0) return 1; else return n * factorial(n-1); } Standard idea: “Let T(n) denote time to compute factorial(n).” T(n) : we are defining an infinite number of variables. From the code: T(0) = fixed number, independent of n. T(n) = fixed number + time to execute factorial(n-1) = c + T(n-1) = c + c + T(n-2) = ... = cn + T(0) = cn + k ≤ (c+k)n, i.e. O(n)

Example 6: Recursive GCD int gcd(int m, int n){ if(m % n == 0) return n; else return gcd(n, m%n); } T(n) = time for gcd with second argument n, no matter what first argument is. T(n) = c’ + T(n’), where n>n’ = m mod n. = 2c’ + T(n’’) where n’’ = n mod n’ n = kn’ + n’’ ... Definition of mod ≥ n’ + n’’ ... Because n > n’

n > 2n’’ ... Because n’’ < n’ If we resubstitute 2logn times in T(n), we get T(n) = 2c log n + T(n) where n < n/2log n = 1. n < 1. But we never call gcd with second argument < 1. So number of recursive calls < 2logn T(n) < 2clog n, i.e. O(log n)

Example 7: Analysis of mergesort Read chapter 16 of book.

Remarks We only ask what function of n is time proportional to. (at most). O(f(n)) = at most proportional to f(n) In iterative programs to estimate time taken we just look at the statement that is executed most. Recursive algorithms are analysed by writing down and solving recurrences. Recurrence = Relationship between T(n) and T(smaller values), e.g. T(n) ≤ c + T(n-1) We are only getting an upper bound on the time, the actual time can be lot smaller, which our analysis may not sometimes reveal.

Insertion sort Basic idea: as you read, keep read values in an array in a sorted manner. At the beginning of the i+1th iteration: A[0..i-1] holds i keys in non-decreasing order. In i+1th iteration We read 1 key. We rearrange keys and find a place to insert the new key so that the new contents are also sorted. Write the code for insertion sort with invariants. Analyse its running time.

Solution The code, alongwith required invariants and additional comments, is also placed on the webpage. It will be confusing how much of this to write; but at least the invariants should be written. We could have broken up the inner loop into 2 steps, first find j where to insert x, then move A[j+1..i-1] to A[j+2..i], then store A[j] = x; The code uses A[i..j] to mean elements A[i],A[i+1],...,A[j] If i>j, then A[i..j] means the empty set. If we write A[i..j] > x, we mean every element is larger. If i > j, then A[i..j] is empty, and A[i..j] > x is trivially true. Running time: If the elements are given in order, then time will be O(n). If elements are in reverse order, then time will be O(n2). No other input will take longer. So overall O(n2).