Basic Concepts in Algorithmic Analysis

Slides:



Advertisements
Similar presentations
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Advertisements

Computational Complexity 1. Time Complexity 2. Space Complexity.
Chapter 1 – Basic Concepts
Chapter 3 Growth of Functions
Chapter 10 Algorithm Efficiency
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
1 Amortized Analysis Consider an algorithm in which an operation is executed repeatedly with the property that its running time fluctuates throughout the.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
1 Data Structures A program solves a problem. A program solves a problem. A solution consists of: A solution consists of:  a way to organize the data.
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
Algorithm Design and Analysis Liao Minghong School of Computer Science and Technology of HIT July, 2003.
Program Performance & Asymptotic Notations CSE, POSTECH.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
C. – C. Yao Data Structure. C. – C. Yao Chap 1 Basic Concepts.
Lecture 2 Computational Complexity
Mathematics Review and Asymptotic Notation
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
Analysis of Algorithm Efficiency Dr. Yingwu Zhu p5-11, p16-29, p43-53, p93-96.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Data Structure Introduction.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Lecture 2 Analysis of Algorithms How to estimate time complexity? Analysis of algorithms Techniques based on Recursions ACKNOWLEDGEMENTS: Some contents.
2-0 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 2 Theoretical.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
Data Structures and Algorithm Analysis Algorithm Analysis and Sorting
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Discrete Mathematics Chapter 2 The Fundamentals : Algorithms, the Integers, and Matrices. 大葉大學 資訊工程系 黃鈴玲.
Asymptotic Complexity
Algorithm Analysis 1.
CMPT 438 Algorithms.
Analysis of Algorithms CS 477/677
Design and Analysis of Algorithms
Theoretical analysis of time efficiency
Analysis of Algorithms
Analysis of algorithms
Growth of Functions & Algorithms
Design and Analysis of Algorithms Chapter -2
Analysis of Algorithms
Introduction to the Design and Analysis of Algorithms
Analysis of Algorithms
COMP108 Algorithmic Foundations Algorithm efficiency
Analysis of Algorithms
Lecture 2 Analysis of Algorithms
Analysis of algorithms
DATA STRUCTURES Introduction: Basic Concepts and Notations
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
Algorithm Analysis (not included in any exams!)
Asymptotic Notations Algorithms Lecture 9.
Algorithm Efficiency Chapter 10.
Data Structures Review Session
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Analysis of Algorithms
Analysis of Algorithms
Chapter 2.
Programming and Data Structure
Fundamentals of the Analysis of Algorithm Efficiency
Asst. Dr.Surasak Mungsing
ICS 353: Design and Analysis of Algorithms
At the end of this session, learner will be able to:
Analysis of algorithms
Introduction To Algorithms
Data Structures and Algorithm Analysis Searching and Sorting
Fundamentals of the Analysis of Algorithm Efficiency
Estimating Algorithm Performance
Discrete Mathematics CS 2610
Analysis of Algorithms
Algorithms and data structures: basic definitions
Presentation transcript:

Basic Concepts in Algorithmic Analysis Chapter 1 Basic Concepts in Algorithmic Analysis

Questions Given a problem How do we find an efficient algorithm for its solution? How can we compare the algorithms that solve the same problem? How should we judge the goodness of an algorithm?

Definition of algorithm An algorithm is a procedure that consists of a finite set of instructions which, given an input from some set of possible inputs, enable us to obtain an output if such an output exists or else obtain nothing at all if there is no output for that particular input through a systematic execution of the instructions

Requirements for algorithm An algorithm should halt on every input, which implies that each instructions requires a finite amount of time, and each input has a finite length. The output of a legal input should be unique.

Linear search Algorithm 1.1 LINEARSEARCH Input: An array A[1..n] of n elements and an element x Output: j if x=A[j], 1jn, and 0 otherwise 1. j1 2. while (j<n) and (xA[j]) 3. jj+1 4. end while 5. if x=A[j] then return j else return 0

Special situation What should we do if the elements in A are sorted? Example: A[1..14]=1, 4, 5, 7, 8, 9, 10, 12, 15, 22, 23, 27, 32, 35

Binary search Algorithm 1.2 BINARYSEARCH Input: An array A[1..n] of n elements sorted in nondecreasing order and an element x Output: j if x=A[j], 1jn, and 0 otherwise 1. low1, highn, j0 2. while (lowhigh) and (j=0) 3. mid(low+high)/2 4. if x=A[mid] then jmid 5. else if x<A[mid] then highmid-1 6. else lowmid+1 7. end while 8. return j

Decision tree

Theorem 1.1 The number of comparisons performed by Algorithm BINARYSEARCH on a sorted array of size n is at most logn+1

Selection sort Algorithm 1.4 SELECTIONSORT Input: An array A[1..n] of n elements Output: A[1..n] sorted in nondecreasing order 1. for i1 to n-1 2. ki 3. for ji+1 to n {Find the ith smallest element} 4. if A[j]<A[k] then kj 5. end for 6. if kj then interchange A[i] and A[k] 7. end for

Selection sort Observation: The number of element comparisons performed by Algorithm SELECTIONSORT is n(n-1)/2. The number of element assignments is between 0 and 3(n-1)

Insertion sort Algorithm 1.5 INSERTIONSORT Input: An array A[1..n] of n elements Output: A[1..n] sorted in nondecreasing order 1. for i2 to n 2. xA[i] 3. ji-1 4. while (j>0) and (A[j]>x) 5. A[j+1]A[j] 6. jj-1 7. end while 8. A[j+1]x 9. endfor

Insertion sort Observation: The number of element comparison performed by Algorithm INSERTIONSORT is between n-1 and n(n-1)/2. The number of element assignments is equal to the number of element comparison plus n-1

Bottom-up merge sorting

Merge Algorithm 1.3 MERGE Input: An array A[1..m] of elements and three indices p, q and r, with 1pq<rm, such that both the subarrays A[p..q] and A[q+1..r] are sorted individually in nondecreasing order Output: A[p..r] contains the result of merging the two subarrays A[p..q] and A[q+1..r]. A[p..r] is sorted in nondecreasing order

Merge 1. comment: B[p..r] is an auxiliary array 2. sp, tq+1, kp 3. while sq and tr 4. if A[s]A[t] then 5. B[k]A[s] 6. ss+1 7. else 8. B[k]A[t] 9. tt+1 10. end if 11. kk+1 12. endwhile 13. if s=q+1 then B[k..r]A[t..r] 14. else B[k..r]A[s..q] 15. endif 16. A[p..r]B[p..r]

Observation The number of element comparisons performed by Algorithm MERGE to merge two nonempty arrays of size n1 and n2, respectively, where n1n2, into one sorted array of size n=n1+n2 is between n1 and n-1. In particular, if the two array sizes are n/2 and n/2, the number of comparisons needed is between n/2 and n-1 The number of element assignments performed by Algorithm MERGE to merge two arrays into one sorted array of size n is exactly 2n

Bottom-up Merge Sorting Algorithm 1.6 BOTTOMUPSORT Input: An array A[1..n] of n elements Output: A[1..n] sorted in nondecreasing order 1. t1 2. while t<n 3. st, t2s, i0 4. while i+tn 5. MERGE(A, i+1, i+s, i+t) 6. ii+t 7. end while 8. if i+s<n then MERGE(A, i+1, i+s, n) 9. endwhile

Observation The total number of element comparisons performed by Algorithm BOTTOMUPSORT to sort an array of n elements, where n is a power of 2, is between (nlogn)/2 and nlogn-n+1. The total number number of element assignments done by the algorithm is exactly 2nlogn

Time complexity Running time of a program is determined by: 1) input size 2) quality of the code 3) quality of the computer system 4) time complexity of the algorithm We are mostly concerned with the behavior of the algorithm under investigation on large input instances. So we may talk about the rate of growth or the order of growth of the running time

Running times vs Input sizes

Running times vs Input sizes complexity function With present computer With computer 100 times faster With computer 1000 times faster n N1 100N1 1000N1 N2 31.6N2 n2 10N2 N3 n3 4.64N3 10N3 n5 N4 2.5N4 3.98N4 2n N5 N5 +6.64 N5 +9.97 3n N6 N6 +4.19 N6 +6.29

Elementary operation Definition: We denote by an “elementary operation” any computational step whose cost is always upperbounded by a constant amount of time regardless of the input data or the algorithm used. Example: Arithmetic operations: addition, subtraction, multiplication and division Comparisons and logical operations Assignments, including assignments of pointers when, say, traversing a list or a tree

O-notation Definition: Let f(n) and g(n) be two functions from the set of natural numbers to the set of nonnegative real numbers. f(n) is said to be O(g(n)) if there exists a natural number n0 and a constant c>0 such that nn0, f(n)cg(n) Consequently, if limf(n)/g(n) exists, then limf(n)/g(n) implies f(n)=O(g(n)) Informally, this definition says that f grows no faster than some constant times g.

-notation Definition: Let f(n) and g(n) be two functions from the set of natural numbers to the set of nonnegative real numbers. f(n) is said to be (g(n)) if there exists a natural number n0 and a constant c>0 such that nn0, f(n)cg(n) Consequently, if limf(n)/g(n) exists, then limf(n)/g(n)0 implies f(n)=(g(n)) Informally, this definition says that f grows at least as fast as some constant times g. f(n)=(g(n)) if and only if g(n)=O(f(n))

-notation Definition: Let f(n) and g(n) be two functions from the set of natural numbers to the set of nonnegative real numbers. f(n) is said to be (g(n)) if there exists a natural number n0 and two positive constants c1 and c2 such that nn0, c1g(n)f(n)c2g(n) Consequently, if limf(n)/g(n) exists, then limf(n)/g(n)=c implies f(n)=(g(n)) where c is a constant strictly greater than 0 f(n)=(g(n)) if and only if f(n)=(g(n)) and f(n)=O(g(n))

o-notation Definition: Let f(n) and g(n) be two functions from the set of natural numbers to the set of nonnegative real numbers. f(n) is said to be o(g(n)) if for every constant c>0 there exists a positive integer n0 such that f(n)<cg(n) for all nn0 Consequently, if limf(n)/g(n) exists, then limf(n)/g(n)=0 implies f(n)=o(g(n)) Informally, this definition says that f(n) becomes insignificant relative to g(n) as n approaches infinity. f(n)=o(g(n)) if and only if f(n)=O(g(n)) but g(n)O(f(n))

Space complexity The work space cannot exceed the running time of an algorithm, as writing into each memory cell requires at least a constant amount of time. S(n)=O(T(n)) In many problems there is a time-space tradeoff: The more space we allocate for the algorithm the faster it runs, and vice versa. But this is within limits: increasing the amount of space does not always result in a noticeable speedup in the algorithm running time. However it is always the case that decreasing the amount of work space required by an algorithm results in a degradation in the algorithm’s speed.

Optimal algorithm In general, if we can prove that any algorithm to solve problem II must be (f(n)), then we call any algorithm to solve problem II in time O(f(n)) an optimal algorithm for problem II

How to estimate the running time of an algorithm 1. Counting the number of iterations 2. Counting the frequency of basic operations Definition: An elementary operation in an algorithm is called a basic operation if it is of highest frequency within a constant factor among all other elementary operations 3. Using recurrence relations

Basic operation Algorithm 1.5 INSERTIONSORT Input: An array A[1..n] of n elements Output: A[1..n] sorted in nondecreasing order 1. for i2 to n 2. xA[i] 3. ji-1 4. while (j>0) and (A[j]>x) 5. A[j+1]A[j] 6. jj-1 7. end while 8. A[j+1]x 9. endfor Algorithm 1.12 MODINSERTIONSORT Input: An array A[1..n] of n elements Output: A[1..n] sorted in nondecreasing order 1. for i2 to n 2. xA[i] 3. kMODBINARYSEARCH(A[1..i-1],x) 4. for j  i-1 downto k 5. A[j+1]A[j] 6. endfor 7. A[k]x 8. endfor

Worst case and average case analysis For many problems, the performance of the algorithm is not only a function of n, but also a function of the form of the input elements worst case analysis, average case analysis, best case analysis

Amortized analysis In amortized analysis, we average out the time taken by the operation throughout the execution of the algorithm, and refer to this average as the amortized running time of that operation. Amortized analysis guarantees the average cost of the operation, and thus the algorithm, in the worst case. This is to be contrasted with the average time analysis in which the average is taken over all instances of the same size. Moreover, unlike the average case analysis, no assumptions about the probability distribution of the input are needed.

Input size Sorting and searching: number of entries in the array or list Graph: the number of vertices or edges in the graph, or both Computational geometry: the number of points, vertices, edges, line segments, polygons, etc Matrix operation: the dimensions of the input matrices Number theory and cryptography: the number of bits These “heterogeneous” measures have brought about some inconsistencies when comparing the amount of time or space required by two algorithms

Input size Algorithm 1.13 FIRST Input: A positive integer n and an array A[1..n] with A[j]=j, 1jn Output: ∑A[j] 1. sum←0 2. for j←1 to n 3. sum←sum+A[j] 4. end for 5. return sum Algorithm 1.14 SECOND Input: A positive integer n Output: ∑j=1,…,nj 3. sum←sum+j Both algorithms run in time (n). But…