1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

the fourth iteration of this loop is shown here
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Complexity Analysis (Part II)
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Asymptotic Notations Iterative Algorithms and their analysis
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Analysis and Design of Algorithms. According to math historians the true origin of the word algorism: comes from a famous Persian author named ál-Khâwrázmî.
Program Efficiency and Complexity
Program Performance & Asymptotic Notations CSE, POSTECH.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms1 The Goal of the Course Design “good” data structures and algorithms Data structure is a systematic way of organizing and accessing.
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
Analysis of Algorithms
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Asymptotic Analysis-Ch. 3
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
©Silberschatz, Korth and Sudarshan3.1 Algorithms Analysis Algorithm efficiency can be measured in terms of:  Time  Space  Other resources such as processors,
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Data Structure Introduction.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Introduction to Analysis of Algorithms CS342 S2004.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
CISC 235: Topic 1 Complexity of Iterative Algorithms.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Linda Shapiro Winter 2015.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
Analysis of Algorithms
COMP108 Algorithmic Foundations Algorithm efficiency
Introduction to Algorithms
Complexity Analysis.
CS 3343: Analysis of Algorithms
Introduction to Algorithms Analysis
CS 3343: Analysis of Algorithms
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
Programming and Data Structure
Performance Evaluation
At the end of this session, learner will be able to:
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Complexity Analysis (Part II)
Analysis of Algorithms
Presentation transcript:

1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group - TU Darmstadt Prof. Neeraj Suri Constantin Sarbu Brahim Ayari Dan Dobre Abdelmajid Khelil

2 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Given: Array A of integers and a constant c. Question: Is c in A? boolean contains (int [] A, int c) { int n = A.length; boolean found = false; for(int i = 0, i < n; i++) if (A [i] == c) found = true; return (found); } Input A c nAssignment Comparisons Array access Increments [1,4,2,7] [2,7,6,1] [2,1,8,4,19,7,16,3] [4,4,4,4,4,4] Memory Complexity (in Java) int: 4 bytes, boolean: 1 byte Memory used: size(A) + size(c) + size(n) + size(i) + size(found) = n*4+13 Time Complexity counting operations Remember: Sequential Search

3 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Time complexity ‣ gives a simple characterization of an algorithm’s efficiency ‣ allows to compare it to alternative algorithms In the last lecture we determined exact running time, but extra precision usually doesn’t worth the effort of computing it Large input sizes: constants and lower order terms are ruled out This means we are studying asymptotic complexity of algorithms  we are interested in how the running time increases with the size of the input in the limit Usually, an algorithm which is asymptotically more efficient is not the best choice for very small inputs ;) Why Asymptotic Complexity?

4 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Upper Bounds: O (big O) - Notation ‣ Properties, proof f  O(g), sum and product rules ‣ Loops, conditional statements, conditions, procedures ‣ Examples: Sequential search, selection sort Lower Bounds:  (Omega) – Notation Bands:  (Theta) - Notation Today: Efficiency Metrics - Complexity

5 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity n c g(n) f(n) n0n0 c > 0  n 0   n > n 0 cg(n) >= f(n) T(n) Asymptotic Time Complexity: Upper Bound

6 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity N Given f : N  R + g : N  R + Definition: O( g ) = { f |  n 0  N, c  R, c > 0:  n  n 0 f ( n )  cg ( n ) } Intuitively: O( g ) = the set of all functions f, that grow, at most, as fast as g One says: „If f  O( g ), then g is an asymptotical upper bound for f” O-Notation (pronounce: “big-Oh”)

7 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity O( n 4 ) = {…, n, n 2, nlogn, n 3, n 4, 3n 4, cn 3, … } ‣ n 3  O( n 4 ) ‣ nlogn  O( n 4 ) ‣ n 4  O( n 4 ) Generally: „slower growth  O (faster growths)“ Example

8 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Often shortened as f = O( g ) instead of f  O( g ) ‣ But: f = O( g ) is no equality in the common meaning, only interpretable from left to right! Normally, for analysis of algorithms: ‣ f : N  N and g : N  N, ‣ since the input is the size of the input data and the value is the amount of elementary operations For average case analysis the set R + is also used: ‣ f : N  R + and g : N  R + O-Notation

9 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Example O-Notation T 1 ( n ) = n + 3  O( n )because n + 3  2 n  n  3 T 2 ( n ) = 3 n + 7  O( n ) T 3 ( n ) = 1000 n  O( n ) T 4 ( n ) = 695 n n  O( n 2 ) Functions are mostly monotonically increasing and  0. Criteria for finding f  O( g ): If f ( n ) / g ( n )  c for some n  n 0 then f = O( g ) Example: lim f ( n ) / g ( n )  c n  n2n2 lim 695 n n T4(n)T4(n) n2n2 n  = = 695 n0n0 c

10 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity The proof has two parts: ‣ Finding the closed form ‣ Solving the inequality f(n)  c. g(n) from the definition Illustration using an example: ‣ A is an algorithm, which sorts a set of numbers in increasing order ‣ Assumption: A performs according to f(n) = n ‣ Proposition: A has the complexity O(n 2 ) ‣ Closed form for f(n) = n: f(n) = 3( n) = 3n(n+1)/2 Proving that f  O(g)

11 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Task: Find a value c, for which 3n(n+1)/2  cn 2 (for n > one n 0 ) Try c=3: 3n(n+1)/2  3n 2 n 2 + n  2n 2 n  n 2 1  nfor all n  1Q.E.D. Proving that f  O(g)

12 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity O-Notation is a simplification: ‣ It eliminates constants: O(n) = O(n/2) = O(17n) ‣ It forms an upper bound, i.e.: from f(n)  O(n log 2 n) follows that f(n)  O(n 2 ) ‣ For O-Notation the basis for logarithms is irrelevant, as: Consequences of the O-Notation

13 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Inclusion relations of the O-Notation: O(1)  O(log n)  O(n)  O(n log n)  O(n 2 )  O(n 3 )  O(2 n )  O(10 n )  We try to set the bounds as tight as possible Rule: Properties of O-Notation

14 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity O(1)constant O(log n)logarithmic O(n)linear O(n log n)n log n O(n 2 )square O(n 3 )cubic O(n k ), k  2 polynomial O(2 n )exponential Pronunciation

15 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity The time complexity of a program comes from the complexity of its parts The complexity of the elementary operations is O(1) (elementary operation: e.g. assignment, comparison, arithmetic operations, array access, …) A defined sequence of elementary operations (independent of the input size n) also has the complexity O(1) Calculating the Time Complexity

16 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Given the time complexities of two algorithms T 1 and T 2 : Summation rule: For the execution of T 1 followed by T 2 : Product rule: For the nested execution of T 1 and T 2 : Sum and Product Rules

17 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Loops in series: (n and m are the problem sizes) for (int i = 0; i < n; i++){ operation ; } for (int j = 0; j < m; j++){ operation ; } Complexity O(n+m) = O(max(n,m)) (sum rule) Loops in Series

18 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Nested loops: (n is the problem size) ‣ When inner loop execution is not dependent on the problem size, e.g.: for (int i = 0; i < n; i++) for (int j = 0; j < 17; j++) operation ; Complexity O(17n) = O(n) (Product rule) ‣ Otherwise: for (int i = 0; i < n; i++) for (int j = 0; j < n; j++) operation; Complexity: (Product rule) Ex: read the data from a n x n matrix -> very expensive (O(n 2 ))! Nested Loops

19 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Conditional Statement: if B then T 1 else T 2 ‣ Cost of „if“ is constant, therefore negligible ‣ T(n)=T 1 (n) or T(n)=T 2 (n) ‣ Good (if decidable): Longer sequences are chosen, i.e., the dominant operation should be used ‣ Upper boundary assessment also possible: T(n) < T 1 (n) + T 2 (n)  O(g 1 (n)+g 2 (n)) Conditional Statement

20 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Loop with condition: (n is the problem size) for (int i = 0; i < n; i++) { if (i = = 0) block1 ; else block2 ; } block1 is executed only once => not relevant ‣ (when not: T(block2) >> n.T(block1) ) block2 is dominant Complexity O(n. T(block2)) Condition Example

21 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Procedures are analyzed separately, and their execution times inserted for each call For recursive procedure calls: a recurrence relation for T(n) must be found Once again: Find a closed form for the recursive relation (example follows shortly) Procedure Calls

22 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Iterative Algorithms (today) ‣ Composed of smaller parts  sum rule ‣ Consider loops  multiplication rule Recursive Algorithms (next lecture) ‣ Time factors: Breaking a problem in several smaller ones Solving the sub-problems Recursive call of the method for solving the problem Combining the solutions for the sub-problems Analysis of simple Algorithms

23 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Cost consists of part a, and a part b which is repeated n times T(n) = a+bn T(n)  O(n) boolean contains (int [] A, int c) { int n = A.length; boolean found = false; for(int i = 0, i < n; i++) if (A [i] == c) found = true; return (found); } b: inside loop a: outside loop Example 1: Sequential Search

24 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Inner loop is executed i times, i upper boundary: c. n Outer loop is executed n times, constant costs: b Costs: n. (b+cn) = bn + cn 2 => O(n 2 ) void SelectionSort (int [] A) { int MinPosition, temp, i, j; for (i=n-1; i>0; i--) { MinPosition = i; for (j=0; j<i; j++) if ( A[j] < A[MinPosition] ) MinPosition = j; temp = A[i]; A[i] = A[MinPosition]; A[MinPosition] = temp; } b c Example 2: Selection Sort

25 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Analog to O(f) we have:  (g) = { h |  c>0:  n‘>0:  n>n‘: h(n)  c g(n) } Intuitively:  (g) is the set of all functions that grow at least as strong as g One says: „ if f   (g), then g sets a lower bound for f.“ Note: f  O(g)  g   (f)  (Omega) - Notation

26 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity n c 2, n 0 > 0 such that f(n) =  (g(n)) T(n) “g(n) sets an lower bound for f(n)” f(n) n0n0 c 2 g(n) Example:  -Notation

27 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity With the sets O(g) and  (g) we can define:  (g) = O(g)   (g) Intuitively:  (g) is the set of functions that grow exactly as strong as g Meaning: if f  O(g) and f   (g) then f   (g) In this case one talks about an exact bound  (Theta) - Notation

28 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity n c 1, c 2, n 0 > 0 such that f(n) =  (g(n)) T(n) “g(n) sets an exact bound for f(n)” c 1 g(n) f(n) n0n0 c 2 g(n) Example:  -Notation

29 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Algorithms with a higher asymptotic complexity can be more efficient for smaller problem sizes Asymptotic execution time only holds for certain values of n The constants do make a difference for smaller input sets AlgorithmT(n)Good for n =... A log 2 nn > 2048 A21000 n 1024  n  2048 A3100 n log 2 n 59  n  1024 A410n 2 10  n  58 A5n3n3 n = 10 A62n2n 2  n  9 Non-Asymptotic Execution Time

30 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Up till now, we’ve seen only iterative algorithms What about recursive algorithms? Following week: Refreshing recursion Then: Complexity Analysis with recurrence relation Complexity and Recursion