Nattee Niparnan. Recall What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J
Advertisements

Analysis of Algorithms
Discrete Structures CISC 2315
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Divide and Conquer. Recall Complexity Analysis – Comparison of algorithm – Big O Simplification From source code – Recursive.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Lecture 2: Divide and Conquer I: Merge-Sort and Master Theorem Shang-Hua Teng.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Chapter 4: Solution of recurrence relationships
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Instructor Neelima Gupta
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
CS 3343: Analysis of Algorithms
MCA 202: Discrete Mathematics Instructor Neelima Gupta
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Complexity Analysis: Asymptotic Analysis. Recall What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Solving Recurrences with the Substitution Method.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Spring 2015 Lecture 2: Analysis of Algorithms
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Computability Sort homework. Formal definitions of time complexity. Big 0. Homework: Exercises. Searching. Shuffling.
1 Section 5.6 Comparing Rates of Growth We often need to compare functions ƒ and g to see whether ƒ(n) and g(n) are about the same or whether one grows.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Mathematical Foundations (Growth Functions) Neelima Gupta Department of Computer Science University of Delhi people.du.ac.in/~ngupta.
Recursion Ali.
Mathematical Foundations (Solving Recurrence)
DAST Tirgul 2.
CS 3343: Analysis of Algorithms
Lecture 11. Master Theorem for analyzing Recursive relations
Chapter 3: Growth of Functions
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
What is an Algorithm? Algorithm Specification.
CS 3343: Analysis of Algorithms
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
Chapter 3: Growth of Functions
CS 3343: Analysis of Algorithms
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Asymptotic Growth Rate
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Introduction to Algorithms
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Analysis of Algorithms
How much does your program take ?
At the end of this session, learner will be able to:
Algorithms Recurrences.
Big O notation f = O(g(n)) c g(n)
Presentation transcript:

Nattee Niparnan

Recall What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation

Today Topic

Interesting Topics of Upper Bound Rule of thumb! We neglect Lower order terms from addition E.g. n 3 +n 2 = O(n 3 ) Constant E.g. 3n 3 = O(n 3 ) Remember that we use = instead of (more correctly) 

Why Discard Constant? From the definition We can use any constant E.g. 3n = O(n) Because When we let c >= 3, the condition is satisfied

Why Discard Lower Order Term? Consider f(n) = n 3 +n 2 g(n) = n 3 If f(n) = O(g(n)) Then, for some c and n 0 c * g(n)-f(n) > 0 Definitely, just use any c >1

Why Discard Lower Order Term? Try c = 1.1 Does 0.1n 3 -n 2 > 0 ? It is when 0.1n > 1 E.g., n > * g(n)-f(n) = 0.1n 3 -n 2 0.1n 3 -n 2 > 0 0.1n 3 > n 2 0.1n 3 /n 2 > 1 0.1n > 1

Lower Order only? In fact, It’s only the dominant term that count Which one is dominating term? The one that grow faster Why? Eventually, it is g*(n)/f*(n) If g(n) grows faster, g(n)/f*(n) > some constant E.g, lim g(n)/f*(n)  infinity The non- dominant term The dominant term

What dominating what? nana n b (a > b) n log nn n 2 log nn log 2 n cncn ncnc Log n1 nlog n Left side dominates

Putting into Practice What is the asymptotic class of 0.5n 3 +N 4 -5(n-3)(n-5)+n 3 log 8 n+25+n 1.5 (n-5)(n 2 +3)+log(n 20 ) 20n 5 +58n 4 +15n 3.2 *3n 2

Putting into Practice What is the asymptotic class of 0.5n 3 +N 4 -5(n-3)(n-5)+n 3 log 8 n+25+n 1.5 (n-5)(n 2 +3)+log(n 20 ) 20n 5 +58n 4 +15n 3.2 *3n 2 O(n 4 ) O(n 3 ) O(n 5.4 )

Asymptotic Notation from Program Flow Sequence Conditions Loops Recursive Call

Sequence Block A Block B f (n) g (n) f(n) + g(n) = O(max (f(n),g(n))

Example Block A Block B O(n) O(n 2 )

Example Block A Block B Θ(n) O(n 2 )

Example Block A Block B Θ(n) Θ(n 2 )

Example Block A Block B O(n 2 ) Θ(n 2 )

Condition Block ABlock B f (n)g (n) O(max (f(n),g(n))

Loops for (i = 1;i <= n;i++) { P(i) } for (i = 1;i <= n;i++) { P(i) } Let P(i) takes time t i

Example for (i = 1;i <= n;i++) { sum += i; } for (i = 1;i <= n;i++) { sum += i; } sum += i  Θ(1)

Why don’t we use max(t i )? Because the number of terms is not constant for (i = 1;i <= n;i++) { sum += i; } for (i = 1;i <= n;i++) { sum += i; } for (i = 1;i <= ;i++) { sum += i; } for (i = 1;i <= ;i++) { sum += i; } Θ(n) Θ(1) With big constant

Example for (j = 1;j <= n;j++) { for (i = 1;i <= n;i++) { sum += i; } for (j = 1;j <= n;j++) { for (i = 1;i <= n;i++) { sum += i; } sum += i  Θ(1)

Example sum += i  Θ(1)

Example : Another way sum += i  Θ(1)

Example sum += i  Θ(1)

Example : While loops While (n > 0) { n = n - 1; } While (n > 0) { n = n - 1; } Θ(n)

Example : While loops While (n > 0) { n = n - 10; } While (n > 0) { n = n - 10; } Θ(n/10) = Θ(n)

Example : While loops While (n > 0) { n = n / 2; } While (n > 0) { n = n / 2; } Θ(log n)

Example : Euclid’s GCD function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a }

Example : Euclid’s GCD function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } How many iteration? Compute mod and swap Until the modding one is zero

Example : Euclid’s GCD a b

a b a mod b If a > b a mod b < a / 2 If a > b a mod b < a / 2

Case 1: b > a / 2 a b a mod b

Case 1: b ≤ a / 2 a b a mod b We can always put another b

Example : Euclid’s GCD function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } B always reduces at least half O( log n)

Theorem If Σa i <1 then T(n)= ΣT(a i n) + O(N) T(n) = O(n) T(n) = T(0.7n) + T(0.2n) + T(0.01)n + 3n = O(n)

Recursion try( n ){ if ( n <= 0 ) return 0; for ( j = 1; j <= n ; j++) sum += j; try (n * 0.7) try (n * 0.2) } try( n ){ if ( n <= 0 ) return 0; for ( j = 1; j <= n ; j++) sum += j; try (n * 0.7) try (n * 0.2) }

Recursion try( n ){ if ( n <= 0 ) return 0; for ( j = 1; j <= n ; j++) sum += j; try (n * 0.7) try (n * 0.2) } try( n ){ if ( n <= 0 ) return 0; for ( j = 1; j <= n ; j++) sum += j; try (n * 0.7) try (n * 0.2) } terminating process recursion Θ(1) Θ(n) T(0.7n) + T(0.2n) T(n) T(n) = T(0.7n) + T(0.2n) + O(n)

Guessing and proof by induction T(n) = T(0.7n) + T(0.2n) + O(n) Guess: T(n) = O(n), T(n) ≤ cn Proof: Basis: obvious Induction: Assume T(i < n) = O(i) T(n) ≤ 0.7cn + 0.2cn + O(n) = 0.9cn + O(n) = O(n) <<< dominating rule

Using Recursion Tree T(n) = 2 T(n/2) + n n n/2 n/4 n/2 n/4 n 2 n/2 4 n/4 Lg n T(n) = O(n lg n)

Master Method T(n) = aT(n/b) + f (n) a ≥ 1, b > 1 Let c = log b (a) f (n) = Ο( n c - ε ) → T(n) = Θ( n c ) f (n) = Θ( n c ) → T(n) = Θ( n c log n ) f (n) = Ω( n c + ε ) → T(n) = Θ( f (n) )

Master Method : Example T(n) = 9T(n/3) + n a = 9, b = 3, c = log 3 9 = 2, n c = n 2 f (n) = n = Ο( n ) T(n) = Θ(n c ) = Θ(n 2 )

Master Method : Example T(n) = T(n/3) + 1 a = 1, b = 3, c = log 3 1 = 0, n c = 1 f (n) = 1 = Θ( n c ) = Θ( 1 ) T(n) = Θ(n c log n) = Θ( log n)

Master Method : Example T(n) = 3T(n/4) + n log n a = 3, b = 4, c = log 4 3 < 0.793, n c < n f (n) = n log n = Ω( n ) a f (n/b) = 3 ((n/4) log (n/4) ) ≤ (3/4) n log n = d f (n) T(n) = Θ( f (n) ) = Θ( n log n)

Conclusion Asymptotic Bound is, in fact, very simple Use the rule of thumbs Discard non dominant term Discard constant For recursive Make recurrent relation Use master method Guessing and proof Recursion Tree