Design and Analysis Algorithm Drs. Achmad Ridok M.Kom Fitra A. Bachtiar, S.T., M. Eng Imam Cholissodin, S.Si., M.Kom Aryo Pinandito, MT Pertemuan 04.

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
Lecture: Algorithmic complexity
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Analysys & Complexity of Algorithms Big Oh Notation.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Algorithm analysis and design Introduction to Algorithms week1
Instructor Neelima Gupta
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
MS 101: Algorithms Instructor Neelima Gupta
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
Time Complexity of Algorithms (Asymptotic Notations)
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
Announcement We will have a 10 minutes Quiz on Feb. 4 at the end of the lecture. The quiz is about Big O notation. The weight of this quiz is 3% (please.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Introduction to Algorithms
Growth of functions CSC317.
O-notation (upper bound)
Asymptotic Notations Algorithms Lecture 9.
CSC 413/513: Intro to Algorithms
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Advanced Analysis of Algorithms
Chapter 2.
Performance Evaluation
O-notation (upper bound)
At the end of this session, learner will be able to:
CSE 373, Copyright S. Tanimoto, 2001 Asymptotic Analysis -
Advanced Analysis of Algorithms
Analysis of Algorithms Big-Omega and Big-Theta
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Presentation transcript:

Design and Analysis Algorithm Drs. Achmad Ridok M.Kom Fitra A. Bachtiar, S.T., M. Eng Imam Cholissodin, S.Si., M.Kom Aryo Pinandito, MT Pertemuan 04

Contents 2 Asymptotic Notation

 Think of n as the number of records we wish to sort with an algorithm that takes f(n) to run. How long will it take to sort n records?  What if n is big?  We are interested in the range of a function as n gets large.  Will f stay bounded?  Will f grow linearly?  Will f grow exponentially?  Our goal is to find out just how fast f grows with respect to n.

Asymptotic Notation  Misalkan: T(n) = 5n 2 + 6n + 25 T(n) proporsional untuk ordo n 2 untuk data yang sangat besar. 4 Memperkirakan formula untuk run-time Indikasi kinerja algoritma (untuk jumlah data yang sangat besar)

Asymptotic Notation  Indikator efisiensi algoritma bedasar pada OoG pada basic operation suatu algoritma.  Penggunaan notasi sebagai pembanding urutan OoG:  O (big oh)  Ω (big omega)  Ө (big theta)  t(n) : algoritma running time (diindikasikan dengan basic operation count (C(n))  g(n) : simple function to compare the count 5

Classifying functions by their Asymptotic Growth Rates (1/2)  asymptotic growth rate, asymptotic order, or order of functions  Comparing and classifying functions that ignores constant factors and small inputs.  O(g(n)), Big-Oh of g of n, the Asymptotic Upper Bound;   (g(n)), Omega of g of n, the Asymptotic Lower Bound.   (g(n)), Theta of g of n, the Asymptotic Tight Bound; and

Example  Example: f(n) = n 2 - 5n  The constant 13 doesn't change as n grows, so it is not crucial. The low order term, -5n, doesn't have much effect on f compared to the quadratic term, n 2. We will show that f(n) =  (n 2 ).  Q: What does it mean to say f(n) =  (g(n)) ?  A: Intuitively, it means that function f is the same order of magnitude as g.

Example (cont.)  Q: What does it mean to say f 1 (n) =  (1)?  A: f 1 (n) =  (1) means after a few n, f 1 is bounded above & below by a constant.  Q: What does it mean to say f 2 (n) =  (n log n)?  A: f 2 (n) =  (n log n) means that after a few n, f 2 is bounded above and below by a constant times n log n. In other words, f 2 is the same order of magnitude as n log n.  More generally, f(n) =  (g(n)) means that f(n) is a member of  (g(n)) where  (g(n)) is a set of functions of the same order of magnitude.

Big-Oh  The O symbol was introduced in 1927 to indicate relative growth of two functions based on asymptotic behavior of the functions now used to classify functions and families of functions

Upper Bound Notation  We say Insertion Sort’s run time is O(n 2 )  Properly we should say run time is in O(n 2 )  Read O as “Big-O” (you’ll also hear it as “order”)  In general a function  f(n) is O(g(n)) if  positive constants c and n 0 such that f(n)  c  g(n)  n  n 0  e.g. if f(n)=1000n and g(n)=n 2, n 0 > 1000 and c = 1 then f(n 0 ) < 1.g(n 0 ) and we say that f(n) = O(g(n))

Asymptotic Upper Bound f(n)f(n) g(n)g(n) c g(n) f(n)  c g(n) for all n  n 0 g(n) is called an asymptotic upper bound of f(n). We write f(n)=O(g(n)) It reads f(n) is big oh of g(n). n0n0

Big-Oh, the Asymptotic Upper Bound  This is the most popular notation for run time since we're usually looking for worst case time.  If Running Time of Algorithm X is O(n 2 ), then for any input the running time of algorithm X is at most a quadratic function, for sufficiently large n.  e.g. 2n 2 = O(n 3 ).  From the definition using c = 1 and n 0 = 2. O(n 2 ) is tighter than O(n 3 ).

6 g(n) f(n) for all n>6, g(n) > 1 f(n). Thus the function f is in the big-O of g. that is, f(n) in O(g(n)). Example 1

g(n) f(n) 5 There exists a n 0 =5 s.t. for all n>n 0, f(n) < 1 g(n). Thus, f(n) is in O(g(n)). Example 2

There exists a n 0 =5, c=3.5, s.t. for all n>n 0, f(n) < c h(n). Thus, f(n) is in O(h(n)). 5 h(n) f(n) 3.5 h(n) Example 3

Example of Asymptotic Upper Bound f(n)=3n 2 +5 g(n)=n 2 4g(n)=4n 2 = 3n 2 + n 2  3n for all n  3 > 3n = f(n) Thus, f(n)=O(g(n)). 3

Exercise on O-notation  Show that 3n 2 +2n+5 = O(n 2 ) 10 n 2 = 3n 2 + 2n 2 + 5n 2  3n 2 + 2n + 5 for n  1 c = 10, n 0 = 1

Tugas 1 : O-notation  f1(n) = 10 n + 25 n 2  f2(n) = 20 n log n + 5 n  f3(n) = 12 n log n n 2  f4(n) = n 1/2 + 3 n log n O(n 2 ) O(n log n) O(n 2 ) O(n log n)

Classification of Function : BIG O (1/2)  A function f(n) is said to be of at most logarithmic growth if f(n) = O(log n)  A function f(n) is said to be of at most quadratic growth if f(n) = O(n 2 )  A function f(n) is said to be of at most polynomial growth if f(n) = O(n k ), for some natural number k > 1  A function f(n) is said to be of at most exponential growth if there is a constant c, such that f(n) = O(c n ), and c > 1  A function f(n) is said to be of at most factorial growth if f(n) = O(n!).

Classification of Function : BIG O (2/2)  A function f(n) is said to have constant running time if the size of the input n has no effect on the running time of the algorithm (e.g., assignment of a value to a variable). The equation for this algorithm is f(n) = c  Other logarithmic classifications:  f(n) = O(n log n)  f(n) = O(log log n)

Lower Bound Notation  We say InsertionSort’s run time is  (n)  In general a function  f(n) is  (g(n)) if  positive constants c and n 0 such that 0  c  g(n)  f(n)  n  n 0  Proof:  Suppose run time is an + b Assume a and b are positive (what if b is negative?)  an  an + b

Big  Asymptotic Lower Bound f(n)f(n) c g(n) f(n)  c g(n) for all n  n 0 g(n) is called an asymptotic lower bound of f(n). We write f(n)=  (g(n)) It reads f(n) is omega of g(n). n0n0

Example of Asymptotic Lower Bound f(n)=n 2 /2-7 c g(n)=n 2 /4 g(n)=n 2 g(n)/4 = n 2 /4 = n 2 /2 – n 2 /4  n 2 /2 – 9 for all n  6 < n 2 /2 – 7 Thus, f(n)=  (g(n)). 6 g(n)=n 2

Example: Big Omega  Example: n 1/2 =  ( log n). Use the definition with c = 1 and n 0 = 16. Checks OK. Let n ≥ 16 : n 1/2 ≥ (1) log n if and only if n = ( log n ) 2 by squaring both sides. This is an example of polynomial vs. log.

Big Theta Notation  Definition: Two functions f and g are said to be of equal growth, f = Big Theta(g) if and only if both f  g) and g =  (f).  Definition: f(n) =  (g(n)) means  positive constants c 1, c 2, and n 0 such that c 1 g(n)  f(n)  c 2 g(n)  n  n 0  If f(n) = O(g(n)) and f(n) =  (g(n)) then f(n) =  (g(n)) (e.g. f(n) = n 2 and g(n) = 2n 2 )

Theta, the Asymptotic Tight Bound  Theta means that f is bounded above and below by g; BigTheta implies the "best fit".  f(n) does not have to be linear itself in order to be of linear growth; it just has to be between two linear functions,

Asymptotically Tight Bound f(n)f(n) c 1 g(n) f(n) = O(g(n)) and f(n) =  (g(n)) g(n) is called an asymptotically tight bound of f(n). We write f(n)=  (g(n)) It reads f(n) is theta of g(n). n0n0 c 2 g(n)

Other Asymptotic Notations  A function f(n) is o(g(n)) if  positive constants c and n 0 such that f(n) < c g(n)  n  n 0  A function f(n) is  (g(n)) if  positive constants c and n 0 such that c g(n) < f(n)  n  n 0  Intuitively, –o() is like < –O() is like  –  () is like > –  () is like  –  () is like =

Examples 1. 2n 3 + 3n 2 + n = 2n 3 + 3n 2 + O(n) = 2n 3 + O( n 2 + n) = 2n 3 + O( n 2 ) = O(n 3 ) = O(n 4 ) 2. 2n 3 + 3n 2 + n = 2n 3 + 3n 2 + O(n) = 2n 3 +  (n 2 + n) = 2n 3 +  (n 2 ) =  (n 3 )

Tugas 2 : Examples (cont.) 3. Suppose a program P is O(n 3 ), and a program Q is O(3 n ), and that currently both can solve problems of size 50 in 1 hour. If the programs are run on another system that executes exactly 729 times as fast as the original system, what size problems will they be able to solve?

Example (cont.) n 3 = 50 3  n = 3 50  729 n = n = log 3 (729  3 50 ) n = n = log 3 (729) + log n = 50  9 n = 6 + log n = 50  9 = 450 n = = 56  Improvement: problem size increased by 9 times for n 3 algorithm but only a slight improvement in problem size (+6) for exponential algorithm.

More Examples (a)0.5n 2 - 5n + 2 =  ( n 2 ). Let c = 0.25 and n 0 = n 2 - 5n + 2 = 0.25( n 2 ) for all n = 25 (b) 0.5 n 2 - 5n + 2 = O( n 2 ). Let c = 0.5 and n 0 = ( n 2 ) = 0.5 n 2 - 5n + 2 for all n = 1 (c) 0.5 n 2 - 5n + 2 =  ( n 2 ) from (a) and (b) above. Use n 0 = 25, c 1 = 0.25, c 2 = 0.5 in the definition.

More Examples (d) 6  2 n + n 2 = O(2 n ). Let c = 7 and n 0 = 4. Note that 2 n = n 2 for n = 4. Not a tight upper bound, but it's true. (e) 10 n = O(n 4 ). There's nothing wrong with this, but usually we try to get the closest g(n). Better is to use O(n 2 ).

Practical Complexity t < 250

Practical Complexity t < 500

Practical Complexity t < 1000

Practical Complexity t < 5000

Click to edit subtitle style