CSI Growth of Functions

Slides:



Advertisements
Similar presentations
Algorithms and Growth Rates
Advertisements

Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Algorithm analysis and design Introduction to Algorithms week1
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
1 INFO 2950 Prof. Carla Gomes Module Algorithms and Growth Rates Rose, Chapter 3.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Time Complexity of Algorithms (Asymptotic Notations)
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Asymptotic Complexity
CSCE 411 Design and Analysis of Algorithms
Analysis of Algorithms
Theoretical analysis of time efficiency
Analysis of algorithms
Analysis of algorithms
COMP9024: Data Structures and Algorithms
Introduction to Analysis of Algorithms
COMP9024: Data Structures and Algorithms
Scalability for Search
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Asymptotic Analysis.
Introduction to Algorithms
What is an Algorithm? Algorithm Specification.
Complexity & the O-Notation
Analysis of algorithms
Analysis of Algorithms
COMP9024: Data Structures and Algorithms
Growth of functions CSC317.
Analysis of Algorithms
Analysis of Algorithms: Methods and Examples
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Algorithm Analysis Lectures 3 & 4
Asymptotes: Why? How to describe an algorithm’s running time?
CSCI 2670 Introduction to Theory of Computing
CSC 413/513: Intro to Algorithms
Analysis of Algorithms
Introduction to Algorithms Analysis
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Analysis of Algorithms
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Asymptotic Analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Analysis of Algorithms
Defining Efficiency Asymptotic Complexity - how running time scales as function of size of input Two problems: What is the “input size” ? How do we express.
Advanced Analysis of Algorithms
Chapter 2.
Analysis of Algorithms
Fundamentals of the Analysis of Algorithm Efficiency
Asst. Dr.Surasak Mungsing
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
At the end of this session, learner will be able to:
Discrete Mathematics 7th edition, 2009
Analysis of algorithms
Fundamentals of the Analysis of Algorithm Efficiency
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Advanced Analysis of Algorithms
Big-O & Asymptotic Analysis
Algorithm Course Dr. Aref Rashad
Analysis of Algorithms
Presentation transcript:

CSI 2101- Growth of Functions Problem resolution Algorithm Program Dr-Zaguia-CSI2101-W08

CSI 2101- Complexity Having an algorithm for a given problem that does not mean that the problem can be solved. The procedure (algorithm) may be so inefficient that it would not be possible to solve the problem within a useful period of time. So what is inefficient? What is the “complexity” of an algorithm? number of steps that it takes to transform the input data into the desired output. Each simple operation (+,-,*,/,=,if, etc) and each memory access corresponds to a step. In general this depends of the problem. The complexity of an algorithm is a function of the size of the input (or size of the instance). We’ll denote the complexity of algorithm A by CA(n), where n is the size of the input. Dr-Zaguia-CSI2101-W08

Different notions of complexity In general this is the notion that we use to characterize the complexity of algorithms Dr-Zaguia-CSI2101-W08

Complexity J I Algorithm “Good Morning” For I = 1 to n 2 3 4 5 n 1 2 3 4 5 n 1 2 3 4 5 n Algorithm “Good Morning” For I = 1 to n For J = I+1 to n ShakeHands(student(I), student(J)) I Running time of “Good Morning”: Time = (# of HS) x (time/HS) + overhead Want an expression for T(n), running time of “Good Morning” on input of size n. How many handshakes? T(n) = s(n2- n)/2 + t s is time for one HS, and t is time for getting organized But do we always characterize the complexity of algorithms with such a detail? What is the most important aspect that we care about? Dr-Zaguia-CSI2101-W08

CA2(n) = 5 n ≥ CA1(n) = 0.5 n2 for n ≤ 10 Growth of Functions CA2(n) = 5 n ≥ CA1(n) = 0.5 n2 for n ≤ 10 Two algorithms A1&A2 CA1(n) = 0.5 n2 CA2(n) = 5 n Which one is better? Better Complexity? Dr-Zaguia-CSI2101-W08

Growth of Functions Two algorithms A1&A2 CA1(n) = 0.5 n2 CA2(n) = 5 n Which one is better? Better Complexity? Main question: how the complexity behaves asymptotically – i.e., when the problem sizes tend to infinity! Dr-Zaguia-CSI2101-W08

Growth of Functions In general we only worry about growth rates because: Our main objective is to analyze the cost performance of algorithms asymptotically. (reasonable in part because computers get faster and faster every year.) Another obstacle to having the exact cost of algorithms is that sometimes the algorithms are quite complicated to analyze. When analyzing an algorithm we are not that interested in the exact time the algorithm takes to run – often we only want to compare two algorithms for the same problem – the thing that makes one algorithm more desirable than another is its growth rate relative to the other algorithm’s growth rate. Dr-Zaguia-CSI2101-W08

Algorithm analysis is concerned with: Growth of Functions Algorithm analysis is concerned with: Type of function that describes run time (we ignore constant factors since different machines have different speed/cycle Large values of n Dr-Zaguia-CSI2101-W08

Growth of Functions 10 20 30 40 50 60 n n2 n3 n5 2n 3n Size Complexity 10 20 30 40 50 60 n .00001s .00002s .00003s .00004s .00005s .00006s n2 .0001s .0004s .0009s .0016s .0025s .0036s n3 .001s .008s .027s .064s .125s .216s n5 .1s 3.2s 24.3s 1.7 mn 5.2 mn 13 mn 2n 1.0s 17.9 mn 12.7 days 35.7 century 366 century 3n .059s 58 mn 6.5 years 3855 century 2x108 century 1.3x1013 century Assuming 106 operations per second Dr-Zaguia-CSI2101-W08

Growth of Functions c.g(n) f(n) n0 n f(n) is O(g(n)) We say “f(n) is big O of g(n)” There exist two constants c and n0 such that for Dr-Zaguia-CSI2101-W08

How to prove that 5x + 100 is O(x/2) Growth of Functions How to prove that 5x + 100 is O(x/2) Need  x> ___, 5x + 100  ___ * x/2 Try c=11 and n0= 200  x> 200, 5x + 100  11 * x/2 (If x> 200 then x/2 > 100. Thus 11 * x/2 = 5x + x/2 >5x +100.) Dr-Zaguia-CSI2101-W08

Growth of Functions x2 + 2x + 1 is O(x2) C = 4 k = 1 also C = 3 k = 2 Dr-Zaguia-CSI2101-W08

Growth of Functions x2 vs. (x2 + x) (x <=20) Dr-Zaguia-CSI2101-W08

Growth of Functions x2 vs. (x2 + x) (x2 + x) is O(n2) Dr-Zaguia-CSI2101-W08

Growth of Functions Guidelines: Very useful: f(n)=aknk + ak-1 nk-1 + ... + a1n + a0 then f(n) is O(nk) f(n)  ( |ak| + |ak-1 / n | + ... + |a0 / nk | ) nk  ( |ak| + |ak-1 | + ... + |a0 | ) nk for every n1. Guidelines: In general, only the largest term in a sum matters. a0xn + a1xn-1 + … + an-1x1 + anx0 is O(xn) n dominates lg n. n5lg n = O(n6) Dr-Zaguia-CSI2101-W08

Growth of Functions 1 Constant time n Linear time (n lg n) List of common functions in increasing O() order:  1 Constant time n Linear time (n lg n) n2 Quadratic time n3 … 2n Exponential time n! Dr-Zaguia-CSI2101-W08

Growth of Functions If we have f(n) is O(g(n)) then we may say too that g(n) is (f(n)) (g(n) is omega of f(n)). g(n) is (f(n)) if and only if there exist constants c and n0 such that: g(n)  c f(n) for all n n0 A function g(n) is (f(n)) (g(n) is Theta of f(n)) if g(n) is O(f(n)) and g(n) is (f(n)) . The f(n) and g(n) functions have the same growth rate. When we write f is O(g), it is like f  g When we write f is (g), it is like f  g When we write f is (g), it is like f = g. Dr-Zaguia-CSI2101-W08

Growth of Functions c .g(n) f(n) c .g(n) n n f(n) is  (g(n)) 2 f(n) c .g(n) 1 n n f(n) is  (g(n)) There exist 3 constants c ,c and n such that 1 2 c1g(n) f(n) c2g(n) for n  n0 Dr-Zaguia-CSI2101-W08

0 then g(n) is O(f(n)) Growth of Functions Use the limit for comparing the order of growth of two functions. 0 then g(n) is O(f(n)) [but g(n) is not (f(n)] c >0 then g(n) is (f(n))  then g(n) is (f(n)). limn   g(n)/ f(n) = Dr-Zaguia-CSI2101-W08

Estimating Functions Estimate the sum of the first n positive integers 1 + 2 + … + n = n(n+1)/2 = n2/2 + n/2 is  (n2) (1 + 2 + … + n) is  (n2) What about f(n) = n! and log n! n!= 1*2* … *n ≤ n*n* …*n = nn. Thus n! is O(nn). log n! ≤ log nn = n logn. Thus log n! is O(n logn). Dr-Zaguia-CSI2101-W08

Note: log scale on y axis. Dr-Zaguia-CSI2101-W08