CSCI 3160 Design and Analysis of Algorithms Tutorial 1

Slides:



Advertisements
Similar presentations
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
Asymptotic Analysis Motivation Definitions Common complexity functions
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (1) Asymptotic Complexity 10/28/2008 Yang Song.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Instructor Neelima Gupta
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
Asymptotic Notations Iterative Algorithms and their analysis
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Design and Analysis Algorithm Drs. Achmad Ridok M.Kom Fitra A. Bachtiar, S.T., M. Eng Imam Cholissodin, S.Si., M.Kom Aryo Pinandito, MT Pertemuan 04.
Iterative Algorithm Analysis & Asymptotic Notations
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. CSE 3101Z Design and Analysis of Algorithms.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Time Complexity of Algorithms (Asymptotic Notations)
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Spring 2015 Lecture 2: Analysis of Algorithms
Asymptotic Analysis CSE 331. Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where?
David Meredith Growth of Functions David Meredith
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
2IL50 Data Structures Spring 2016 Lecture 2: Analysis of Algorithms.
Asymptotic Notation Faculty Name: Ruhi Fatima
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Thinking about Algorithms Abstractly
Introduction to Algorithms
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Advanced Analysis of Algorithms
Chapter 2.
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Intro to Data Structures
Performance Evaluation
O-notation (upper bound)
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
David Kauchak cs161 Summer 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Advanced Analysis of Algorithms
Presentation transcript:

CSCI 3160 Design and Analysis of Algorithms Tutorial 1 Chengyu Lin

About me Name: Chengyu Lin Email: cylin@cse.cuhk.edu.hk Office: SHB 117 Office hour: Friday 14:00 – 16:00 You can always send me emails to make appointments

Asymptotic Notations O(g(n)) Ω(g(n)) Θ(g(n)) Big O Asymptotic upper bound Ω(g(n)) Big Omega Asymptotic lower bound Θ(g(n)) Big Theta Asymptotic tight bound

Asymptotic Notations The time (space) complexity of an algorithm usually depends on the input size, where the input could be vertices/edges of a graph a sequence of integers Usually the running time of algorithm grows with the input size Usually the running time is written as a function t(n), where n is the input size

Asymptotic Notations Suppose we want to analyze the running time of a program, e.g. This takes t(n) = n(n+1)/2 · n(n+1)/2 = n2(n+1)2/4 steps. Calculating the exact running time is tedious. for (int i = 1; i <= n; ++i) { for (int j = i; j <= n; ++j) { for (int r = 1; r <= n; ++r) { for (int s = r; s <= n; ++s) { puts(“hello”); } } }

Asymptotic Notations Asymptotic notation simplifies the calculations This takes t(n) = O(n2) · O(n2) = O(n4) steps. and yet it still captures the behavior of t(n) very well when the n is “large” enough for (int i = 1; i <= n; ++i) { for (int j = i; j <= n; ++j) { for (int r = 1; r <= n; ++r) { for (int s = r; s <= n; ++s) { puts(“hello”); } } }

Big O notation We say that t(n) = O(g(n)) if there exists a constant c > 0 such that t(n) ≤ c · g(n) for every sufficiently large n Intuitively, if t(n) is the running time, it tells us the running time cannot be worse than g(n) by a constant multiplicative factor Usually, g(n) looks simpler than t(n) (e.g. g(n) = n2, n!, log n, …)

Big O notation We say that t(n) = O(g(n)) if there exists a constant c > 0 such that t(n) ≤ c · g(n) for every sufficiently large n The inequality only needs to hold for large n e.g. n2 = O(2n) 3n2 ≤ 2n is not true when n = 3 It holds for every n ≥ 4, so it’s okay

Big O notation We say that We can multiply g(n) by a positive constant t(n) = O(g(n)) if there exists a constant c > 0 such that t(n) ≤ c · g(n) for every sufficiently large n We can multiply g(n) by a positive constant e.g. 4n2 = O(n2) 4n2 ≤ n2 is not true for any n But 4n2 ≤ 4n2 holds, so it’s okay

Big O notation We say that g(n) can be any upper bound t(n) = O(g(n)) if there exists a constant c > 0 such that t(n) ≤ c · g(n) for every sufficiently large n g(n) can be any upper bound e.g. n2 = O(n2) It is also true to say n2 = O(n100) or n2 = O(2n)

Convention Actually, O(g(n)) is a set When we say t(n) = O(g(n)) O(g(n)) = {t | ∃c > 0 such that t(n) ≤ c · g(n) for every large n} it contains all the functions that are upper bounded by g(n) When we say t(n) = O(g(n)) Actually we mean t(n) ∈ O(g(n)) When we say O(f(n)) = O(g(n)) Actually we mean O(f(n)) ⊆ O(g(n)) meaning if t(n) = O(f(n)) then t(n) = O(g(n))

Comparing functions |← polynomial →| … n1/3 n1/2 n n2 n3 … … 2n 2n^2 … 22^n … 2^2^…^2 |←exp→| double exp tower … O(1) , …, log* n, … loglog n, … log n, log2 n, … The functions above increases from left to right asymptotically i.e. O(1) < O(loglog n) < O(log n) < log2 n = (log n)2 < O(n1/3) < O(n) < O(n2) < O(2n) < O(n!) < O(nn) Logarithm is a lot smaller than polynomial e.g. 2k log n = nk Polynomial is a lot smaller than exponential e.g. log 2n = n

Comparing functions |← polynomial →| … n1/3 n1/2 n n2 n3 … … 2n 2n^2 … 22^n … 2^2^…^2 |←exp→| double exp tower … O(1) , …, log* n, … loglog n, … log n, log2 n, … The functions above increases from left to right asymptotically When we compare two functions asymptotically, compare the dominating terms on both sides first If they are of the same order, compare the next dominating terms, and so on… e.g. O(2n n2 log n) < O(2.1n n3 log2 n), since 2n < 2.1n the rest contributes very little when n is large

Examples 1 = O(log n)? n3 = O(n2)? log1000n = O(n)? n3 = O(1.001n)? O((log n)1999) = O(n0.001)? O(loglog n) = O(log3 n)? O(2n n4 log n) = O(2n n4 log4 n)? Yes No Yes No No Yes (see slide 12) Yes Yes

Properties O(f(n)) + O(g(n)) = O(f(n) + g(n)) O(max(f(n), g(n))) = O(f(n) + g(n)) t(n) = O(n2) + O(n) = O(n2 + n) (by property 1) t(n) = O(n2 + n) = O(n2) (by property 2) for (int i = 1; i <= n; ++i) { for (int j = i; j <= n; ++j) { puts(“CSCI3160); } } puts(“CSCI3160”); }

Properties O(f(n)) × O(g(n)) = O(f(n) × g(n)) e.g. O(n) × O(n) × O(n) = O(n3) (by property 3) for (int i = 1; i <= n; ++i) { for (int j = i; j <= n; ++j) { for (int k = j; k <= n; ++k) { puts(“hello”); } } }

Big Omega notation We say that t(n) = Ω(g(n)) if there exists a constant c > 0 such that t(n) ≥ c · g(n) for every sufficiently large n Intuitively, if t(n) is the running time, it tells us the running time cannot be better than g(n) by a constant multiplicative factor e.g. All comparison sort algorithms run in Ω(n log n) time Again, g(n) usually looks simple

Big Omega notation We say that t(n) = Ω(g(n)) if there exists a constant c > 0 such that t(n) ≥ c · g(n) for every sufficiently large n The inequality only needs to hold for large n e.g. 2n = Ω(n2) 2n ≥ n2 is not true when n = 3 It holds for every n ≥ 4, so it’s okay

Big Omega notation We say that t(n) = Ω(g(n)) if there exists a constant c > 0 such that t(n) ≥ c · g(n) for every sufficiently large n We can multiply g(n) by a positive constant e.g. 0.1 n2 = Ω(n2) 0.1n2 ≥ n2 is not true for any n But 0.1n2 ≥ 0.1n2 holds, so it’s okay

Big Omega notation We say that g(n) can be any lower bound t(n) = Ω(g(n)) if there exists a constant c > 0 such that t(n) ≥ c · g(n) for every sufficiently large n g(n) can be any lower bound e.g. n2 = Ω(n2) It is also true to say n2 = Ω(n) or n2 = Ω(log n)

Properties Ω(f(n)) + Ω(g(n)) = Ω(f(n) + g(n)) Ω(max(f(n), g(n))) = Ω(f(n) + g(n)) Ω(f(n)) × Ω(g(n)) = Ω(f(n) × g(n))

Examples If t(n) = O(g(n)) then t(n) ≠ Ω(g(n))? 1 = Ω(log n)? No n3 = Ω(n2)? Yes log1000n = Ω(n)? No n3 = Ω(1.001n)? Yes 1.002n = Ω(1.001n n2)? Yes 1 = O(log n)? Yes n3 = O(n2)? No log1000n = O(n)? Yes n3 = O(1.001n)? No 1.002n = O(1.001n n2)? No If t(n) = O(g(n)) then t(n) ≠ Ω(g(n))? If t(n) = Ω(g(n)) then t(n) ≠ O(g(n))? No! e.g. n3 = O(n3) = Ω(n3)

Big Theta notation Big O gives an asymptotical upper bound to t(n) Big Omega gives an asymptotical lower bound to t(n) There could be a gap between both bounds The upper and lower bounds could be very loose e.g. n2 = O(2n), n2 = Ω(1) When the upper bound matches the lower bound, we say this bound is tight (asymptotically)

Big Theta notation We say that t(n) = Θ(g(n)) if t(n) = O(g(n)) and t(n) = Ω(g(n)) Combining the definitions of O(g(n)) and Ω(g(n)), there exists a constant c1, c2 > 0 such that c1 · g(n) ≤ t(n) ≤ c2 · g(n) for every sufficiently large n Intuitively, if t(n) is the running time, it tells us the running time grows at about the same rate as g(n)

Examples t(n) = n3 - 4n2 + log n + 1 t(n) = O(n3) t(n) = Ω(n3)

Examples If t(n) = O(g(n)) then t(n) ≠ Ω(g(n))? 1 = Ω(log n)? No n3 = Ω(n2)? Yes log1000n = Ω(n)? No n3 = Ω(1.001n)? Yes 1.002n = Ω(1.001n n2)? Yes 1 = O(log n)? Yes n3 = O(n2)? No log1000n = O(n)? Yes n3 = O(1.001n)? No 1.002n = O(1.001n n2)? No If t(n) = O(g(n)) then t(n) ≠ Ω(g(n))? If t(n) = Ω(g(n)) then t(n) ≠ O(g(n))? No! e.g. n3 = O(n3) = Ω(n3)

Properties But we have t(n) = O(g(n)) if and only if g(n) = Ω(t(n)) For Big Theta, we have t(n) = Θ(g(n)) if and only if g(n) = Θ(t(n)) Θ(f(n)) + Θ(g(n)) = Θ(f(n) + g(n)) Θ(max(f(n), g(n))) = Θ(f(n) + g(n)) Θ(f(n)) × Θ(g(n)) = Θ(f(n) × g(n))

Examples Θ((log n)1999) = Θ(n0.001)? Θ(loglog n) = Θ(log3 n)? Θ(2n + n4 + log n) = Θ(2n + n4 + log4 n)? No No Yes Θ(2n + n4 + log n) = Θ(max{2n, n4, log n}) = Θ(2n) Similarly, Θ(2n + n4 + log4 n) = Θ(2n) t(n) = Θ(g(n)) if and only if g(n) = Θ(t(n)) So Θ(2n + n4 + log n) = Θ(2n) = Θ(2n + n4 + log4 n)

Asymptotic Notations O(g(n)), big O Ω(g(n)), big Omega Θ(g(n)), big theta In this course, we will use Big O notation a lot It is important to get familiar with it There are two other asymptotic notations o(g(n)) (small o) ω(g(n)) (small omega)

End Questions