CSC263 Tutorial 1 Big-O, Ω, ϴ Trevor Brown

Slides:



Advertisements
Similar presentations
Growth-rate Functions
Advertisements

Counting the bits Analysis of Algorithms Will it run on a larger problem? When will it fail?
Analysis of Algorithms
Chapter 1 – Basic Concepts
COMP 171 Data Structures and Algorithms Tutorial 4 In-Class Exercises: Algorithm Design.
Big-O and Friends. Formal definition of Big-O A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N Example: Let f(n)
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
20-Jun-15 Analysis of Algorithms II. 2 Basics Before we attempt to analyze an algorithm, we need to define two things: How we measure the size of the.
RECURSION Self referential functions are called recursive (i.e. functions calling themselves) Recursive functions are very useful for many mathematical.
© 2006 Pearson Addison-Wesley. All rights reserved6-1 More on Recursion.
Professor John Peterson
Algorithm Efficiency and Sorting
CS 280 Data Structures Professor John Peterson. Big O Notation We use a mathematical notation called “Big O” to talk about the performance of an algorithm.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Analysis of Algorithm.
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
CHAPTER 10 Recursion. 2 Recursive Thinking Recursion is a programming technique in which a method can call itself to solve a problem A recursive definition.
Search Lesson CS1313 Spring Search Lesson Outline 1.Searching Lesson Outline 2.How to Find a Value in an Array? 3.Linear Search 4.Linear Search.
Simple Sorting Algorithms. 2 Outline We are going to look at three simple sorting techniques: Bubble Sort, Selection Sort, and Insertion Sort We are going.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
February 17, 2015Applied Discrete Mathematics Week 3: Algorithms 1 Double Summations Table 2 in 4 th Edition: Section th Edition: Section th.
Asymptotic Notations Iterative Algorithms and their analysis
1 Complexity Lecture Ref. Handout p
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a lgorithms.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
Analyzing algorithms using big-O, omega, theta First, we analyze some easy algorithms. Most of them have the same running time for all inputs of length.
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
DISCRETE MATHEMATICS I CHAPTER 11 Dr. Adam Anthony Spring 2011 Some material adapted from lecture notes provided by Dr. Chungsim Han and Dr. Sam Lomonaco.
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
Analysis of Algorithms
Chapter 19: Searching and Sorting Algorithms
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
Asymptotic Analysis-Ch. 3
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Asymptotic Notation (O, Ω, )
CSC 211 Data Structures Lecture 13
Data Structure Introduction.
3.3 Complexity of Algorithms
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
تصميم وتحليل الخوارزميات عال311 Chapter 3 Growth of Functions
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 6: Sorting and Searching.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
2IL50 Data Structures Spring 2016 Lecture 2: Analysis of Algorithms.
21-Feb-16 Analysis of Algorithms II. 2 Basics Before we attempt to analyze an algorithm, we need to define two things: How we measure the size of the.
CSC317 1 So far so good, but can we do better? Yes, cheaper by halves... orkbook/cheaperbyhalf.html.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 3 Chapter Summary  Algorithms o Example Algorithms searching for an element in a list sorting a list so its elements are in some prescribed.
Analysis of Algorithms
Searching Given a collection and an element (key) to find… Output
CS 3343: Analysis of Algorithms
CSC 413/513: Intro to Algorithms
CS 2210 Discrete Structures Algorithms and Complexity
Big-O, Ω, ϴ Trevor Brown CSC263 Tutorial 1 Big-O, Ω, ϴ Trevor Brown
CS 3343: Analysis of Algorithms
Ch. 2: Getting Started.
Quicksort Quick sort Correctness of partition - loop invariant
CS 2210 Discrete Structures Algorithms and Complexity
Presentation transcript:

CSC263 Tutorial 1 Big-O, Ω, ϴ Trevor Brown

Big-O and Big-Ω Many students think Big-O is “worst case” and Big-Ω is “best case.” They are WRONG! Both O(x) and Ω(x) describe running time for the worst possible input! What do O(x) and Ω(x) mean? How do we show an algorithm is O(x) or Ω(x)? Algorithm is O(x)Algorithm is Ω(x) The algorithm takes at most c*x steps to run on the worst possible input. The algorithm takes at least c*x steps to run on the worst possible input. Algorithm is O(x)Algorithm is Ω(x) Show: for every input, the algorithm takes at most c*x steps. Show: there is an input that makes the algorithm take at least c*x steps.

Analyzing algorithms using O, Ω, ϴ First, we analyze some easy algorithms. We can easily find upper and lower bounds on the running times of these algorithms by using basic arithmetic.

Example 1 for i = print * Running time: 100*c, which is ϴ(1) (why?) O(1), Ω(1) => ϴ(1)

Example 2 for i = 1..x print * Running time: x*c O(x), Ω(x) => ϴ(x) In fact, this is also Ω(1) and O(n^2), but these are very weak statements. x

Example 3 for i = 1..x for j = 1..x print * O(x^2), Ω(x^2) => ϴ(x^2) x x

Example 4a for i = 1..x for j = 1..i print * Big-O: i is always ≤ x, so j always iterates up to at most x, so this at most x*x steps, which is O(x^2). Big-Ω: When i=1, the loop over j performs “print *” once. When i=2, the loop over j performs “print *” twice. […] So, “print *” is performed …+x times. Easy summation formula: …+x = x(x+1)/2 x(x+1)/2 = x^2/2+x/2 ≥ (1/2) x^2, which is Ω(x^2)

Example 4b for i = 1..x for j = 1..i print * Big-Ω: – Useful trick: consider the iterations of the first loop for which i >= x/2. – In these iterations, the second loop iterates from j=1 to at least j=x/2. – Therefore, the number of steps performed in these iterations is at least x/2*x/2 = (1/4) x^2, which is Ω(x^2). – The time to perform ALL iterations is even more than this so, of course, the whole algorithm must take Ω(x^2) time. – Therefore, this function is ϴ(x^2). >= for i = x/2..x for j = 1..x/2 print * i ≥ x/2 j ≤ x/2 x/2

for i = 1..x for j = 1..i for k = 1..j print * Big-O: i, j, k ≤ x, so we know total number of steps is ≤ x*x*x = O(x^3). Big-Ω: – Consider only the iterations of the first loop from x/2 up to x. – For these values of i, the second loop always iterates up to at least x/2. – Consider only the iterations of the second loop from x/4 up to x/2. – For these values of j, the third loop always iterates up to at least x/4. – For these values of i, j and k, the function performs “print *” at least: x/2 * x/4 * x/4 = x^3/32 times, which is Ω(x^3). – Therefore, this function is ϴ(x^3). Example 5 for i = x/2..x for j = 1..i for k = 1..j print * for i = x/2..x for j = 1..x/2 for k = 1..j print * for i = x/2..x for j = x/4..x/2 for k = 1..j print * for i = x/2..x for j = x/4..x/2 for k = 1..x/4 print *

Analyzing algorithms using O, Ω, ϴ Now, we are going to analyze algorithms whose running times depend on their inputs. For some inputs, they terminate very quickly. For other inputs, they can be slow.

Example 1 LinearSearch(A[1..n], key) for i = 1..n if A[i] = key then return true return false Might take 1 iteration… might take many... Big-O: at most n iterations, constant work per iteration, so O(n) Big-Ω: can you find a bad input that makes the algorithm take Ω(n) time?

Example 2 int A[n] for i = 1..n binarySearch(A, i) Remember: binarySearch is O(lg n). O(n lg n) How about Ω? Maybe it’s Ω(n lg n), but it’s hard to tell. Not enough to know binary search is Ω(lg n), because the worst-case input might make only one invocation of binary search take c*lg n steps (and the rest might finish in 1 step). Would need to find a particular input that causes the algorithm to take a total of c(n lg n) steps. In fact, binarySearch(A, i) takes c*lg n steps when i is not in A, so this function takes c(n lg n) steps if A[1..n] doesn’t contain any number in {1,2,…,n}.

Example 3 Your boss tells you that your group needs to develop a sorting algorithm for the company's web server that runs in O(n lg n) time. If you can't prove that it can sort any input of length n in O(n lg n) time, it's no good to him. He doesn't want to lose orders because of a slow server. Your colleague tells you that he's been working on this piece of code, which he believes should fit the bill. He says he thinks it can sort any input of length n in O(n lg n) time, but he doesn't know how to prove this.

Example 3 continued WeirdSort(A[1..n]) last := n sorted := false while not sorted do sorted := true for j := 1 to last-1 do if A[j] > A[j+1] then swap A[j] and A[j+1] sorted := false last := last-1