LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures.

Slides:



Advertisements
Similar presentations
Let’s hope this doesn’t take too long. BIG O. WHY WE SHOULD CARE Computers are can perform billions of operations per second but they are still not infinite.
Advertisements

CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
HST 952 Computing for Biomedical Scientists Lecture 10.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
The Efficiency of Algorithms
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
CS2420: Lecture 4 Vladimir Kulyukin Computer Science Department Utah State University.
Complexity (Running Time)
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Lecture 16: Big-Oh Notation
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Chapter 1 Algorithm Analysis
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Question of the Day A friend tells the truth when saying: A road near my house runs directly north-south; I get on the road facing north, drive for a mile,
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
Object-Oriented Design CSC 212. Announcements Ask more questions!  Your fellow students have the same questions (remember, I grade the daily quizzes)
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
Algorithms and Algorithm Analysis The “fun” stuff.
1 Analysis of Algorithms CS 105 Introduction to Data Structures and Algorithms.
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Recursion. What is recursion? Rules of recursion Mathematical induction The Fibonacci sequence Summary Outline.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
 O(1) – constant time  The time is independent of n  O(log n) – logarithmic time  Usually the log is to the base 2  O(n) – linear time  O(n*logn)
Data Structures R e c u r s i o n. Recursive Thinking Recursion is a problem-solving approach that can be used to generate simple solutions to certain.
Problem of the Day  I am thinking of a question and propose 3 possible answers. Exactly one of the following is the solution. Which is it? A. Answer 1.
©TheMcGraw-Hill Companies, Inc. Permission required for reproduction or display. Chapter 15 * Recursive Algorithms.
Data Structure and Algorithms. Algorithms: efficiency and complexity Recursion Reading Algorithms.
Algorithm Analysis Part of slides are borrowed from UST.
LECTURE 20: RECURSION CSC 212 – Data Structures. Humorous Asides.
COSC 1P03 Data Structures and Abstraction 2.1 Analysis of Algorithms Only Adam had no mother-in-law. That's how we know he lived in paradise.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Question of the Day  Move one matchstick to produce a square.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
CSC 143Q 1 CSC 143 Program Efficiency [Chapter 9, pp ]
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
Vishnu Kotrajaras, PhD.1 Data Structures
Lecture 2 What is a computational problem? What is an instance of a problem? What is an algorithm? How to guarantee that an algorithm is correct? What.
CSC 212 – Data Structures Lecture 15: Big-Oh Notation.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution?
Algorithm Analysis with Big Oh ©Rick Mercer. Two Searching Algorithms  Objectives  Analyze the efficiency of algorithms  Analyze two classic algorithms.
LECTURE 22: BIG-OH COMPLEXITY CSC 212 – Data Structures.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
Algorithm Analysis 1.
Algorithm Analysis CSE 2011 Winter September 2018.
Algorithm Analysis (not included in any exams!)
Building Java Programs
Algorithm design and Analysis
Introduction to Data Structures
CS 201 Fundamental Structures of Computer Science
Building Java Programs
Searching, Sorting, and Asymptotic Complexity
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Presentation transcript:

LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures

Algorithm Analysis  Execution time with n inputs on 4GHz machine: n = 10n = 50n = 100n = 1000n = 10 6 O(n log n)9 ns50 ns175 ns2500 ns5 ms O(n 2 )25 ns625 ns2250 ns ns4 min O(n 5 )25000 ns72.5 ms2.7 s2.9 days1x10 13 yrs O(2 n )2500 ns3.25 days1 x yrs1 x yrs Too long! O(n!)1 ms1.4 x yrs7 x yrs Too long!

 Only large data sets are considered in analysis  If a program only takes 2 minutes, who cares?  Multipliers are ignored by this analysis  O( ⅕ n) = O(2n) = O(50000n) = O(n)  Need lots of 5 ms to reach 4 minutes  Only equation’s most significant term kept  O(⅛n n 2 ) = O(n 5 + n 2 ) = O(n 5 )  What is extra 17 min. after 3 x years? Big-Oh Notation

 Measures how many simple operations executed  Assignments, method calls, arithmetic, comparisons, getting array entry, following a reference, etc.  Provides simple, rough approximation of time  Excellent for narrowing approach to be used  Actual algorithms implementations not important  Precision not an issue: 17 min vs. age of the universe vs. Cage Match It Ain’t

 Sequences of simple statements take O(1) time  Loops from 1 – n will take time:  Constant amount added: O(n) time  O( log n) when multiplying by constant amount  When run sequentially, add the big-Oh times  Remember to drop multipliers & insignificant details  Complexities should multiply when nested  Slow having loops nested inside loop inside loops… Rules of Thumb

Algorithm sneaky(int n) total = 0 for i = 0 to n do for j = 0 to n do total += i * j return total end for end for  sneaky would take O(1) time to execute  Looks like O(n) iterations setup for each loop  But in first pass, method ends after return  Always executes the same number of operations It’s About Time

Algorithm power(int a, int b ≥ 0) if a == 0 && b == 0 then return -1 endif exp = 1 repeat b times exp *= a end repeat return exp  power takes O(n) time in most cases  Would only take O(1) if a & b are 0  O(n) algorithm, however, since use worst-case Big-Oh == Murphy’s Law

algorithm sum(int[][] a) total = 0 for i = 0 to a.length do for j = 0 to a[i].length do total += a[i][j] end for end for return total  Despite nested loops, this runs in O(n) time  Method’s input is doubly-subscripted array  For this method n is entries in entire array How Big Is My Input?

Handling Method Calls  Method call is O(1) operation, …  … but then also need to add time running method  Big-Oh counts operations executed in total  Remember: there is no such thing as free lunch  Borrowing $5 to pay does not make your lunch free  Similarly, need to include all operations executed  In which method run DOES NOT MATTER

public static int sumOdds(int n) { int sum = 0; for (int i = 1; i <= n; i+=2) { sum+=i; } return sum; } public static void oddSeries(int n) { for (int i = 1; i < n; i++) { System.out.println(i + “ ” + sumOdds(n)); } }  oddSeries calls sumOdds n times  Each call does O(n) work, so takes O(n 2 ) total time! Methods Calling Methods

 Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to find  Derive difficult answer using simple process  May find that you can simplify big-Oh computation  Find smaller or larger big-Oh than imagined  Can be proof, but need not be that formal  Explaining your answer is critical for this  Helps you be able to convince others Justifying an Answer

Algorithm factorial(int n) if n <= 1 then return 1 else fact = factorial(n – 1) return n * fact endif  Ignoring recursive calls cost, runs in O(1) time  At most n – 1 calls since n decreased by 1 each time  Method’s total complexity is O(n)  Runs O(n – 1) * O(1) = O(n - 1) = O(n) operations Big-Oh Notation

Algorithm fib(int n) if n < 1 then return n else return fib(n-1) + fib(n-2) endif  O(1) time for each O(2 n ) calls = O(2 n ) complexity  Calls fib(1), fib(0) when n = 2 ( )  n = 3, total of 4 calls: 3 for fib(2) + 1 for fib(1)  n = 4, total of 8 calls: 4 for fib(3) + 3 for fib(2)  Number of calls doubles when n incremented = O(2 n ) Big-Oh Notation

 Finish week #8 assignment  Due by 5PM next Tuesday  Start programming assignment #3  Messages are not always sent to everyone!  Read section 5.1 in book before class  Discuss out first ADT: Stack  What is it? How is it used? Why do I now want Pez?  Strongly urge students to fill out ADT Design for Stack Before Next Lecture…