Question of the Day  Move one matchstick to produce a square.

Slides:



Advertisements
Similar presentations
Chapter 20 Computational complexity. This chapter discusses n Algorithmic efficiency n A commonly used measure: computational complexity n The effects.
Advertisements

MATH 224 – Discrete Mathematics
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Analysis of Algorithms
Algorithm Complexity Analysis: Big-O Notation (Chapter 10.4)
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Dan Grossman Spring 2010.
Big-O and Friends. Formal definition of Big-O A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N Example: Let f(n)
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
The Efficiency of Algorithms
Professor John Peterson
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Complexity (Running Time)
CS 280 Data Structures Professor John Peterson. Big O Notation We use a mathematical notation called “Big O” to talk about the performance of an algorithm.
The Efficiency of Algorithms
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Lecture 16: Big-Oh Notation
Algorithm Analysis (Big O)
Pointers (Continuation) 1. Data Pointer A pointer is a programming language data type whose value refers directly to ("points to") another value stored.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Question of the Day A friend tells the truth when saying: A road near my house runs directly north-south; I get on the road facing north, drive for a mile,
COS 312 DAY 16 Tony Gauvin. Ch 1 -2 Agenda Questions? Next Capstone progress report is March 26 Assignment 4 Corrected – 3 A’s, 1 C, 1 D, 1 F and 1 MIA.
Recursion, Complexity, and Sorting By Andrew Zeng.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
Object-Oriented Design CSC 212. Announcements Ask more questions!  Your fellow students have the same questions (remember, I grade the daily quizzes)
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Chapter 10 Algorithm Analysis.  Introduction  Generalizing Running Time  Doing a Timing Analysis  Big-Oh Notation  Analyzing Some Simple Programs.
Problem of the Day  I am thinking of a question and propose 3 possible answers. Exactly one of the following is the solution. Which is it? A. Answer 1.
Data Structure and Algorithms. Algorithms: efficiency and complexity Recursion Reading Algorithms.
Algorithm Analysis Part of slides are borrowed from UST.
Algorithm Analysis (Big O)
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures.
Week 12 - Monday.  What did we talk about last time?  Defining classes  Class practice  Lab 11.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
CSC 143Q 1 CSC 143 Program Efficiency [Chapter 9, pp ]
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
CSC 212 – Data Structures Lecture 15: Big-Oh Notation.
Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution?
Algorithm Analysis with Big Oh ©Rick Mercer. Two Searching Algorithms  Objectives  Analyze the efficiency of algorithms  Analyze two classic algorithms.
LECTURE 22: BIG-OH COMPLEXITY CSC 212 – Data Structures.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Algorithm Complexity Analysis (Chapter 10.4) Dr. Yingwu Zhu.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
Introduction to Analysis of Algorithms
Introduction to complexity
Algorithm Analysis CSE 2011 Winter September 2018.
Efficiency (Chapter 2).
Algorithm Analysis (not included in any exams!)
Building Java Programs
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
Searching, Sorting, and Asymptotic Complexity
Analysis of Algorithms
Analysis of Algorithms
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Analysis of Algorithms
Presentation transcript:

Question of the Day  Move one matchstick to produce a square

Question of the Day  Move one matchstick to produce a square

“Anything that can go wrong…”  Big-Oh will calculate algorithm’s complexity  Worst-case  Worst-case analysis of algorithm performance  Usually reasonably correlated with execution time  Not always right to consider only worst-case  May be situation where worst-case is very rare  Solve for other cases similarly, but almost never done

Algorithmic Analysis

Primitive Statements O(1)  Basis of programming, take constant time: O(1)  Fastest possible big-Oh notation  Time to run sequence of primitive statements, too  But only if the input does not affect sequence Ignore constant multiplier 11 O(5) = O(5 * 1 ) = O( 1 )

Simple Loops for (int i = 0; i < n.length; i++){} -or- while (i < n) { i++; }  Each loop executed n times  Primitive statements only within body of loop O(1)  Big –oh complexity of single loop iteration: O(1) O(n)  Either loop runs O(n) iterations O(n)O(1)O(n)  So loop has O(n) * O(1) = O(n) complexity total

More Complicated Loops for (int i = 0; i < n; i += 2) { } i  0, 2, 4, 6,..., n  In above example, loop executes n / 2 iterations O(1)  Iterations takes O(1) time, so total complexity: O( n )O(1) = O( n / 2 ) * O(1) O(n ) = O(n * ½ * 1) O(n) = O(n)

Really Complicated Loops for (int i = 1; i < n; i *= 2) { } i  1, 2, 4, 8,..., n  In above code, loop executes log n iterations O(1)  Iterations takes O(1) time, so total complexity: O(log n)O(1) = O(log n) * O(1) O(log n) = O(log n * 1) O(log n) = O(log n)

Nested Loops for (int i = 0; i < n; i++){ for (int j = 0; j < n; j++) { } }  Program would execute outer loop n times  Inner loop run n times each iteration of outer loop  O(n)O(n)  O(n) iterations doing O(n) work each iteration O(n)O(n)O(n 2 )  So loop has O(n) * O(n) = O(n 2 ) complexity total  Loops complexity multiplies when nested

 Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to determine  Derive difficult answer using simple process Justifying an Answer

Proving Your Answer

 Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to determine  Derive difficult answer using simple process  May find that you can simplify big-Oh computation  Find smaller or larger big-Oh than imagined  Convincing others need not be very formal  Explaining your answer in clear way is critical, however Justifying an Answer

Algorithm sneaky(int n) total = 0 for i = 0 to n do for j = 0 to n do total += i * j return total end for end for  sneaky would take _____ time to execute  O(n) iterations for each loop in the method It’s About Time

Algorithm sneaky(int n) total = 0 for i = 0 to n do for j = 0 to n do total += i * j return total end for end for  sneaky would take O(1) time to execute  O(n) iterations for each loop in the method  But in first pass, method ends after return  Always executes same number of operations It’s About Time

Algorithm power(int a, int b ≥ 0) if a == 0 && b == 0 then return -1 endif exp = 1 repeat b times exp *= a end repeat return exp  power takes O(n) time in most cases  Would only take O(1) if a & b are 0  ____ algorithm overall Big-Oh == Murphy’s Law

Algorithm power(int a, int b ≥ 0) if a == 0 && b == 0 then return -1 endif exp = 1 repeat b times exp *= a end repeat return exp  power takes O(n) time in most cases  Would only take O(1) if a & b are 0 big-Oh uses worst-case  O(n) algorithm overall; big-Oh uses worst-case Big-Oh == Murphy’s Law

algorithm sum(int[][] a) total = 0 for i = 0 to a.length do for j = 0 to a[i].length do total += a[i][j] end for end for return total  Despite nested loops, this runs in O(n) time  Input is doubly-subscripted array for this method  For this method n is number entries in array How Big Am I?

Handling Method Calls  Method call is O(1) operation, …  … but then also need to add time running method  Big-Oh counts operations executed in total  Remember: there is no such thing as free lunch  Borrowing $5 to pay does not make your lunch free  Similarly, need to include all operations executed  In which method run DOES NOT MATTER

public static int sumOdds(int n) { int sum = 0; for (int i = 1; i <= n; i+=2) { sum+=i; } return sum; } public static void oddSeries(int n) { for (int i = 1; i < n; i++) { System.out.println(i + “ ” + sumOdds(n)); } }  oddSeries calls sumOdds n times  Each call does O(n) work, so takes O(n 2 ) total time! Methods Calling Methods

Your Turn  Get into your groups and complete activity

How to Prepare for Midterm DODON'T  Make cheat sheets for the test  Review how parts of Java work  Add post-its to important pages  Memorize  Drink case of 40s before test  Use post-its as clothing