Professor John Peterson

Slides:



Advertisements
Similar presentations
Chapter 20 Computational complexity. This chapter discusses n Algorithmic efficiency n A commonly used measure: computational complexity n The effects.
Advertisements

MATH 224 – Discrete Mathematics
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Algorithm Analysis.
11.2 Complexity Analysis. Complexity Analysis As true computer scientists, we need a way to compare the efficiency of algorithms. Should we just use a.
the fourth iteration of this loop is shown here
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
1 CSE1301 Computer Programming Lecture 31: List Processing (Search)
1 9/24/07CS150 Introduction to Computer Science 1 Relational Operators and the If Statement.
Complexity Analysis (Part I)
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
RECURSION Self referential functions are called recursive (i.e. functions calling themselves) Recursive functions are very useful for many mathematical.
CS 280 Data Structures Professor John Peterson. Homework #1 Remember: this is due by class Wednesday! Ask questions now or by /wiki
CS 280 Data Structures Professor John Peterson. Example: log(N) This is where things get hairy! How would you compute Log 10 (N) in a very approximate.
1 Algorithm Efficiency, Big O Notation, and Role of Data Structures/ADTs Algorithm Efficiency Big O Notation Role of Data Structures Abstract Data Types.
CS 280 Data Structures Professor John Peterson. Log Complexity int j = n; while (j > 0) { System.out.println(j); j = j / 2; /* Integer division! */ }
CS 280 Data Structures Professor John Peterson. Big O Notation We use a mathematical notation called “Big O” to talk about the performance of an algorithm.
1 CS150 Introduction to Computer Science 1 Relational Operators and the If Statement 9/22/08.
Analysis of Algorithm.
Algorithm/Running Time Analysis. Running Time Why do we need to analyze the running time of a program? Option 1: Run the program and time it –Why is this.
Abstract Data Types (ADTs) Data Structures The Java Collections API
Pointers (Continuation) 1. Data Pointer A pointer is a programming language data type whose value refers directly to ("points to") another value stored.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Week 2 CS 361: Advanced Data Structures and Algorithms
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Recursion, Complexity, and Sorting By Andrew Zeng.
Analysis of Algorithms
CS 340Chapter 2: Algorithm Analysis1 Time Complexity The best, worst, and average-case complexities of a given algorithm are numerical functions of the.
CSE1301 Computer Programming: Lecture 26 List Processing (Search)
SortingBigOh Sorting and "Big Oh" Adapted for ASFA from a presentation by: Barb Ericson Georgia Tech Aug 2007 ASFA AP Computer Science.
Grading Exams 60 % Lab 20 % Participation 5% Quizes 15%
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Lecture 4: Calculating by Iterating. The while Repetition Statement Repetition structure Programmer specifies an action to be repeated while some condition.
Chapter 10 Algorithm Analysis.  Introduction  Generalizing Running Time  Doing a Timing Analysis  Big-Oh Notation  Analyzing Some Simple Programs.
CS101 Computer Programming I Chapter 4 Extra Examples.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
ANALYSING COSTS COMP 103. RECAP  ArrayList: add(), ensuring capacity, iterator for ArrayList TODAY  Analysing Costs 2 RECAP-TODAY.
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Searching Topics Sequential Search Binary Search.
A Introduction to Computing II Lecture 5: Complexity of Algorithms Fall Session 2000.
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
CPS 100e 5.1 Inheritance and Interfaces l Inheritance models an "is-a" relationship  A dog is a mammal, an ArrayList is a List, a square is a shape, …
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Lecture 4 CS140 Dick Steflik. Reading Keyboard Input Import java.util.Scanner – A simple text scanner which can parse primitive types and strings using.
Algorithm Analysis 1.
Complexity Analysis (Part I)
Analysis of Algorithms
Thought for the Day “Years wrinkle the skin, but to give up enthusiasm wrinkles the soul.” – Douglas MacArthur.
Introduction to complexity
Algorithm Efficiency Algorithm efficiency
Complexity Analysis.
Efficiency (Chapter 2).
Building Java Programs
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
Searching, Sorting, and Asymptotic Complexity
Analysis of Algorithms
Algorithms Analysis Algorithm efficiency can be measured in terms of:
8. Comparison of Algorithms
Analysis of Algorithms
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Complexity Analysis (Part I)
Analysis of Algorithms
Analysis of Algorithms
Complexity Analysis (Part I)
Presentation transcript:

Professor John Peterson CS 280 Data Structures Professor John Peterson

Big O Notation We use a mathematical notation called “Big O” to talk about the performance of an algorithm. O(n2) means that an algorithm runs in time proportional to n2 (the “size” of the input squared). Note that there is no constant – it’s not 23n2 or something like that – this compares general algorithms to each other. Constant factors are hard to measure and understand!

Program Example Consider the following program: for (int i = 0; i < n; i++) { System.out.print(i);} This program takes O(n) time to execute. This assumes that all of the statements in the program execute in constant time. (Do they?)

Iteration This is a special case of a general idea: the complexity of code within a loop is calculated by multiplying the number of times the loop is executed by the complexity of the loop body. What is the complexity of this: for (int i = 0; i < n*n; i++) System.out.print(i);

Nesting How about this? for (int i = 0; i < n; i++) for (int j = 0; j < n, j++) for (int k = 0; k < n; k++) System.out.println(i+j+k);

Conditionals The if statement is trickier – you need to choose the greater complexity of the alternatives. Sometimes you can use extra information to avoid always assuming the worst. While loops are related – you need to assume the worst in general.

If for (int i = 0; i < n; i++) { if (a[i] == 0) { for (int j = 0; j < n; j++) { System.out.print (i + “ “ + j); }}} What is the input to this one? What does the complexity really depend on?

Best, Worst, and Average When we don’t know what will happen with an “if”, we can make a number of assumptions: The worst possible outcome (highest complexity) The best possible outcome (lowest complexity) An average between the best and worst – this often hard to describe

A Simple Method boolean find(int [] a, int v) { for (int i = 0; i < a.length; i++) { if (a[i] == v) return true;} return false; } What is “n” in this case? What is the best case complexity? What is the worst case? Average?

Method Calls What do you do about a method call? You need to know how complex the method is GIVEN THE PARAMETER VALUES. This can be really tricky! The worst case is always an option if nothing is known about the parameters.

Example: log(N) This is where things get hairy! How would you compute Log10(N) in a very approximate manner? What does “Log(N)” mean mathematically? How might we get this in a real piece of code? Why don’t we really need to know the base of the logarithm?

Log Complexity int j = n; while (j > 0) { System.out.println(j); j = j / 2; /* Integer division! */ } What would this print for n = 10? 100?

Example public void r(int i) { if (i == 0) return; System.out.print(i) r(i-1); r(i-1); } r(n)

Example: 2N public void r(int i) { if (i == 0) return; System.out.print(i) r(i-1); r(i-1); } r(n)

Dominating Terms The big idea behind Big-O notation is that when you add complexities, one term may dominate the other. That is, once n has reached some value, one complexity is ALWAYS bigger than another. Example: O(n2) + O(n) = O(n2 + n) = O(n2)

A Complexity Ladder O(1) O(log(n)) O(n) O(n log(n)) O(n2) O(2n) There are lots more complexities “between the cracks” but these are the ones that we’ll be seeing this term.