Week 12 - Wednesday.  What did we talk about last time?  Hunters and prey  Class variables  Big Oh notation.

Slides:



Advertisements
Similar presentations
12-Apr-15 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Advertisements

Announcements You survived midterm 2! No Class / No Office hours Friday.
MATH 224 – Discrete Mathematics
CS4HS at Marquette University. a) Find the sum of 4 and 7 b) Sort a list of words in alphabetical order c) State the most beautiful phrase in the English.
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Week 12: Running Time and Performance 1.  Most of the problems you have written in this class run in a few seconds or less  Some kinds of programs can.
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Quicksort.
Complexity (Running Time)
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
1 Algorithms and Analysis CS 2308 Foundations of CS II.
Analysis of Algorithm.
Elementary Data Structures and Algorithms
Search Lesson CS1313 Spring Search Lesson Outline 1.Searching Lesson Outline 2.How to Find a Value in an Array? 3.Linear Search 4.Linear Search.
Abstract Data Types (ADTs) Data Structures The Java Collections API
Algorithm Analysis (Big O)
Data Structures Introduction Phil Tayco Slide version 1.0 Jan 26, 2015.
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Big Oh Algorithm Analysis with
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Week 2 - Monday.  What did we talk about last time?  Exceptions  Threads  OOP  Interfaces.
CS 1704 Introduction to Data Structures and Software Engineering.
Week 5 - Monday.  What did we talk about last time?  Linked list implementations  Stacks  Queues.
Week 5 - Wednesday.  What did we talk about last time?  Exam 1!  And before that?  Review!  And before that?  if and switch statements.
Chapter 12 Recursion, Complexity, and Searching and Sorting
IT253: Computer Organization Lecture 3: Memory and Bit Operations Tonga Institute of Higher Education.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Arrays Tonga Institute of Higher Education. Introduction An array is a data structure Definitions  Cell/Element – A box in which you can enter a piece.
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Week 2 - Wednesday.  What did we talk about last time?  Running time  Big Oh notation.
Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy Walters, and Godfrey Muganda Modified for use by MSU Dept. of Computer Science.
3.3 Complexity of Algorithms
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Week 10 - Friday.  What did we talk about last time?  Graph representations  Adjacency matrix  Adjacency lists  Depth first search.
Algorithm Analysis (Big O)
Week 12 - Friday.  What did we talk about last time?  Finished hunters and prey  Class variables  Constants  Class constants  Started Big Oh notation.
Searching and Sorting Searching: Sequential, Binary Sorting: Selection, Insertion, Shell.
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Week 12 - Wednesday.  What did we talk about last time?  Hunters and prey.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
Week 12 - Monday.  What did we talk about last time?  Defining classes  Class practice  Lab 11.
Searching CSE 103 Lecture 20 Wednesday, October 16, 2002 prepared by Doug Hogan.
Search Algorithms Written by J.J. Shepherd. Sequential Search Examines each element one at a time until the item searched for is found or not found Simplest.
Algorithm Analysis with Big Oh ©Rick Mercer. Two Searching Algorithms  Objectives  Analyze the efficiency of algorithms  Analyze two classic algorithms.
Section 1.7 Comparing Algorithms: Big-O Analysis.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Week 9 - Monday CS 113.
Week 13: Searching and Sorting
Recitation 13 Searching and Sorting.
Introduction to Algorithms
COMP 53 – Week Seven Big O Sorting.
Building Java Programs
Algorithm design and Analysis
Introduction to Data Structures
CS 201 Fundamental Structures of Computer Science
Measuring “Work” Linear and Binary Search
Analysis of Algorithms
slides created by Ethan Apter
Analysis of Algorithms
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Analysis of Algorithms
Algorithms and data structures: basic definitions
Presentation transcript:

Week 12 - Wednesday

 What did we talk about last time?  Hunters and prey  Class variables  Big Oh notation

 We want to compare the running time of one program to another  We want a mathematical description with the following characteristics:  Worst case We care mostly about how bad things could be  Asymptotic We focus on the behavior as the input size gets larger and larger

 Enter Big Oh notation  Big Oh simplifies a complicated running time function into a simple statement about its worst case growth rate  All constant coefficients are ignored  All low order terms are ignored  3n + 3 is O(n)  Big Oh is a statement that a particular running time is no worse than some function, with appropriate constants

 147n 3 + 2n 2 + 5n is O(n3)O(n3)  n n is O(2 n )  15n 2 + 6n + 7log n is O(n2)O(n2)  659n + nlog n is O(n log n)  Note: In CS, we use log 2 unless stated otherwise

 How long does it take to do multiplication by hand? 123 x __  Let’s assume that the length of the numbers is n digits  (n multiplications + n carries) x n digits + (n + 1 digits) x n additions  Running time: O(n 2 )

 How do we find the largest element in an array?  Running time: O(n) if n is the length of the array  What if the array is sorted in ascending order?  Running time: O(1) int largest = array[0]; for( int i = 1; i < array.length; i++ ) if( array[i] > largest ) largest = array[i]; System.out.println("Largest: " + largest); int largest = array[0]; for( int i = 1; i < array.length; i++ ) if( array[i] > largest ) largest = array[i]; System.out.println("Largest: " + largest); System.out.println("Largest: " + array[array.length-1]);

 Here is some code that sorts an array in ascending order  What is its running time?  Running time: O(n 2 ) for( int i = 0; i < array.length; i++ ) for( int j = 0; j < array.length - 1; j++ ) if( array[j] > array[j + 1] ) { int temp = array[j]; array[j] = array[j + 1]; array[j + 1] = temp; } for( int i = 0; i < array.length; i++ ) for( int j = 0; j < array.length - 1; j++ ) if( array[j] > array[j + 1] ) { int temp = array[j]; array[j] = array[j + 1]; array[j + 1] = temp; }

 Here is a table of several different complexity measures, in ascending order, with their functions evaluated at n = 100 DescriptionBig Ohf(100) ConstantO(1)1 LogarithmicO(log n)6.64 LinearO(n)O(n)100 LinearithmicO(n log n) QuadraticO(n2)O(n2)10000 CubicO(n3)O(n3) ExponentialO(2 n )1.27 x FactorialO(n!)9.33 x

 Computers get faster, but not in unlimited ways  If computers get 10 times faster, here is how much a problem from each class could grow and still be solvable DescriptionBig OhIncrease in Size ConstantO(1)Unlimited LogarithmicO(log n)1000 LinearO(n)O(n)10 LinearithmicO(n log n)10 QuadraticO(n2)O(n2)3-4 CubicO(n3)O(n3)2-3 ExponentialO(2 n )Hardly changes FactorialO(n!)Hardly changes

 There is nothing better than constant time  Logarithmic time means that the problem can become much larger and only take a little longer  Linear time means that time grows with the problem  Linearithmic time is just a little worse than linear  Quadratic time means that expanding the problem size significantly could make it impractical  Cubic time is about the reasonable maximum if we expect the problem to grow  Exponential and factorial time mean that we cannot solve anything but the most trivial problem instances

 Memory usage can be a problem  If you run out of memory, your program can crash  Memory usage can have serious performance consequences too

 Remember, there are multiple levels of memory on a computer  Each next level is on the order of 500 times larger and 500 times slower Cache Actually on the CPU Fast and expensive RAM Primary memory for a desktop computer Pretty fast and relatively expensive Hard Drive Secondary memory for a desktop computer Slow and cheap 1000X SizeSpeed

 If you can do a lot of number crunching without leaving cache, that will be very fast  If you have to fetch data from RAM, that will slow things down  If you have to read and write data to the hard drive (unavoidable with large pieces of data like digital movies), you will slow things down a lot

 Memory can be easier to estimate than running time  Depending on your input, you will allocate a certain number of objects, arrays, and primitive data types  It is possible to count the storage for each item allocated  Remember that a reference to an object or an array costs an additional 4 bytes on top of the size of the object

 Here are the sizes of various types in Java  Note that the N refers to the number of elements in the array or String TypeBytes boolean 1 char 2 int 4 double 8 TypeBytes boolean[] 16 + N char[] N int[] N double[] N TypeBytes reference4 String N object8 + size of members array of objects16 + (4 + size of members)N

 Lets say that I give you a list of numbers, and I ask you, “Is 37 on this list?”  As a human, you have no problem answering this question, as long as the list is reasonably short  What if the list is an array, and I want you to write a Java program to find some number?

 Easy!  We just look through every element in the array until we find it or run out  If we find it, we return the index, otherwise we return -1 public static int find( int[] array, int number ) { for( int i = 0; i < array.length; i++ ) if( array[i] == number ) return i; return -1; } public static int find( int[] array, int number ) { for( int i = 0; i < array.length; i++ ) if( array[i] == number ) return i; return -1; }

 Unfortunately for you, we know about Big Oh notation  Now we have some way to measure how long this algorithm takes  How long, if n is the length of the array?  O(n) time because we have to look through every element in the array, in the worst case

 Is there any way to go smaller than O(n)?  What complexity classes even exist that are smaller than O(n)?  O(1)  O(log n)  Well, on average, we only need to check half the numbers, that’s ½ n which is still O(n)  Darn…

 We can do better with more information  For example, if the list is sorted, then we can use that information somehow  How?  We can play a High-Low game

 Repeatedly divide the search space in half  We’re looking for 37, let’s say Check the middle (Too high) Check the middle (Too low) Check the middle (Too low) Check the middle (Found it!) 37

 How long can it take?  What if you never find what you’re looking for?  Well, then, you’ve narrowed it down to a single spot in the array that doesn’t have what you want  And what’s the maximum amount of time that could have taken?

 We can apply this idea to a guessing game  First we tell the computer that we are going to pick a number between 1 and n  We pick, and it tries to narrow down the number  It should only take log n tries  Remember log 2 (1,000,000) is only about 20

 This is a classic interview question asked by Microsoft, Amazon, and similar companies  Imagine that you have 9 red balls  One of them is just slightly heavier than the others, but so slightly that you can’t feel it  You have a very accurate two pan balance you can use to compare balls  Find the heaviest ball in the smallest number of weighings

 It’s got to be 8 or fewer  We could easily test one ball against every other ball  There must be some cleverer way to divide them up  Something that is related somehow to binary search

 We can divide the balls in half each time  If those all balance, it must be the one we left out to begin with

 How?  They key is that you can actually cut the number of balls into three parts each time  We weigh 3 against 3, if they balance, then we know the 3 left out have the heavy ball  When it’s down to 3, weigh 1 against 1, again knowing that it’s the one left out that’s heavy if they balance

 The cool thing is…  Yes, this is “cool” in the CS sense, not in the real sense  Anyway, the cool thing is that we are trisecting the search space each time  This means that it takes log 3 n weighings to find the heaviest ball  We could do 27 balls in 3 weighings, 81 balls in 4 weighings, etc.

 Sorting  Lab 12

 Finish Project 4  Due Friday before midnight