Object-Oriented Design CSC 212. Announcements Ask more questions!  Your fellow students have the same questions (remember, I grade the daily quizzes)

Slides:



Advertisements
Similar presentations
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Advertisements

HST 952 Computing for Biomedical Scientists Lecture 10.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Dan Grossman Spring 2010.
CSC401 – Analysis of Algorithms Lecture Notes 1 Introduction
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
© 2006 Pearson Addison-Wesley. All rights reserved6-1 More on Recursion.
1 Algorithm Efficiency, Big O Notation, and Role of Data Structures/ADTs Algorithm Efficiency Big O Notation Role of Data Structures Abstract Data Types.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Complexity (Running Time)
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithm.
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Lecture 16: Big-Oh Notation
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
Abstract Data Types (ADTs) Data Structures The Java Collections API
COMP s1 Computing 2 Complexity
SIGCSE Tradeoffs, intuition analysis, understanding big-Oh aka O-notation Owen Astrachan
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
A Review of Recursion Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Week 2 CS 361: Advanced Data Structures and Algorithms
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Algorithm Input Output An algorithm is a step-by-step procedure for solving a problem in a finite amount of time. Chapter 4. Algorithm Analysis (complexity)
Question of the Day A friend tells the truth when saying: A road near my house runs directly north-south; I get on the road facing north, drive for a mile,
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
1 Analysis of Algorithms CS 105 Introduction to Data Structures and Algorithms.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Problem of the Day  I am thinking of a question and propose 3 possible answers. Exactly one of the following is the solution. Which is it? A. Answer 1.
Algorithms and data structures: basic definitions An algorithm is a precise set of instructions for solving a particular task. A data structure is any.
3.3 Complexity of Algorithms
Data Structure and Algorithms. Algorithms: efficiency and complexity Recursion Reading Algorithms.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
COSC 1P03 Data Structures and Abstraction 2.1 Analysis of Algorithms Only Adam had no mother-in-law. That's how we know he lived in paradise.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Question of the Day  Move one matchstick to produce a square.
LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures.
CSC 212 – Data Structures Lecture 15: Big-Oh Notation.
Announcement We will have a 10 minutes Quiz on Feb. 4 at the end of the lecture. The quiz is about Big O notation. The weight of this quiz is 3% (please.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution?
Algorithm Analysis with Big Oh ©Rick Mercer. Two Searching Algorithms  Objectives  Analyze the efficiency of algorithms  Analyze two classic algorithms.
LECTURE 22: BIG-OH COMPLEXITY CSC 212 – Data Structures.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
1 Algorithms Searching and Sorting Algorithm Efficiency.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Analysis of Algorithms
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Catie Baker Spring 2015.
Introduction to Algorithms
Big-Oh and Execution Time: A Review
Building Java Programs
CSE 143 Lecture 5 Binary search; complexity reading:
Introduction to Data Structures
CS 201 Fundamental Structures of Computer Science
Searching, Sorting, and Asymptotic Complexity
Analysis of Algorithms
Algorithms and data structures: basic definitions
Presentation transcript:

Object-Oriented Design CSC 212

Announcements Ask more questions!  Your fellow students have the same questions (remember, I grade the daily quizzes)  A different explanation can often help clarify the matter Homework #1 on web  Due before class on Thursday  No user I/O required I explicitly state when this is required

Recursion A recursive function using old results to define new values  E.g.,  Fibonacci sequence: 1, 1, 2, 3, 5, 8, 13, 21, … Value of new term is sum of two preceding ones

Recursion Recursive definitions have two parts:  Base Case – solved non-recursively (often with constant definitions) n! = 1, if n = 1 First two terms of Fibonacci sequence defined as 1  Recursive Case – solved using the function being defined n! = n * (n-1)!, n > 1

Recursion Continued Using recursion can simplify code: public static int factorial(int num) { if (n == 1) return 1; else return n * factorial(num – 1); } But, recursion can also bring problems  What does factorial(-2) return? Recursion can also be very slow

Another Recursion Example Can have multiple base cases Recursion can also be done multiple times public int fibonacci(int n) { if (n < 0) { return 0; } else if (n < 2) { return 1; } else { return fibonacci(n – 1) + fibonacci(n-2); }

Mutual Recursion Recursion can also occur across methods: public static int factEven(int n) { return n * factOdd(n – 1); } public static int factOdd(int n) { if (n == 1) return 1; else return n * factEven(n – 1); } We will see more complex methods of recursion later

Aside: Logarithms If B * K = N, then log B N = K  B is the base of the logarithm Unless stated, in CSC logarithm base is 2 So log N really means log 2 N  log N = K if and only if 2 K = N log 16 is 4 log 1,024 = 10 log 1,000,000,000 ≈ 30 Logarithmic functions grow very slowly

Logarithm Examples Number of bits required to store a binary number is logarithmic  8 bits stores 256 values log 256 = 8  Maximum value of Java word = 2,147,483,648 log 2,147,483,648 = 31

Logarithm Examples Inventor of chess asked Emperor to be paid like this:  1 grain of rice on the first square, 2 on next… So each square has twice the grain as previous  Function grows exponentially i.e. 2 n, the inverse of a logarithm  Emperors like clever games, but not always the game designers Chess inventor was beheaded

Analysis Techniques Running time is important when coding  Obviously true for real-time systems  But also holds for most systems But not always possible to compare times of all algorithms  Lots of ways to solve a single problem  Many different implementations possible for each solution

Analysis Techniques Want way of examining algorithm that ignores affect of compiler, hardware, etc. Consider algorithm across different inputs, including (especially) worst possible case How to do this analysis without dealing with implementation issues?

The Pseudo-Code Answer Analysis is only for human eyes  Do bother with details needed to make code compile  Instead use "pseudo-code" Pseudo-code isn't real  Name used when writing algorithm in a computer language-like manner

The Pseudo-Code Answer Pseudo-code includes all important code  E.g., Loops, assignments, method calls, etc.  Helps better analyze algorithm Pseudo-code isn't formal – only used to understand algorithm  Ignore unimportant punctuation, formalisms  Write pseudo-code so people can understand and analyze it

Pseudo-code Example What is this function computing? int exampleFunction(int n, n > 0) returnVariable  1 while (n > 0) returnVariable = returnVariable * n n  n – 1 return returnVariable

Algorithm Analysis When comparing algorithms, do not want to measure exact times  Do not want to do all the coding and testing  Instead want back-of-the-envelope measures  Provide quick and easy evaluation and comparison  Implementation often affects execution times, anyway!

Big-Oh Notation Big-Oh computes code complexity  Provides worst-case analysis of performance  Execution time related to code complexity  Enables comparison between algorithms Can use pseudo-code description of algorithm Do not need to implement all approaches Avoids comparing details not related to algorithms  E.g., Compiler, CPU, Users typing speed

Algorithmic Analysis

Algorithm Analysis Approximate time to run a program with n inputs on 1GHz machine: n = 10n = 50n = 100n = 1000n = 10 6 O(n log n)35 ns200 ns700 ns10000 ns20 ms O(n 2 )100 ns2500 ns10000 ns1 ms17 min O(n 5 )0.1 ms0.3 s10.8 s11.6 days3x10 13 years O(2 n )1000 ns13 days4 x years Too long! O(n!)4 ms Too long!

Big-Oh Notation Want correct results for any data set  Only consider details affecting large data sets  Ignore multipliers: O(5n) = O(2n) = O(n) Constant multipliers affected by implementation Coding tricks can often reduce these factors, anyway  Use only dominating term: O(n 5 + n 2 ) = O(n 5 ) Does extra 17 minutes matter after 3x10 13 years?

Analysis of Algorithms Individual statements  E.g., method calls, assignments, arithmetic…  O(1) Complexity Also called “constant time”  Also holds for sequence of statements Provided statements (including loops) execute constant number of times for all input sizes Remember: only want rough estimate – we ignore constant multipliers

Analysis of Algorithms Simple Loops for (int i = 0; i < n; i++) { S }  for statement executed n times  If S is simple sequence of statements (e.g., complexity of O(1)), total complexity is n*O(1) = O(n)

Analysis of Algorithms, cont. Slightly more complicated loops for (int i = 0; i < n; i += 2) { S }  i takes values 0, 2, 4,... until it is larger than n  for loop executes n/2 times  If S executes in O(1) time, loop complexity is n/2 * O(1) = O(n/2) = O(½n) = O(n)

Analysis of Algorithms, cont. Nested Loops for (int i=0; i<n; i++) { for (int j = 0; j < n; j++) { S } }  If S executes in constant time (e.g., O(1)) complexity of j loop is n * O(1) = O(n)  i loop's complexity = n * j loop's complexity = n * O(n) = O(n 2 )

Analysis of Algorithms, cont. Complex Nested Loops for (int m = 0; m < n; m++) { for (int i = 0; i < m; i++) { S } }  Outer loop executes n times  Inner loop executes m times  Assume S executes in constant time

Analysis of Algorithms, cont. Total number of executions is: = n n n -1 = (1 + n-1) + (2 + n-2) ( (n-1)/2 +(n+1)/2) = (n) + (n) + (n) (n) = n * n/2 = 0.5 * n 2 = O(n 2 )

Analysis of Algorithms, cont. Complex Nested Loops  Big-Oh notation matches previous nested loop  Minor improvement in inner loop didn't change big picture  But execution time may be half as much Big-Oh cannot be used to measure small improvements

Analysis of Algorithms, cont. Loops with ‘jumps’ for (int i = 0; i < n; i *= 2) { S }  i equals 1, 2, 4,... until it exceeds n  for loop executes 1 + log 2 n times  If S executes in O(1) time, loop complexity is: (log 2 n + 1) * O(1) = O(log n + 1) = O(log n)

Quick Analysis Tricks Analyzing nested loops  Complexity is product of loops’ complexity.  What is complexity of the following code? for (int i = 0; i < n; i++) for (int j = 0; j < i; j++) for (int k = 0; k < n; k++) for (int m = 0; m < k; m += 2) for (int q = 0; q < j; q++) { S }

Quick Analysis Tricks Analyzing consecutive loops  Complexity will be longest loop's complexity  What is complexity of 5 consecutive loops: 4 loop from 1 - n; 1 loops from 1 - n 2 ?

Experimental Verification Occasionally, may want to verify you have determined the correct complexity To do this verification  Implement the algorithm in a real programming language  Pick initial number of inputs (i.e. n)  Measure time needed to run program

Experimental Verification Verifying Big-Oh analysis  Increase n by a factor of 10 and run it again if logarithmic (O(log n)), takes 3x longer if linear (O(n)), takes 10x longer if O(n log n), takes 13x longer if quadratic (O(n 2 )), takes 100x longer if exponential (O(2 n )), takes over 1000x longer

Limitations of Big-Oh Analysis Constants can make a difference  For n < 8, 3n is larger than n log n  Ignores differences in time needed to access data in main memory versus on a disk Disks can take thousands of times longer to read  Worst case rarely happens Big-Oh notation often overestimates total time

Daily Quiz #1 Finish writing the findMin and findMax methods without using any loops: public void printMinAndMax(int[] a) { if (a.length < 1) return; System.out.println(“Min entry value: ” + Integer.toString(findMin(a,0))); System.out.println(“Max entry value: ” + Integer.toString(findMax(a,0))); } public int findMin(int[] a, int n) { … } public int findMax(int[] a, int n) { … }  Hint: Use n to determine when to stop the recursion

Daily Quiz #2 GIVEN: public abstract class Person {... } public interface Worker {... } public class Student extends Person {... } public class Employee extends Person implements Worker {... } public class StudentEmp extends Student implements Worker {... } WHICH OF THESE ARE ILLEGAL? OK? NEED CASTING? Person p1 = new Person(); Person p2 = new Student(); Person p3 = new Employee(); Person p4 = new StudentEmp(); Worker w2 = p2; Worker w3 = p3; Worker w4 = p4; Student s2 = p2; Student s3 = p3; Student s4 = p4; Employee e3 = p3; Employee e4 = p4; StudentEmp se3 = p3; StudentEmp se4 = p4;

Daily Quiz #2 Not using these slides, write each of the following:  A method executing in O(1) time  A method executing in O(n) time  A method executing in O(n log n) time  A method executing in O(n 2 ) time  A method executing in O(n 4 log n) time