Object (Data and Algorithm) Analysis Cmput 115 - Lecture 5 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this.

Slides:



Advertisements
Similar presentations
Designed and Presented by Dr. Ayman Elshenawy Elsefy Dept. of Systems & Computer Eng.. Al-Azhar University
Advertisements

CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
HST 952 Computing for Biomedical Scientists Lecture 10.
Fundamentals of Python: From First Programs Through Data Structures
CMPT 225 Sorting Algorithms Algorithm Analysis: Big O Notation.
Analysys & Complexity of Algorithms Big Oh Notation.
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Sorting - Selection Sort Cmput Lecture 10 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Self-Reference - Induction Cmput Lecture 7 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this lecture is.
Complexity Analysis (Part I)
DAST, Spring © L. Joskowicz 1 Data Structures – LECTURE 1 Introduction Motivation: algorithms and abstract data types Easy problems, hard problems.
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
Sorting - Merge Sort Cmput Lecture 12 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is based.
Cmpt-225 Algorithm Efficiency.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Algorithm Efficiency and Sorting
Sorting - Insertion Sort Cmput Lecture 11 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is.
Concept of Basic Time Complexity Problem size (Input size) Time complexity analysis.
Quick Sort Cmput Lecture 13 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is based on code.
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
DAST, Spring © L. Joskowicz 1 Data Structures – LECTURE 1 Introduction Motivation: algorithms and abstract data types Easy problems, hard problems.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
Abstract Data Types (ADTs) Data Structures The Java Collections API
COMP s1 Computing 2 Complexity
1 i206: Lecture 6: Math Review, Begin Analysis of Algorithms Marti Hearst Spring 2012.
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Discrete Structures Lecture 11: Algorithms Miss, Yanyan,Ji United International College Thanks to Professor Michael Hvidsten.
Mathematics Review and Asymptotic Notation
Analysis of Algorithms
Chapter 3 Sec 3.3 With Question/Answer Animations 1.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Data Structure Introduction.
Computer Science and Software Engineering University of Wisconsin - Platteville 8. Comparison of Algorithms Yan Shi CS/SE 2630 Lecture Notes Part of this.
Algorithms and data structures: basic definitions An algorithm is a precise set of instructions for solving a particular task. A data structure is any.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
3.3 Complexity of Algorithms
Lecture 10 – Algorithm Analysis.  Next number is sum of previous two numbers  1, 1, 2, 3, 5, 8, 13, 21 …  Mathematical definition 2COMPSCI Computer.
Introduction to Analysis of Algorithms CS342 S2004.
Sorting: Implementation Fundamental Data Structures and Algorithms Klaus Sutner February 24, 2004.
Big Java by Cay Horstmann Copyright © 2009 by John Wiley & Sons. All rights reserved. Selection Sort Sorts an array by repeatedly finding the smallest.
Asymptotic Behavior Algorithm : Design & Analysis [2]
Foundations of Algorithms, Fourth Edition
Algorithm Analysis (Big O)
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Searching Topics Sequential Search Binary Search.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
INTRO2CS Tirgul 8 1. Searching and Sorting  Tips for debugging  Binary search  Sorting algorithms:  Bogo sort  Bubble sort  Quick sort and maybe.
1 Algorithms Searching and Sorting Algorithm Efficiency.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Analysis of Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
GC 211:Data Structures Algorithm Analysis Tools
GC 211:Data Structures Algorithm Analysis Tools
Algorithm Efficiency Chapter 10.
Algorithms and data structures: basic definitions
Presentation transcript:

Object (Data and Algorithm) Analysis Cmput Lecture 5 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this lecture is based on code from the book: Java Structures by Duane A. Bailey or the companion structure package Revised 12/31/99

©Duane Szafron About This Lecture In this lecture we will learn how to compare the relative space and time efficiencies of different software solutions.

©Duane Szafron Outline Abstraction Implementation choice criteria Time and space complexity Asymptotic analysis Worst, best and average case complexity

©Duane Szafron Abstraction in Software Systems A software system is an abstract model of a real-world phenomena. It is abstract since some of the properties of the real world are ignored in the system. For example, a banking system does not usually model the height and weight of its customers, even though it represents its customers in the software. The Customer class is only an abstraction of a real customer, with details missing.

©Duane Szafron Design Choices When a software system is designed, the designer has many choices about what abstractions to make. For example, in a banking system, the designer can choose to design a Customer object that knows its client number and password. Alternately, each Customer object could know its BankCard object which knows its client number and password.

©Duane Szafron Implementation Choices After a software system is designed, the programmer has many choices to make when implementing classes. For example, in a banking system, should an Account object remember an Array of Transactions or a Vector of Transactions? For example, should a sort method use a selection sort or a merge sort? In this course, we will focus on choices at the class and method level.

©Duane Szafron Choice Criteria What criteria should be used to make a choice at the implementation level? Historically, these two criteria have been considered the most important: –The amount of space that an object uses and the amount of temporary storage used by its methods. –The amount of time that a method takes to run.

©Duane Szafron Other Choice Criteria However, we should also consider: –The simplicity of an object representation. –The simplicity of an algorithm that is implemented as a method. Why? These criteria are important because they directly affect: –Amount of time it takes to implement a class. –Number of defects (bugs) in classes. –Amount of time it takes to update a class when requirements change.

©Duane Szafron Comparing Methods What is better, a selection sort or a merge sort? –We can implement each sort as a method. –We can count the amount of storage space each method requires for sample data. –We can time the two methods on sample data and see which one is faster. –We could time how long it takes a programmer to write the code for each method. –We can count the defects in the two implementations.

©Duane Szafron Comparing Average Methods Unfortunately, our answers would depend on the sample data and the individual programmer chosen. To circumvent this problem, we could compute averages over many sample data sets and over many programmers. This would be a long and expensive process. We need a way to predict the answers by analysis rather than experimentation.

©Duane Szafron Time and Space Complexity The minimum time and space requirements of a method can be determined from the algorithm. Since these values depend on the data size, we often describe them as a function of the data size (referred to as n). These functions are called respectively, the time complexity, time(n), and space complexity, space(n), of the algorithm.

©Duane Szafron Space Complexity It is easy to compute the space complexity for many algorithms: –A selection sort of one word elements has space complexity: space(n) = n + 1 words it takes n words to store the elements. It takes one extra word for swapping elements. –A merge sort of one word elements has space complexity: space(n) = 2n words It takes n words to store the elements. It takes n words for swapping elements during merges.

©Duane Szafron Time Complexity - Operations The time complexity of an algorithm is more difficult to compute. –The time depends on the number and kind of machine instructions that are generated. –The time depends on the time it takes to run machine instructions. One approach is to pick some basic operation and count these operations. –For example, if we count addition operations in an algorithm that adds the elements of a list, the time complexity for a list with n elements is: n.

©Duane Szafron Data Size is Important If we are sorting 100 values, it probably doesn’t matter which sort we use: –The time and space complexities are irrelevant. –However, the simplicity may be a factor. If we are sorting 20,000 values, it may matter: –1 sec (SS) versus 16 sec (MS) on current hardware. –20,001 words (SS) versus 40,000 words (MS). –The simplicity difference is still the same. If we are sorting 1,000,000 values, it does matter. n 2 = 1,000,000,000,000 nlog 2 n= 1,000,000*20=20,000,000

©Duane Szafron Asymptotic Analysis We are often only interested in the time and space complexities for large data sizes, n. That is, we are interested in the asymptotic behavior of time(n) and space(n) in the limit as n goes to infinity. A function f(n) is O(g(n)) “big-O” if and only if: limit {f(n)/g(n)} = c, where c is a constant n  If you are uncomfortable with limits then the textbook has an alternate definition.

©Duane Szafron Constant Time - O(1) Algorithms The time to assign a value to an array is: –It takes a constant time, say c 1. –time(n) is O(1) since limit {c 1 / 1} = c 1. n  The time to add two integers stored in memory is: –Accessing each int takes a constant time, say c 1. –Adding the two ints takes a constant time, say c 2. –time(n) is O(1) since limit {(c 1 + c 2 ) / 1} = c 1 + c 2. n  myArray[0] = 1; first + second

©Duane Szafron Linear Time - O(n) Algorithms The time to add the values in an array is linear: –Initializing the sum and index to zero takes c 1. –Each addition and store takes c 2. –Each loop index increment and comparison to the stop index takes c 3. –time(n) is O(n) since limit {(c 1 + n* c 2 + n* c 3 )/ n} = c 2 + c 3. n  sum = 0; for (i = 0; i < list.length; i++) sum = sum + list[i];

©Duane Szafron Polynomial Time - O(n c ) Algorithms The time to add up the elements in a square matrix of size n is quadratic (n c with c = 2): –Initializing the running sum to zero takes c 1. –Each addition takes c 2. –Each outer loop index modification takes c 3. –Each inner loop index modification takes c 3. –time(n) is O(n 2 ) since limit {(c 1 +n 2 *c 2 + (n+n 2 )*c 3 )/ n 2 } = c 2 + c 3. n  sum = 0; for (row = 0; row < aMatrix.height(); row++) for (column = 0; column < aMatrix.height(); column++) sum = sum + aMatrix.elementAt(row, column);

©Duane Szafron Exponential Time - O(c n ) Algorithms The time to print each element of a [full] binary tree of depth n is exponential (c n with c = 2): –Each print takes c 1. –Each left node and right node access takes c 2. –Each stop check takes c 3. –time(n) is O(2 n ) since limit {(c 1 *(2 n -1)+c 2 *(2 n -2)+c 3 *(2 n -2) )/ 2 n } = c 1 +c 2 +c 3. n  System.out.println(aNode.value()); leftNode = aNode.left(); if (leftNode != null) leftNode.print(); rightNode = aNode.right(); if (rightNode != null) rightNode.print(); The number of nodes in the tree is 2 n - 1

©Duane Szafron Variable Time Algorithms The time complexity of many algorithms depends not only on the size of the data (n), but also on the specific data values: –The time to find a key value in an indexed container depends on the location of the key in the container. –The time to sort a container depends on the initial order of the elements.

©Duane Szafron Worst, Best and Average Complexity We define three common cases. –The best case complexity is the smallest value for any problem of size n. –The worst case complexity is the largest value for any problem of size n. –The average case complexity is the weighted average of all problems of size n, where the weight of a problem is the probability that the problem occurs.

©Duane Szafron Index Value Worst Case Complexity Example Assume we are doing a sequential search on a container of size n. The worst case complexity is when the element is not in the container. 35 The worst case complexity is O(n).

©Duane Szafron Best Case Complexity Example Assume we are doing a sequential search on a container of size n. The best case complexity is when the element is the first one The best case complexity is O(1).

©Duane Szafron Average Case Complexity Example To compute the average case complexity, we must make assumptions about the probability distribution of the data sets. –Assume a sequential search on a container of size n. –Assume a 50% chance that the key is in the container. –Assume equal probabilities that the key is at any position. The probability that the key is at any particular location in the container is (1/2) * (1/n) = 1/2n. The average case complexity is: 1*(1/2n) + 2*(1/2n) + … + n*(1/2n) + n*(1/2) = (1/2n)*( … + n) + (n/2) = (1/2n)*(n*(n+1)/2) + (n/2) = (n+1)/4 + (n/2) = (3n + 1)/4 = O(n)

©Duane Szafron Trading Time and Space It is often possible to construct two different algorithms, where one is faster but takes more space. For example: –A selection sort has worst case time complexity O(n2) while using n+1 words of space. –A merge sort has worst case time complexity O(n*log(n)) while using 2*n words of space. Analysis lets us choose the best algorithm based on our time and space requirements.

©Duane Szafron Principles Big O analysis generally is only applicable for large n! (n)

©Duane Szafron Trace of Selection Sort Unsorted Pass Number(i)Sorted JK j space(n)= n + 1 time(n)=n+(n-1)+(n-2)+…+1= n(n+1)/2= O(n 2 )

©Duane Szafron Trace of Merge Sort Unsorted Pass Number(i)Sorted JK j space(n)= 2n time(n)=1+2+4+…+2 log(n-1) +n*1+(n/2)*2+(n/4)* *n = n + n*log(n) = O(n*logn) Recall x=2 n => log 2 x=n 8=2 3 => log 2 8=3 DECOMPOSE MERGE