Algorithm Analysis (Big O) CS-341 Dick Steflik. Complexity In examining algorithm efficiency we must understand the idea of complexity –Space complexity.

Slides:



Advertisements
Similar presentations
12-Apr-15 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Advertisements

CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
HST 952 Computing for Biomedical Scientists Lecture 10.
90-723: Data Structures and Algorithms for Information Processing Copyright © 1999, Carnegie Mellon. All Rights Reserved. 1 Lecture 2: Basics Data Structures.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Computer Science 101 Efficiency and Complexity Analysis.
Fundamentals of Python: From First Programs Through Data Structures
Analysys & Complexity of Algorithms Big Oh Notation.
CS 3610/5610N data structures Lecture: complexity analysis Data Structures Using C++ 2E1.
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 10 Algorithm Efficiency
Algorithm Analysis (Big O) CS-341 Dick Steflik. Complexity In examining algorithm efficiency we must understand the idea of complexity –Space complexity.
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
CS107 Introduction to Computer Science
Cmpt-225 Algorithm Efficiency.
The Efficiency of Algorithms
1 Algorithm Efficiency, Big O Notation, and Role of Data Structures/ADTs Algorithm Efficiency Big O Notation Role of Data Structures Abstract Data Types.
The Efficiency of Algorithms
Algorithm Efficiency and Sorting Bina Ramamurthy CSE116A,B.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Abstract Data Types (ADTs) Data Structures The Java Collections API
Algorithm Analysis (Big O)
Spring2012 Lecture#10 CSE 246 Data Structures and Algorithms.
COMP s1 Computing 2 Complexity
© Janice Regan, CMPT 128, Feb CMPT 128: Introduction to Computing Science for Engineering Students Running Time Big O Notation.
1 Complexity Lecture Ref. Handout p
Program Performance & Asymptotic Notations CSE, POSTECH.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
Chapter 3 Sec 3.3 With Question/Answer Animations 1.
Algorithm Evaluation. What’s an algorithm? a clearly specified set of simple instructions to be followed to solve a problem a way of doing something What.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
1 Lecture 5 Generic Types and Big O. 2 Generic Data Types A generic data type is a type for which the operations are defined but the types of the items.
Algorithm Efficiency and Sorting Data Structure & Algorithm.
Chapter 10 Algorithm Analysis.  Introduction  Generalizing Running Time  Doing a Timing Analysis  Big-Oh Notation  Analyzing Some Simple Programs.
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
Efficiency of Algorithms. Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter()
Algorithm Analysis (Big O)
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Computer science is a field of study that deals with solving a variety of problems by using computers. To solve a given problem by using computers, you.
Searching Topics Sequential Search Binary Search.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithmic Efficency
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Introduction to complexity
GC 211:Data Structures Algorithm Analysis Tools
Complexity In examining algorithm efficiency we must understand the idea of complexity Space complexity Time Complexity.
Algorithm Analysis CSE 2011 Winter September 2018.
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
GC 211:Data Structures Algorithm Analysis Tools
Introduction to Data Structures
Analysys & Complexity of Algorithms
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Presentation transcript:

Algorithm Analysis (Big O) CS-341 Dick Steflik

Complexity In examining algorithm efficiency we must understand the idea of complexity –Space complexity –Time Complexity

Space Complexity when memory was expensive and machines didn’t have much, we focused on making programs as space efficient as possible and developed schemes to make memory appear larger than it really was (virtual memory and memory paging schemes) Although not as important today, space complexity is still important in the field of embedded computing (hand held computer based equipment like cell phones, palm devices, etc)

Time Complexity Is the algorithm “fast enough” for my needs How much longer will the algorithm take if I increase the amount of data it must process Given a set of algorithms that accomplish the same thing, which is the right one to choose

Cases to examine Best case –if the algorithm is executed, the fewest number of instructions are executed Average case –executing the algorithm produces path lengths that will on average be the same Worst case –executing the algorithm produces path lengths that are always a maximum

Worst case analysis Of the three cases the really only useful case (from the standpoint of program design) is that of the worst case. Worst case helps answer the software lifecycle issue of: –If its good enough today, will it be good enough tomorrow

Frequency Count examine a piece of code and predict the number of instructions to be executed Ex Code for (int i=0; i< n ; i++) { cout << i; p = p + i; } F.C. n+1 n ____ 3n+1 Inst # for each instruction predict how mant times each will be encountered as the code runs totaling the counts produces the F.C. (frequency count)

Order of magnitude In the previous example best_case=avg_case=worst_case because the example was based just on fixed iteration taken by itself, F.C. is relatively meaningless but expressed as an order of magnitude we can use it as a estimator of algorithm performance as we increase the amount of data to convert F.C. to order of magnitude: –discard constant terms –disregard coefficients –pick the most significant term the order of magnitude of 3n+1 becomes n if F.C. is always calculated taking a worst case path through the algorithm the order of magnitude is called Big O (i.e. O(n))

Another example Code for (int i=0; i< n ; i++) for int j=0 ; j < n; j++) { cout << i; p = p + i; } F.C. n+1 n(n+1) n*n ____ 3n+1 Inst # F.C. n+1 n 2 +2n+1 n 2 ____ 3n 2 +3n+2 discarding constant terms produces : 3n 2 +3n clearing coefficients : n 2 +n picking the most significant term: n 2 Big O = O(n 2 )

What is Big O Big O is the rate at which performance of an algorithm degrades as a function of the amount of data it is asked to handle For example: O(n) indicates that performance degrades at a linear rate; O(n 2 ) indicates the rate of degradation follows a quadratic path.

Common growth rates

Small collections of data log 2 n n2n2 20 For small collections of data, an algorithm with a worse overall big O may actually perform better than an algorithm with a better big O

Combining Algorithms Sometimes we can increase the overall performance of an algorithm (but not it’s bigO) by combining two algorithms. Suppose we have a large number of data elements to be sorted and the algorithm we pick has O(log 2 n) but we notice that it spends a lot of time doing housekeeping for small sub-collections of data. We can take advantage of the better performance of an O(n2) algorithm on small collections of data by switching algorithms when we notice that the collection size is getting small This will improve the overall performance (time that this specific case takes to run) but will not change the bigO

Summary Order-of-magnitude analysis and Big O measure an algorithm’s time requirement as a function of of the problem size by using a growth rate function. This allows you to analyze the efficiency without regard to factors like the computer speed and skill of the person coding the solution. When comparing the efficiency of algorithms, you examine their growth rate functions when the problems are large. Only significant differences in growth-rate functions are meaningful. Worst case analysis considers the maximum amount of work an algorithm will require while average case analysis considers the expected amount of work. Use order-of-magnitude analysis to help you select a particular implementation for an ADT. If your application frequently uses particular ADT operations, your implementation should be efficient for at least those operations.