Algorithm Efficiency Chapter 10.

Slides:



Advertisements
Similar presentations
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Advertisements

Algorithm Complexity Analysis: Big-O Notation (Chapter 10.4)
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 10 Algorithm Efficiency
Introduction to Analysis of Algorithms
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Cmpt-225 Simulation. Application: Simulation Simulation  A technique for modeling the behavior of both natural and human-made systems  Goal Generate.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Algorithm Analysis (Big O)
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Week 2 CS 361: Advanced Data Structures and Algorithms
Nyhoff, ADTs, Data Structures and Problem Solving with C++, Second Edition, © 2005 Pearson Education, Inc. All rights reserved ADT Implementation:
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley. Ver Chapter 9: Algorithm Efficiency and Sorting Data Abstraction &
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
Algorithm Efficiency Chapter 10 Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Efficiency of Algorithms. Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter()
Algorithm Analysis (Big O)
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 3 Chapter Summary  Algorithms o Example Algorithms searching for an element in a list sorting a list so its elements are in some prescribed.
Algorithm Complexity Analysis (Chapter 10.4) Dr. Yingwu Zhu.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
1 ADT Implementation: Recursion, Algorithm Analysis Chapter 10.
Nyhoff, ADTs, Data Structures and Problem Solving with C++, Second Edition, © 2005 Pearson Education, Inc. All rights reserved Recursion,
© 2017 Pearson Education, Hoboken, NJ. All rights reserved
Algorithm Analysis 1.
Chapter 10 Algorithm Efficiency
Algorithm Efficiency and Sorting
CS 302 Data Structures Algorithm Efficiency.
Complexity Analysis (Part I)
Chapter 2 Algorithm Analysis
Mathematical Foundation
Analysis of Algorithms
ADT Implementation: Recursion, Algorithm Analysis, and Standard Algorithms Chapter 10 Nyhoff, ADTs, Data Structures and Problem Solving with C++, Second.
Analysis of Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Analysis of Algorithms
Introduction to Algorithms
GC 211:Data Structures Algorithm Analysis Tools
Lecture – 2 on Data structures
DATA STRUCTURES Introduction: Basic Concepts and Notations
Enough Mathematical Appetizers!
Computation.
Algorithm Analysis (not included in any exams!)
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
GC 211:Data Structures Algorithm Analysis Tools
Introduction to Algorithms Analysis
Applied Discrete Mathematics Week 6: Computation
Programming and Data Structure
Algorithm Efficiency and Sorting
Analysis of Algorithms
Analysis of Algorithms
Algorithm Efficiency: Searching and Sorting Algorithms
Algorithm Efficiency Chapter 10
Algorithm Analysis Bina Ramamurthy CSE116A,B.
Algorithm Efficiency and Sorting
CSE 1342 Programming Concepts
Algorithmic complexity
Estimating Algorithm Performance
Complexity Analysis (Part I)
Analysis of Algorithms
Algorithm Efficiency and Sorting
Complexity Analysis (Part I)
Presentation transcript:

Algorithm Efficiency Chapter 10

What Is a Good Solution? A program incurs a real and tangible cost. Computing time Memory required Difficulties encountered by users Consequences of incorrect actions by program A solution is good if … The total cost incurred over all phases of its life … is minimal

What Is a Good Solution? Important elements of the solution Good structure Good documentation Efficiency Be concerned with efficiency when Developing underlying algorithm Choice of objects and design of interaction between those objects

Measuring Efficiency of Algorithms Important because Choice of algorithm has significant impact Examples Responsive word processors Grocery checkout systems Automatic teller machines Video/Gaming machines Life support systems E-Commerce web sites

Measuring Efficiency of Algorithms Analysis of algorithms The area of computer science that provides tools for contrasting efficiency of different algorithms Comparison of algorithms should focus on significant differences in efficiency (order of magnitude differences as input size increases) We consider comparisons of algorithms, not programs How do we measure efficiency Space utilization – amount of memory required Time required to accomplish the task Time efficiency depends on : size of input and for some algorithms, input order speed of machine quality of source code quality of compiler These vary from one platform to another

Measuring Efficiency of Algorithms Difficulties with comparing programs (instead of algorithms) How are the algorithms coded What computer will be used What data should the program use Algorithm analysis should be independent of Specific implementations, computers, and data We can count the number of times instructions are executed This gives us a measure of efficiency of an algorithm So we measure computing time as: f(n) = computing time of an algorithm for input of size n = number of times the instructions are executed

The Execution Time of Algorithms An algorithm’s execution time is related to number of operations it requires. Example: Towers of Hanoi Solution for n disks required 2n – 1 moves If each move requires time m Solution requires (2n – 1)  m time units

Example: Calculating the Mean Task # times executed Initialize the sum to 0 1 Initialize index i to 0 1 While i < n do following n+1 a) Add x[i] to sum n b) Increment i by 1 n Return mean = sum/n 1 Total f(n) = 3n + 4

Computing Time Order of Magnitude As number of inputs increases f(n) = 3n + 4 grows at a rate proportional to n Thus f(n) has the "order of magnitude" n The computing time of an algorithm on input of size n, f(n) said to have order of magnitude g(n), Written as f(n) is O(g(n)) Defined as, f(n) is O(g(n)) iff there exist positive constants, C and N0 such that 0 < f(n) < Cg(n) for all n ≥ N0

Big Oh Notation Another way of saying this: The complexity of the algorithm is O(g(n)). Example: For the Mean-Calculation Algorithm: f(n) is O(n) Note that constants and multiplicative factors are ignored. f(x) ∈ O(g(x)) as there exists c > 0 (e.g., c = 1) and N0 (e.g., N0 = 5) such that f(x) ≤ cg(x) whenever N ≥ N0. Here, x is n and y is Time.

Algorithm Growth Rates Measure an algorithm’s time requirement as function of problem size Most important thing to learn How quickly algorithm’s time requirement grows as a function of problem size Demonstrates contrast in growth rates

Big Oh Notation g(n) is usually simple: n, n2, n3, ... 2n 1, log2n n log2n log2log2n Note graph of common computing times

Big Oh Notation Graphs of common computing times

Algorithm Growth Rates Time requirements as a function of problem size n

Analysis and Big O Notation The graphs of 3  n2 and n2 - 3  n + 10

Analysis and Big O Notation Order of growth of some common functions

Common Computing Time Functions log2log2n log2n n n log2n n2 n3 2n --- 1 2 0.00 4 8 1.00 16 64 1.58 3 24 512 256 2.00 4096 65536 2.32 5 32 160 1024 32768 4294967296 2.58 6 384 262144 1.84467E+19 3.00 2048 16777216 1.15792E+77 3.32 10 10240 1048576 1.07E+09 1.8E+308 4.32 20 20971520 1.1E+12 1.15E+18 6.7E+315652

Analysis and Big O Notation Worst-case analysis Worst case analysis usually considered Easier to calculate, thus more common Average-case analysis More difficult to perform Must determine relative probabilities of encountering problems of given size

Computing in Real Time Suppose each instruction can be done in 1 microsecond For n = 256 instructions how long for various f(n) Function Time log2log2n 3 microseconds Log2n 8 microseconds n .25 milliseconds n log2n 2 milliseconds n2 65 milliseconds n3 17 seconds 2n 3.7+E64 centuries!!

Keeping Your Perspective ADT used makes a difference Array-based getEntry is O(1) Link-based getEntry is O(n) Choosing implementation of ADT Consider how frequently certain operations will occur Seldom used but critical operations must also be efficient

Keeping Your Perspective If problem size is always small Possible to ignore algorithm’s efficiency Weigh trade-offs between Algorithm’s time and memory requirements Compare algorithms for style and efficiency

Efficiency of Searching Algorithms Sequential search Worst case: O(n) Average case: O(n) Best case: O(1) Binary search O(log2n) in worst case Average case: O(log2n) At same time, maintaining array in sorted order requires overhead cost … can be substantial

Sequential Search Algorithm linear search(x : integer; a1,...,an : distinct integers) i = 1 while (i  n and x  ai) i = i + 1 if i  n then location = i else location = 0 {location is the index(subscript) of the term that equals x, or 0 if x is not found} Give f(n) for Sequential Search? Count operations. Give worst case g(n) for Sequential Search? What is the average case for Sequential Search? On average where do we find x in the array, a? Give average case O(g(n)) for Sequential Search?

Seq. Search Average Case Analysis E[X] = S (x * Pr{X=x}), assume uniform probability distribution, here read E[X] as expectation (average) of the discrete variable X. X is a function that maps elements of a sample space to the real numbers. For Sequential Search, the sample space is the finite set of comparisons required to find the key. We assume the key is located in the array, a. Comparisons(x) Pr{X=x} x* Pr{X=x} 1 1/n 1/n 2 1/n 2/n 3 1/n 3/n 4 1/n 4/n 5 1/n 5/n Sum = 1/n(S i) for 6 1/n 6/n 1 ≤ i ≤ n 7 1/n 7/n = (n(n+1))/2n . . . = (n+1)/2 . . . On average, find key at middle n 1/n n/n

Binary Search Algorithm binary search(x : integer; a1,...,an : increasing integers) i = 1 { i is the left endpoint} j = n { j is the right endpoint} while i < j m = (i+j)/2 if x>am then i=m+1 else j = m if x=ai then location = i else location = -1 {location is the index(subscript) of the term that equals x, or -1 if x is not found} Give f(n) for Binary Search? Give worst case g(n) for Binary Search What is the average case for Binary Search? On average where do we find x in the array, a? Give average case O(g(n)) for Binary Search?

Binary Search Average Case Analysis E[X] = S (x * Pr{X=x}), here read E[X] as expectation (average) of the discrete variable X. X is a function that maps elements of a sample space to the real numbers. For Binary Search, the sample space is the finite set of comparisons required to find the key. We assume the key is located in the array, a. Comparisons(x) Pr{X=x} x* Pr{X=x} You fill in the table? On average, where should we find the key in the array?

Examples Assuming a linked list of n nodes, the statements Node *cur = head; while (cur != null) { cout << curr->item << endl; cur = cur->next; } // end while require ______ assignment(s). The code segment = O(___)? Consider an algorithm that contains loops of this form: for (i = 1 through n) for (j = 1 through i) for (k = 1 through 10) Task T If task T requires t time units, the innermost loop on k requires ___ time units. The middle loop on j requires ___ time units. The code segment = O(____)?

Examples Order the following functions from smallest growth rate to largest. n2 n 2n log2 n Use Big-O notation to specify the asymptotic run-time of the following code segments. Assume variables a and b are unsigned ints. while (a != 0) { cout <<a<<" "; a /=2; } = O(___)? If(a > b) cout<<a<<endl; else cout<<b<<endl; = O(___)?

End Chapter 10