The Efficiency of Algorithms

Slides:



Advertisements
Similar presentations
The Efficiency of Algorithms Chapter 4 Copyright ©2012 by Pearson Education, Inc. All rights reserved.
Advertisements

Algorithm Analysis.
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 10 Algorithm Efficiency
Introduction to Analysis of Algorithms
Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into.
CISC220 Spring 2010 James Atlas Lecture 06: Linked Lists (2), Big O Notation.
Cmpt-225 Algorithm Efficiency.
The Efficiency of Algorithms
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
The Efficiency of Algorithms Chapter 9 Slides by Steve Armstrong LeTourneau University Longview, TX  2007,  Prentice Hall.
Algorithm Efficiency and Sorting
Algorithm Efficiency and Sorting Bina Ramamurthy CSE116A,B.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Abstract Data Types (ADTs) Data Structures The Java Collections API
Algorithm Analysis (Big O)
Spring2012 Lecture#10 CSE 246 Data Structures and Algorithms.
COMP s1 Computing 2 Complexity
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Efficiency and Sorting Data Structure & Algorithm.
Chapter 10 Algorithm Analysis.  Introduction  Generalizing Running Time  Doing a Timing Analysis  Big-Oh Notation  Analyzing Some Simple Programs.
Computer Science and Software Engineering University of Wisconsin - Platteville 8. Comparison of Algorithms Yan Shi CS/SE 2630 Lecture Notes Part of this.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Algorithm Analysis Part of slides are borrowed from UST.
Efficiency of Algorithms. Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter()
Algorithm Analysis (Big O)
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
1 ADT Implementation: Recursion, Algorithm Analysis Chapter 10.
The Efficiency of Algorithms Chapter 9 Carrano, Data Structures and Abstractions with Java, Second Edition, (c) 2007 Pearson Education, Inc. All rights.
Algorithm Analysis 1.
Introduction to Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms
Lecture 06: Linked Lists (2), Big O Notation
The Efficiency of Algorithms
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
DS.A.1 Algorithm Analysis Chapter 2 Overview
Chapter 2.
Algorithm Analysis Bina Ramamurthy CSE116A,B.
The Efficiency of Algorithms
The Efficiency of Algorithms
The Efficiency of Algorithms
Analysis of Algorithms
Algorithm Efficiency and Sorting
Presentation transcript:

The Efficiency of Algorithms Chapter 7

Chapter Contents Motivation Measuring an Algorithm's Efficiency Big Oh Notation Formalities Picturing Efficiency The Efficiency of Implementations of the ADT List The Array-Based Implementation The Linked Implementation Comparing Implementations

Measuring Algorithm Efficiency Algorithm has both time and space requirements called complexity to measure Types of complexity Space complexity Time complexity Analysis of algorithms The measuring of either time/space complexity of an algorithm Measure the time complexity since it is more important Cannot compute actual time for an algorithm. Give function of problem size that is directly proportional to time requirement: growth-rate function Function measures how the time requirement grows as the problem size grows. We usually measure worst-case time

Measuring Algorithm Efficiency Three algorithms for computing 1 + 2 + … n for an integer n > 0

Measuring Algorithm Efficiency The number of operations required by the algorithms

Measuring Algorithm Efficiency The number of operations required by the algorithms as a function of n

Big Oh Notation Computer scientists use a notation to represent an algorithm’s complexity. To say "Algorithm A has a worst-case time requirement proportional to n" We say A is O(n) Read "Big Oh of n" or “order of at most n” For the other two algorithms Algorithm B is O(n2) Algorithm C is O(1)

Grows in magnitude from left to right… Big Oh Notation Grows in magnitude from left to right… Tabulates magnitudes of typical growth-rate functions evaluated at increasing values of n When analyzing the time efficiency of an algorithm, consider larger problems. For small problems, the difference between the execution time is usually insignificant.

f(n) ≤ c•g(n) for all n ≥ N Formalities Formal mathematical definition of Big Oh An algorithm's time requirement f(n) is of order at most g(n) Big Oh provides an upper bound on a function’s growth rate. f(n) = O(g(n)) That is, if a positive real number c and positive integer N exist such that f(n) ≤ c•g(n) for all n ≥ N c•g(n) is the upper bound on f(n) when n is sufficiently large.

An illustration of the definition of Big Oh Formalities An illustration of the definition of Big Oh

Example Show that f(n) = 5*n + 3 = O(n) g(n) = n, c = 6, and N = 3 f(n) <= 6 g(n) Why don’t we let g(n) = n^2 ? Let g(n) = n^2, c=8, N =1 Although the conclusion is correct, it is not as tight as possible. You want the upper bound to be as small as possible, and you want it to involve simple functions.

Formalities The following identities hold for Big Oh notation: O(k * f(n)) = O(f(n)) O(f(n)) + O(g(n)) = O(f(n) + g(n)) O(f(n)) * O(g(n)) = O(f(n) *g(n)) By using these identities and ignoring smaller terms in a growth rate function, you can determine the order of complexity with little efforts. O(4*n^2 + 50*n -10) = O(4*n^2) = O(n^2)

Body of loop requires a constant amount of time O(1) Picturing Efficiency Body of loop requires a constant amount of time O(1) an O(n) algorithm.

Picturing Efficiency An O(n2) algorithm.

Another O(n2) algorithm. Picturing Efficiency Another O(n2) algorithm.

Question? for i = 1 to n { for j = 1 to 5 sum = sum +1; } Using Gig Oh notation, what is the order of the computation time?

Get a Feel for Growth-rate Functions The effect of doubling the problem size on an algorithm's time requirement.

Get a Feel for Growth-rate Functions The time to process one million of problem size by algorithms of various orders at the rate of one million operations per second. A programmer can use O(n2), O(n3) or O(2n) as long as the problem size is small

Efficiency of Implementations of ADT List For array-based implementation Add to end of list O(1) Add to list at given position O(n) For linked implementation Add to end of list O(n)/O(1) Retrieving an entry O(n)

Comparing Implementations The time efficiencies of the ADT list operations for two implementations, expressed in Big Oh notation

Choose Implementation for ADT Consider the operations that your application requires A particular operation frequently, its implementation has to be efficient. Conversely, rarely use an operation, you can afford to use one that has an inefficient implementation.

Typical Growth-rate function 1: implies a problem whose time requirement is constant and, therefore, independent of problem size n. Log2n: time requirement for a logarithmic algorithm increase slowly as the problem size increases. If you square the problem size, you only double its time. n: time requirement for a linear algorithm increases directly with the size of problem. If you squire the problem size, you also squire its time requirement. n* Log2n: increases more rapidly than a linear algorithm. Such problem usually divide a problem into smaller problems that are each solved separately. n^2: the time requirement for a quadratic algorithm increases rapidly with the size of the problem. Algorithms that use two nested loops are often quadratic. Such algorithm are practical only for small problem. n^3: cubic algorithm increases more rapidly than quadratic algorithm. Algorithms that use three nested loops are often cubic. 2^n: the time requirement for exponential algorithm usually increases too rapidly to be practical.

Exercises Using Big Oh notation, indicate the time requirement of each of the following tasks in the worst case. Describe any assumptions that you make. a. After arriving at a party, you shake hands with each person there. b. Each person in a room shakes hands with everyone else in the room. c. You climb a flight of stairs. d. You slide down the banister. e. After entering an elevator, you press a button to choose a floor. f. You ride the elevator from the ground floor up to the nth floor. g. You read a book twice.

Exercises Suppose that your implementation of a particular algorithm appears in Java as follows: for (int pass = 1; pass <= n; pass++) { for (int index = 0; index < n; index++) for (int count = 1; count < 10; count++) . . . } // end for The algorithm involves an array of n items. The previous code shows the only repetition in the algorithm, but it does not show the computations that occur within the loops. These computations, however, are independent of n. What is the order of the algorithm?

Exercises What order is an algorithm that has as a growth-rate function of: a. 8*n^3 -9*n b. 7*log2 n + 20 c. 9*log2 n + n d. n*log2 n + n^2 e. log2 (log2 n) + 3*log2 n + 4