Efficiency of Algorithms Csci 107 Lecture 6. Last time –Algorithm for pattern matching –Efficiency of algorithms Today –Efficiency of sequential search.

Slides:



Advertisements
Similar presentations
CSE Lecture 3 – Algorithms I
Advertisements

Efficiency of Algorithms Csci 107 Lecture 6-7. Topics –Data cleanup algorithms Copy-over, shuffle-left, converging pointers –Efficiency of data cleanup.
Efficiency of Algorithms
The Efficiency of Algorithms
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, Java Version, Third Edition.
CPSC 171 Introduction to Computer Science Efficiency of Algorithms.
The Efficiency of Algorithms
HST 952 Computing for Biomedical Scientists Lecture 10.
90-723: Data Structures and Algorithms for Information Processing Copyright © 1999, Carnegie Mellon. All Rights Reserved. 1 Lecture 2: Basics Data Structures.
Fundamentals of Python: From First Programs Through Data Structures
ALG0183 Algorithms & Data Structures Lecture 3 Algorithm Analysis 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks Weiss Chapter 5 Sahni.
CPSC 171 Introduction to Computer Science More Efficiency of Algorithms.
What is an Algorithm? (And how do we analyze one?)
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Efficiency of Algorithms
CS107 Introduction to Computer Science
Chapter 3 The Efficiency of Algorithms
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Third Edition Additions by Shannon Steinfadt SP’05.
Efficiency of Algorithms February 19th. Today Binary search –Algorithm and analysis Order-of-magnitude analysis of algorithm efficiency –Review Sorting.
Object (Data and Algorithm) Analysis Cmput Lecture 5 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this.
CS107 Introduction to Computer Science Lecture 5, 6 An Introduction to Algorithms: List variables.
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Fourth Edition.
Efficiency of Algorithms Csci 107 Lecture 8. Last time –Data cleanup algorithms and analysis –  (1),  (n),  (n 2 ) Today –Binary search and analysis.
Algorithms and Efficiency of Algorithms February 4th.
Chapter 3: The Efficiency of Algorithms
Efficiency of Algorithms February 11th. Efficiency of an algorithm worst case efficiency is the maximum number of steps that an algorithm can take for.
Concept of Basic Time Complexity Problem size (Input size) Time complexity analysis.
CSC 2300 Data Structures & Algorithms March 20, 2007 Chapter 7. Sorting.
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (1) Asymptotic Complexity 10/28/2008 Yang Song.
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Third Edition Additions by Shannon Steinfadt SP’05.
Elementary Data Structures and Algorithms
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Last Class: Graph Undirected and directed (digraph): G=, vertex and edge Adjacency matrix and adjacency linked list
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
1 Chapter 3: Efficiency of Algorithms Quality attributes for algorithms Correctness: It should do things right No flaws in design of the algorithm Maintainability.
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
计算机科学概述 Introduction to Computer Science 陆嘉恒 中国人民大学 信息学院
Efficiency of Algorithms Csci 107 Lecture 7. Last time –Data cleanup algorithms and analysis –  (1),  (n),  (n 2 ) Today –Binary search and analysis.
SEARCHING. Vocabulary List A collection of heterogeneous data (values can be different types) Dynamic in size Array A collection of homogenous data (values.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
1Computer Sciences Department. Book: Introduction to Algorithms, by: Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest Clifford Stein Electronic:
CSC 211 Data Structures Lecture 13
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Chapter 9 Efficiency of Algorithms. 9.3 Efficiency of Algorithms.
Sorting.
Invitation to Computer Science 6th Edition Chapter 3 The Efficiency of Algorithms.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
Algorithmics - Lecture 41 LECTURE 4: Analysis of Algorithms Efficiency (I)
Algorithm Analysis with Big Oh ©Rick Mercer. Two Searching Algorithms  Objectives  Analyze the efficiency of algorithms  Analyze two classic algorithms.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
LECTURE 9 CS203. Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search) and sorting (selection sort.
Lecture – 2 on Data structures
Chapter 3: The Efficiency of Algorithms
Chapter 3: The Efficiency of Algorithms
Invitation to Computer Science 5th Edition
Presentation transcript:

Efficiency of Algorithms Csci 107 Lecture 6

Last time –Algorithm for pattern matching –Efficiency of algorithms Today –Efficiency of sequential search –Data cleanup algorithms Copy-over, shuffle-left, converging pointers –Efficiency of data cleanup algorithms –Order of magnitude  (n),  (n 2 )

(Time) Efficiency of an algorithm worst case efficiency is the maximum number of steps that an algorithm can take for any collection of data values. Best case efficiency is the minimum number of steps that an algorithm can take any collection of data values. Average case efficiency - the efficiency averaged on all possible inputs - must assume a distribution of the input - we normally assume uniform distribution (all keys are equally probable) If the input has size n, efficiency will be a function of n

Order of Magnitude Worst-case of sequential search: –3n+5 comparisons –Are these constants accurate? Can we ignore them? Simplification: –ignore the constants, look only at the order of magnitude –n, 0.5n, 2n, 4n, 3n+5, 2n+100, 0.1n+3 ….are all linear –we say that their order of magnitude is n 3n+5 is order of magnitude n: 3n+5 =  (n) 2n +100 is order of magnitude n: 2n+100=  (n) 0.1n+3 is order of magnitude n: 0.1n+3=  (n) ….

Efficiency of Copy-Over Best case: –all values are zero: no copying, no extra space Worst-case: –No zero value: n elements copied, n extra space –Time:  (n) –Extra space: n

Efficiency of Shuffle-Left Space: –no extra space (except few variables) Time –Best-case No zero value: no copying ==> order of n =  (n) –Worst case All zero values: –every element thus requires copying n-1 values one to the left n x (n-1) = n 2 - n = order of n 2 =  (n 2 ) (why?) –Average case Half of the values are zero n/2 x (n-1) = (n 2 - n)/2 = order of n 2 =  (n 2 )

Efficiency of Converging Pointers Algorithm Space –No extra space used (except few variables) Time –Best-case No zero value No copying => order of n =  (n) –Worst-case All values zero: One copy at each step => n-1 copies order of n =  (n) –Average-case Half of the values are zero: n/2 copies order of n =  (n)

Data Cleanup Algorithms Copy-Over –worst-case: time  (n), extra space n –best case: time  (n), no extra space Shuffle-left –worst-case: time  (n 2 ), no extra space –Best-case: time  (n), no extra space Converging pointers –worst-case: time  (n), no extra space –Best-case: time  (n), no extra space

Order of magnitude  (n 2 ) Any algorithm that does cn 2 work for any constant c –2n 2 is order of magnitude n 2 : 2n 2 =  (n 2 ) –.5n 2 is order of magnitude n 2 :.5n 2 =  (n 2 ) –100n 2 is order of magnitude n 2: 100n 2 =  (n 2 )

Another example Problem: Suppose we have n cities and the distances between cities are stored in a table, where entry [i,j] stores the distance from city i to city j –How many distances in total? –An algorithm to write out these distances For each row 1 through n do –For each column 1 through n do »Print out the distance in this row and column –Analysis?

Comparison of  (n) and  (n 2 )  (n): n, 2n+5, 0.01n, 100n, 3n+10,..  (n 2 ): n 2, 10n 2, 0.01n 2,n 2 +3n, n 2 +10,… We do not distinguish between constants.. –Then…why do we distinguish between n and n 2 ?? –Compare the shapes: n 2 grows much faster than n Anything that is order of magnitude n 2 will eventually be larger than anything that is of order n, no matter what the constant factors are Fundamentally n 2 is more time consuming than n –  (n 2 ) is larger (less efficient) than  (n) 0.1n 2 is larger than 10n (for large enough n) n 2 is larger than 1000n (for large enough n)

The Tortoise and the Hare Does algorithm efficiency matter?? –…just buy a faster machine! Example: Pentium Pro –1GHz (10 9 instr per second), $2000 Cray computer –10000 GHz(10 13 instr per second), $30million Run a  (n) algorithm on a Pentium Run a  (n 2 ) algorithm on a Cray For what values of n is the Pentium faster? –For n > the Pentium leaves the Cray in the dust..