CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 4.

Slides:



Advertisements
Similar presentations
Time Analysis Since the time it takes to execute an algorithm usually depends on the size of the input, we express the algorithm's time complexity as a.
Advertisements

Lecture: Algorithmic complexity
HST 952 Computing for Biomedical Scientists Lecture 10.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
The Growth of Functions
Introduction to Analysis of Algorithms
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 21 Instructor: Paul Beame.
CSE 421 Algorithms Richard Anderson Lecture 4. What does it mean for an algorithm to be efficient?
Lecture 7 CSE 331 Sep 16, Feedback forms VOLUNTARY Last 5 mins of the lecture.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
CS2420: Lecture 4 Vladimir Kulyukin Computer Science Department Utah State University.
1 Discrete Mathematics Summer 2004 By Dan Barrish-Flood originally for Fundamental Algorithms For use by Harper Langston in D.M.
Elementary Data Structures and Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
CSE 421 Algorithms Richard Anderson Lecture 3. Classroom Presenter Project Understand how to use Pen Computing to support classroom instruction Writing.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis (Big O)
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
COMPSCI 102 Introduction to Discrete Mathematics.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
2.1 Computational Tractability
2.1 Computational Tractability. 2 Computational Tractability Charles Babbage (1864) As soon as an Analytic Engine exists, it will necessarily guide the.
Complexity & Analysis of Data Structures & Algorithms Piyush Kumar (Lecture 2: Algorithmic Analysis) Welcome to COP4531 Based on slides from J. Edmonds,
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Geoff Holmes and Bernhard Pfahringer COMP206-08S General Programming 2.
CSCI 256 Data Structures and Algorithm Analysis Lecture 3 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
CSC 430: Basics of Algorithm Runtime Analysis 1. What is Runtime Analysis? Fundamentally, it is the process of counting the number of “steps” it takes.
Asymptotic Analysis CSE 331. Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where?
CSE 373: Data Structures and Algorithms
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Asymptotes: Why? How to describe an algorithm’s running time? (or space, …) How does the running time depend on the input? T(x) = running time for instance.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Advanced Algorithms Analysis and Design
Algorithms Furqan Majeed.
CS 3343: Analysis of Algorithms
Algorithm Analysis (not included in any exams!)
Lecture 7 CSE 331 Sep 14, 2016.
Objective of This Course
Chapter 2 Basics of Algorithm Analysis
Complexity & Analysis of Data Structures & Algorithms
CS 201 Fundamental Structures of Computer Science
Richard Anderson Lecture 3
Chapter 2.
CSE 373 Data Structures and Algorithms
Chapter 2 Basics of Algorithm Analysis
Richard Anderson Winter 2019 Lecture 4
Estimating Algorithm Performance
Richard Anderson Autumn 2015 Lecture 4
Presentation transcript:

CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 4

Computational Tractability A major focus of algorithm design is to find efficient algorithms for computational problems. What does it mean for an algorithm to be efficient? As soon as an Analytic Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will arise - By what course of calculation can these results be arrived at by the machine in the shortest time? - Charles Babbage Charles Babbage (1864) Analytic Engine (schematic)

Some Initial Attempts at Defining Efficiency Proposed Definition of Efficiency (1): An algorithm is efficient if, when implemented, it runs quickly on real input instances –Where does it run? Even bad algorithms can run quickly when applied to small test cases on extremely fast processors –How is it implemented? Even good algorithms can run slowly when they are coded sloppily –What is a “real” input instance? Some instances can be much harder than others –How well, or badly, does the running time scale as problem sizes grow to unexpected levels? –We need a concrete definition that is platform-independent, instance-independent, and of predictive value with respect to increasing input sizes

Worst-Case Analysis Worst case running time: Obtain bound on largest possible running time of algorithm on input of a given size N –Draconian view, but hard to find effective alternative –Generally captures efficiency in practice Average case running time: Obtain bound on running time of algorithm on random input as a function of input size N –Hard (or impossible) to accurately model real instances by random distributions –Algorithm tuned for a certain distribution may perform poorly on other inputs

Brute-Force Search Ok, but what is a reasonable analytical benchmark that can tell us whether a running time bound is impressive or weak? –For many non-trivial problems, there is a natural brute force search algorithm that checks every possible solution (i.e., try all possibilities, see if any one works, N! for Stable Matching) –Note that this is an intellectual cop-out; it provides us with absolutely no insight into the problem structure –Thus, a first simple guide is by comparison with brute-force search Proposed Definition of Efficiency (2): An algorithm is efficient if it achieves qualitatively better worst-case performance than brute-force search –Still vague, what is “qualitatively better performance”?

Polynomial Time as a Definition of Efficiency Desirable scaling property: Algorithms with polynomial run time have the property that increasing the problem size by a constant factor increases the run time by at most a constant factor An algorithm is polynomial-time if the above scaling property holds There exists constants c > 0 and d > 0 such that on every input of size N, its running time is bounded by cN d steps.

Polynomial Time as a Definition of Efficiency Proposed Definition of Efficiency (3): An algorithm is efficient if it has a polynomial running time Justification: It really works in practice! –Generally, polynomial time seems to capture the algorithms which are efficient in practice –Although 6.02   N 20 is technically polynomial-time, it would be useless in practice –In practice, the polynomial-time algorithms that people develop almost always have low constants and low exponents –Breaking through the exponential barrier of brute force typically exposes some crucial structure of the problem

Polynomial Time as a Definition of Efficiency One further reason why the mathematical formalism and the empirical evidence seem to line up well in the case of polynomial-time solvability is that the gulf between the growth rates of polynomial and exponential functions is enormous

Asymptotic Order of Growth We could give a very concrete statement about the running time of an algorithm on inputs of size N such as: On any input of size N, the algorithm runs for at most 1.62N N + 8 steps –Finding such a precise bound may be an exhausting activity, and more detail than we wanted anyway –Extremely detailed statements about the number of steps an algorithm executes are often meaningless. Why?

Why Ignore Constant Factors? Constant factors are arbitrary –Depend on the implementation –Depend on the details of the model Determining the constant factors is tedious and provides little insight

Why Emphasize Growth Rates? The algorithm with the lower growth rate will be faster for all but a finite number of cases Performance is most important for larger problem size As memory prices continue to fall, bigger problem sizes become feasible Improving growth rate often requires new techniques

Formalizing Growth Rates Upper bounds: T(n) is O(f(n)) if there exist constants c > 0 and n 0  0 such that for all n  n 0 we have T(n)  c · f(n) Lower bounds: T(n) is  (f(n)) if there exist constants c > 0 and n 0  0 such that for all n  n 0 we have T(n)  c · f(n) Tight bounds: T(n) is  (f(n)) if T(n) is both O(f(n)) and  (f(n)) Ex: T(n) = 32n n + 32 –T(n) is O(n 2 ), O(n 3 ),  (n 2 ),  (n), and  (n 2 ) –T(n) is not O(n),  (n 3 ),  (n), or  (n 3 )