Richard Anderson Autumn 2015 Lecture 4

Slides:



Advertisements
Similar presentations
Lecture: Algorithmic complexity
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Chapter 3 Growth of Functions
CSE 421 Algorithms Richard Anderson Lecture 4. What does it mean for an algorithm to be efficient?
Asymptotic Analysis Motivation Definitions Common complexity functions
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS2420: Lecture 4 Vladimir Kulyukin Computer Science Department Utah State University.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
CSE 421 Algorithms Richard Anderson Lecture 3. Classroom Presenter Project Understand how to use Pen Computing to support classroom instruction Writing.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Instructor Neelima Gupta
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 4.
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Fundamentals of Algorithms MCS - 2 Lecture # 8. Growth of Functions.
Geoff Holmes and Bernhard Pfahringer COMP206-08S General Programming 2.
CPSC 411 Design and Analysis of Algorithms
The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. CSE 3101Z Design and Analysis of Algorithms.
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
Algorithm Analysis Part of slides are borrowed from UST.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 2.
Asymptotic Analysis CSE 331. Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where?
CSE 373: Data Structures and Algorithms
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CISC220 Spring 2010 James Atlas Lecture 07: Big O Notation.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
Analysis of Algorithms: Methods and Examples CSE 2320 – Algorithms and Data Structures Vassilis Athitsos Modified by Alexandra Stefan University of Texas.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Asymptotic Complexity
Algorithm Analysis 1.
Analysis of Algorithms
Chapter 3: Growth of Functions
Introduction to Algorithms
Lecture 06: Linked Lists (2), Big O Notation
Analysis of Algorithms: Methods and Examples
CS 3343: Analysis of Algorithms
Chapter 3: Growth of Functions
CSC 413/513: Intro to Algorithms
Richard Anderson Lecture 4
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Programming and Data Structure
DS.A.1 Algorithm Analysis Chapter 2 Overview
Richard Anderson Lecture 3
Chapter 2.
Programming and Data Structure
Searching, Sorting, and Asymptotic Complexity
Intro to Data Structures
CSC 380: Design and Analysis of Algorithms
Math/CSE 1019N: Discrete Mathematics for Computer Science Winter 2007
Richard Anderson Winter 2019 Lecture 4
Design and Analysis of Algorithms
CSE 373, Copyright S. Tanimoto, 2001 Asymptotic Analysis -
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Estimating Algorithm Performance
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
CS 2604 Data Structures and File Management
Analysis of Algorithms
Algorithm Efficiency and Sorting
Intro to Theory of Computation
Presentation transcript:

Richard Anderson Autumn 2015 Lecture 4 CSE 421 Algorithms Richard Anderson Autumn 2015 Lecture 4

Announcements Reading Homework Guidelines Chapter 2.1, 2.2 Chapter 3 (Mostly review) Start on Chapter 4 Homework Guidelines Prove that your algorithm works A proof is a “convincing argument” Give the run time for you algorithm Justify that the algorithm satisfies the runtime bound You may lose points for style

What does it mean for an algorithm to be efficient?

Definitions of efficiency Fast in practice Qualitatively better worst case performance than a brute force algorithm

Polynomial time efficiency An algorithm is efficient if it has a polynomial run time Run time as a function of problem size Run time: count number of instructions executed on an underlying model of computation T(n): maximum run time for all problems of size at most n

Polynomial Time Algorithms with polynomial run time have the property that increasing the problem size by a constant factor increases the run time by at most a constant factor (depending on the algorithm)

Why Polynomial Time? Generally, polynomial time seems to capture the algorithms which are efficient in practice The class of polynomial time algorithms has many good, mathematical properties

Polynomial vs. Exponential Complexity Suppose you have an algorithm which takes n! steps on a problem of size n If the algorithm takes one second for a problem of size 10, estimate the run time for the following problems sizes: 12 14 16 18 20 10: 1 second 12: 2 minutes 14: 6 hours 16: 2 months 18: 50 years 20: 20K years

Ignoring constant factors Express run time as O(f(n)) Emphasize algorithms with slower growth rates Fundamental idea in the study of algorithms Basis of Tarjan/Hopcroft Turing Award

Why ignore constant factors? Constant factors are arbitrary Depend on the implementation Depend on the details of the model Determining the constant factors is tedious and provides little insight

Why emphasize growth rates? The algorithm with the lower growth rate will be faster for all but a finite number of cases Performance is most important for larger problem size As memory prices continue to fall, bigger problem sizes become feasible Improving growth rate often requires new techniques

Formalizing growth rates T(n) is O(f(n)) [T : Z+  R+] If n is sufficiently large, T(n) is bounded by a constant multiple of f(n) Exist c, n0, such that for n > n0, T(n) < c f(n) T(n) is O(f(n)) will be written as: T(n) = O(f(n)) Be careful with this notation

Prove 3n2 + 5n + 20 is O(n2) Let c = Let n0 = Choose c = 6, n0 = 5 T(n) is O(f(n)) if there exist c, n0, such that for n > n0, T(n) < c f(n)

Order the following functions in increasing order by their growth rate n log4n 2n2 + 10n 2n/100 1000n + log8 n n100 3n 1000 log10n n1/2

Lower bounds T(n) is W(f(n)) Warning: definitions of W vary T(n) is at least a constant multiple of f(n) There exists an n0, and e > 0 such that T(n) > ef(n) for all n > n0 Warning: definitions of W vary T(n) is Q(f(n)) if T(n) is O(f(n)) and T(n) is W(f(n))

Useful Theorems If lim (f(n) / g(n)) = c for c > 0 then f(n) = Q(g(n)) If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)) If f(n) is O(h(n)) and g(n) is O(h(n)) then f(n) + g(n) is O(h(n))

Ordering growth rates For b > 1 and x > 0 logbn is O(nx) For r > 1 and d > 0 nd is O(rn)