Richard Anderson Lecture 3

Slides:



Advertisements
Similar presentations
Lecture: Algorithmic complexity
Advertisements

CSE 373 Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
CSE 421 Algorithms Richard Anderson Lecture 4. What does it mean for an algorithm to be efficient?
CSE 421 Algorithms Richard Anderson Autumn 2006 Lecture 2.
Asymptotic Analysis Motivation Definitions Common complexity functions
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CSE 421 Algorithms Richard Anderson Lecture 3. Classroom Presenter Project Understand how to use Pen Computing to support classroom instruction Writing.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis (Big O)
Instructor Neelima Gupta
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 4.
Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Fundamentals of Algorithms MCS - 2 Lecture # 8. Growth of Functions.
CSCI 256 Data Structures and Algorithm Analysis Lecture 3 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Algorithm Analysis Part of slides are borrowed from UST.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 2.
CSE 421 Algorithms Richard Anderson (for Anna Karlin) Winter 2006 Lecture 2.
Asymptotic Analysis CSE 331. Definition of Efficiency An algorithm is efficient if, when implemented, it runs quickly on real instances Implemented where?
CSE 373: Data Structures and Algorithms
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
Analysis of Algorithms: Methods and Examples CSE 2320 – Algorithms and Data Structures Vassilis Athitsos Modified by Alexandra Stefan University of Texas.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Asymptotic Complexity
Analysis of Algorithms
Analysis of Algorithms
COMP108 Algorithmic Foundations Algorithm efficiency
Analysis of Algorithms
Introduction to Algorithms
Lecture 06: Linked Lists (2), Big O Notation
CS 3343: Analysis of Algorithms
CSE 421: Introduction to Algorithms
CSC 413/513: Intro to Algorithms
Introduction to Algorithms Analysis
Richard Anderson Lecture 4
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Programming and Data Structure
Richard Anderson Autumn 2006 Lecture 1
Analysis of Algorithms
DS.A.1 Algorithm Analysis Chapter 2 Overview
Chapter 2.
Programming and Data Structure
CSE 373 Data Structures Lecture 5
Performance Evaluation
CSC 380: Design and Analysis of Algorithms
Math/CSE 1019N: Discrete Mathematics for Computer Science Winter 2007
Richard Anderson Winter 2009 Lecture 2
At the end of this session, learner will be able to:
Richard Anderson Winter 2019 Lecture 4
Design and Analysis of Algorithms
CSE 373, Copyright S. Tanimoto, 2001 Asymptotic Analysis -
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Estimating Algorithm Performance
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Analysis of Algorithms
Richard Anderson Autumn 2015 Lecture 4
Algorithm Efficiency and Sorting
Intro to Theory of Computation
Presentation transcript:

Richard Anderson Lecture 3 CSE 421 Algorithms Richard Anderson Lecture 3

Draw a picture of something from Seattle

What is the run time of the Stable Matching Algorithm? Initially all m in M and w in W are free While there is a free m w highest on m’s list that m has not proposed to if w is free, then match (m, w) else suppose (m2, w) is matched if w prefers m to m2 unmatch (m2, w) match (m, w) Executed at most n2 times

O(1) time per iteration Find free m Find next available w If w is matched, determine m2 Test if w prefer m to m2 Update matching

What does it mean for an algorithm to be efficient?

Definitions of efficiency Fast in practice Qualitatively better worst case performance than a brute force algorithm

Polynomial time efficiency An algorithm is efficient if it has a polynomial run time Run time as a function of problem size Run time: count number of instructions executed on an underlying model of computation T(n): maximum run time for all problems of size at most n

Polynomial Time Algorithms with polynomial run time have the property that increasing the problem size by a constant factor increases the run time by at most a constant factor (depending on the algorithm)

Why Polynomial Time? Generally, polynomial time seems to capture the algorithms which are efficient in practice The class of polynomial time algorithms has many good, mathematical properties

Polynomial vs. Exponential Complexity Suppose you have an algorithm which takes n! steps on a problem of size n If the algorithm takes one second for a problem of size 10, estimate the run time for the following problems sizes: 12 14 16 18 20 10: 1 second 12: 2 minutes 14: 6 hours 16: 2 months 18: 50 years 20: 20K years

Ignoring constant factors Express run time as O(f(n)) Emphasize algorithms with slower growth rates Fundamental idea in the study of algorithms Basis of Tarjan/Hopcroft Turing Award

Why ignore constant factors? Constant factors are arbitrary Depend on the implementation Depend on the details of the model Determining the constant factors is tedious and provides little insight

Why emphasize growth rates? The algorithm with the lower growth rate will be faster for all but a finite number of cases Performance is most important for larger problem size As memory prices continue to fall, bigger problem sizes become feasible Improving growth rate often requires new techniques

Formalizing growth rates T(n) is O(f(n)) [T : Z+  R+] If n is sufficiently large, T(n) is bounded by a constant multiple of f(n) Exist c, n0, such that for n > n0, T(n) < c f(n) T(n) is O(f(n)) will be written as: T(n) = O(f(n)) Be careful with this notation

Prove 3n2 + 5n + 20 is O(n2) Let c = Let n0 = Choose c = 6, n0 = 5 T(n) is O(f(n)) if there exist c, n0, such that for n > n0, T(n) < c f(n)

Order the following functions in increasing order by their growth rate n log4n 2n2 + 10n 2n/100 1000n + log8 n n100 3n 1000 log10n n1/2

Lower bounds T(n) is W(f(n)) Warning: definitions of W vary T(n) is at least a constant multiple of f(n) There exists an n0, and e > 0 such that T(n) > ef(n) for all n > n0 Warning: definitions of W vary T(n) is Q(f(n)) if T(n) is O(f(n)) and T(n) is W(f(n))

Useful Theorems If lim (f(n) / g(n)) = c for c > 0 then f(n) = Q(g(n)) If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)) If f(n) is O(h(n)) and g(n) is O(h(n)) then f(n) + g(n) is O(h(n))

Ordering growth rates For b > 1 and x > 0 logbn is O(nx) For r > 1 and d > 0 nd is O(rn)