Chapter Two Algorithm Analysis

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Algorithm Complexity Analysis: Big-O Notation (Chapter 10.4)
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE 326 Asymptotic Analysis David Kaplan Dept of Computer Science & Engineering Autumn 2001.
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Analysis of Recursive Algorithms
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
COMP s1 Computing 2 Complexity
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
CSE332: Data Abstractions Section 2 HyeIn Kim Spring 2013.
Mathematics Review and Asymptotic Notation
Computer Science 212 Title: Data Structures and Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
CS 1704 Introduction to Data Structures and Software Engineering.
Analysis of Algorithms
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Introduction to Programming (in C++) Complexity Analysis of Algorithms
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Computer Science and Software Engineering University of Wisconsin - Platteville 8. Comparison of Algorithms Yan Shi CS/SE 2630 Lecture Notes Part of this.
Algorithms and data structures: basic definitions An algorithm is a precise set of instructions for solving a particular task. A data structure is any.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Algorithm Analysis Part of slides are borrowed from UST.
Algorithm Analysis Algorithm Analysis Lectures 3 & 4 Resources Data Structures & Algorithms Analysis in C++ (MAW): Chap. 2 Introduction to Algorithms (Cormen,
Analysis of Algorithms Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
CSE 326: Data Structures Asymptotic Analysis Hannah Tang and Brian Tjaden Summer Quarter 2002.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Algorithm Complexity Analysis (Chapter 10.4) Dr. Yingwu Zhu.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
Analysis of Algorithms
Analysis of Algorithms
Computer Science 212 Title: Data Structures and Algorithms
Analysis of Algorithms
Introduction to Algorithms
Algorithm Analysis Lectures 3 & 4
Algorithm Analysis (not included in any exams!)
Computer Science 212 Title: Data Structures and Algorithms
Complexity Analysis of Algorithms
Analysis of Algorithms
David Kauchak cs161 Summer 2009
Analysis of Algorithms
Algorithms and data structures: basic definitions
Presentation transcript:

Chapter Two Algorithm Analysis Empirical vs. theoretical Space vs. time Worst case vs. Average case Upper, lower, or tight bound Determining the runtime of programs What about recursive programs?

What’s the runtime? 2n3+n2+n+2? O(n3) runtime int n; cin >> n; for (int i=0; i<n; i++) for (int j=0; j<n; j++) for (int k=0; k<n; k++) cout << “Hello world!\n”; 2n3+n2+n+2? O(n3) runtime Memory leak What if the last line is replaced by: string *s=new string(“Hello world!\n”); O(n3) time and space

Resource Analysis Runtime: we’d like to count the steps – but that would be machine dependent Space: we may also be interested in space usage  ignore constant factors, use O() notation  count steps equivalent to machine language instructions  count the bytes used Do examples on board.

Asymptotic notation g(n) is said to be O(f(n)) if there exist constants c and n0 such that g(n) < c f(n) for all n > n0 g(n) is said to be W(f(n)) if there exist positive constants c and n0 such that 0 <= c f(n) < g(n) for all n > n0 g(n) is said to be Q(f(n)) if g(n) = O(f(n)) and g(n) = W(f(n)) O: like <= for functions (asymptotically speaking) W: like >= Q: like =  for all n > n0  ignore constant factors, lower order terms

Asymptotic notation: examples Asymptotic runtime, in terms of O, W, Q? Suppose the runtime for a function is n2 + 2n log n + 40 0.0000001 n2+ 1000000n1.999 n3 + n2 log n n2.0001 + n2 log n 2n+ 100 n2 1.00001n+ 100 n97

Asymptotic comparisons 0.0000001 n2 = O(1000000n1.999 )? n1.000001 = O(n log n)? 1.0001n = O(n943)? lg n = Q(ln n)? (Compare the limit of the quotient of the functions) No – a polynomial with a higher power dominates one with a lower power No – all polynomials (n.000001) dominate any polylog (log n) No – all exponentials dominate any polynomial Yes – different bases are just a constant factor difference

What’s the runtime? Q(n3) + Q(n3) = Q(n3) int n; cin >> n; for (int i=0; i<n; i++) for (int j=0; j<n; j++) for (int k=0; k<n; k++) cout << “Hello world!\n”; Q(n3) + Q(n3) = Q(n3) Statements or blocks in sequence: add

What’s the runtime? Loops: add up cost of each iteration int n; cin >> n; for (int i=0; i<n; i++) for (int j=n; j>1; j/=2) cout << “Hello world!\n”; Loops: add up cost of each iteration (multiply loop cost by number of iterations if they all take the same time) log n iterations of n steps  Q(n log n)

What’s the runtime? Loops: add up cost of each iteration int n; cin >> n; for (int i=0; i<n; i++) for (int j=0; j<i; j++) cout << “Hello world!\n”; Loops: add up cost of each iteration 1 + 2 + 3 + … + n = n(n+1)/2 = O(n2)

What’s the runtime? template <class Item> void insert(Item a[], int l, int r) { int i; for (i=r; i>l; i--) compexch(a[i-1],a[i]); for (i=l+2; i<=r; i++) { int j=i; Item v=a[i]; while (v<a[j-1]) { a[j] = a[j-1]; j--; } a[j] = v; }

What’s the runtime? void myst(int n) { if (n<100) for (int i=0; i<n; i++) for (int j=0; j<n; j++) for (int k=0; k<n; k++) cout << “Hello world!\n”; else for (int j=0; j<n; j++ }

Estimate the runtime Suppose an algorithm has runtime Q(n3) suppose solving a problem of size 1000 takes 10 seconds. How long to solve a problem of size 10000? Suppose an algorithm has runtime Q(n log n) runtime 10-8 n3; if n=10000, runtime 10000s = 2.7hr runtime 10-3 n lg n; if n=10000, runtime 133 secs

Worst vs. average case You might be interested in worst, best, or average case analysis of an algorithm You can have upper, lower, or tight bounds on each of those functions. Eg. For each n, some problem instances of size n have runtime n and some have runtime n2. Worst case: Best case: Average case: Q(n2), W(n), W(log n), O(n2), O(n3) W(n), W(log n), O(n2), Q(n) W(n), W(log n), O(n2), O(n3) Average case: need to know distribution of inputs

The Taxpayer Problem Tax time is coming up. The IRS needs to process tax forms. How to access and update each taxpayer’s info? ADT? ADT Dictionary: find(x), insert(x), delete(x) Implementation?

Array Implementation Insert(x): Find(k): Delete(I): Time for n Records[numRecs++] = x; Runtime: Time for n Operations? O(1) O(n2) For (I=0; I<numRecs; I++) if (records[I].key == k) return I; Runtime: O(n) records[I]=records[--numRecs]; Runtime: O(1)

Sorted Array Implementation Find(x): Runtime? int bot=1, top=numRecs-1, mid; while (bot <= top) { mid = (bot + top)/2; if (data[mid]==x) return mid; if (data[mid]<x) top=mid-1; else bot=mid+1; } return –1;

Analysis of Binary Search How many steps to search among n items? Number of items eliminated at each step? Definition of lg(x)? Runtime? O(log n)

Sorted Array, cont. Insert(x)? Delete(x)? Time for n insert, delete, and find ops? O(n) O(n) O(n2)

Which implementation is better? find(x) insert(x) delete(x) Array S. Array Worst case for n operations? Array: Sorted Array: O(n) O(1) O(1) O(log n) O(n) O(n) O(n2) O(n2) What if some operations are more frequent than others?

Molecule viewer example Java demos: molecule viewer Example1 Example2 Example3

Molecule Viewer Source Snippet /* * I use a bubble sort since from one iteration to the next, the sort * order is pretty stable, so I just use what I had last time as a * "guess" of the sorted order. With luck, this reduces O(N log N) * to O(N) */ for (int i = nvert - 1; --i >= 0;) { boolean flipped = false; for (int j = 0; j <= i; j++) { int a = zs[j]; int b = zs[j + 1]; if (v[a + 2] > v[b + 2]) { zs[j + 1] = a; zs[j] = b; flipped = true; } if (!flipped) break;

Merge sort runtime? T(n) = 2T(n/2) + c n T(1) = b; void mergesort(first, last) { if (last-first >= 1) { mid=(last-first)/2 + first; mergesort(first, mid); mergesort(mid+1,last); merge(first, mid, last); } T(n) = 2T(n/2) + c n T(1) = b; Called a recurrence relation

Recurrence relations In Discrete Math: you’ll learn how to solve these. In this class: we’ll say “Look it up.” But you will be responsible for knowing how to write down a recurrence relation for the runtime of a program. Divide-and-conquer algorithms like merge sort that divide problem size by 2 and use O(n) time to conquer T(n) = 2T(n/2) + c n have runtime O(n log n)

Hanoi runtime? T(n) = 2T(n-1) + c T(0) = b Look it up: T(n) = O(2n) void hanoi(n, from, to, spare { if (n > 0) { hanoi(n-1,from,spare,to); cout << from << “ – “ << to << endl; hanoi(n-1,spare,to,from); } T(n) = 2T(n-1) + c T(0) = b Look it up: T(n) = O(2n)

Hanoi recurrence solution T(n)=2T(n-1)+c T(n-1) = 2T(n-2) + c T(n-2) = 2T(n-3) + c _______ T(n) = 2T(n-1) + c = 2 [ 2T(n-2) + c ] + c = 22 T(n-2) + 2 c + c = 23 T(n-3) + 22 c + 21 c + 20 c … = 2k T(n-k) + 2k-1 c + 2k-2 c + … + 21 c + 20 c = 2k T(n-k) + c(2k – 1) Done when n-k=0 since we know T(0). T(n) = 2n b + c 2n - c = Q(2n)

Binary Search recurrence? Recurrence relation? T(n)=T(n/2)+c; T(1) = b Look it up: Q(log n)