1 CSE 326: Data Structures Program Analysis Lecture 3: Friday, Jan 8, 2003.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

Analysis of Algorithms
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Lecture 5COMPSCI.220.FS.T Worst-Case Performance Upper bounds : simple to obtain Lower bounds : a difficult matter... Worst case data may be unlikely.
CSE 326: Data Structures Lecture #4 Alon Halevy Spring Quarter 2001.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE 326 Asymptotic Analysis David Kaplan Dept of Computer Science & Engineering Autumn 2001.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Complexity Analysis (Part I)
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms Alon Halevy Fall Quarter 2000.
CSE 326: Data Structures Lecture #2 Analysis of Algorithms Alon Halevy Fall Quarter 2000.
Analysis of Recursive Algorithms
CSC 2300 Data Structures & Algorithms January 30, 2007 Chapter 2. Algorithm Analysis.
Data Structure Algorithm Analysis TA: Abbas Sarraf
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
CS Discrete Mathematical Structures Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, 9:30-11:30a.
CSE 326: Data Structures Introduction & Part One: Complexity Henry Kautz Autumn Quarter 2002.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014.
SIGCSE Tradeoffs, intuition analysis, understanding big-Oh aka O-notation Owen Astrachan
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Analysis of Algorithm Lecture 3 Recurrence, control structure and few examples (Part 1) Huma Ayub (Assistant Professor) Department of Software Engineering.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
CSCE 3110 Data Structures & Algorithm Analysis Rada Mihalcea Algorithm Analysis II Reading: Weiss, chap. 2.
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Analysis of Algorithms
Project 2 due … Project 2 due … Project 2 Project 2.
Fundamentals CSE 373 Data Structures Lecture 5. 12/26/03Fundamentals - Lecture 52 Mathematical Background Today, we will review: ›Logs and exponents ›Series.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Lauren Milne Summer 2015.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Lauren Milne Summer 2015.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Algorithm Analysis (Big O)
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
CSE 326: Data Structures Class #4 Analysis of Algorithms III Analysis of Recursive Algorithms Henry Kautz Winter 2002.
Vishnu Kotrajaras, PhD.1 Data Structures
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Lecture #3 Analysis of Recursive Algorithms
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
September 18, Algorithms and Data Structures Lecture II Simonas Šaltenis Aalborg University
Algorithm Analysis 1.
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Algorithm Analysis (not included in any exams!)
Algorithm design and Analysis
CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis Dan Grossman Fall 2013.
Analysis of Algorithms
CSE 326: Data Structures Lists
CSE 373 Data Structures Lecture 5
Searching, Sorting, and Asymptotic Complexity
CSE 373: Data Structures & Algorithms
Mathematical Background 2
CSE 373 Optional Section Led by Yuanwei, Luyi Apr
At the end of this session, learner will be able to:
Analysis of Algorithms
Presentation transcript:

1 CSE 326: Data Structures Program Analysis Lecture 3: Friday, Jan 8, 2003

2 Outline Empirical analysis of algorithms Formal analysis of algorithms Reading assignment: sec (maximum subsequence)

3 Determining the Complexity of an Algorithm Empirical measurements: –pro: discover if constant factors are significant –con: may be running on “wrong” inputs Formal analysis (proofs): –pro: no interference from implementation/hardware details –con: hides constants; may be hard In theory, theory is the same as practice, but in practice it is not

4 Measuring Empirical Complexity: Linear vs. Binary Search Linear Search Binary Search Time to find one item: Time to find N items: Find a item in a sorted array of length N Binary search algorithm:

5 int bfind(int x, int a[], int left, int right) { if (left+1 == right) return –1; m = (left + right) / 2; if (x == a[m]) return m; if (x < a[m]) return bfind(x, a, left, m); else return bfind(x, a, m, right); } int bfind(int x, int a[], int left, int right) { if (left+1 == right) return –1; m = (left + right) / 2; if (x == a[m]) return m; if (x < a[m]) return bfind(x, a, left, m); else return bfind(x, a, m, right); } for (i=0; i<n; i++) a[i] = i; for (i=0; i<n; i++) lfind(i,a,n); int lfind(int x, int a[], int n) { if (n==0) return –1; if (x == a[n-1]) return n-1; return lfind(x, a, n-1); } int lfind(int x, int a[], int n) { if (n==0) return –1; if (x == a[n-1]) return n-1; return lfind(x, a, n-1); } or bfind(i,a,-1,n)

6 Graphical Analysis

7

8

9 slope  2 slope  1 Recall: we search n times Linear = O(n 2 ) Binary = O(n log n)

10 Property of Log/Log Plots On a linear plot, a linear function is a straight line On a log/log plot, any polynomial function is a straight line! slope  y/  x = exponent vertical axis slope Proof: suppose y = cx k log(y) = log(cx k ) log(y) = log(c) + log(x k ) log(y) = log(c) + k log(x) horizontal axis

11 slope  1 Why does O(n log n) look like a straight line?

12 Empirical Complexity Large data sets may be required to gain an accurate empirical picture When running time is expected to be polynomial, use Log/log plots  slope = exponent When the running time is expected to be exponential, use log on the y axis When running time is expected to be log, then use long on the x axis Best: try all three, and see which one is linear

13 Analyzing Code primitive operations consecutive statements function calls conditionals loops recursive functions

14 Conditionals Conditional if C then S 1 else S 2 Suppose you are doing a O( ) analysis? Time(C) + Max(Time(S1),Time(S2)) or Time(C)+Time(S1)+Time(S2) Suppose you are doing a  ( ) analysis? Time(C) + Min(Time(S1),Time(S2)) or Time(C)

15 Nested Loops for i = 1 to n do for j = 1 to n do sum = sum + 1

16 Nested Dependent Loops for i = 1 to n do for j = i to n do sum = sum + 1

17 Nested Dependent Loops for i = 1 to n do for j = i to n do sum = sum + 1 Compute it the hard way: Compute it the smart way: substitute n - i+1 with j

18 Other Important Series Sum of squares: Sum of exponents: Geometric series: Novel series: –Reduce to known series, or prove inductively

19 Linear Search Analysis Best case, tight analysis: Worst case, tight analysis: void lfind(int x, int a[], int n) { for (i=0; i<n; i++) if (a[i] == x) return i; return –1; } void lfind(int x, int a[], int n) { for (i=0; i<n; i++) if (a[i] == x) return i; return –1; }

20 Iterated Linear Search Analysis Easy worst-case upper-bound: Worst-case tight analysis: for (i=0; i<n; i++) a[i] = i; for (i=0; i<n; i++) lfind(i,a,n);

21 Analyzing Recursive Programs 1.Express the running time T(n) as a recursive equation 2.Solve the recursive equation For an upper-bound analysis, you can optionally simplify the equation to something larger For a lower-bound analysis, you can optionally simplify the equation to something smaller

22 Binary Search What is the worst-case upper bound? int bfind(int x, int a[], int left, int right) { if (left+1 == right) return –1; m = (left + right) / 2; if (x == a[m]) return m; if (x < a[m]) return bfind(x, a, left, m); else return bfind(x, a, m, right); } int bfind(int x, int a[], int left, int right) { if (left+1 == right) return –1; m = (left + right) / 2; if (x == a[m]) return m; if (x < a[m]) return bfind(x, a, left, m); else return bfind(x, a, m, right); }

23 Binary Search Introduce some constants… b = time needed for base case c = time needed to get ready to do a recursive call Size is n = right-left Running time is thus: int bfind(int x, int a[], int left, int right) { if (left+1 == right) return –1; m = (left + right) / 2; if (x == a[m]) return m; if (x < a[m]) return bfind(x, a, left, m); else return bfind(x, a, m, right); } int bfind(int x, int a[], int left, int right) { if (left+1 == right) return –1; m = (left + right) / 2; if (x == a[m]) return m; if (x < a[m]) return bfind(x, a, left, m); else return bfind(x, a, m, right); }

24 Binary Search Analysis One sub-problem, half as large Equation: T(1)  b T(n)  T(n/2) + c for n>1 Solution: T(n)  T(n/2) + cwrite equation  T(n/4) + c + cexpand  T(n/8) + c + c + c  T(n/2 k ) + kcinductive leap  T(1) + c log n where k = log nselect value for k  b + c log n = O(log n)simplify

25 Solving Recursive Equations by Telescoping Create a set of equations, take their sum

26 Inductive Proof If you know the closed form solution, you can validate it by ordinary induction

27 Amortized Analysis Stack Stack operations –push –pop –is_empty Stack property: if x is on the stack before y is pushed, then x will be popped after y is popped A BCDEFBCDEF E D C B A F What is biggest problem with an array implementation?

28 Stretchy Stack Implementation int[] data; int maxsize; int top; Push(e){ if (top == maxsize){ temp = new int[2*maxsize]; for (i=0;i<maxsize;i++) temp[i]=data[i]; data = temp; maxsize = 2*maxsize; } data[++top] = e; } int pop() { return data[--top]; } int[] data; int maxsize; int top; Push(e){ if (top == maxsize){ temp = new int[2*maxsize]; for (i=0;i<maxsize;i++) temp[i]=data[i]; data = temp; maxsize = 2*maxsize; } data[++top] = e; } int pop() { return data[--top]; } Best case Push = O( ) Worst case Push = O( )

29 Stretchy Stack Amortized Analysis Consider sequence of n push/pop operations Amortized time = (T 1 + T T n ) / n We compute this next n push(e 1 ) push(e 2 ) pop() push(e 3 ) push(e 4 ) pop()... push(e k ) push(e 1 ) push(e 2 ) pop() push(e 3 ) push(e 4 ) pop()... push(e k )  time = T 1  time = T n

30 Stretchy Stack Amortized Analysis The length of the array increases like this: 1, 2, 4, 8,..., 2 k,..., n For each T i we have one of the following –T i = O(1) for pop( ), and for some push(e i ) –T i = O(2 k ) for some push(e i ) Hence

31 Stretchy Stack Amortized Analysis Let’s compute this sum: And therefore: In an asymptotic sense, there is no overhead in using stretchy arrays rather than regular arrays!

32 Geometric Series

33 Stretchy Stack Amortized Analysis Careful ! We must be clever to get good amortized performance ! Consider “smart pop”: int pop(){ int e = data[--top]; if (top <= maxsize/2){ maxsize = maxsize/2; temp = new int[maxsize]; for (i=0;i<maxsize;i++) temp[i]=data[i]; data = temp;} return e; } int pop(){ int e = data[--top]; if (top <= maxsize/2){ maxsize = maxsize/2; temp = new int[maxsize]; for (i=0;i<maxsize;i++) temp[i]=data[i]; data = temp;} return e; }

34 Stretchy Stack Amortized Analysis Take the sequence of 3n push/pop operations: push(e 1 ) push(e 2 )... push(e n ) pop() push(e n ) pop() push(e n ) pop()... push(e n ) pop() push(e 1 ) push(e 2 )... push(e n ) pop() push(e n ) pop() push(e n ) pop()... push(e n ) pop() n 2n Suppose n = 2 k +1 Hence amortized time is: T = (  (1)  (1) +  (n)  (n))/3n = (n  (1) + 2n  (n))/3n = 2/3  (n) Hence T =  (n) !!!