CSE 326: Data Structures Class #4 Analysis of Algorithms III Analysis of Recursive Algorithms Henry Kautz Winter 2002.

Slides:



Advertisements
Similar presentations
Back to Sorting – More efficient sorting algorithms.
Advertisements

A simple example finding the maximum of a set S of n numbers.
Analysis of Algorithms
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
CSC 331: Algorithm Analysis Divide-and-Conquer Algorithms.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.
CSE 326: Data Structures Lecture #4 Alon Halevy Spring Quarter 2001.
CSE 326 Analyzing Recursion David Kaplan Dept of Computer Science & Engineering Autumn 2001.
CSE 326 Asymptotic Analysis David Kaplan Dept of Computer Science & Engineering Autumn 2001.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms Alon Halevy Fall Quarter 2000.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
1 CSE 326: Data Structures Program Analysis Lecture 3: Friday, Jan 8, 2003.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Analysis of Recursive Algorithms
 Last lesson  Arrays for implementing collection classes  Performance analysis (review)  Today  Performance analysis  Logarithm.
CSC 2300 Data Structures & Algorithms January 30, 2007 Chapter 2. Algorithm Analysis.
Chapter 1 Introduction Definition of Algorithm An algorithm is a finite sequence of precise instructions for performing a computation or for solving.
CSE 326: Data Structures Introduction & Part One: Complexity Henry Kautz Autumn Quarter 2002.
EXPANDING STACKS AND QUEUES CS16: Introduction to Data Structures & Algorithms 1 Tuesday, February 10, 2015.
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
CS 3343: Analysis of Algorithms
Recursion l Powerful Tool l Useful in simplifying a problem (hides details of a problem) l The ability of a function to call itself l A recursive call.
Chapter 1 Introduction. Goals Why the choice of algorithms is so critical when dealing with large inputs Basic mathematical background Review of Recursion.
Week 5 - Monday.  What did we talk about last time?  Linked list implementations  Stacks  Queues.
Analysis of Algorithms
Project 2 due … Project 2 due … Project 2 Project 2.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Computer Science Department Data Structure & Algorithms Lecture 8 Recursion.
Review Introduction to Searching External and Internal Searching Types of Searching Linear or sequential search Binary Search Algorithms for Linear Search.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Fundamentals CSE 373 Data Structures Lecture 5. 12/26/03Fundamentals - Lecture 52 Mathematical Background Today, we will review: ›Logs and exponents ›Series.
1 CSE 326: Data Structures: Hash Tables Lecture 12: Monday, Feb 3, 2003.
Week 6 - Monday.  What did we talk about last time?  Exam 1!  Before that:  Recursion.
CSC 221: Recursion. Recursion: Definition Function that solves a problem by relying on itself to compute the correct solution for a smaller version of.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
1 CSE 326: Data Structures A Sort of Detour Henry Kautz Winter Quarter 2002.
Lecture 7. Solution by Substitution Method T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n)
+ David Kauchak cs312 Review. + Midterm Will be posted online this afternoon You will have 2 hours to take it watch your time! if you get stuck on a problem,
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
CSE 326: Data Structures Lecture 23 Spring Quarter 2001 Sorting, Part 1 David Kaplan
CSE 326: Data Structures Lecture #3 Asymptotic Analysis Steve Wolfman Winter Quarter 2000.
Amortized Analysis. Problem What is the time complexity of n insert operations into an dynamic array, which doubles its size each time it is completely.
Vishnu Kotrajaras, PhD.1 Data Structures
Section 1.7 Comparing Algorithms: Big-O Analysis.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture3.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Searching – Linear and Binary Searches
CS 3343: Analysis of Algorithms
Randomized Algorithms
Algorithm design and Analysis
CS 3343: Analysis of Algorithms
Randomized Algorithms
CSE 326: Data Structures Lists
CSE 373 Data Structures Lecture 5
Mathematical Background 2
CSE 326: Data Structures: Sorting
At the end of this session, learner will be able to:
The Selection Problem.
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

CSE 326: Data Structures Class #4 Analysis of Algorithms III Analysis of Recursive Algorithms Henry Kautz Winter 2002

Exercise Form groups of 5 people (split rows in half) Person sitting in middle is note-taker Share the lists of steps for analyzing a recursive procedure. Come up with a revised list combining best ideas. (5 minutes) Note-taker: copy list on a transparency. Then: use your method to analyze the following procedure. (10 minutes) Note-taker: copy solution on a transparency

Recursive Selection Sort Sort(int A[], int n) { if (n<=1) return; int m = A[0]; for (int i=1; i<n; i++){ if (m > A[i]) { int tmp = A[i]; A[i] = m; m = tmp; } Sort( &A[1], n-1 ); }

How I Analyze a Recursive Program 1.Write recursive equation, using constants a, b, etc. 2.Expand the equation repeatedly, until I can see the pattern 3.Write the equation that captures the pattern – make an inductive leap! – in terms of a new variable k 4.Select a particular value for the variable k in terms of n – pick a value that will make the recursive function a constant 5.Simplify Along the way, can throw out terms to simplify, if this is an upper-bound O( ) calculation.

Example: Sum of Integer Queue sum_queue(Q){ if (Q.length == 0 ) return 0; else return Q.dequeue() + sum_queue(Q); } –One subproblem –Linear reduction in size (decrease by 1) –Combining: constant c (+), 1×subproblem Equation:T(0)  b T(n)  c + T(n – 1) for n>0

Sum, Continued Equation:T(0)  b T(n)  c + T(n – 1) for n>0 Solution: T(n)  c + c + T(n-2) expand recursion  c + c + c + T(n-3)  ck + T(n-k) for all k inductive leap  cn + T(0) for k=nselect value for k  cn + b = O(n)simplify

Example: Binary Search One subproblem, half as large Equation: T(1)  b T(n)  T(n/2) + c for n>1 Solution: T(n)  T(n/2) + cwrite equation  T(n/4) + c + cexpand  T(n/8) + c + c + c  T(n/2 k ) + kcinductive leap  T(1) + c log n where k = log nselect value for k  b + c log n = O(log n)simplify

Example: MergeSort Split array in half, sort each half, merge together –2 subproblems, each half as large –linear amount of work to combine T(1)  b T(n)  2T(n/2) + cn for n>1 T(n)  2T(n/2)+cn  2(2(T(n/4)+cn/2)+cn = 4T(n/4) +cn +cn  4(2(T(n/8)+c(n/4))+cn+cn = 8T(n/8)+cn+cn+cnexpand  2 k T(n/2 k )+kcninductive leap  nT(1) + cn log n where k = log nselect value for k = O(n log n)simplify

Lower Bound Analysis: Recursive Fibonacci Recursive Fibonacci: int Fib(n){ if (n == 0 or n == 1) return 1 ; else return Fib(n - 1) + Fib(n - 2); } Lower bound analysis  (n) Just like before, but be careful that equations are all 

Analysis Important: you introduce a new variable k! It is not necessarily the case that k=n!

Learning from Analysis To avoid recursive calls –store all basis values in a table –each time you calculate an answer, store it in the table –before performing any calculation for a value n check if a valid answer for n is in the table if so, return it Memoization –a form of dynamic programming How much time does memoized version take?

Logs and exponents We will be dealing mostly with binary numbers (base 2) Definition: log X B = A means X A = B Any base is equivalent to base 2 within a constant factor: Why?

Logs and exponents We will be dealing mostly with binary numbers (base 2) Definition: log X B = A means X A = B Any base is equivalent to base 2 within a constant factor: Why? Because: if R = log 2 B, S = log 2 X, and T = log X B, –2 R = B, 2 S = X, and X T = B –2 R = X T = 2 ST i.e. R = ST and therefore, T = R/S.

Properties of logs We will assume logs to base 2 unless specified otherwise log AB = log A + log B (note: log AB  log Alog B) log A/B = log A – log B (note: log A/B  log A / log B) log A B = B log A (note: log A B  (log A) B = log B A) log log X 0 –log log X = Y means –log X grows slower than X; called a “sub-linear” function log 1 = 0, log 2 = 1, log 1024 = 10

Normal scale plot

Log-Normal Plot Why? What would give a straight line?

Log-Normal Plot

Log-log plot n^ n^3

Log-log plot

Kinds of Analysis So far we have considered worst case analysis We may want to know how an algorithm performs “on average” Several distinct senses of “on average” –amortized average time per operation over a sequence of operations –average case average time over a random distribution of inputs –expected case average time for a randomized algorithm over different random seeds for any input

Amortized Analysis Consider any sequence of operations applied to a data structure –your worst enemy could choose the sequence! Some operations may be fast, others slow Goal: show that the average time per operation is still good

Stack ADT Stack operations –push –pop –is_empty Stack property: if x is on the stack before y is pushed, then x will be popped after y is popped What is biggest problem with an array implementation? A BCDEFBCDEF E D C B A F

Stretchy Stack Implementation int * data; int maxsize; int top; Push(e){ if (top == maxsize){ temp = new int[2*maxsize]; for (i=0;i<maxsize;i++) temp[i]=data[i]; ; delete data; data = temp; maxsize = 2*maxsize; } else { data[++top] = e; } Best case Push = O( ) Worst case Push = O( )

Stretchy Stack Amortized Analysis Consider sequence of n operations push(3); push(19); push(2); … What is the max number of stretches? What is the total time? –let’s say a regular push takes time a, and stretching an array contain k elements takes time bk. Amortized time =

Stretchy Stack Amortized Analysis Consider sequence of n operations push(3); push(19); push(2); … What is the max number of stretches? What is the total time? –let’s say a regular push takes time a, and stretching an array contain k elements takes time bk. Amortized time = log n

Series Arithmetic series: Geometric series:

Stretchy Stack Amortized Analysis Consider sequence of n operations push(3); push(19); push(2); … What is the max number of stretches? What is the total time? –let’s say a regular push takes time a, and stretching an array contain k elements takes time bk. Amortized time = (an+b(2n-1))/n = O( ) log n

Moral of the Story

To Do Assignment #1 due: –Electronic turnin: midnight, Monday Jan 21 –Hardcopy writeup due in class Wednesday, Jan 23 Finish reading Chapter 3. –Be prepared to discuss these questions (bring written notes to refer to): 1.What is a call stack? 2.Could you write a compiler that did not use one? 3.What data structure does a printer queue use?