CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms Alon Halevy Fall Quarter 2000.

Slides:



Advertisements
Similar presentations
Divide and Conquer (Merge Sort)
Advertisements

Back to Sorting – More efficient sorting algorithms.
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
A simple example finding the maximum of a set S of n numbers.
Data Structures Lecture 9 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
1 Divide & Conquer Algorithms. 2 Recursion Review A function that calls itself either directly or indirectly through another function Recursive solutions.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.
CSE 326: Data Structures Lecture #4 Alon Halevy Spring Quarter 2001.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CSE 326 Analyzing Recursion David Kaplan Dept of Computer Science & Engineering Autumn 2001.
CSE 326 Asymptotic Analysis David Kaplan Dept of Computer Science & Engineering Autumn 2001.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
1 CSE 326: Data Structures Program Analysis Lecture 3: Friday, Jan 8, 2003.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
CSE 326: Data Structures Lecture #2 Analysis of Algorithms Alon Halevy Fall Quarter 2000.
Analysis of Recursive Algorithms
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4 Modified by: Daniel Gomez-Prado, University of Massachusetts Amherst.
Andreas Klappenecker [based on the slides of Prof. Welch]
CSC 2300 Data Structures & Algorithms January 30, 2007 Chapter 2. Algorithm Analysis.
Data Structure Algorithm Analysis TA: Abbas Sarraf
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
CS Discrete Mathematical Structures Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, 9:30-11:30a.
CSE 326: Data Structures Introduction & Part One: Complexity Henry Kautz Autumn Quarter 2002.
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
Project 2 due … Project 2 due … Project 2 Project 2.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Computer Science Department Data Structure & Algorithms Lecture 8 Recursion.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Fundamentals CSE 373 Data Structures Lecture 5. 12/26/03Fundamentals - Lecture 52 Mathematical Background Today, we will review: ›Logs and exponents ›Series.
 O(1) – constant time  The time is independent of n  O(log n) – logarithmic time  Usually the log is to the base 2  O(n) – linear time  O(n*logn)
Week 6 - Monday.  What did we talk about last time?  Exam 1!  Before that:  Recursion.
CSC 221: Recursion. Recursion: Definition Function that solves a problem by relying on itself to compute the correct solution for a smaller version of.
+ David Kauchak cs312 Review. + Midterm Will be posted online this afternoon You will have 2 hours to take it watch your time! if you get stuck on a problem,
Advanced Counting Techniques CSC-2259 Discrete Structures Konstantin Busch - LSU1.
CSE 326 Linear ADTs: Lists, Stacks and Queues David Kaplan Dept of Computer Science & Engineering Autumn 2001.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
1 CSE 326: Data Structures A Sort of Detour Henry Kautz Winter Quarter 2002.
+ David Kauchak cs312 Review. + Midterm Will be posted online this afternoon You will have 2 hours to take it watch your time! if you get stuck on a problem,
2IS80 Fundamentals of Informatics Fall 2015 Lecture 6: Sorting and Searching.
CSE 326: Data Structures Lecture 23 Spring Quarter 2001 Sorting, Part 1 David Kaplan
CSE 326: Data Structures Lecture #3 Asymptotic Analysis Steve Wolfman Winter Quarter 2000.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
CSE 326: Data Structures Class #4 Analysis of Algorithms III Analysis of Recursive Algorithms Henry Kautz Winter 2002.
Vishnu Kotrajaras, PhD.1 Data Structures
BINARY SEARCH CS16: Introduction to Data Structures & Algorithms Thursday February 12,
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture3.
Searching – Linear and Binary Searches
Algorithm Analysis The case of recursion.
Divide-and-Conquer 6/30/2018 9:16 AM
Divide-and-Conquer 7 2  9 4   2   4   7
CSE 373 Data Structures Lecture 5
Divide and Conquer (Merge Sort)
Divide-and-Conquer 7 2  9 4   2   4   7
Divide & Conquer Algorithms
David Kauchak cs161 Summer 2009
Recurrences.
The Selection Problem.
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms Alon Halevy Fall Quarter 2000

Nested Dependent Loops for i = 1 to n do for j = i to n do sum = sum + 1

Recursion A recursive procedure can often be analyzed by solving a recursive equation Basic form: T(n) = if (base case) then some constant else ( time to solve subproblems + time to combine solutions ) Result depends upon –how many subproblems –how much smaller are subproblems –how costly to combine solutions (coefficients)

Example: Sum of Integer Queue sum_queue(Q){ if (Q.length == 0 ) return 0; else return Q.dequeue() + sum_queue(Q); } –One subproblem –Linear reduction in size (decrease by 1) –Combining: constant c (+), 1×subproblem Equation:T(0)  b T(n)  c + T(n – 1) for n>0

Sum, Continued Equation:T(0)  b T(n)  c + T(n – 1) for n>0 Solution: T(n)  c + c + T(n-2)  c + c + c + T(n-3)  kc + T(n-k) for all k  nc + T(0) for k=n  cn + b = O(n)

Example: Binary Search One subproblem, half as large Equation: T(1)  b T(n)  T(n/2) + c for n>1 Solution: T(n)  T(n/2) + c  T(n/4) + c + c  T(n/8) + c + c + c  T(n/2 k ) + kc  T(1) + c log n where k = log n  b + c log n = O(log n)

Example: MergeSort Split array in half, sort each half, merge together –2 subproblems, each half as large –linear amount of work to combine T(1)  b T(n)  2T(n/2) + cn for n>1 T(n)  2T(n/2)+cn  2(2(T(n/4)+cn/2)+cn = 4T(n/4) +cn +cn  4(2(T(n/8)+c(n/4))+cn+cn = 8T(n/8)+cn+cn+cn  2kT(n/2k)+kcn  2kT(1) + cn log n where k = log n = O(n log n)

Example: Recursive Fibonacci Recursive Fibonacci: int Fib(n){ if (n == 0 or n == 1) return 1 ; else return Fib(n - 1) + Fib(n - 2); } Running time: Lower bound analysis T(0), T(1)  1 T(n)  T(n - 1) + T(n - 2) + c if n > 1 Note: T(n)  Fib(n) Fact: Fib(n)  (3/2) n O( (3/2) n ) Why?

Direct Proof of Recursive Fibonacci Recursive Fibonacci: int Fib(n) if (n == 0 or n == 1) return 1 else return Fib(n - 1) + Fib(n - 2) Lower bound analysis T(0), T(1) >= b T(n) >= T(n - 1) + T(n - 2) + c if n > 1 Analysis let  be (1 +  5)/2 which satisfies  2 =  + 1 show by induction on n that T(n) >= b  n - 1

Direct Proof Continued Basis: T(0)  b > b  -1 and T(1)  b = b  0 Inductive step: Assume T(m)  b  m - 1 for all m < n T(n)  T(n - 1) + T(n - 2) + c  b  n-2 + b  n-3 + c  b  n-3 (  + 1) + c = b  n-3  2 + c  b  n-1

Fibonacci Call Tree

Learning from Analysis To avoid recursive calls –store all basis values in a table –each time you calculate an answer, store it in the table –before performing any calculation for a value n check if a valid answer for n is in the table if so, return it Memoization –a form of dynamic programming How much time does memoized version take?

Kinds of Analysis So far we have considered worst case analysis We may want to know how an algorithm performs “on average” Several distinct senses of “on average” –amortized average time per operation over a sequence of operations –average case average time over a random distribution of inputs –expected case average time for a randomized algorithm over different random seeds for any input

Amortized Analysis Consider any sequence of operations applied to a data structure –your worst enemy could choose the sequence! Some operations may be fast, others slow Goal: show that the average time per operation is still good

Stack ADT Stack operations –push –pop –is_empty Stack property: if x is on the stack before y is pushed, then x will be popped after y is popped What is biggest problem with an array implementation? A BCDEFBCDEF E D C B A F

Stretchy Stack Implementation int data[]; int maxsize; int top; Push(e){ if (top == maxsize){ temp = new int[2*maxsize]; copy data into temp; deallocate data; data = temp; } else { data[++top] = e; } Best case Push = O( ) Worst case Push = O( )

Stretchy Stack Amortized Analysis Consider sequence of n operations push(3); push(19); push(2); … What is the max number of stretches? What is the total time? –let’s say a regular push takes time a, and stretching an array contain k elements takes time kb, for some constants a and b. Amortized time = (an+b(2n-1))/n = O(1) log n

Wrapup Having math fun? Homework #1 out wednesday – due in one week Programming assignment #1 handed out. Next week: linked lists