Analysis of Algorithms and Inductive Proofs

Slides:



Advertisements
Similar presentations
Recitation on analysis of algorithms. runtimeof MergeSort /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return;
Advertisements

Chapter 1 – Basic Concepts
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
1 Algorithmic analysis Introduction. This handout tells you what you are responsible for concerning the analysis of algorithms You are responsible for:
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY CS2110 – Spring 2015 Lecture 10 size n … Constant time n ops n + 2 ops 2n + 2 ops n*n ops Pic: Natalie.
Recitation 11 Analysis of Algorithms and inductive proofs 1.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Asymptotic Analysis-Ch. 3
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Recitation on analysis of algorithms. Formal definition of O(n) We give a formal definition and show how it is used: f(n) is O(g(n)) iff There is a positive.
+ David Kauchak cs312 Review. + Midterm Will be posted online this afternoon You will have 2 hours to take it watch your time! if you get stuck on a problem,
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 10 CS2110 – Fall
1 COMMONWEALTH OF AUSTRALIA Copyright Regulations 1969 WARNING This material has been reproduced and communicated to you by or on behalf.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
1 Algorithms Searching and Sorting Algorithm Efficiency.
Algorithm Analysis 1.
CMPT 438 Algorithms.
Chapter 2 Algorithm Analysis
Recitation 9 Prelim Review.
Introduction to Algorithms: Asymptotic Notation
Analysis of Algorithms
May 17th – Comparison Sorts
Searching – Linear and Binary Searches
Analysis of Algorithms
Introduction to Algorithms
Recitation 10 Prelim Review.
CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis
CS 3343: Analysis of Algorithms
COMP 53 – Week Seven Big O Sorting.
Sorting by Tammy Bailey
David Kauchak CS52 – Spring 2015
CS 3343: Analysis of Algorithms
Big-Oh and Execution Time: A Review
CS 3343: Analysis of Algorithms
i206: Lecture 8: Sorting and Analysis of Algorithms
Algorithm Analysis (not included in any exams!)
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Algorithm design and Analysis
Asymptotic Growth Rate
Data Structures Review Session
Searching, Sorting, and Asymptotic Complexity
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
CSE 373 Data Structures and Algorithms
CS200: Algorithms Analysis
Sub-Quadratic Sorting Algorithms
Searching, Sorting, and Asymptotic Complexity
Searching, Sorting, and Asymptotic Complexity
Asymptotic complexity Searching/sorting
Intro to Data Structures
Asymptotic complexity
Recitation 10 Prelim Review.
At the end of this session, learner will be able to:
Sorting and Asymptotic Complexity
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Searching and Sorting Hint at Asymptotic Complexity
Complexity Analysis (Part II)
Searching and Sorting Hint at Asymptotic Complexity
Searching and Sorting Hint at Asymptotic Complexity
Analysis of Algorithms
Presentation transcript:

Analysis of Algorithms and Inductive Proofs Recitation 11 Analysis of Algorithms and Inductive Proofs

Review: Big O definition c * g(n) f(n) is O(g(n)) iff There exists c > 0 and N > 0 such that: f(n) ≤ c * g(n) for n ≥ N f(n) n Explain the Big O definition and explain that this is why constants don’t matter. c * g(n) is our upper bound and we can set c to be any real valued number. The best way to “remember the definition” is to remember the picture and from it write the definition. Note: We don’t say f(n) = O(g(n)) because it is not an equality. N

Example: n+6 is O(n) n + 6 ---this is f(n) <= <if 6 <= n, write as> n + n = <arith> 2*n <choose c = 2> = c*n ---this is c * g(n) So choose c = 2 and N = 6 f(n) is O(g(n)): There exist c > 0, N > 0 such that: f(n) ≤ c * g(n) for n ≥ N When you show this proof, emphasize (1) the proof format, much better than usual format in math, because it shows the reason for each step. (2) Development! Start at top, n+6 and try to make = and <= and < steps that end up at c*g(n), and choose N and c as you go to make it work.

Big O Review: Big O Is used to classify algorithms by how they respond to changes in input size n. Important vocabulary: Constant time: O(1) Logarithmic time: O(log n) Linear time: O(n) Quadratic time: O(n2) Exponential time: O(2n) For all of the vocab words, go over an example, Adding two numbers - O(1) binary search - O(log n) linear search - O(n) selection sort - O(n^2)

Review: Big O 1. log(n) + 20 is O(log(n)) (logarithmic) 2. n + log(n) is O(n) (linear) 3. n/2 and 3*n are O(n) 4. n * log(n) + n is n * log(n) 5. n2 + 2*n + 6 is O(n2) (quadratic) 6. n3 + n2 is O(n3) (cubic) 7. 2n + n5 is O(2n) (exponential)

Merge Sort

Runtime of merge sort /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return; int e= (h+k)/2; mS(b, h, e); mS(b, e+1, k); merge(b, h, e, k); } mS is mergeSort for readability This is an implementation of mergesort. Sort the elements of the array b, meaning that nothing is returned but b is changed. Go through this code step by step with your group. Method mS sorts the elements of b between indexes h and k in place, so if h is at least k, then they’re nothing left to sort, so there’s nothing to do; otherwise, create e midway between h and k, sort b[h..e] and b[e+1..k] recursively, then we merge those two sorted parts. Merge here means we take two sorted parts of the array and turn them into one sorted part of the array. We’ll look at merge’s code later.

Runtime of merge sort /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return; int e= (h+k)/2; mS(b, h, e); mS(b, e+1, k); merge(b, h, e, k); } mS is mergeSort for readability We will count the number of comparisons mS makes Use T(n) for the number of array element comparisons that mS makes on an array segment of size n To calculate the number of steps the algorithms takes is our next task. We wil count only array-element comparisons. Because of the recursion, we get a (recurrence relation). Base case: when the number of elements to be sorted (the distance between h and k) is 0 or 1. It makes 0 comparisons. Here, the only operations we are considering to have any significant cost are comparisons (comparing the values of things to be sorted). We’ll watch to make sure we’re not performing any other overly costly-looking operations.

Runtime of merge sort T(0) = 0 T(1) = 0 /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return; int e= (h+k)/2; mS(b, h, e); mS(b, e+1, k); merge(b, h, e, k); } T(0) = 0 T(1) = 0 Use T(n) for the number of array element comparisons that mergeSort makes on an array of size n

Runtime of merge sort /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return; int e= (h+k)/2; mS(b, h, e); T(e+1-h) comparisons = T(n/2) mS(b, e+1, k); T(k-e) comparisons = T(n/2) merge(b, h, e, k); How long does merge take? } Back to mergeSort: We now know the number of comparisons merge takes, so to prove the number of comparisons the algorithm takes as a whole, we must figure out how to deal with this recursion. To this end, we create T().

Runtime of merge Runtime is O(k-h) pseudocode for merge Merge Sort Runtime of merge pseudocode for merge /** Pre: b[h..e] and b[e+1..k] are already sorted */ merge(Comparable[] b, int h, int e, int k) Copy both segments While both copies are non-empty Compare the first element of each segment Set the next element of b to the smaller value Remove the smaller element from its segment k-h loops must empty one segment One comparison, one add, one remove Runtime is O(k-h)

Runtime of merge sort Recursive Case: T(n) = 2T(n/2) + O(n) /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return; int e= (h+k)/2; mS(b, h, e); T(e+1-h) comparisons = T(n/2) mS(b, e+1, k); T(k-e) comparisons = T(n/2) merge(b, h, e, k); O(k-h) comparisons = O(n) } Recursive Case: T(n) = 2T(n/2) + O(n) Assume that we’re sorting a number of elements n that is a power of 2. This makes the analysis easier, and it has another property as well. A non-rigorous justification for this: It’s clear (when we see the code in full) that increasing the number of elements to sort will not make this algorithm run faster. Therefore, for any value n that is not a power of two, we know it runs faster than running our algorithm with the next power of 2 up elements. This is mostly sufficient for O() runtime analysis.

Runtime We determined that T(1) = 0 T(n) = 2T(n/2) + n for n > 1 Merge Sort Runtime We determined that T(1) = 0 T(n) = 2T(n/2) + n for n > 1 We will prove that T(n) = n log2 n (or n lg n for short) Here we have a base case (time to sort 1 element is 0), and a recursive case (time to sort n elements is twice the time to sort n/2, plus n more time). Remember that we’re only using n = powers of 2, for simplicity.

Recursion tree T(n) n = n T(n/2) T(n/2) (n/2)2 = n T(n/4) T(n/4) Merge Sort Recursion tree merge time at level T(n) n = n lg n levels T(n/2) T(n/2) (n/2)2 = n T(n/4) T(n/4) T(n/4) T(n/4) (n/4)4 = n T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) (n/2)2 = n This is the “recursion tree” for mergeSort. We calculate the time it takes to perform the merges at each “level of recursion.” Att the highest level, merging n elements takes n time. Before that can happen, the algorithm must make two merges of n/2 elements each, which take n/2 comparisons each, for a total of n time. Before those can happen, they each call a pair of merges of n/4 elements each. That’s 4 merges of n/4 elements each, for a total of n time. There are lg n levels to this tree –the recursion depth of mergeSort is lg n. Therefore, there are ln n levels, with n time spent on each, or a total of n lg n time spent to execute this algorithm. lg n levels * n comparisons is O(n log n)

Proof by induction T(n) = 2T(n/2) + n = 2(n/2)lg(n/2) + n Merge Sort Proof by induction To prove T(n) = n lg n, we can assume true for smaller values of n (like recursion) T(n) = 2T(n/2) + n = 2(n/2)lg(n/2) + n = n(lg n - lg 2) + n = n(lg n - 1) + n = n lg n - n + n = n lg n Property of logarithms log22 = 1

Heap Sort

Heap Sort Very simple idea: Turn the array into a max-heap Pull each element out /** Sort b */ public static void heapSort(Comparable[] b) { heapify(b); for (int i= b.length-1; i >= 0; i--) { b[i]= poll(b, i); }

Heap Sort Heap Sort /** Sort b */ public static void heapSort(Comparable[] b) { heapify(b); for (int i= b.length-1; i >= 0; i--) { b[i]= poll(b, i); } max-heap max-heap max-heap sorted ... Why does it have to be a max-heap? sorted ...

O(n lg n) + n*O(lg n) = O(n lg n) Heap Sort Heap Sort runtime /** Sort b */ public static void heapSort(Comparable[] b) { heapify(b); for (int i= b.length-1; i >= 0; i--) { b[i]= poll(b, i); } O(n lg n) loops n times O(lg n) Total runtime: O(n lg n) + n*O(lg n) = O(n lg n)