Help Session 3: Induction and Complexity Ezgi, Shenqi, Bran.

Slides:



Advertisements
Similar presentations
1 AVL Trees. 2 AVL Tree AVL trees are balanced. An AVL Tree is a binary search tree such that for every internal node v of T, the heights of the children.
Advertisements

David Luebke 1 4/22/2015 CS 332: Algorithms Quicksort.
Introduction to Algorithms Jiafen Liu Sept
Spring 2015 Lecture 5: QuickSort & Selection
David Luebke 1 5/20/2015 CS 332: Algorithms Quicksort.
Quicksort Ack: Several slides from Prof. Jim Anderson’s COMP 750 notes. UNC Chapel Hill1.
Chapter 7 Sorting Part I. 7.1 Motivation list: a collection of records. keys: the fields used to distinguish among the records. One way to search for.
CS203 Programming with Data Structures Sorting California State University, Los Angeles.
Quicksort Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CS 162 Intro to Programming II Quick Sort 1. Quicksort Maybe the most commonly used algorithm Quicksort is also a divide and conquer algorithm Advantage.
Recitation on analysis of algorithms. runtimeof MergeSort /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return;
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
Recursive Definitions Rosen, 3.4. Recursive (or inductive) Definitions Sometimes easier to define an object in terms of itself. This process is called.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Sorting Chapter 6. 2 Algorithms to know Insertion Sort Insertion Sort –O(n 2 ) but very efficient on partially sorted lists. Quicksort Quicksort –O(n.
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
David Luebke 1 7/2/2015 Merge Sort Solving Recurrences The Master Theorem.
Proving correctness. Proof based on loop invariants  an assertion which is satisfied before each iteration of a loop  At termination the loop invariant.
Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts Divide & Conquer.
DAST 2005 Tirgul 6 Heaps Induction. DAST 2005 Heaps A binary heap is a nearly complete binary tree stored in an array object In a max heap, the value.
Introduction to Computer Science Recursive Array Programming Recursive Sorting Algorithms Unit 16.
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Induction and recursion
David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
MS 101: Algorithms Instructor Neelima Gupta
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Chapter 7 Quicksort Ack: This presentation is based on the lecture slides from Hsu, Lih- Hsing, as well as various materials from the web.
Recursive Quicksort Data Structures in Java with JUnit ©Rick Mercer.
Recitation 11 Analysis of Algorithms and inductive proofs 1.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
September 29, Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University
ASYMPTOTIC COMPLEXITY CS2111 CS2110 – Fall
CompSci 100e Program Design and Analysis II March 29, 2011 Prof. Rodger CompSci 100e, Spring20111.
David Luebke 1 6/3/2016 CS 332: Algorithms Analyzing Quicksort: Average Case.
David Luebke 1 6/3/2016 CS 332: Algorithms Heapsort Priority Queues Quicksort.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
CS 2133: Data Structures Quicksort. Review: Heaps l A heap is a “complete” binary tree, usually represented as an array:
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Recitation on analysis of algorithms. Formal definition of O(n) We give a formal definition and show how it is used: f(n) is O(g(n)) iff There is a positive.
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
CSC 413/513: Intro to Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
CS 162 Intro to Programming II Insertion Sort 1. Assume the initial sequence a[0] a[1] … a[k] is already sorted k = 0 when the algorithm starts Insert.
David Luebke 1 2/19/2016 Priority Queues Quicksort.
CSC317 1 So far so good, but can we do better? Yes, cheaper by halves... orkbook/cheaperbyhalf.html.
Mudasser Naseer 1 3/4/2016 CSC 201: Design and Analysis of Algorithms Lecture # 6 Bubblesort Quicksort.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
Section Recursion 2  Recursion – defining an object (or function, algorithm, etc.) in terms of itself.  Recursion can be used to define sequences.
Analysis of Algorithms CS 477/677
CS 3343: Analysis of Algorithms
Quick Sort Divide: Partition the array into two sub-arrays
QuickSort QuickSort Best, Worst Average Cases K-th Ordered Statistic
CSC 413/513: Intro to Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Sorting We have actually seen already two efficient ways to sort:
CS 583 Analysis of Algorithms
Proving correctness.
Ch 7: Quicksort Ming-Te Chi
Lecture 3 / 4 Algorithm Analysis
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
CS 3343: Analysis of Algorithms
CS 332: Algorithms Quicksort David Luebke /9/2019.
Ack: Several slides from Prof. Jim Anderson’s COMP 202 notes.
Quicksort Quick sort Correctness of partition - loop invariant
Algorithms CSCI 235, Spring 2019 Lecture 17 Quick Sort II
Sorting We have actually seen already two efficient ways to sort:
Presentation transcript:

Help Session 3: Induction and Complexity Ezgi, Shenqi, Bran

Quicksort algorithm quicksort(A, lo, hi) is if lo < hi then p := partition(A, lo, hi) quicksort(A, lo, p – 1) quicksort(A, p + 1, hi) algorithm partition(A, lo, hi) is pivot := A[hi] i := lo // place for swapping for j := lo to hi – 1 do if A[j] ≤ pivot then swap A[i] with A[j] i := i + 1 swap A[i] with A[hi] return i quicksort(A, 1, length(A))

Proof of correctness (by induction) 1.Base Case: - n = 0 : quicksort(A, 1, 0) lo > hi and algorithm returns correctly the empty array. - n = 1 : quicksort(A, 1, 1) lo == hi and algorithm returns correctly the array A as it is. 2.Inductive Hypothesis For 1 < n = k, let’s assume that quicksort(A, 1, k) returns a sorted array of size k. This hypothesis extends to all n ≤ k. Observe that n = hi-lo+1 (number of elements to be sorted). algorithm quicksort(A, lo, hi) is if lo < hi then p := partition(A, lo, hi) quicksort(A, lo, p – 1) quicksort(A, p + 1, hi)

Proof of correctness (by induction) 3.Inductive Step Prove that quicksort(A, 1, k+1) correctly returns a sorted array of size n = k+1 where k > 1. Let’s think outloud: We want to use our hypothesis for n ≤ k. Observe that code calls quicksort twice: quicksort(A, 1, p–1) and quicksort(A, p+1, k+1). Intuitively, if these sorts are on arrays with sizes n ≤ k, we can use our hypothesis to say that these are sorted correctly. What else? Now to the formal proof! algorithm quicksort(A, lo, hi) is if lo < hi then p := partition(A, lo, hi) quicksort(A, lo, p – 1) quicksort(A, p + 1, hi) algorithm partition(A, lo, hi) is pivot := A[hi] i := lo for j := lo to hi – 1 do if A[j] ≤ pivot then swap A[i] with A[j] i := i + 1 swap A[i] with A[hi] return i

Proof of correctness (by induction) 3.Inductive Step Prove that quicksort(A, 1, k+1) correctly returns a sorted array of size n = k+1 where k > 1. For lo = 1, hi = k+1 and k > 1, lo < hi condition is true. We call partition(A, 1, k+1). In partition(A, 1, k+1), i variable starts as lo, which is 1. It is incremented by 1 whenever A[hi] is bigger than A[j] where 1≤j≤k. Thus, 1≤i≤k. Which is then assigned to variable p. 1≤p≤k Again, after partition(A, 1, k+1), A[1:p-1] ≤ A[p] and A[p:k+1]> A[p]. algorithm quicksort(A, lo, hi) is if lo < hi then p := partition(A, lo, hi) quicksort(A, lo, p – 1) quicksort(A, p + 1, hi) algorithm partition(A, lo, hi) is pivot := A[hi] i := lo for j := lo to hi – 1 do if A[j] ≤ pivot then swap A[i] with A[j] i := i + 1 swap A[i] with A[hi] return i 1

Proof of correctness (by induction) after partition(A, 1, k+1), every element > A[p] is put into A[1:p-1], every element ≤ A[p] is put into A[p:k+1]. Then we call quicksort(A, 1, p–1) and quicksort(A, p+1, k+1). A[1]A[2]A[3]……A[k]A[k+1] p > A[p] ≤ A[p] algorithm quicksort(A, lo, hi) is if lo < hi then p := partition(A, lo, hi) quicksort(A, lo, p – 1) quicksort(A, p + 1, hi) algorithm partition(A, lo, hi) is pivot := A[hi] i := lo for j := lo to hi – 1 do if A[j] ≤ pivot then swap A[i] with A[j] i := i + 1 swap A[i] with A[hi] return i

Proof of correctness (by induction) 3.Inductive Step Prove that quicksort(A, 1, k+1) correctly returns a sorted array of size n = k+1 where k > 1. Then we call quicksort(A, 1, p–1) and quicksort(A, p+1, k+1). n = p–1-1+1 = p-1 for quicksort(A, 1, p–1) Since 1≤p≤k, 0≤n≤k-1. Thus, by the inductive hypothesis, quicksort(A, 1, p–1) sorts A[1:p-1] correctly. n = k+1-(p+1)+1 = k-p+1 for quicksort(A, p+1, k+1). Since 1≤p≤k, 1≤n≤k. Thus, by the inductive hypothesis, quicksort(A, p+1, k+1) sorts A[p+1:k+1] correctly. 2 3 algorithm quicksort(A, lo, hi) is if lo < hi then p := partition(A, lo, hi) quicksort(A, lo, p – 1) quicksort(A, p + 1, hi) algorithm partition(A, lo, hi) is pivot := A[hi] i := lo for j := lo to hi – 1 do if A[j] ≤ pivot then swap A[i] with A[j] i := i + 1 swap A[i] with A[hi] return i

Proof of correctness (by induction) If A[1:p-1]≤A[p] and A[p:k+1]>A[p]. quicksort(A, 1, p–1) sorts A[1:p-1] correctly. quicksort(A, p+1, k+1) sorts A[p+1:k+1] correctly. Then A[1:k+1] sorted correctly. A[1]A[2]A[3]……A[k]A[k+1] p > A[p] sorted ≤ A[p] sorted algorithm quicksort(A, lo, hi) is if lo < hi then p := partition(A, lo, hi) quicksort(A, lo, p – 1) quicksort(A, p + 1, hi) algorithm partition(A, lo, hi) is pivot := A[hi] i := lo for j := lo to hi – 1 do if A[j] ≤ pivot then swap A[i] with A[j] i := i + 1 swap A[i] with A[hi] return i

Questions? Spoiler: Complexity towards the end of the class algorithm quicksort(A, lo, hi) is if lo < hi then p := partition(A, lo, hi) quicksort(A, lo, p – 1) quicksort(A, p + 1, hi) algorithm partition(A, lo, hi) is pivot := A[hi] i := lo for j := lo to hi – 1 do if A[j] ≤ pivot then swap A[i] with A[j] i := i + 1 swap A[i] with A[hi] return i

Find Big - Oh f(n) = n! + 2 n Note: f(n) is in O(g(n)) if there exist positive constants c and n 0 such that g(n) ≤ c f(n)for all n > n 0 1.N factorial is in O(N N ) 1.Proved by induction 2.f(n) is in O(n n )

Find Big - Oh f(n) = n! + 2 n Note: f(n) is in O(g(n)) if there exist positive constants c and n 0 such that g(n) ≤ c f(n)for all n > n 0 Prove n! ≤ n n : 1.Base case: n=1, 1! = I.H. : For n = k, k! ≤ k k

Find Big - Oh f(n) = n! + 2 n Note: f(n) is in O(g(n)) if there exist positive constants c and n 0 such that g(n) ≤ c f(n)for all n > n 0 Prove n! ≤ n n : 2.I.H. : For n = k, k! ≤ k k 3.I.S. : For n = k + 1, (k+1)! = k!(k+1) ≤ k k (k+1) ≤ (k+1) k (k+1) = (k+1) k+1

Compare Run Time Increasing Rate (i)100 n 3 (ii)n 3 +n (iii)n(logn) 1000 (iv)(n+5)!/(n+4)! O(n 3 ) O(n(logn) 1000 ) O(n) Slowest! Fastest!

Analyze Code void silly(int n) { for (int i = 0; i < n * n; i++) { for (int j = 0; j < n; ++j) { for (int k = 0; k < i; ++k) System.out.println("k = " + k); for (int m = 0; m < 100; ++m) System.out.println("m = " + m); } O(n 5 )

For Loop Induction To prove: Consider the following piece of code. Prove that on the kth iteration, x = k!

For loop induction Base case: i = k = 1 Start of iteration: x = 1 i = 1 End of iteration: x = x * i = 1 * 1 = 1! = i!

For loop induction Inductive case: i = k Inductive hypothesis: After k – 1 iterations, x = (k-1)! Start of iteration: x = (k-1)! i = k End of iteration: x = (k-1) !* i = (k-1)! * k = k! = i!

AVL Tree Induction To prove: An AVL tree is already balanced.

To prove: An AVL tree is always balanced. Base Case: empty tree Proof: If an AVL tree is empty, then it has no root, and therefore no nodes with left or right subtrees. Therefore, the balance condition Difference in height between left and right subtrees of any node < 2 Is trivially satisfied.

To prove: An AVL tree is always balanced. Inductive Hypothesis: After n operations, the AVL tree is still balanced. Inductive Step: Two possible cases: N+1th operation is an insert N+1th operation is a remove

To prove: An AVL tree is always balanced. Case 1: Insert into balanced tree. Two subcases: Tree is balanced after insert without needing to rebalance: Nothing to prove. Tree is initially unbalanced: Rebalance tree rotation After tree rotation, left and right subtrees are same height. In a thorough proof, would go over each specific case.

To prove: An AVL tree is always balanced. Case 2: Delete from a balanced tree. Two subcases: Tree is balanced after delete without needing to rebalance: Nothing to prove. Tree is initially unbalanced: Rebalance tree rotation After tree rotation, left and right subtrees are same height. In a thorough proof, would go over each specific case.

Complex Complexity Consider the following program: What is its complexity?

Complex Complexity Func calls itself n times, with a parameter of n-1, so the recurrence relation is: T(n) = n * T(n-1)

Complex Complexity T(n) = n * T(n-1) T(n) = n * (n-1) * T(n-2) T(n) = n 2 * T(n-2) – n * T(n-2) ≈ n 2 * T(n-2) T(n) = n * (n-1) * (n-2) * T(n-3) T(n) = n 3 * T(n-3) – 3n 2 * T(n-3) +2n * T(n-3) ≈ n 3 * T(n-3) In general: T(n) = O(n n )