CSE 310 Review 2/17/2016 Patrick Michaelson Ian Nall.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
BY Lecturer: Aisha Dawood. Heapsort  O(n log n) worst case like merge sort.  Sorts in place like insertion sort.  Combines the best of both algorithms.
Analysis of Algorithms
Data Structures Lecture 9 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Spring 2015 Lecture 5: QuickSort & Selection
David Luebke 1 5/20/2015 CS 332: Algorithms Quicksort.
September 19, Algorithms and Data Structures Lecture IV Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
CS 253: Algorithms Chapter 6 Heapsort Appendix B.5 Credit: Dr. George Bebis.
Analysis of Algorithms CS 477/677
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Comp 122, Spring 2004 Heapsort. heapsort - 2 Lin / Devi Comp 122 Heapsort  Combines the better attributes of merge sort and insertion sort. »Like merge.
Lecture 2: Divide and Conquer I: Merge-Sort and Master Theorem Shang-Hua Teng.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 7 Heapsort and priority queues Motivation Heaps Building and maintaining heaps.
Analysis of Algorithms CS 477/677 Midterm Exam Review Instructor: George Bebis.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Recurrences - 1 Recurrences.
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
3-Sorting-Intro-Heapsort1 Sorting Dan Barrish-Flood.
David Luebke 1 7/2/2015 Merge Sort Solving Recurrences The Master Theorem.
Heapsort CIS 606 Spring Overview Heapsort – O(n lg n) worst case—like merge sort. – Sorts in place—like insertion sort. – Combines the best of both.
Sorting Algorithms (Part II) Slightly modified definition of the sorting problem: input: A collection of n data items where data item a i has a key, k.
2IL50 Data Structures Spring 2015 Lecture 3: Heaps.
David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
2IL50 Data Structures Fall 2015 Lecture 3: Heaps.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
September 29, Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University
Computer Sciences Department1. Sorting algorithm 3 Chapter 6 3Computer Sciences Department Sorting algorithm 1  insertion sort Sorting algorithm 2.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
David Luebke 1 6/3/2016 CS 332: Algorithms Heapsort Priority Queues Quicksort.
Priority Queues and Heaps. October 2004John Edgar2  A queue should implement at least the first two of these operations:  insert – insert item at the.
CS 2133: Data Structures Quicksort. Review: Heaps l A heap is a “complete” binary tree, usually represented as an array:
1 Analysis of Algorithms Chapter - 03 Sorting Algorithms.
David Luebke 1 12/23/2015 Heaps & Priority Queues.
Computer Algorithms Lecture 9 Heapsort Ch. 6, App. B.5 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
CSC 413/513: Intro to Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
Chapter 6: Heapsort Combines the good qualities of insertion sort (sort in place) and merge sort (speed) Based on a data structure called a “binary heap”
1 Heap Sort. A Heap is a Binary Tree Height of tree = longest path from root to leaf =  (lgn) A heap is a binary tree satisfying the heap condition:
David Luebke 1 2/5/2016 CS 332: Algorithms Introduction to heapsort.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 6: Sorting and Searching.
Analysis of Algorithms CS 477/677 Lecture 8 Instructor: Monica Nicolescu.
David Luebke 1 2/19/2016 Priority Queues Quicksort.
CSC317 1 So far so good, but can we do better? Yes, cheaper by halves... orkbook/cheaperbyhalf.html.
Sept Heapsort What is a heap? Max-heap? Min-heap? Maintenance of Max-heaps -MaxHeapify -BuildMaxHeap Heapsort -Heapsort -Analysis Priority queues.
Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 8b. Sorting(2): (n log n) Algorithms.
Heapsort Lecture 4 Asst. Prof. Dr. İlker Kocabaş.
6.Heapsort. Computer Theory Lab. Chapter 6P.2 Why sorting 1. Sometimes the need to sort information is inherent in a application. 2. Algorithms often.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
Analysis of Algorithms CS 477/677
"Teachers open the door, but you must enter by yourself. "
Algorithms and Data Structures Lecture VI
Chapter 4: Divide and Conquer
CS 583 Analysis of Algorithms
Ch 6: Heapsort Ming-Te Chi
Heapsort.
Lecture 3 / 4 Algorithm Analysis
"Teachers open the door, but you must enter by yourself. "
CS 3343: Analysis of Algorithms
HEAPS.
CSE 373 Data Structures and Algorithms
Solving Recurrences Continued The Master Theorem
The Selection Problem.
Design and Analysis of Algorithms
Presentation transcript:

CSE 310 Review 2/17/2016 Patrick Michaelson Ian Nall

Topics Divide and Conquer Algorithm Base Concept Merge-Sort Quick Sort Analysis of Algorithms Recurrence Iterative Insertion Sort Correctness through Loop Invariants Heaps Priority Queue Decision Tree

Divide and Conquer Split the problem apart to make it easier to solve Usually done through recursion Makes solving sorting problems easier Each Level Divide – make a smaller sub problem Conquer – solve the problem recursively, while bringing it down to a trivial level Combine – put the solutions of the sub problems back together to be a solution of the original problem Can be done iteratively, but the problem becomes much larger

Merge-Sort Divide: The problem goes from n to n/2 elements Conquer: Then sort the sub problems recursively using merge-sort, if the sub problems are size 1 then they’re already sorted, so you know the limit is 1. Combine: Merge the two sorted sub problems so you produce a sorted sequence of n elements

Example Dividing process Merging process

Merge-Sort Pseudo Code MERGE-SORT(A, p, r) If p<r then q = floor of (p+r)/2 MERGE-SORT(A,p,q) MERGE-SORT(A,q+1,r) MERGE(A,p,q,r)

Merge Pseudo code MERGE(A,p,q,r) B[p..r] = A[p..r] // a temporary array to contain the data i= p j = q+1 z = p while i ≤ q and j ≤ r do if B[i] ≤ B[j] then A[z] = B[i] i++ else A[z] = B[j] z++ If i< q then A[z..r] = B[i..q] else if j<r A[z..r] = B[j..r]

Analysis T(1) = constant If n>1 2T(n/2) + cn + b 2T due to the recursion that occurs from both merge sorts and splitting the problem up The constants come from b time steps for finding the middle of the array c*n comes from the merge which is a linear time for array operations

Quick Sort Divide: The array is rearranged into two non empty sub arrays, A[p..q] and A[q+1..r] from A[p..r]. This is done in such a way that each element in A[p..q] a the left array is less than or equal to the elements of A[q+1..r] (the q index is determined by the partition function) Conquer: Sort the subarrays A[p..q] and A[q+1..r] or Left array and Right array respectively Combine: the subarrays are already sorted in place. A[p..r] is already sorted no further work needed

Quick Sort Example Pick an index as a pivot point of an array: 10, 12, 7, 2, 15, 6. Pivot choice for us will always be the last index. 6 is the pivot, so everything less than 6 to the left and greater than equal to the right 2 | 6 | 10, 12, 7, 15 We then take the last index of both arrays 2 | 6 | 10, 12, 7 |15 We then continue this till all of the cells are single sized then put them back together 2|6|7|10,12|15 2|6|7|10|12|15 2,6,7,10,12,15

Quicksort Pseudo Code QUICKSORT(A,p,r) If p<r then q = PARTITION(A,p,r) QUICKSORT(A,p,q-1) QUICKSORT(A,q+1,r)

Partition Partition can be arbitrarily set up by the designer Could be median of the array, could be start, could be end that contains the partition, it doesn’t matter Some logic is harder to follow then others though In practice Choose A[r] as pivot element Scan left until ≥ A[r] is found Scan right until < A[r] is found Swap these two elements Continue until the scan pointers meet. Swap A[r] with the element left most position of right sub list(the element that’s being pointed at by both pointers)

Partition Subroutine PARTITION(A, p, r) pivot = A[r] i = p j = r-1 while TRUE do while(j > p){if A[j] < pivot) break; else j--;} while(i pivot) break; else i++;} if i < j Exchange A[i] and A[j] else Exchange A[i] and A[r] return (i)

Analysis Total time complexity T(n) =  (nlogn)

Analysis of Algorithms Many forms of algorithms with different methods of analysis Recurrence Recursion Iterative

Recurrence Multiple ways to solve recurrence relationships Recursion Tree Substitution Master Method

Recursion Tree - Merge Sort example Expand recurrence to the base case, n=1 for Merge Sort T(n) = 2T(n/2) +cn + b = 2(2T(n/4)+c(n/2)+b)+cn+b …. T(1), which is easier to see in Tree form

T(n) || k = 1 = 2 0 cn+b cn+b + + T(n/2) T(n/2) | | k = 2 = 2 1 c(n/2)+b c(n/2)+b 2(c(n/2)+b) T(n/4) T(n/4) T(n/4) T(n/4) || || || || k = 4 = 2 2 c(n/4)+b c(n/4)+b c(n/4)+b c(n/4)+b 4(c(n/4)+b) …….. In general k*(c(n/k)+b) T(1) T(1) ………………………………..T(1) || here k becomes n=2 h (h is height of tree) constant n(c(n/n)+b) Total:  k=1,2,4,8,….n k*(c(n/k)+b)=  k=1,2,4,8,….n (cn+kb).

 k=1,2,4,8,….n k*(c(n/k)+b)=  k=1,2,4,8,….n (cn+kb) = cn  k=1,2,4,8,….n 1 + b  k=1,2,4,8,….n k (We can pull out variables independent of k from the summation) Since  k=1,2,4,8,….n 1 = log 2 n (How many time we need to add 1? The height of tree n= 2 h, so h=log 2 n times) And  k=1,2,4,8,….n k = 2n-1 (  k=1,2,4,8,….n k = …+ (n/4)+(n/2)+n, you can try with n=16, to verify this) Thus, we get: Total = cn*(log 2 n) + b*(2n-1) = Θ(n log 2 n) Therefore, Merge-Sort is Θ(nlog 2 n) or by convention (omitting the base): Θ(nlogn) Merge sort is more efficient than insertion sort for large enough inputs.

Substitution There’s 2 steps to this process Guess what form the solution will take Use mathematical induction(ie. weak induction, p→q) to find constants and show that the solution works Steps of mathematical induction Prove base case n=1 for Merge-Sort, if so our guessed solution works at least this far Prove that if it works for base case it works for n = k, it will also work for n = k + 1(which is the next possible value of n) If it passes those steps then we know it will work for any possible n

Merge-Sort substitution example T(n) a, if n=1 2T(n/2) + cn + b, if n>1 (c and b are constants) A guess for this based on the type of recurrence T(n) = cnlog 2 n + 2bn – b (for n is a power of 2)

Base case: n = 2 T(2) = 2c log b-b = 2c+3b From the given recurrence T(2) = 2T(1) + 2c + b = 2c+3b if T(1) = b Induction: Assume T(k) = cklog 2 k + 2bk – b, Then T(2k) = 2T(k)+c(2k)+b (by the given recurrence) = 2(cklog 2 k +2bk – b) + c(2k) + b = c(2klog 2 k) + 4bk- 2b + c(2k log 2 2) +b = c(2k)(log 2 k +log 2 2) + 2b(2k) –b So T(n) = cnlog 2 n+2bn-b is true for n = 2k if it is true for n = k Conclusion: T(n) = cnlog 2 n+2bn-b for n = 2, 4, 8, …, 2 i,..

Master’s Theorem Master Theorem (Theorem 4.1) Suppose that T(n) = aT(n/b) + f(n) where a  1 and b > 1 and they are constants, and f(n) is a function of n. Then 1. If f(n) = O(n log b a -  ) for some constant  > 0, then T(n) =  (n log b a ). 2. If f(n) =  (n log b a ), then T(n) =  (n log b a log 2 n). 3. If f(n) =  (n log b a +  ) for some constant  > 0, and if a f(n/b)  c f(n) for some constant c < 1 and all sufficiently large n, then T(n) =  (f(n)).

Master’s Theorem Merge-Sort example T(n) = 2T(n/2) +cn + b a = 2, b = 2, f(n) = cn + b, log b a = log 2 2 = 1 Thus f(n) =  (n log b a ) =  (n). The second case: T(n) =  (n log b a log 2 n) =  (nlog 2 n)

Iterative A process that is repeated until it reaches a specific goal, or to reach a specific goal Example: Insertion Sort InsertionSort(A) //Overall runtime is O(n 2 ) Min = A[0] For(i = 0 -> A.length) //Adds n potential time to run time, because it must run n times For(j = i+1 -> A.length) //Adds n potential time to run time If A[j] < Min then Min = A[j] Swap(A, A[i], A[min]) return A

Correctness Loop Invariant - A loop invariant is a condition [among program variables] that is necessarily true immediately before and immediately after each iteration of a loop. (Note that this says nothing about its truth or falsity part way through an iteration.) Initialization Maintenance Termination

Heaps Binary heap data structure is an array object, which can be viewed as a nearly complete binary tree A complete binary tree: a binary tree that is completely filled on all levels except the lowest possible level, ie a tree of height 3 would have 1 node, then 2 nodes, then somewhere between 1-4 nodes at its lowest level The lowest level is filled from left to right

Heap visual example PARENT(i)// parent of i in the tree return  i / 2  LEFT(i)// left child of i in the tree return 2i RIGHT(i)// right child of i in the tree return (2i+1)

Procedures of Heaps MAX-HEAPIFY: maintains heap property (O(logn)) BUILD-MAX-HEAP: produces a heap from an unordered input array (O(n)) HEAPSORT: sort an array in place (O(nlogn)) EXTRACT MAX or INSERT: allow heap data structure to be used as a priority queue (O(logn))

MAX-HEAPIFY pseudo code MAX-HEAPIFY(A, i) // L = LEFT(i) R = RIGHT(i) If L ≤ A.heap-size and A[L] > A[i] largest = L Else largest = I If r ≤ A.heap-size and A[r] > A[largest] largest = r If largest ≠ I Exchange A[i] and A[largest] MAX-HEAPIFY(A, largest)  T(n) = O(logn)

BUILD-MAX-HEAP pseudo code BUILD-MAX-HEAP(A) A.heap-size = A.length For i = floor(A.length/2) down to 1 do MAX-HEAPIFY(A,i)

Priority Queue Maintains a set of elements we’ll call S, each element has an associate value called a key Operations Insert – inserts an element into the set Maximum – returns the element in S with the largest key Extract-max – removes and returns the element in S with the largest key Increase-Key - Increases the value of an element’s key to a different value Can use linked lists or a heap to create a priority queue

Priority Queue Heap based pseudo code HEAP-MAXIMUM(A) Return A[1] HEAP-EXTRACT-MAX(A) //Running time O(logn) If A.heap-size <1 Error:no element to extract Else Max= A[1] A[1] = A[A.heap-size] A.heap-size— HEAPIFY(A,1) return Max

HEAP-INCREASE-KEY(A, i, key)//Running time O(logn) If key <A[i] then Error new key is smaller than current key A[i] = key While i> 1 and A[PARENT(i)] < A[i] ExchangeA[i] and A[PARENT(i)] i = PARENT(i) MAX-HEAP-INSERT(A, key)//Running time O(logn) A.heap-size++ A[A.heap-size] = -infinity HEAP-INCREASE-KEY(A, A.heapsize, key)

Decision Trees A model to show a process Shows all possible permutations Only 1 possible permutation is possible per set up Leaves correspond to permutations Internal nodes represent pair-wise comparisons; The root is the first comparison Execution of the algorithm corresponds to tracking the path from root to the leaf

EXAMPLE - Decision tree for INSERTION-SORT operating on a 1 a 1 a 2 Each of the n! permutations of the elements must appear as a leaf of the tree, for the sorting algorithm to sort properly.