Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSE 310 Review 2/17/2016 Patrick Michaelson Ian Nall.

Similar presentations


Presentation on theme: "CSE 310 Review 2/17/2016 Patrick Michaelson Ian Nall."— Presentation transcript:

1 CSE 310 Review 2/17/2016 Patrick Michaelson Ian Nall

2 Topics Divide and Conquer Algorithm Base Concept Merge-Sort Quick Sort Analysis of Algorithms Recurrence Iterative Insertion Sort Correctness through Loop Invariants Heaps Priority Queue Decision Tree

3 Divide and Conquer Split the problem apart to make it easier to solve Usually done through recursion Makes solving sorting problems easier Each Level Divide – make a smaller sub problem Conquer – solve the problem recursively, while bringing it down to a trivial level Combine – put the solutions of the sub problems back together to be a solution of the original problem Can be done iteratively, but the problem becomes much larger

4 Merge-Sort Divide: The problem goes from n to n/2 elements Conquer: Then sort the sub problems recursively using merge-sort, if the sub problems are size 1 then they’re already sorted, so you know the limit is 1. Combine: Merge the two sorted sub problems so you produce a sorted sequence of n elements

5 Example 3 6 2 9 2 3 6 9 63 9 2 6 3 9 2 6 3 9 2 Dividing process Merging process 5 1 4 7 6 3 9 2 5 1 4 7 5 14 7 5 1 4 7 1 5 4 7 1 4 5 7 1 2 3 4 5 6 7 9

6 Merge-Sort Pseudo Code MERGE-SORT(A, p, r) If p<r then q = floor of (p+r)/2 MERGE-SORT(A,p,q) MERGE-SORT(A,q+1,r) MERGE(A,p,q,r)

7 Merge Pseudo code MERGE(A,p,q,r) B[p..r] = A[p..r] // a temporary array to contain the data i= p j = q+1 z = p while i ≤ q and j ≤ r do if B[i] ≤ B[j] then A[z] = B[i] i++ else A[z] = B[j] z++ If i< q then A[z..r] = B[i..q] else if j<r A[z..r] = B[j..r]

8 Analysis T(1) = constant If n>1 2T(n/2) + cn + b 2T due to the recursion that occurs from both merge sorts and splitting the problem up The constants come from b time steps for finding the middle of the array c*n comes from the merge which is a linear time for array operations

9 Quick Sort Divide: The array is rearranged into two non empty sub arrays, A[p..q] and A[q+1..r] from A[p..r]. This is done in such a way that each element in A[p..q] a the left array is less than or equal to the elements of A[q+1..r] (the q index is determined by the partition function) Conquer: Sort the subarrays A[p..q] and A[q+1..r] or Left array and Right array respectively Combine: the subarrays are already sorted in place. A[p..r] is already sorted no further work needed

10 Quick Sort Example Pick an index as a pivot point of an array: 10, 12, 7, 2, 15, 6. Pivot choice for us will always be the last index. 6 is the pivot, so everything less than 6 to the left and greater than equal to the right 2 | 6 | 10, 12, 7, 15 We then take the last index of both arrays 2 | 6 | 10, 12, 7 |15 We then continue this till all of the cells are single sized then put them back together 2|6|7|10,12|15 2|6|7|10|12|15 2,6,7,10,12,15

11 Quicksort Pseudo Code QUICKSORT(A,p,r) If p<r then q = PARTITION(A,p,r) QUICKSORT(A,p,q-1) QUICKSORT(A,q+1,r)

12 Partition Partition can be arbitrarily set up by the designer Could be median of the array, could be start, could be end that contains the partition, it doesn’t matter Some logic is harder to follow then others though In practice Choose A[r] as pivot element Scan left until ≥ A[r] is found Scan right until < A[r] is found Swap these two elements Continue until the scan pointers meet. Swap A[r] with the element left most position of right sub list(the element that’s being pointed at by both pointers)

13 Partition Subroutine PARTITION(A, p, r) pivot = A[r] i = p j = r-1 while TRUE do while(j > p){if A[j] < pivot) break; else j--;} while(i pivot) break; else i++;} if i < j Exchange A[i] and A[j] else Exchange A[i] and A[r] return (i)

14 Analysis Total time complexity T(n) =  (nlogn)

15 Analysis of Algorithms Many forms of algorithms with different methods of analysis Recurrence Recursion Iterative

16 Recurrence Multiple ways to solve recurrence relationships Recursion Tree Substitution Master Method

17 Recursion Tree - Merge Sort example Expand recurrence to the base case, n=1 for Merge Sort T(n) = 2T(n/2) +cn + b = 2(2T(n/4)+c(n/2)+b)+cn+b …. T(1), which is easier to see in Tree form

18 T(n) || k = 1 = 2 0 cn+b cn+b + + T(n/2) T(n/2) | | k = 2 = 2 1 c(n/2)+b c(n/2)+b 2(c(n/2)+b) + + + + T(n/4) T(n/4) T(n/4) T(n/4) || || || || k = 4 = 2 2 c(n/4)+b c(n/4)+b c(n/4)+b c(n/4)+b 4(c(n/4)+b) …….. In general k*(c(n/k)+b) T(1) T(1) ………………………………..T(1) || here k becomes n=2 h (h is height of tree) constant n(c(n/n)+b) ------------------------- Total:  k=1,2,4,8,….n k*(c(n/k)+b)=  k=1,2,4,8,….n (cn+kb).

19  k=1,2,4,8,….n k*(c(n/k)+b)=  k=1,2,4,8,….n (cn+kb) = cn  k=1,2,4,8,….n 1 + b  k=1,2,4,8,….n k (We can pull out variables independent of k from the summation) Since  k=1,2,4,8,….n 1 = log 2 n (How many time we need to add 1? The height of tree n= 2 h, so h=log 2 n times) And  k=1,2,4,8,….n k = 2n-1 (  k=1,2,4,8,….n k = 1+2+4+8+ …+ (n/4)+(n/2)+n, you can try with n=16, to verify this) Thus, we get: Total = cn*(log 2 n) + b*(2n-1) = Θ(n log 2 n) Therefore, Merge-Sort is Θ(nlog 2 n) or by convention (omitting the base): Θ(nlogn) Merge sort is more efficient than insertion sort for large enough inputs.

20 Substitution There’s 2 steps to this process Guess what form the solution will take Use mathematical induction(ie. weak induction, p→q) to find constants and show that the solution works Steps of mathematical induction Prove base case n=1 for Merge-Sort, if so our guessed solution works at least this far Prove that if it works for base case it works for n = k, it will also work for n = k + 1(which is the next possible value of n) If it passes those steps then we know it will work for any possible n

21 Merge-Sort substitution example T(n) a, if n=1 2T(n/2) + cn + b, if n>1 (c and b are constants) A guess for this based on the type of recurrence T(n) = cnlog 2 n + 2bn – b (for n is a power of 2)

22 Base case: n = 2 T(2) = 2c log 2 2+ 4b-b = 2c+3b From the given recurrence T(2) = 2T(1) + 2c + b = 2c+3b if T(1) = b Induction: Assume T(k) = cklog 2 k + 2bk – b, Then T(2k) = 2T(k)+c(2k)+b (by the given recurrence) = 2(cklog 2 k +2bk – b) + c(2k) + b = c(2klog 2 k) + 4bk- 2b + c(2k log 2 2) +b = c(2k)(log 2 k +log 2 2) + 2b(2k) –b So T(n) = cnlog 2 n+2bn-b is true for n = 2k if it is true for n = k Conclusion: T(n) = cnlog 2 n+2bn-b for n = 2, 4, 8, …, 2 i,..

23 Master’s Theorem Master Theorem (Theorem 4.1) Suppose that T(n) = aT(n/b) + f(n) where a  1 and b > 1 and they are constants, and f(n) is a function of n. Then 1. If f(n) = O(n log b a -  ) for some constant  > 0, then T(n) =  (n log b a ). 2. If f(n) =  (n log b a ), then T(n) =  (n log b a log 2 n). 3. If f(n) =  (n log b a +  ) for some constant  > 0, and if a f(n/b)  c f(n) for some constant c < 1 and all sufficiently large n, then T(n) =  (f(n)).

24 Master’s Theorem Merge-Sort example T(n) = 2T(n/2) +cn + b a = 2, b = 2, f(n) = cn + b, log b a = log 2 2 = 1 Thus f(n) =  (n log b a ) =  (n). The second case: T(n) =  (n log b a log 2 n) =  (nlog 2 n)

25 Iterative A process that is repeated until it reaches a specific goal, or to reach a specific goal Example: Insertion Sort InsertionSort(A) //Overall runtime is O(n 2 ) Min = A[0] For(i = 0 -> A.length) //Adds n potential time to run time, because it must run n times For(j = i+1 -> A.length) //Adds n potential time to run time If A[j] < Min then Min = A[j] Swap(A, A[i], A[min]) return A

26 Correctness Loop Invariant - A loop invariant is a condition [among program variables] that is necessarily true immediately before and immediately after each iteration of a loop. (Note that this says nothing about its truth or falsity part way through an iteration.) Initialization Maintenance Termination

27 Heaps Binary heap data structure is an array object, which can be viewed as a nearly complete binary tree A complete binary tree: a binary tree that is completely filled on all levels except the lowest possible level, ie a tree of height 3 would have 1 node, then 2 nodes, then somewhere between 1-4 nodes at its lowest level The lowest level is filled from left to right

28 Heap visual example 123456789101112 2018107128954217 20 18 7 5 12 10 89 4217 5 1 2 3 7 64 PARENT(i)// parent of i in the tree return  i / 2  LEFT(i)// left child of i in the tree return 2i RIGHT(i)// right child of i in the tree return (2i+1)

29 Procedures of Heaps MAX-HEAPIFY: maintains heap property (O(logn)) BUILD-MAX-HEAP: produces a heap from an unordered input array (O(n)) HEAPSORT: sort an array in place (O(nlogn)) EXTRACT MAX or INSERT: allow heap data structure to be used as a priority queue (O(logn))

30 MAX-HEAPIFY pseudo code MAX-HEAPIFY(A, i) // L = LEFT(i) R = RIGHT(i) If L ≤ A.heap-size and A[L] > A[i] largest = L Else largest = I If r ≤ A.heap-size and A[r] > A[largest] largest = r If largest ≠ I Exchange A[i] and A[largest] MAX-HEAPIFY(A, largest)  T(n) = O(logn)

31 BUILD-MAX-HEAP pseudo code BUILD-MAX-HEAP(A) A.heap-size = A.length For i = floor(A.length/2) down to 1 do MAX-HEAPIFY(A,i)

32 Priority Queue Maintains a set of elements we’ll call S, each element has an associate value called a key Operations Insert – inserts an element into the set Maximum – returns the element in S with the largest key Extract-max – removes and returns the element in S with the largest key Increase-Key - Increases the value of an element’s key to a different value Can use linked lists or a heap to create a priority queue

33 Priority Queue Heap based pseudo code HEAP-MAXIMUM(A) Return A[1] HEAP-EXTRACT-MAX(A) //Running time O(logn) If A.heap-size <1 Error:no element to extract Else Max= A[1] A[1] = A[A.heap-size] A.heap-size— HEAPIFY(A,1) return Max

34 HEAP-INCREASE-KEY(A, i, key)//Running time O(logn) If key <A[i] then Error new key is smaller than current key A[i] = key While i> 1 and A[PARENT(i)] < A[i] ExchangeA[i] and A[PARENT(i)] i = PARENT(i) MAX-HEAP-INSERT(A, key)//Running time O(logn) A.heap-size++ A[A.heap-size] = -infinity HEAP-INCREASE-KEY(A, A.heapsize, key)

35 Decision Trees A model to show a process Shows all possible permutations Only 1 possible permutation is possible per set up Leaves correspond to permutations Internal nodes represent pair-wise comparisons; The root is the first comparison Execution of the algorithm corresponds to tracking the path from root to the leaf

36 EXAMPLE - Decision tree for INSERTION-SORT operating on a 1 a 1 a 2 Each of the n! permutations of the elements must appear as a leaf of the tree, for the sorting algorithm to sort properly.


Download ppt "CSE 310 Review 2/17/2016 Patrick Michaelson Ian Nall."

Similar presentations


Ads by Google