Foundations of Algorithms, Fourth Edition

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
A simple example finding the maximum of a set S of n numbers.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 4 Instructor: Paul Beame TA: Gidon Shavit.
Divide and Conquer Strategy
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS4413 Divide-and-Conquer
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Using Divide and Conquer for Sorting
Chapter 3: Divide and Conquer
Theory of Algorithms: Divide and Conquer
Lecture 2: Divide and Conquer algorithms Phan Thị Hà Dương
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
CSC2100B Quick Sort and Merge Sort Xin 1. Quick Sort Efficient sorting algorithm Example of Divide and Conquer algorithm Two phases ◦ Partition phase.
Introduction to Algorithms Chapter 7: Quick Sort.
CSC 2300 Data Structures & Algorithms March 23, 2007 Chapter 7. Sorting.
Updated QuickSort Problem From a given set of n integers, find the missing integer from 0 to n using O(n) queries of type: “what is bit[j]
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 4 Some of the sides are exported from different sources.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
CS 162 Intro to Programming II Quick Sort 1. Quicksort Maybe the most commonly used algorithm Quicksort is also a divide and conquer algorithm Advantage.
Algorithm Design Strategy Divide and Conquer. More examples of Divide and Conquer  Review of Divide & Conquer Concept  More examples  Finding closest.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Quicksort.
Unit 061 Quick Sort csc326 Information Structures Spring 2009.
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Design and Analysis of Algorithms – Chapter 51 Divide and Conquer (I) Dr. Ying Lu RAIK 283: Data Structures & Algorithms.
1 QuickSort Worst time:  (n 2 ) Expected time:  (nlgn) – Constants in the expected time are small Sorts in place.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Divide-and-Conquer 7 2  9 4   2   4   7
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
CSCE 3110 Data Structures & Algorithm Analysis Sorting (I) Reading: Chap.7, Weiss.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Sorting Algorithms 2. Quicksort General Quicksort Algorithm: Select an element from the array to be the pivot Select an element from the array to be the.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
CSS106 Introduction to Elementary Algorithms M.Sc Askar Satabaldiyev Lecture 05: MergeSort & QuickSort.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
1 Heapsort, Mergesort, and Quicksort Sections 7.5 to 7.7.
Divide and Conquer Strategy
1 Ch.19 Divide and Conquer. 2 BIRD’S-EYE VIEW Divide and conquer algorithms Decompose a problem instance into several smaller independent instances May.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Sorting Quick, Merge & Radix Divide-and-conquer Technique subproblem 2 of size n/2 subproblem 1 of size n/2 a solution to subproblem 1 a solution to.
Lecture 6 Sorting II Divide-and-Conquer Algorithms.
Computer Sciences Department1. Sorting algorithm 4 Computer Sciences Department3.
Sorting – Lecture 3 More about Merge Sort, Quick Sort.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Divide-and-Conquer The most-well known algorithm design strategy:
Chapter 4: Divide and Conquer
Unit-2 Divide and Conquer
Data Structures Review Session
Topic: Divide and Conquer
Divide-and-Conquer The most-well known algorithm design strategy:
Algorithms Dr. Youn-Hee Han April-May 2013
Divide-and-Conquer 7 2  9 4   2   4   7
CSE 373 Data Structures and Algorithms
Divide-and-Conquer 7 2  9 4   2   4   7
Divide and Conquer Merge sort and quick sort Binary search
Presentation transcript:

Foundations of Algorithms, Fourth Edition Richard Neapolitan, Kumarss Naimipour Chapter 2 Divide-and-Conquer

Divide and Conquer In this approach a problem is divided into sub-problems and the same algorithm is applied to every subproblem ( often this is done recursively) Examples Binary Search (review algorithm in book) Mergesort (review algorithm in book) Quicksort

Figure 2.1 : The steps down by a human when searching with Binary Search. (Note: x = 18)

Complexity of Binary Search Since this and many other divide and conquer algorithms are recursive you will recall that we can determine their complexity using recurrence relations. For Binary Search we have T(n) = T(n/2) + 1 =[T(n/4)+1]+1 = T(n/22) + 2 =[T(n/8+1]+ 2 = T(n/23)+ 3 … =T(n/2k)+k

What is T(n/2k)+k We if we let k get larger until n=2k then we see that k = log2n. Why? Consequently the relation becomes T(n) = T(1) + log2n T(n) = log2n Since n/2k is 1 if they are equal and T(1) =1

MergeSort Recall in this algorithm we divide the array into two equal parts and sort each half prior to merging. The recurrence relation is clearly T(n) = 2T(n/2) + n Recall that Merging is O(n) right?

Figure 2.2: The steps done by a human when sorting with Mergesort. T(n/4) T(n/2) O(n) Figure 2.2: The steps done by a human when sorting with Mergesort.

T(n) = 2T(n/2) + n T(n) = 2T(n/2) + n = 2[ 2T(n/22) + n/2] + n = 22T(n/22) + 2n = 2[2T(n/23) + n/22] +2n =23T(n/23) + 3n … =2kT(n/2k) + kn If n=2k then we have T(n) = nT(1) + (log2n)n = n+ nlog2n = O(nlog2n)

QuickSort Works in situ! Void quicksort(int low, int high) { int pivot; if (high > low){ partition(low, high, pivot); quicksort(low, pivot-1); quicksort(pivot+1,high); }

Figure 2. 3: The steps done by a human when sorting with Quicksort Figure 2.3: The steps done by a human when sorting with Quicksort. The subarrays are enclosed in rectangles whereas the pivot points are free.

Partition Study this carefully void partition (int low, int high, int&pivot) { int I,j, pivotitem; pivotitem = S[low]; // select left item (hmmm) j=low; for (i=low+1; i<=high; i++) There are many ways if (S[i] < pivotitem){ to write this function! j++; All have a complexity swap S[i] and S[j]; of O(n). } pivot= j; swap S[low] and S[pivot];

Complexity of Quicksort The complexity of this algorithm depends on how good the pivot value selection is . If the value is always in the middle of array then the best case complexity is T(n) = n + 2T(n/2) Which we already have determined is T(n) = n log2 n

Worst case for Quicksort This clearly will occur if each pivot value is less than (or greater) all the elements of the array. IE the array is split into 1 and n-1 size pieces. This gives a recurrence relation of T(n) = T(1) + T(n-1) + n-1 Time to sort left array right array partition

Worst Case analysis T(n) = T(1) + T(n-1) + n-1 = T(n-1) + n Assume the answer is n(n-1)/2 check it out ! n(n-1)/2 = 0 + (n-1)(n-2)/2 + n-1 = (n-1)(n-2)/2 + 2(n-1)/2 =((n-1)(n-2)+ 2(n-1))/2 = (n-1)(n-2+2)/2 = n(n-1)/2 ☺

Quick Sort Analysis Quicksort’s worst case is θ(n2) Does this mean that quick sort is just as bad as say selection sort, insertion sort and/or bubble sort. No! Its all about average case performance. The average case performance for these three is θ(n2) as well. What is the average case complexity for QS?

Average Case Analysis assume prob. pivotpoint is p 𝐴 𝑛 = 𝑝=1 𝑛 1 𝑛 𝐴 𝑝−1 +𝐴(𝑛−𝑝) +𝑛−1 𝐴 𝑛 = 1 𝑛 𝑝=1 𝑛 𝐴 𝑝−1 +𝐴(𝑛−𝑝) +𝑛−1 𝐴 𝑛 = 2 𝑛 𝑝=1 𝑛 𝐴 𝑝−1 +𝑛−1 See HW 22 p 86 for above conversion

Average case continued Multiplying by n 𝑛𝐴 𝑛 =2 𝑝=1 𝑛 𝐴 𝑝−1 +𝑛 𝑛−1 (𝑛−1)𝐴 𝑛−1 =2 𝑝=1 𝑛−1 𝐴 𝑝−1 +(𝑛−1)(𝑛−2) Subtracting these equations we have 𝑛𝐴 𝑛 − 𝑛−1 𝐴 𝑛−1 =2𝐴 𝑛−1 +2(𝑛−1) 𝐴 𝑛 𝑛+1 = 𝐴 𝑛−1 𝑛 + 2 𝑛−1 𝑛 𝑛+1

Average case QS continued Assume 𝑎 𝑛 = 𝐴 𝑛 𝑛+1 we get 𝑎 𝑛 = 𝑎 𝑛−1 + 2 𝑛−1 𝑛 𝑛+1 , 𝑎 0 =0 Applying some simple math we have 𝑎 𝑛 ≈2 ln 𝑛 Which give 𝐴(𝑛)≈ 𝑛+1 lg 𝑛= 𝑛+1 2(( ln 2)( lg 𝑛 ) ≈1.38 𝑛+1 lg 𝑛 ∈𝜃(𝑛 lg 𝑛)

Matrix Multiplication (Strassen) Lets look at the product of two 2 by 2’s 𝑐 11 𝑐 12 𝑐 21 𝑐 22 = 𝑎 11 𝑎 12 𝑎 21 𝑎 22 × 𝑏 11 𝑏 12 𝑏 21 𝑏 22 Clearly after you do the homework #26 m1=(a11+a22)(b11+b22) m5=(a11+a12)b22 m2=(a11+a22)b11 m6=(a21+a11)(b11+b12) m3=a11(b12+b22) m7=(a12+a22)(b21+b22) m4=a22(b21+b11) will give the following!

And the answer is Original method 8 mult, four add/sub 𝐶= 𝑚 1 + 𝑚 4 − 𝑚 5 + 𝑚 7 𝑚 3 + 𝑚 5 𝑚 2 + 𝑚 4 𝑚 1 + 𝑚 3 − 𝑚 2 + 𝑚 6 Original method 8 mult, four add/sub Strassen’s method 7 mult, and 18 add/sub Hmmmm! So what’s the big deal?

Big Matrices 2n by 2n 𝐶 11 𝑐 12 𝐶 21 𝑐 22 = 𝐴 11 𝐴 12 𝐴 21 𝐴 22 × 𝐵 11 𝐵 12 𝐵 21 𝐵 22 Where C11 is the upper left hand corner of the matrix of size n/2 by n/2. The others are similarly defined. Now m1=(A11+A22)(B11+B22) Is the sum and product of matrices

Our function is then void Strassen(int n, A,B, C)// these are nxn mats { if (n<= threshold) computer C=AxB normally Partition A and B into eight submatrices strassen(n/2, A11+A22, B11+B22, M1); strassen(n/2, A21+A22, B11, M2) etc // making 7 recursive calls } NOT EIGHT!

Complexity T(n) = 7T(n/2) + cn2 T(n) = θ(nlg7) = O(n2.81) Which is Using the general theorem. The best know is Coppersmith and Winograd with a time complexity of O(n2.376) Why am I using big O here?

Recalling General Theorem See page 588 Assume 𝑇 𝑛 =𝑎𝑇 𝑛 𝑏 +𝑐 𝑛 𝑘 for n>1 and n a power of b, T(1)=d 𝑇 𝑛 ∈ 𝜃 𝑛 𝑘 𝑖𝑓 𝑎< 𝑏 𝑘 𝜃( 𝑛 𝑘 lg 𝑛) 𝑖𝑓 𝑎= 𝑏 𝑘 𝜃 𝑛 𝑙𝑜𝑔 𝑏 𝑎 𝑖𝑓 𝑎> 𝑏 𝑘

Just a side note Suppose we has 8 recursive calls instead of 7 in the above case. Then the recurrence relation would be T(n) = 8T(n/2) + cn2 This has a complexity of what?

When not to use divide and conquer An instance of size n is divided into two or more instances each almost of size n. An instance of size n is divided into almost n instances of size n/c, where c is a constant