Chapter 4: Divide and Conquer

Slides:



Advertisements
Similar presentations
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Advertisements

Divide-and-Conquer The most-well known algorithm design strategy:
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS4413 Divide-and-Conquer
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Theory of Algorithms: Divide and Conquer
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
Divide and Conquer Chapter 6. Divide and Conquer Paradigm Divide the problem into sub-problems of smaller sizes Conquer by recursively doing the same.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 7: Sorting Algorithms
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 4 Some of the sides are exported from different sources.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Design and Analysis of Algorithms – Chapter 51 Divide and Conquer (I) Dr. Ying Lu RAIK 283: Data Structures & Algorithms.
Recurrence Examples.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
Divide and Conquer Strategy
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Advanced Sorting.
Analysis of Algorithms CS 477/677
Subject Name: Design and Analysis of Algorithm Subject Code: 10CS43
Chapter 4 Divide-and-Conquer
Chapter 5 Divide & conquer
Algorithm Design & Analysis
Divide and Conquer Strategy
Divide-and-Conquer The most-well known algorithm design strategy:
Chapter 4 Divide-and-Conquer
Divide-And-Conquer-And-Combine
Chapter 7 Sorting Spring 14
Divide and Conquer.
Divide-and-Conquer The most-well known algorithm design strategy:
Quick Sort (11.2) CSE 2011 Winter November 2018.
Divide-and-Conquer The most-well known algorithm design strategy:
Unit-2 Divide and Conquer
Data Structures Review Session
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Divide-And-Conquer-And-Combine
Divide-and-Conquer The most-well known algorithm design strategy:
Design and Analysis of Algorithms
Chapter 4 Divide-and-Conquer
CSE 2010: Algorithms and Data Structures
Divide and Conquer Merge sort and quick sort Binary search
Chapter 4.
Divide-and-Conquer The most-well known algorithm design strategy:
EE 312 Software Design and Implementation I
Algorithms Dr. Youn-Hee Han April-May 2013
CSE 373: Data Structures and Algorithms
CSE 373 Data Structures and Algorithms
Algorithms: Design and Analysis
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
The Selection Problem.
Design and Analysis of Algorithms
Divide and Conquer Merge sort and quick sort Binary search
Chapter 7 : Sorting 교수 : 이상환 강의실 : 113,118호, 324호 연구실 : 과학관 204호
Presentation transcript:

Chapter 4: Divide and Conquer The Design and Analysis of Algorithms Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees

Chapter 4. Divide and Conquer Algorithms Basic Idea Merge Sort Quick Sort Binary Search Binary Tree Algorithms Closest Pair and Quickhull Conclusion

Basic Idea Divide instance of problem into two or more smaller instances Solve smaller instances recursively Obtain solution to original (larger) instance by combining these solutions

Basic Idea a problem of size n subproblem 1 subproblem 2 of size n/2 a solution to the original problem a problem of size n

General Divide-and-Conquer Recurrence T(n) = aT(n/b) + f (n) where f(n)  (nd), d  0 Master Theorem: If a < bd, T(n)  (nd) If a = bd, T(n)  (nd log n) If a > bd, T(n)  (nlog b a ) Examples: T(n) = T(n/2) + n  T(n)  (n ) Here a = 1, b = 2, d = 1, a < bd T(n) = 2T(n/2) + 1  T(n)  (n log 2 2 ) = (n ) Here a = 2, b = 2, d = 0, a > bd

Examples T(n) = T(n/2) + 1  T(n)  (log(n) ) Here a = 1, b = 2, d = 0, a = bd T(n) = 4T(n/2) + n  T(n)  (n log 2 4 ) = (n 2 ) Here a = 4, b = 2, d = 1, a > bd T(n) = 4T(n/2) + n2  T(n)  (n2 log n) Here a = 4, b = 2, d = 2, a = bd T(n) = 4T(n/2) + n3  T(n)  (n3) Here a = 4, b = 2, d = 3, a < bd

Recurrence Examples

Recursion Tree for T(n)=3T(n/4)+(n2) cn2 cn2 c(n/4)2 T(n/16) (c) T(n/4) T(n/4) T(n/4) (a) (b) cn2 cn2 (3/16)cn2 c(n/4)2 c(n/4)2 c(n/4)2 log 4n (3/16)2cn2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 (nlog 43) T(1) T(1) T(1) T(1) T(1) T(1) 3log4n= nlog 43 Total O(n2) (d)

Solution to T(n)=3T(n/4)+(n2) The height is log 4n, #leaf nodes = 3log 4n= nlog 43. Leaf node cost: T(1). Total cost T(n)=cn2+(3/16) cn2+(3/16)2 cn2+  +(3/16)log 4n-1 cn2+ (nlog 43) =(1+3/16+(3/16)2+  +(3/16)log 4n-1) cn2 + (nlog 43) <(1+3/16+(3/16)2+  +(3/16)m+ ) cn2 + (nlog 43) =(1/(1-3/16)) cn2 + (nlog 43) =16/13cn2 + (nlog 43) =O(n2).

Recursion Tree of T(n)=T(n/3)+ T(2n/3)+O(n)

Mergesort Split array A[0..n-1] in two about equal halves and make copies of each half in arrays B and C Sort arrays B and C recursively Merge sorted arrays B and C into array A

Mergesort

Mergesort

Mergesort

Mergesort Code mergesort( int a[], int left, int right) { if (right > left) middle = left + (right - left)/2; mergesort(a, left, middle); mergesort(a, middle+1, right); merge(a, left, middle, right); }

Analysis of Mergesort T(N) = 2T(N/2) + N All cases have same efficiency: Θ(n log n) Space requirement: Θ(n) (not in-place) Can be implemented without recursion (bottom-up)

Features of Mergesort All cases: Θ(n log n) Requires additional memory Θ(n) Recursive Not stable (does not preserve previous sorting) Never used for sorting in main memory, used for external sorting

Quicksort Let S be the set of N elements Quicksort(S): If N = 0 or N = 1, return Pick an element v - pivot, in S Partition S - {v} into two disjoint sets: S1 = {x є S - {v}| x ≤ v}, and S2 = {x є S - {v}| x ≥ v} Return {quicksort(S1), followed by v, followed by quicksort(S2)}

Quicksort Select a pivot (partitioning element) Median of three algorithm Take the first, the last and the middle elements and sort them (within their positions). Choose the median of these three elements. Hide the pivot – swap the pivot and the next to the last element. Example: 5 8 9 1 4 2 7 After sorting 1 8 9 5 4 2 7 Hide the pivot 1 8 9 2 4 5 7

Quicksort Rearrange the list so that all the elements in the first s positions are smaller than or equal to the pivot and all the elements in the remaining n-s-1 positions are larger than or equal to the pivot. Note: the pivot does not participate in rearranging the elements. Exchange the pivot with the first element in the second subarray — the pivot is now in its final position Sort the two subarrays recursively

left – the smallest valid index right – the largest valid index if( left + 10 <= right) { int i = left, j = right - 1; for ( ; ; ) while (a[++i] < pivot) {} while (pivot < a[--j] ) {} if (i < j) swap (a[i],a[j]); else break; } swap (a[i], a[right-1]); quicksort ( a, left, i-1); quicksort (a, i+1, right); else insertionsort (a, left, right);

Analysis of Quicksort Best case: split in the middle — Θ(n log n) T(N) = 2T(N/2) + N Worst case: sorted array — Θ(n2) T(N) = T(N-1) + N Average case: random arrays — Θ(n log n) Quicksort is considered the method of choice for internal sorting of large files (n ≥ 10000)

Features of Quicksort Best and average case: Θ(n log n) , Worst case: Θ(n2), it happens when the pivot is the smallest of the largest element In-place sorting, additional memory only for swapping Recursive Not stable Never used for life-critical and mission-critical applications, unless you assume the worst-case response time

Binary Search If K = A[m], stop (successful search); Very efficient algorithm for searching in sorted array: K vs A[0] . . . A[m] . . . A[n-1] If K = A[m], stop (successful search); otherwise, continue searching by the same method in A[0..m-1] if K < A[m], and in A[m+1..n-1] if K > A[m]

Analysis of Binary Search Time efficiency T(N) = T(N/2) + 1 Worst case: O(log(n)) Best case: O(1) Optimal for searching a sorted array Limitations: must be a sorted array (not linked list)

Binary Tree Algorithms Binary tree is a divide-and-conquer ready structure Ex. 1: Classic traversals (preorder, inorder, postorder) Algorithm Inorder(T) if T   Inorder(Tleft) print(root of T) Inorder(Tright) Efficiency: Θ(n)

Binary Tree Algorithms Ex. 2: Computing the height of a binary tree h(T) = max{ h(TL), h(TR) } + 1 if T   and h() = -1 Efficiency: Θ(n)

Recursion tree for T(n)=aT(n/b)+f(n)