Divide and Conquer Applications Sanghyun Park Fall 2002 CSE, POSTECH.

Slides:



Advertisements
Similar presentations
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
A simple example finding the maximum of a set S of n numbers.
Rank Rank of an element is its position in ascending key order. [2,6,7,8,10,15,18,20,25,30,35,40] rank(2) = 0 rank(15) = 5 rank(20) = 7.
Closest Pair Given a set S = {p1, p2,..., pn} of n points in the plane find the two points of S whose distance is the smallest. Images in this presentation.
Divide-and-Conquer The most-well known algorithm design strategy:
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS4413 Divide-and-Conquer
1 Divide & Conquer Algorithms. 2 Recursion Review A function that calls itself either directly or indirectly through another function Recursive solutions.
Reminder: Closest-Pair Problem Given a collection P of n points in  × , obtain the two points p1 and p2 such that their distance is less or equal that.
Divide And Conquer Distinguish between small and large instances. Small instances solved differently from large ones.
Theory of Algorithms: Divide and Conquer
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
Spring 2015 Lecture 5: QuickSort & Selection
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
The Substitution method T(n) = 2T(n/2) + cn Guess:T(n) = O(n log n) Proof by Mathematical Induction: Prove that T(n)  d n log n for d>0 T(n)  2(d  n/2.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
CS38 Introduction to Algorithms Lecture 7 April 22, 2014.
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 4 Some of the sides are exported from different sources.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Algorithm Design Strategy Divide and Conquer. More examples of Divide and Conquer  Review of Divide & Conquer Concept  More examples  Finding closest.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 11 Sorting and Searching. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Examine the linear search and.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Sorting Rearrange n elements into ascending order. 7, 3, 6, 2, 1  1, 2, 3, 6, 7.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Divide-And-Conquer Sorting Small instance.  n
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Quick Sort By: HMA. RECAP: Divide and Conquer Algorithms This term refers to recursive problem-solving strategies in which 2 cases are identified: A case.
Theory of Algorithms: Divide and Conquer James Gain and Edwin Blake {jgain | Department of Computer Science University of Cape Town.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Chapter 10 B Algorithm Efficiency and Sorting. © 2004 Pearson Addison-Wesley. All rights reserved 9 A-2 Sorting Algorithms and Their Efficiency Sorting.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
1 Closest Pair of Points (from “Algorithm Design” by J.Kleinberg and E.Tardos) Closest pair. Given n points in the plane, find a pair with smallest Euclidean.
CS 61B Data Structures and Programming Methodology July 21, 2008 David Sun.
Algorithm Design Methods (II) Fall 2003 CSE, POSTECH.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
Divide And Conquer A large instance is solved as follows:  Divide the large instance into smaller instances.  Solve the smaller instances somehow. 
Decision Problems Optimization problems : minimum, maximum, smallest, largest Satisfaction (SAT) problems : Traveling salesman, Clique, Vertex-Cover,
CSE 340: Review (at last!) Measuring The Complexity Complexity is a function of the size of the input O() Ω() Θ() Complexity Analysis “same order” Order.
Sorting. Sorting Sorting is important! Things that would be much more difficult without sorting: –finding a telephone number –looking up a word in the.
1 Ch.19 Divide and Conquer. 2 BIRD’S-EYE VIEW Divide and conquer algorithms Decompose a problem instance into several smaller independent instances May.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Divide and Conquer Sorting
Subject Name: Design and Analysis of Algorithm Subject Code: 10CS43
Rank Rank of an element is its position in ascending key order.
Chapter 4 Divide-and-Conquer
Divide-And-Conquer-And-Combine
Divide and Conquer.
Rank Rank of an element is its position in ascending key order.
Data Structures Review Session
8/04/2009 Many thanks to David Sun for some of the included slides!
Divide-And-Conquer-And-Combine
Divide-and-Conquer The most-well known algorithm design strategy:
Chapter 4.
Divide-and-Conquer The most-well known algorithm design strategy:
CSE 373 Data Structures and Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
Divide & Conquer Sorting
Quicksort and Randomized Algs
Divide and Conquer Merge sort and quick sort Binary search
Presentation transcript:

Divide and Conquer Applications Sanghyun Park Fall 2002 CSE, POSTECH

Merge Sort Sort the first half of the array using merge sort. Sort the second half of the array using merge sort. Merge the first half of the array with the second half.

Merge Algorithm Merge is an operation that combines two sorted arrays. Assume the result is to be placed in a separate array called result (already allocated). The two given arrays are called front and back. front and back are in increasing order. For the complexity analysis, the size of the input, n, is the sum n front + n back.

Merge Algorithm For each array keep track of the current position. REPEAT until all the elements of one of the given arrays have been copied into result: – Compare the current elements of front and back. – Copy the smaller into the current position of result (break the ties however you like). – Increment the current position of result and the array that was copied from. Copy all the remaining elements of the other given array into result.

Merge Algorithm - Complexity Every element in front and back is copied exactly once. Each copy is two accesses, so the total number of accessing due to copying is 2n. The number of comparisons could be as small as min(n front, n back ) or as large as n-1. Each comparison is two accesses.

Merge Algorithm - Complexity In the worst case the total number of accesses is 2n +2(n-1) = O(n). In the best case the total number of accesses is 2n + 2min(n front,n back ) = O(n). The average case is between the worst and best case and is therefore also O(n).

Merge Sort Algorithm Split anArray into two non-empty parts anyway you like. For example, front = the first n/2 elements in anArray back = the remaining elements in anArray Sort front and back by recursively calling MergeSort. Now you have two sorted arrays containing all the elements from the original array. Use merge to combine them, put the result in anArray.

MergeSort Call Graph (n=7) Each box represents one invocation of MergeSort. How many levels are there in general if the array is divided in half each time? 0~6 0~23~6 3~4 3~34~4 5~6 5~56~6 1~2 1~12~2 0~0

MergeSort Call Graph (general) Suppose n = 2 k. How many levels? How many boxes on level j? What values is in each box at level j? n n/2 n/

MergeSort: Complexity Analysis Each invocation of mergesort on p array positions does the following: – Copies all p positions once (#accesses = O(p)) – Calls merge (#accesses = O(p)) Observe that p is the same for all invocations at the same level, therefore total number of accesses at a given level j is O((#invocations at level j) * p j ).

MergeSort: Complexity Analysis The total number of accesses at level j is O((#invocations at level j) * p j ) = O(2 j-1 * n/2 j-1 ) = O(n) In other words, the total number of accesses at each level is O(n). The total number of accesses for the entire mergesort is the sum of the accesses for all the levels. (#levels) * O(n) = O(log n) * O(n) = O(n log n)

MergeSort: Complexity Analysis Best case:O(n log n) Worst case:O(n log n) Average case:O(n log n)

Quick Sort Quicksort can be seen as a variation of mergesort in which front and back are defined in a different way.

Quicksort Algorithm Partition anArray into two non-empty parts. – Pick any value in the array, pivot. – small = the elements in anArray < pivot – large = the elements in anArray > pivot – Place pivot in either part, so as to make sure neither part is empty. Sort small and large by recursively calling QuickSort. You could use merge to combine them, but because the elements in small are smaller than elements in large, simply concatenate small and large, and put the result into anArray.

Quicksort: Complexity Analysis Like mergesort, a single invocation of quicksort on an array of size p has complexity O(p): – p comparisons = 2*p accesses – 2*p moves (copying) = 4*p accesses Best case: every pivot chosen by quicksort partitions the array into equal-sized parts. In this case quicksort is the same big-O complexity as mergesort – O(n log n)

Quicksort: Complexity Analysis Worst case: the pivot chosen is the largest or smallest value in the array. Partition creates one part of size 1 (containing only the pivot), the other of size p-1. n 1n-1 n

Quicksort: Complexity Analysis Worst case: There are n-1 invocations of quicksort (not counting base cases) with arrays of size: p = n, n-1, n-2, …, 2 Since each of these does O(p), the total number of accesses is O(n) + O(n-1) + … + O(1) = O(n 2 ) Ironically the worst case occurs when the list is sorted (or near sorted)!

Quicksort: Complexity Analysis The average case must be between the best case O(n log n) and the worst case is O(n 2 ). Analysis yields a complex recurrence relation. The average case number of comparisons turns out to be approximately 1.386*n*log n – 2.846*n. Therefore the average case time complexity is O(n log n).

Quicksort: Complexity Analysis Best case O(n log n) Worst case O(n2) Average case O(n log n) Note that the quick sort is inferior to insertion sort and merge sort if the list is sorted, nearly sorted, or reverse sorted.

Closest Pair Of Points Given n points in 2D, find the pair that are closest.

Applications We plan to drill holes in a metal sheet. If the holes are too close, the sheet will tear during drilling. Verify that two holes are closer than a threshold distance (e.g., holes are at least 1 inch apart.)

Air Traffic Control 3D --- Locations of airplanes flying in the neighborhood of a busy airport are known. We want to be sure that no two planes get closer than a given threshold distance.

Simple Solution For each of the n(n-1)/2 pairs of points, determine the distance between the points in the pair. Determine the pair with the minimum distance. O(n 2 ) time

Divide and Conquer Solution When n is small, use simple solution. When n is large, – Divide the point set into two roughly equal parts A and B. – Determine the closest pair of points in A. – Determine the closest pair of points in B. – Determine the closest pair of points such that one point is in A and the other in B. – From the three closest pairs computed, select the one with least distance.

Example Divide so that points in A have x coordinate <= that of points in B.

Example Find the closest pair in A. Let d 1 be the distance between the points in this pair.

Example Find the closest pair in B. Let d 2 be the distance between the points in this pair.

Example Let d = min{d 1,d 2 }. Is there a pair with one point in A and the other in B, and their distance < d?

Example Candidates lie within d of the dividing line. Call these regions R A and R B, respectively.

Example Let q be a point in R A. q need to be paired only with those points in R B that are within d of q.y.

Example Points that are to be paired with q are in d x 2d rectangle of R B (comparing region of q). Points in this rectangle are at least d apart. (What is the maximum number of points in this rectangle?)

Example So the comparing region of q has at most 6 points. So number of pairs to check is <= 6 * |R A | = O(n).

Time Complexity Create a sorted list of points (by x-coordinate): O(n log n) time Create a sorted list of points (by y-coordinate): O(n log n) time Using these two lists, the required pairs of points from R A and R B can be constructed in O(n) time. Let n < 4 define a small instance.

Time Complexity Let t(n) be the time to find the closest pair excluding the time to create the two sorted lists. t(n) = c, n < 4, where c is a constant. when n >= 4, t(n) = t(ceil(n/2)) + t(floor(n/2)) + dn, where d is a constant. To solve the recurrence, assume n is a power of 2 and use repeated substitution. t(n) = O(n log n)

Closest Pair Of Points Given n points in 2D, find the pair that are closest. C++ routines are given on pp of the text. The complete program is also available at