1 CSC 421: Algorithm Design & Analysis Spring 2013 Divide & conquer  divide-and-conquer approach  familiar examples: merge sort, quick sort  other examples:

Slides:



Advertisements
Similar presentations
CSC 421: Algorithm Design & Analysis
Advertisements

CS 261 – Data Structures AVL Trees. Binary Search Tree: Balance Complexity of BST operations: proportional to the length of the path from the root to.
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
A simple example finding the maximum of a set S of n numbers.
CSC 213 – Large Scale Programming or. Today’s Goals  Begin by discussing basic approach of quick sort  Divide-and-conquer used, but how does this help?
Divide-and-Conquer The most-well known algorithm design strategy:
1 CSC 427: Data Structures and Algorithm Analysis Fall 2010 Divide & conquer  divide-and-conquer approach  familiar examples: binary search, merge sort,
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS4413 Divide-and-Conquer
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Divide and Conquer.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
CSC2100B Quick Sort and Merge Sort Xin 1. Quick Sort Efficient sorting algorithm Example of Divide and Conquer algorithm Two phases ◦ Partition phase.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
Algorithm Design Strategy Divide and Conquer. More examples of Divide and Conquer  Review of Divide & Conquer Concept  More examples  Finding closest.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
1 CSC 321: Data Structures Fall 2012 Proofs & trees  proof techniques –direct proof, proof by contradiction, proof by induction  trees  tree recursion.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
CSCE 3110 Data Structures & Algorithm Analysis Sorting (I) Reading: Chap.7, Weiss.
24/3/00SEM107 - © Kamin & ReddyClass 16 - Searching - 1 Class 16 - Searching r Linear search r Binary search r Binary search trees.
SEARCHING UNIT II. Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances.
INTRODUCTION TO BINARY TREES P SORTING  Review of Linear Search: –again, begin with first element and search through list until finding element,
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
1 CSC 427: Data Structures and Algorithm Analysis Fall 2011 Divide & conquer (part 1)  divide-and-conquer approach  familiar examples: binary search,
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
1 Ch.19 Divide and Conquer. 2 BIRD’S-EYE VIEW Divide and conquer algorithms Decompose a problem instance into several smaller independent instances May.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
(c) University of Washington20c-1 CSC 143 Binary Search Trees.
CSC 222: Object-Oriented Programming
CSC 321: Data Structures Fall 2016 Trees & recursion
CSC 427: Data Structures and Algorithm Analysis
Subject Name: Design and Analysis of Algorithm Subject Code: 10CS43
CSC 427: Data Structures and Algorithm Analysis
CSC 421: Algorithm Design & Analysis
CSC 421: Algorithm Design & Analysis
Chapter 4 Divide-and-Conquer
CSC 421: Algorithm Design & Analysis
CSC 421: Algorithm Design & Analysis
CSC 421: Algorithm Design & Analysis
Divide-and-Conquer The most-well known algorithm design strategy:
Chapter 4 Divide-and-Conquer
CSC 421: Algorithm Design & Analysis
CSC 321: Data Structures Fall 2017 Trees & recursion
Trees.
Jordi Cortadella and Jordi Petit Department of Computer Science
Map interface Empty() - return true if the map is empty; else return false Size() - return the number of elements in the map Find(key) - if there is an.
Divide-and-Conquer The most-well known algorithm design strategy:
Chapter 4: Divide and Conquer
Unit-2 Divide and Conquer
Topic: Divide and Conquer
Divide-and-Conquer The most-well known algorithm design strategy:
Chapter 4 Divide-and-Conquer
Divide-and-Conquer The most-well known algorithm design strategy:
CSC 321: Data Structures Fall 2018 Trees & recursion
CSC 421: Algorithm Design & Analysis
CSC 143 Binary Search Trees.
CSC 380: Design and Analysis of Algorithms
CSCE 3110 Data Structures & Algorithm Analysis
CSCE 3110 Data Structures & Algorithm Analysis
Trees.
CSCE 3110 Data Structures & Algorithm Analysis
CSC 421: Algorithm Design & Analysis
CS203 Lecture 15.
CS 261 – Data Structures AVL Trees.
Presentation transcript:

1 CSC 421: Algorithm Design & Analysis Spring 2013 Divide & conquer  divide-and-conquer approach  familiar examples: merge sort, quick sort  other examples: closest points, large integer multiplication  tree operations: binary trees, BST

2 Divide & conquer probably the best-known problem-solving paradigm 1.divide the problem into several subproblems of the same type (ideally of about equal size) 2.solve the subproblems, typically using recursion 3.combine the solutions to the subproblems to obtain a solution to the original problem

Not always a win some problems can be thought of as divide & conquer, but are not efficient to implement that way  e.g., counting the number of 0 elements in a list of numbers: 1.recursively count the number of 0's in the 1 st half of the list 2.recursively count the number of 0's in the 2 nd half of the list 3.add the two counts 3 Cost(N)= 2 Cost(N/2) + C = 2 [2 Cost(N/4) + C] + C = 4 Cost(N/4) + 3C = 4 [2 Cost(N/8) + C] + 3C = 8 Cost(N/8) + 7C =... = N Cost(N/N) + (N-1)C = NC' + (N-1)C = (C + C')N - C  O(N) the overhead of recursion makes this much slower than a simple iteration (i.e., decrease & conquer)

Master Theorem Suppose Cost(N) = a Cost(N/b) + C n d.  that is, divide & conquer is performed with polynomial-time overhead O(N d )if a < b d Cost(N) = O(N d log N)if a = b d O(N log b a )if a > b d 4  e.g., zero count algorithm has Cost(N) = 2Cost(N/2) + C a = 2, b = 2, d = 0  2 > 2 0  O(N log 2 2 )  O(N)

Merge sort we have seen divide-and-conquer algorithms that are more efficient than brute force e.g., merge sort list[0..N-1] 1.if list N <= 1, then DONE 2.otherwise, a)merge sort list[0..N/2] b)merge sort list[N/2+1..N-1] c)merge the two sorted halves recall: Cost(N) = 2Cost(N/2) +CN  merging is O(N), but requires O(N) additional storage and copying  we can show this is O(N log N) by unwinding, or  a = 2, b = 2, d = 1  2 = 2 1  O(N log N) by Master Theorem 5

Why is halving most common? consider a variation of merge sort that divides the list into 3 parts 1.if list N <= 1, then DONE 2.otherwise, a)merge sort list[0..N/3] b)merge sort list[N/3+1..2N/3] c)merge sort list[2N/3+1..N-1] d)merge the three sorted parts Cost(N) = 3Cost(N/3) + CN a = 3, b = 3, d = 1  3 = 3 1  O(N log N) in general,X Cost(N/X) + CN  O(N log N) X Cost(N/X) + C  O(N)  dividing into halves is simplest, and just as efficient as thirds, quarters, … 6

Quick sort Collections.sort implements quick sort, another O(N log ) sort which is faster in practice e.g., quick sort list[0..N-1] 1.if list N <= 1, then DONE 2.otherwise, a)select a pivot element (e.g., list[0], list[N/2], list[random], …) b)partition list into [items pivot] c)quick sort the partitions best case: pivot is median Cost(N) = 2Cost(N/2) +CN  O(N log N) worst case: pivot is smallest or largest value Cost(N) = Cost(N-1) +CN  O(N 2 ) 7

Quick sort (cont.) average case: O(N log N) there are variations that make the worst case even more unlikely  switch to selection sort when small  median-of-three partitioning instead of just taking the first item (or a random item) as pivot, take the median of the first, middle, and last items in the list if the list is partially sorted, the middle element will be close to the overall median if the list is random, then the odds of selecting an item near the median is improved refinements like these can improve runtime by 20-25% however, O(N 2 ) degradation is possible 8

9 Closest pair given a set of N points, find the pair with minimum distance  brute force approach: consider every pair of points, compare distances & take minimum big Oh?  there exists an O(N log N) divide-and-conquer solution Assume the points are sorted by x-coordinate (can be done in O(N log N)) 1.partition the points into equal parts using a vertical line in the plane 2.recursively determine the closest pair on left side (Ldist) and the closest pair on the right side (Rdist) 3.find closest pair that straddles the line (Cdist), each within min(Ldist,Rdist) of the line (can be done in O(N)) 4.answer = min(Ldist, Rdist, Cdist) O(N 2 ) Cost(N) = 2 Cost(N/2) + CN  O(N log N)

Multiplying large integers many CS applications, e.g., cryptography, involve manipulating large integers long multiplication: × long multiplication of two N-digit integers requires N 2 digit multiplications 10

Divide & conquer multiplication can solve (a x b) by splitting the numbers in half & factoring consider a 1 = 1 st N/2 digits of a, a 0 = 2 nd N/2 digits of a (similarly for b 1 and b 0 ) (a x b) = (a 1 10 N/2 + a 0 ) x (b 1 10 N/2 + b 0 ) = (a 1 x b 1 )10 N + (a 1 x b 0 + a 0 x b 1 )10 N/2 + (a 0 x b 0 ) = c 2 10 N + c 1 10 N/2 + c 0 where  c 2 = a 1 x b 1  c 0 = a 0 x b 0  c 1 = (a 1 + a 0 ) x (b 1 + b 0 ) – (c 2 + c 0 ) 11 EXAMPLE: a = b =  c 2 = a 1 x b 1 = 123 x 213 =  c 0 = a 0 x b 0 = 456 x 121 =  c 1 = (a 1 + a 0 ) x (b 1 + b 0 ) – (c 2 + c 0 ) = ( ) x ( ) – ( ) = 579 x 334 – = – = (a x b) = c 2 10 N + c 1 10 N/2 + c 0 = =

Efficiency of d&c multiplication in order to multiply N-digit numbers  calculate c 2, c 1, and c 0, each of which involves multiplying N/2-digit numbers MultCost(N) = 3 MultCost(N/2) + C from Master Theorem: a = 3, b = 2, d = 0  3 > 2 0  O(N log 2 3 ) ≈ O(N ) 12 worry: in minimizing the number of multiplications, we have increased the number of additions & subtractions  a similar analysis can show that AddCost(N) is also O(N log 2 3 ) does O(N log 2 3 ) really make a difference vs. O(N 2 ) ?  has been shown to improve performance for as small as N = 8  for N = 300, can run more than twice as fast

13 Dividing & conquering trees since trees are recursive structures, most tree traversal and manipulation operations can be classified as divide & conquer algorithms  can divide a tree into root + left subtree + right subtree  most tree operations handle the root as a special case, then recursively process the subtrees  e.g., to display all the values in a (nonempty) binary tree, divide into 1.displaying the root 2.(recursively) displaying all the values in the left subtree 3.(recursively) displaying all the values in the right subtree  e.g., to count number of nodes in a (nonempty) binary tree, divide into 1.(recursively) counting the nodes in the left subtree 2.(recursively) counting the nodes in the right subtree 3.adding the two counts + 1 for the root

BinaryTree class public class BinaryTree { protected TreeNode root; public BinaryTree() { this.root = null; } public void add(E value) { … } public boolean remove(E value) { … } public boolean contains(E value) { … } public int size() { … } public String toString() { … } } 14 to implement a binary tree, need to store the root node  the root field is "protected" instead of "private" to allow for inheritance  the empty tree has a null root  then, must implement methods for basic operations on the collection

15 size method public int size() { return this.size(this.root); } private int size(TreeNode current) { if (current == null) { return 0; } else { return this.size(current.getLeft()) + this.size(current.getRight()) + 1; } } divide-and-conquer approach: BASE CASE: if the tree is empty, number of nodes is 0 RECURSIVE: otherwise, number of nodes is (# nodes in left subtree) + (# nodes in right subtree) + 1 for the root note: a recursive implementation requires passing the root as parameter  will have a public "front" method, which calls the recursive "worker" method

16 contains method public boolean contains(E value) { return this.contains(this.root, value); } private boolean contains(TreeNode current, E value) { if (current == null) { return false; } else { return value.equals(current.getData()) || this.contains(current.getLeft(), value) || this.contains(current.getRight(), value); } } divide-and-conquer approach: BASE CASE: if the tree is empty, the item is not found BASE CASE: otherwise, if the item is at the root, then found RECURSIVE: otherwise, search the left and then right subtrees

17 toString method public String toString() { if (this.root == null) { return "[]"; } String recStr = this.toString(this.root); return "[" + recStr.substring(0,recStr.length()-1) + "]"; } private String toString(TreeNode current) { if (current == null) { return ""; } return this.toString(current.getLeft()) + current.getData().toString() + "," + this.toString(current.getRight()); } must traverse the entire tree and build a string of the items  there are numerous patterns that can be used, e.g., in-order traversal BASE CASE: if the tree is empty, then nothing to traverse RECURSIVE: recursively traverse the left subtree, then access the root, then recursively traverse the right subtree

Other tree operations? add? remove? numOccur? height? weight? 18

19 Binary search trees a binary search tree is a binary tree in which, for every node:  the item stored at the node is ≥ all items stored in its left subtree  the item stored at the node is < all items stored in its right subtree are BST operations divide & conquer? contains? add? remove? what about balanced BSTs? (e.g., red-black trees, AVL trees)