CSC 213 – Large Scale Programming. Today’s Goals  Review past discussion of data sorting algorithms  Weaknesses of past approaches & when we use them.

Slides:



Advertisements
Similar presentations
Lower Bounds for Sorting, Searching and Selection
Advertisements

Lecture 3: Parallel Algorithm Design
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
CSC 213 – Large Scale Programming. Today’s Goals  Review discussion of merge sort and quick sort  How do they work & why divide-and-conquer?  Are they.
CSC 213 – Large Scale Programming or. Today’s Goals  Begin by discussing basic approach of quick sort  Divide-and-conquer used, but how does this help?
Data Structures Lecture 9 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
1 Merge Sort Review of Sorting Merge Sort. 2 Sorting Algorithms Selection Sort uses a priority queue P implemented with an unsorted sequence: –Phase 1:
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Merge Sort1 Part-G1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Insertion Sort. Selection Sort. Bubble Sort. Heap Sort. Merge-sort. Quick-sort. 2 CPSC 3200 University of Tennessee at Chattanooga – Summer 2013 © 2010.
CSC2100B Quick Sort and Merge Sort Xin 1. Quick Sort Efficient sorting algorithm Example of Divide and Conquer algorithm Two phases ◦ Partition phase.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Quick-Sort     29  9.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
© 2004 Goodrich, Tamassia Sorting Lower Bound1. © 2004 Goodrich, Tamassia Sorting Lower Bound2 Comparison-Based Sorting (§ 10.3) Many sorting algorithms.
CSC 2300 Data Structures & Algorithms March 27, 2007 Chapter 7. Sorting.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
2 -1 Analysis of algorithms Best case: easiest Worst case Average case: hardest.
© 2004 Goodrich, Tamassia Sorting Lower Bound1. © 2004 Goodrich, Tamassia Sorting Lower Bound2 Comparison-Based Sorting (§ 10.3) Many sorting algorithms.
CPSC 411, Fall 2008: Set 2 1 CPSC 411 Design and Analysis of Algorithms Set 2: Sorting Lower Bound Prof. Jennifer Welch Fall 2008.
TTIT33 Algorithms and Optimization – Dalg Lecture 2 HT TTIT33 Algorithms and optimization Lecture 2 Algorithms Sorting [GT] 3.1.2, 11 [LD] ,
Wednesday, 11/13/02, Slide #1 CS 106 Intro to CS 1 Wednesday, 11/13/02  QUESTIONS??  Today:  More on searching a list; binary search  Sorting a list;
Sorting Lower Bound Andreas Klappenecker based on slides by Prof. Welch 1.
MergeSort Source: Gibbs & Tamassia. 2 MergeSort MergeSort is a divide and conquer method of sorting.
Sorting Lower Bound1. 2 Comparison-Based Sorting (§ 4.4) Many sorting algorithms are comparison based. They sort by making comparisons between pairs of.
CSC 213 Lecture 12: Quick Sort & Radix/Bucket Sort.
CSCE 3110 Data Structures & Algorithm Analysis Sorting (I) Reading: Chap.7, Weiss.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
Binary Heap.
Sorting Fun1 Chapter 4: Sorting     29  9.
LECTURE 40: SELECTION CSC 212 – Data Structures.  Sequence of Comparable elements available  Only care implementation has O(1) access time  Elements.
CSC 213 Lecture 13: Writing Code & Sorting Lowest Bound.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
1 COMMONWEALTH OF AUSTRALIA Copyright Regulations 1969 WARNING This material has been reproduced and communicated to you by or on behalf.
1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 13 CS2110 – Fall 2009.
CSE 250 – Data Structures. Today’s Goals  First review the easy, simple sorting algorithms  Compare while inserting value into place in the vector 
Merge Sort 1/12/2018 5:48 AM Merge Sort 7 2   7  2  2 7
Advanced Sorting 7 2  9 4   2   4   7
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Sorting.
Merge Sort 1/12/2018 9:44 AM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Sorting Lower Bound 4/25/2018 8:49 PM
Lecture 3: Parallel Algorithm Design
CPSC 411 Design and Analysis of Algorithms
Quick Sort and Merge Sort
Data Structures Using C++
Quick-Sort 9/12/2018 3:26 PM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Divide and Conquer.
MergeSort Source: Gibbs & Tamassia.
More on Merge Sort CS 244 This presentation is not given in class
Objectives Introduce different known sorting algorithms
(2,4) Trees 11/15/2018 9:25 AM Sorting Lower Bound Sorting Lower Bound.
Chapter 4: Divide and Conquer
(2,4) Trees 12/4/2018 1:20 PM Sorting Lower Bound Sorting Lower Bound.
CS200: Algorithm Analysis
Merge Sort 2/23/ :15 PM Merge Sort 7 2   7  2   4  4 9
Quick-Sort 2/25/2019 2:22 AM Quick-Sort     2
(2,4) Trees 2/28/2019 3:21 AM Sorting Lower Bound Sorting Lower Bound.
Quick-Sort 4/8/ :20 AM Quick-Sort     2 9  9
Merge Sort 4/10/ :25 AM Merge Sort 7 2   7  2   4  4 9
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
Merge Sort 5/30/2019 7:52 AM Merge Sort 7 2   7  2  2 7
CS203 Lecture 15.
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

CSC 213 – Large Scale Programming

Today’s Goals  Review past discussion of data sorting algorithms  Weaknesses of past approaches & when we use them  Can we find limit to how long sorting needs?  What does this mean for sorting & those past sorts  Get good idea of how merge sort is executed  What is algorithm and what will this require?  What are execution trees & how they show runtime?

Ghosts of Sorts Past  We have already seen & discussed 4 sorts  Bubble-sort -- O ( n 2 ) time sort; slowest sort  Selection-sort -- O ( n 2 ) time sort; PQ concept  Insertion-sort -- O ( n 2 ) time sort; PQ concept O ( n log n )  Heap-sort -- O ( n log n ) time sort; requires PQ  All of these sorts of limited usefulness

Ghosts of Sorts Past  We have already seen & discussed 4 sorts  Bubble-sort -- O ( n 2 ) time sort; slowest sort  Selection-sort -- O ( n 2 ) time sort; PQ concept  Insertion-sort -- O ( n 2 ) time sort; PQ concept O ( n log n )  Heap-sort -- O ( n log n ) time sort; requires PQ  All of these sorts of limited usefulness

Counting Comparisons decision tree  Consider sort as a path in a decision tree  Nodes are single decision needed for sorting yesno Is x i > x j ?

Counting Comparisons decision tree  Consider sort as a path in a decision tree  Nodes are single decision needed for sorting  Traveling from root to leaf sorts data  Tree’s height is lower-bound on sorting complexity

Decision Tree Height  Unique leaf for each ordering of data initially  Needed to ensure we sort different inputs differently  Consider 4, 5 as data to be sorted using a tree  Could be entered in 2 possible orders: 4, 5 or 5, 4  Need two leaves for this sort unless (4 < 5) == (5 < 4)

Decision Tree Height  Unique leaf for each ordering of data initially  Needed to ensure we sort different inputs differently  Consider 4, 5 as data to be sorted using a tree  Could be entered in 2 possible orders: 4, 5 or 5, 4  Need two leaves for this sort unless (4 < 5) == (5 < 4)  For sequence of n numbers, can arrange n! ways  Tree with n! leaves needed to sort n numbers  Given this many leaves, what is height of the tree?

Decision Tree Height  With n ! external nodes, binary tree’s height is:

The Lower Bound  But what does O(log(n !) ) equal? n ! = n * n -1 * n -2 * n -3 * n /2 * … * 2 * 1 n ! ≤ ( ½* n ) ½* n (½ of series is larger than ½* n ) log(n!) ≤ log((½*n) ½*n ) log(n!) ≤ ½*n * log(½*n) O(log(n!)) ≤ O(½*n * log(½*n))

The Lower Bound  But what does O(log(n !) ) equal? n ! = n * n -1 * n -2 * n -3 * n /2 * … * 2 * 1 n ! ≤ ( ½* n ) ½* n (½ of series is larger than ½* n ) log(n!) ≤ log((½*n) ½*n ) log(n!) ≤ ½*n * log(½*n) O(log(n!)) ≤ O(½*n * log(½*n))

The Lower Bound  But what does O(log(n !) ) equal? n ! = n * n -1 * n -2 * n -3 * n /2 * … * 2 * 1 n ! ≤ ( ½* n ) ½* n (½ of series is larger than ½* n ) log(n!) ≤ log((½*n) ½*n ) log(n!) ≤ ½*n * log(½*n) O(log(n!)) ≤ O(n log n)

Lower Bound on Sorting  Smallest number of comparisons is tree’s height  Decision tree sorting n elements has n! leaves  At least log( n !) height needed for this many leaves  As we saw, this simplifies to at most O(n log n) height  O(n log n) time needed to compare data!  Means that heap-sort is fastest possible (in big-Oh)  Pain-in-the- to code & requires external heap

Lower Bound on Sorting  Smallest number of comparisons is tree’s height  Decision tree sorting n elements has n! leaves  At least log( n !) height needed for this many leaves  As we saw, this simplifies to at most O(n log n) height  O(n log n) time needed to compare data!  Means that heap-sort is fastest possible (in big-Oh)  Pain-in-the- to code & requires external heap  Is there a simple sort using only Sequence ?

Julius, Seize Her!  Formula to Roman success  Divide peoples before an attack  Then conquer weakened armies  Common programming paradigm  Divide: split into 2 partitions  Recur: solve for partitions  Conquer: combine solutions

Divide-and-Conquer  Like all recursive algorithms, need base case  Has immediate solution to a simple problem  Work is not easy and sorting 2+ items takes work  Already sorted 1 item since it cannot be out of order  Sorting a list with 0 items even easer  Recursive step simplifies problem & combines it  Begins by splitting data into two equal Sequence s  Merges sub Sequence s after they have been sorted

Merge-Sort Algorithm mergeSort(Sequence S, Comparator C) if S.size() < 2 then // Base case return S else // Recursive case // Split S into two equal-sized partitions S 1 and S 2 mergeSort(S 1, C) mergeSort(S 2, C) S  merge(S 1, S 2, C) return S

Merging Sorted Sequences Algorithm merge(S 1, S 2, C) Sequence retVal = // Code instantiating a Sequence while  S 1.isEmpty() &&  S 2.isEmpty() if C.compare(S 1.get(0), S 2.get(0)) < 0 retVal.insertLast(S 1.removeFirst()) else retVal.insertLast(S 2.removeFirst()) endif end while  S 1.isEmpty() retVal.insertLast(S 1.removeFirst()) end while  S 2.isEmpty() retVal.insertLast(S 2.removeFirst()) end return retVal

Execution Tree  Depicts divide-and-conquer execution  Recursive call represented by each oval node  Original Sequence shown at start  At the end of the oval, sorted Sequence shown  Initial call at root of the (binary) tree  Bottom of the tree has leaves for base cases

Execution Example  Not in a base case 

Execution Example  Not in a base case, so split into S 1 & S 

Execution Example  Not in a base case, so split into S 1 & S 

Execution Example  Recursively call merge-sort on S 

Execution Example  Recursively call merge-sort on S  

Execution Example  Not in a base case, so split into S 1 & S  

Execution Example  Not in a base case, so split into S 1 & S  

Execution Example  Recursively call merge-sort on S  

Execution Example  Recursively call merge-sort on S    2 7

Execution Example  Still no base case, so split again & recurse on S     7

Execution Example  Enjoy the base case – literally no work to do!     77  7

Execution Example  Recurse on S 2 and solve for this base case     77  72  22  2

Execution Example  Merge the two solutions to complete this call     77  72  22  2

Execution Example  Recurse on S 2 and sort this sub Sequence      77  72  22  2

Execution Example  Split into S 1 & S 2 and solve the base cases      94  44  47  77  72  22  2

Execution Example  Merge the 2 solutions to sort this Sequence      94  44  47  77  72  22  2

Execution Example  I feel an urge, an urge to merge      94  44  47  77  72  22  2

Execution Example  Let's do the merge sort again! (with S 2 )       94  44  47  77  72  22  2

Execution Example  Let's do the merge sort again! (with S 2 )        88  83  33  39  94  44  47  77  72  22  2

Execution Example  Let's do the merge sort again! (with S 2 )         66  61  11  18  88  83  33  39  94  44  47  77  72  22  2

Execution Example  Let's do the merge sort again! (with S 2 )         66  61  11  18  88  83  33  39  94  44  47  77  72  22  2

Execution Example  Merge the last call to get the final result         66  61  11  18  88  83  33  39  94  44  47  77  72  22  2

For Next Lecture  New weekly assignment for week was posted  Discussing sorts which have few concepts to code  Will return soon enough; do not worry about it  Keep reviewing requirements for program #2  Preliminary deadlines arrive before final version  Time spent on requirements & design saves coding  Reading on quick sort for this Friday  Guess what? It can be really, really, quick  Severe drawbacks also exist; read to understand this