Sorting. Typically, most organizations spend large percentage of computing time (25% - 50%) on sorting Internal sorting the list is small enough to sort.

Slides:



Advertisements
Similar presentations
Merge and Radix Sorts Data Structures Fall, th.
Advertisements

Introduction to Algorithms Quicksort
Garfield AP Computer Science
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Sorting Comparison-based algorithm review –You should know most of the algorithms –We will concentrate on their analyses –Special emphasis: Heapsort Lower.
§7 Quicksort -- the fastest known sorting algorithm in practice 1. The Algorithm void Quicksort ( ElementType A[ ], int N ) { if ( N < 2 ) return; pivot.
ADA: 5. Quicksort1 Objective o describe the quicksort algorithm, it's partition function, and analyse its running time under different data conditions.
CSCE 3110 Data Structures & Algorithm Analysis
Divide And Conquer Distinguish between small and large instances. Small instances solved differently from large ones.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Using Divide and Conquer for Sorting
CSE 373: Data Structures and Algorithms
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Chapter 7: Sorting Algorithms
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Nyhoff, ADTs, Data Structures and Problem Solving with C++, Second Edition, © 2005 Pearson Education, Inc. All rights reserved Sorting.
CS 206 Introduction to Computer Science II 04 / 27 / 2009 Instructor: Michael Eckmann.
Sorting Heapsort Quick review of basic sorting methods Lower bounds for comparison-based methods Non-comparison based sorting.
CSC 2300 Data Structures & Algorithms March 27, 2007 Chapter 7. Sorting.
CS 206 Introduction to Computer Science II 12 / 05 / 2008 Instructor: Michael Eckmann.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
2 -1 Analysis of algorithms Best case: easiest Worst case Average case: hardest.
Department of Computer Eng. & IT Amirkabir University of Technology (Tehran Polytechnic) Data Structures Lecturer: Abbas Sarraf Internal.
CS 206 Introduction to Computer Science II 12 / 03 / 2008 Instructor: Michael Eckmann.
Sorting. Introduction Assumptions –Sorting an array of integers –Entire sort can be done in main memory Straightforward algorithms are O(N 2 ) More complex.
Sorting Chapter 10.
TTIT33 Algorithms and Optimization – Dalg Lecture 2 HT TTIT33 Algorithms and optimization Lecture 2 Algorithms Sorting [GT] 3.1.2, 11 [LD] ,
Sorting Rearrange n elements into ascending order. 7, 3, 6, 2, 1  1, 2, 3, 6, 7.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Sorting Chapter 10. Chapter 10: Sorting2 Chapter Objectives To learn how to use the standard sorting methods in the Java API To learn how to implement.
Divide-And-Conquer Sorting Small instance.  n
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
Been-Chian Chien, Wei-Pang Yang, and Wen-Yang Lin 7-1 Chapter 7 Sorting Introduction to Data Structure CHAPTER 7 SORTING 7.1 Searching and List Verification.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Sorting HKOI Training Team (Advanced)
HKOI 2006 Intermediate Training Searching and Sorting 1/4/2006.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Sorting Chapter 10. Chapter Objectives  To learn how to use the standard sorting methods in the Java API  To learn how to implement the following sorting.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
Sorting CS 105 See Chapter 14 of Horstmann text. Sorting Slide 2 The Sorting problem Input: a collection S of n elements that can be ordered Output: the.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
Sorting CS 110: Data Structures and Algorithms First Semester,
Divide and Conquer Applications Sanghyun Park Fall 2002 CSE, POSTECH.
CS 206 Introduction to Computer Science II 04 / 22 / 2009 Instructor: Michael Eckmann.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Divide And Conquer A large instance is solved as follows:  Divide the large instance into smaller instances.  Solve the smaller instances somehow. 
Quick sort, lower bound on sorting, bucket sort, radix sort, comparison of algorithms, code, … Sorting: part 2.
1 Ch.19 Divide and Conquer. 2 BIRD’S-EYE VIEW Divide and conquer algorithms Decompose a problem instance into several smaller independent instances May.
Chapter 4, Part II Sorting Algorithms. 2 Heap Details A heap is a tree structure where for each subtree the value stored at the root is larger than all.
Nothing is particularly hard if you divide it into small jobs. Henry Ford Nothing is particularly hard if you divide it into small jobs. Henry Ford.
Quicksort Dr. Yingwu Zhu. 2 Quicksort A more efficient exchange sorting scheme than bubble sort – A typical exchange involves elements that are far apart.
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 13 CS2110 – Fall 2009.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting So Far Insertion sort: –Easy to code –Fast on small inputs (less than ~50 elements) –Fast on nearly-sorted.
Divide and Conquer Sorting
Divide-And-Conquer-And-Combine
Description Given a linear collection of items x1, x2, x3,….,xn
Sorting Chapter 13 Nyhoff, ADTs, Data Structures and Problem Solving with C++, Second Edition, © 2005 Pearson Education, Inc. All rights reserved
Chapter 4: Divide and Conquer
Divide-And-Conquer-And-Combine
Sub-Quadratic Sorting Algorithms
CSE 373 Data Structures and Algorithms
Chapter 7 : Sorting 교수 : 이상환 강의실 : 113,118호, 324호 연구실 : 과학관 204호
Presentation transcript:

Sorting

Typically, most organizations spend large percentage of computing time (25% - 50%) on sorting Internal sorting the list is small enough to sort entirely in main memory bubble sorting insertion sorting quick sorting shell sorting heap sorting bucket sorting radix sorting

Divide-And-Conquer Sorting Small instance.  n <= 1 elements.  n <= 10 elements.  We’ll use n <= 1 for now. Large instance.  Divide into k >= 2 smaller instances.  k = 2, 3, 4, … ?  What does each smaller instance look like?  Sort smaller instances recursively.  How do you combine the sorted smaller instances?

Insertion Sort k = 2 First n - 1 elements (a[0:n-2]) define one of the smaller instances; last element (a[n-1]) defines the second smaller instance. a[0:n-2] is sorted recursively. a[n-1] is a small instance.a[0]a[n-1] a[n-2]

Insertion Sort Combining : insert a[n-1] into the sorted a[0:n-2] Complexity is O(n 2 ). Usually implemented nonrecursively.a[0]a[n-1] a[n-2]

Insertion Sort for each j between 1 and n-1, insert list[j] into already sorted subfile list[0],···,list[j-1] Left Out of Order(LOO) relative disorder R i is LOO iff R i < max{R j } 0<j<i The insertion step is executed only for those records that are LOO Time complexity Best case: O(n), worst case: O(n 2 ) Does little work if the list is nearly sorted

Insertion Sort Example) - n = 5 - input sequence: (5, 4, 3, 2, 1) - all records R 0, R 1, R 2, R 3, R 4 are LOO

Insertion Sort Example) n = 5 input sequence (2, 3, 4, 5, 1) only R 4 is LOO

Insertion Sort void insertion_sort(element list[], int n) { /* perform a insertion sort of the list */ int i, j; element next; for (i = 1; i < n; i++) { next = list[i]; for (j = i - 1; (j >= 0) && (next.key < list[j].key); j--) list[j + 1] = list[j]; list[j + 1] = next; } … list[] ij next card Current cards

Selection Sort k = 2 To divide a large instance into two smaller instances, first find the largest element. The largest element defines one of the smaller instances; the remaining n-1 elements define the second smaller instance.a[0]a[n-1] a[n-2]

Selection Sort The second smaller instance is sorted recursively. Append the first smaller instance (largest element) to the right end of the sorted smaller instance. Complexity is O(n 2 ). Usually implemented nonrecursively.a[0]a[n-1] a[n-2]

Bubble Sort Compare every list[i] and list[i+1] elements for i = 0, 1, ···, n-1; if out of order, then swap list[i] and list[i+1] Each pass filters up the largest key size is decreased by 1 after each pass If swap does not occur on the previous pass, then already sorted

Bubble Sort bubble_sort list last swap occurred

Bubble Sort void bubble_sort(element list[], int n) { /* perform a bubble sort on the list */ int i, j; int flag = 1; element next; for (i = n - 1; flag > 0; i--) { flag = 0; for (j = 0; j < i; j++) if (list[j] > list[j + 1] { swap(&list[j], &list[j + 1]); flag = 1; } … list[] jj+1

Bubble Sort If on any pass last swap occurs at j-th and (j+1)-th position then set i to the last value of j time complexity: worst case O(n 2 )

Bubble Sort Bubble sort may also be viewed as a k = 2 divide-and-conquer sorting method. Insertion sort, selection sort and bubble sort divide a large instance into one smaller instance of size n - 1 and another one of size 1. All three sort methods take O(n 2 ) time.

Divide And Conquer Divide-and-conquer algorithms generally have best complexity when a large instance is divided into smaller instances of approximately the same size. When k = 2 and n = 24, divide into two smaller instances of size 12 each. When k = 2 and n = 25, divide into two smaller instances of size 13 and 12, respectively.

Quick Sort Divide and conquer two phase split and control Use recursion stack is needed Best average time O(n·log 2 n)

Quick Sort Small instance has n <= 1. Every small instance is a sorted instance. To sort a large instance, select a pivot element from out of the n elements. Partition the n elements into 3 groups left, middle and right. The middle group contains only the pivot element. All elements in the left group are <= pivot. All elements in the right group are >= pivot. Sort left and right groups recursively. Answer is sorted left group, followed by middle group followed by sorted right group.

Choice of Pivot Pivot is leftmost element in list that is to be sorted.  When sorting a[6:20], use a[6] as the pivot.  Text implementation does this. Randomly select one of the elements to be sorted as the pivot.  When sorting a[6:20], generate a random number r in the range [6, 20]. Use a[r] as the pivot.

Choice of Pivot Median-of-Three rule. From the leftmost, middle, and rightmost elements of the list to be sorted, select the one with median key as the pivot.  When sorting a[6:20], examine a[6], a[13] ((6+20)/2), and a[20]. Select the element with median (i.e., middle) key.  If a[6].key = 30, a[13].key = 2, and a[20].key = 10, a[20] becomes the pivot.  If a[6].key = 3, a[13].key = 2, and a[20].key = 10, a[6] becomes the pivot.  If a[6].key = 30, a[13].key = 25, and a[20].key = 10, a[13] becomes the pivot.

Choice of Pivot When the pivot is picked at random or when the median-of-three rule is used, we can use the quick sort code of the text provided we first swap the leftmost element and the chosen pivot. pivot swap

Partitioning Into Three Groups Sort a = [6, 2, 8, 5, 11, 10, 4, 1, 9, 7, 3]. Leftmost element (6) is the pivot. When another array b is available:  Scan a from left to right (omit the pivot in this scan), placing elements <= pivot at the left end of b and the remaining elements at the right end of b.  The pivot is placed at the remaining position of the b.

Partitioning Example Using Additional Array a b Sort left and right groups recursively.

In-place Partitioning 1. Find leftmost element (bigElement) > pivot. 2. Find rightmost element (smallElement) < pivot. 3. Swap bigElement and smallElement provided bigElement is to the left of smallElement. 4. Repeat.

In-Place Partitioning X’X X : pivot 01n-1 swap the first element greater than pivot the first element smaller than pivot new pivot X’X ······ Xsorted

In-Place Partitioning X’’X X ······ X sorted

In-Place Partitioning Example a a a a 6104 bigElement is not to left of smallElement, terminate process. Swap pivot and smallElement a 6106

Quick Sort Example) quicksort input file: 10 records (26,5,37,1,61,11,59,15,48,19) simulation of quicksort

Quick Sort void quicksort(element list[], int left, int right) { int pivot, i, j; element temp; if (left < right) { i = left; j = right + 1; pivot = list[left].key; do { do i++; while (list[i].key < pivot); do j--; while (list[j].key > pivot); if (i < j) SWAP(list[i], list[j], temp); } while (i < j); SWAP (list[left], list[j], temp); quicksort(list, left, j - 1); quicksort(list, j + 1, right); } leftright i jpivot Pivot 과 j 번째 cell 과의 교환

Complexity Time complexity average case: O(n·log 2 n) split into “equal size” T(n): average time to sort n records T(n)  c·n + 2·T(n/2)  c·n + 2(c·n/2 + 2·T(n/4))  2·c·n + 4·T(n/4) ···  c·n·log 2 n + n·T(1) = O(n·log 2 n) worst case: O(n 2 ) when input list is already sorted

Complexity T(n) is maximum when either |left| = 0 or |right| = 0 following each partitioning: when the pivot is always the smallest or largest element. For the worst-case time, T(n) = T(n-1) + cn, n > 1 Use repeated substitution to get T(n) = O(n 2 ). The best case arises when |left| and |right| are equal (or differ by 1) following each partitioning. So the best-case complexity is O(n log n). To improve performance, define a small instance to be one with n <= 15 (say) and sort small instances using insertion sort.

Optimal Sorting Time How quickly can we sort a list on n objects? the best possible time: O(n·log 2 n) decision tree on a list (X 0,X 1,X 2 ) Stop [2,0,1]Stop [0,2,1]Stop [1,2,0]Stop [2,1,0] Stop [1,0,2]Stop [0,1,2] K 1  K 2 K 0  K 2 K 0  K 1 YesNo Yes No K 0  K 2 K 1  K 2 [1,0,2] [0,1,2] [1,2,0] [0,2,1] Yes No

Optimal Sorting Time Theorem) Any decision tree that sorts n distinct elements has a height of at least log 2 (n!) + 1 decision tree of n elements have n! leaves number of leaves of a BT of height k  2 k-1 height of the decision tree  log 2 (n!) + 1 Theorem) Any algorithm that sorts by comparisons only must have a worst case computing time of O(nlog 2 n) n!  (n/2) n/2 log 2 (n!)  (n/2)log 2 (n/2) = O(nlog 2 n)

Merge Sort k = 2 First ceil(n/2) elements define one of the smaller instances; remaining floor(n/2) elements define the second smaller instance. Each of the two smaller instances is sorted recursively. The sorted smaller instances are combined using a process called merge. Complexity is O(n log n). Usually implemented nonrecursively.

Merge Two Sorted Lists A = (2, 5, 6) B = (1, 3, 8, 9, 10) C = () Compare smallest elements of A and B and merge smaller into C. A = (2, 5, 6) B = (3, 8, 9, 10) C = (1)

Merge Two Sorted Lists A = (5, 6) B = (3, 8, 9, 10) C = (1, 2) A = (5, 6) B = (8, 9, 10) C = (1, 2, 3) A = (6) B = (8, 9, 10) C = (1, 2, 3, 5)

Merge Two Sorted Lists A = () B = (8, 9, 10) C = (1, 2, 3, 5, 6) When one of A and B becomes empty, append the other list to C. O(1) time needed to move an element into C. Total time is O(n + m), where n and m are, respectively, the number of elements initially in A and B.

Merge Sort [8, 3, 13, 6, 2, 14, 5, 9, 10, 1, 7, 12, 4] [8, 3, 13, 6, 2, 14, 5][9, 10, 1, 7, 12, 4] [8, 3, 13, 6][2, 14, 5] [8, 3][13, 6] [8][3][13][6] [2, 14][5] [2][14] [9, 10, 1][7, 12, 4] [9, 10][1] [9][10] [7, 12][4] [7][12]

Merge Sort [3, 8][6, 13] [3, 6, 8, 13] [8][3][13][6] [2, 14] [2, 5, 14] [2, 3, 5, 6, 8, 13, 14] [5] [2][14] [9, 10] [1, 9, 10] [1] [9][10] [7, 12] [4, 7, 12] [1, 4, 7, 9, 10,12] [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13,14] [4] [7][12]

Time Complexity Let t(n) be the time required to sort n elements. t(0) = t(1) = c, where c is a constant. When n > 1, t(n) = t(ceil(n/2)) + t(floor(n/2)) + dn, where d is a constant. To solve the recurrence, assume n is a power of 2 and use repeated substitution. t(n) = O(n log n)

Merge Sort Downward pass over the recursion tree. Divide large instances into small ones. Upward pass over the recursion tree.  Merge pairs of sorted lists. Number of leaf nodes is n. Number of non-leaf nodes is n-1.

Time Complexity Downward pass.  O(1) time at each node.  O(n) total time at all nodes. Upward pass.  O(n) time merging at each level that has a non-leaf node.  Number of levels is O(log n)  Total time is O(n log n)

Nonrecursive Version Eliminate downward pass. Start with sorted lists of size 1 and do pairwise merging of these sorted lists as in the upward pass.

Nonrecursive Merge Sort [8][3][13][6][2][14][5][9][10][1][7][12][4] [3, 8][6, 13][2, 14] [5, 9] [1, 10][7, 12] [4] [3, 6, 8, 13][2, 5, 9, 14][1, 7, 10, 12] [4] [2, 3, 5, 6, 8, 9, 13, 14][1, 4, 7, 10, 12] [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14]

Non-recursive Merge Sort

Merge Sort (1/3) void merge(element initList[], element mergedList[], int i, int m, int n) { /* initList[1:m] 과 initList[m+1:n] 는 정렬된 리스트. 이들은 정렬된 리스트 mergedList[i:n] 으로 합병된다.*/ int j,k,t; j = m+1; k = i; while( i <= m && j <= n) { if (initList[i].key <= initList[j].key) mergeList[k++] = initList[i++]; else mergeList[k++] = initList[j++]; } if (i > m) /* mergedList[k:n] = initList[j:n]*/ for(t = j; t <= n; t++) mergeList[t] = initList[t]; else /* mergedList[k:n] = initList[i:m] */ for(t = i; t <= m; t++) mergeListpk+t-i] = initList[t]; } i mm+1 n [i:m][m+1:n] [i:n] merging j initList[] mergedList[] k

Merge Sort (2/3) void mergePass(element initList[], element mergedList[], int n, int s) { /* 길이가 s 인 서브리스트의 인접 쌍들이 // initList 에서부터 mergedList 로 합병된다. n 은 initList 에 있는 레코드 수이다. for(i = 1; i <= n-2*s+1; i+= 2*s) merge(initList, mergedList, i, i+s-1, i+2*s-1); if((i+s-1)<n) merge(initList, mergedList, i, i+s-1, n); else { for(j=i; j <= n; j++) mergedList[j] = initList[j]; } void mergeSort(element a[], int n) { int s = 1; /* 현재 segment 크기 */ element extra[MAX_SIZE]; while (s<n) { mergePass(a, extra, n, s); s *= 2; mergePass(extra, a, n, s); s *= 2; } ii+s-1i+2s-1 한 segment 의 크기 : s 개 s s …a[ ] extra[ ] 2s … merging i+s

[1][2][3][4][5][6][7][8][9][10] [1][4][5][8][9][10] [1][8][9][10] [1][10] i+s-1 > n i+s-1 < n …a[ ] [n][n-2*s+1]

Complexity Sorted segment size is 1, 2, 4, 8, … Number of merge passes is ceil(log 2 n). Each merge pass takes O(n) time. Total time is O(n log n). Need O(n) additional space for the merge. Merge sort is slower than insertion sort when n <= 15 (approximately). So define a small instance to be an instance with n <= 15. Sort small instances using insertion sort. Start with segment size = 15.

Recursive Merge Sort

Natural Merge Sort Initial sorted segments are the naturally ocurring sorted segments in the input. Input = [8, 9, 10, 2, 5, 7, 9, 11, 13, 15, 6, 12, 14]. Initial segments are: [8, 9, 10] [2, 5, 7, 9, 11, 13, 15] [6, 12, 14] 2 (instead of 4) merge passes suffice. Segment boundaries have a[i] > a[i+1].

Natural Merge Sort

Heap Sort Utilize the max heap structure implement max heap by using array Time complexity average case : O(n·log 2 n) worst case : O(n·log 2 n) Adjust the binary tree to establish the heap time: O(d) where d: depth of tree

Heap Sort Example) heap sorting process - input list (26,5,77,1,61,11,59,15,48,19) Array interpreted as a binary tree [1] [2] [3] [4][5] [6] [7] [8] [9] [10] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

Heap Sort Initial max heap construction

Heap Sort void adjust (element a[], int root, int n) { element temp; temp = a[root]; rootkey = a[root].key; child = 2*root; while(child<=n) { if((child<n) && (a[child].key < a[child+1].key)) child++; if(rootkey > a[child].key) break; else { a[child/2] = a[child]; child *= 2; } a[child/2] = temp; } void heapSort (element a[], int n) { int i, j; element temp; for(i=n/2; i>0; i--) /* 초기 heap 구축 */ adjust(a,i,n); for(i=n-1; i>0;i--) { SWAP(a[1], a[i+1],temp); adjust(a, 1, i); } [1] [2] [3] [4][5] [6] [7] [8] [9] [10]

Heap Sort [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

Heap Sort

Heap Sort

Heap Sort void adjust(element list[], int root, int n) { int child, rootkey; element temp; temp = list[root]; rootkey = list[root].key; child = 2 * root; /* left child */ while(child <= n) { if ((child < n) && (list[child].key < list[child+1].key)) child++; if (rootkey > list[child].key) break; else { list[child/2] = list[child]; child *= 2; } list[child/2] = temp; } void heapsort(element list[], int n) { /* perform a heapsort on the array */ int i, j; element temp; for (i = n / 2; i > 0; i--) /* initial heap construction */ adjust(list, i, n); for (i = n - 1; i > 0; i--) { /* heap adjust */ SWAP(list[1], list[i+1], temp); adjust(list, 1, i); }

Heap Sort Worst case time complexity: log 2 n + log 2 (n-1) + ··· + log 2 2 = O(n·log 2 n) h = log 2 n heap

Radix Sort A kind of distributive sort repeat the following 3 steps 1) comparison (least significant digit  most significant digit) 2) distribution 3) merging

Radix Sort Example) radix sort

Summary of Internal Sorting Insertion sorting works well when The list is already partially sorted n is small Merge sort has The best worst case behavior Requires more storage than heap sort Slightly more overhead than quick sort Quick sort Has the best average behavior worst case behavior is O(n 2 ) In practice, combine insertion sort, quick sort and merge sort so that Merge sort uses quick sort for sublists of size < 45 Quick sort uses insertion sort for sublists of size < 20