Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sorting.

Similar presentations


Presentation on theme: "Sorting."— Presentation transcript:

1 Sorting

2 Sorting Sorting is the process of arranging a group of items into a defined order, either ascending or descending, based on some criteria. Example: Start  End 

3 Bubble-Sort The bubble-sort is the oldest and simplest sort in use. Unfortunately, it's also the slowest It sorts a list by repeatedly comparing neighboring elements and swapping them if necessary.

4 Bubble-Sort Idea: Scan through the list comparing adjacent elements and swap them if they are not in relative order. // this step has the effect of bubbling the largest value to the last position in the list Then, scan the list again, bubbling up the second-to-last value Repeat this process until all elements have been placed into their correct order

5 Bubble-Sort ---- finish the first traversal ---- start again ---- finish the second traversal ---- ………………….

6 Bubble-Sort void bubbleSort (int[ ] data, int first, int n) { int position, scan; //Loop variable int temp; // used during the swapping of two array values for (position = n-1; position >= first; position--) for (scan = first; scan <= position – 1; scan++) if ( data[scan] > data[scan+1] ) { // swap data[scan] with data[scan+1] temp = data[scan+1]; data[scan+1] = data[scan]; data[scan] = temp; }

7 Running Time for Bubble-Sort
One traversal = move the maximum element at the end Traversal #i : n – i + 1 comparisons Number of comparisons: (n – 1) + (n – 2) + … + 1 = (n – 1) n / 2 = O(n 2) // The number of comparisons is the same in each case (best, average, and worst)

8 Running Time for Bubble-Sort
In the worst case (when the list is in reverse order), bubble-sort performs O(n 2) swaps. The best case, when all elements are already ordered, requires no swaps

9 The Selection-Sort Algorithm
The picture shows an array of six integers that we want to sort from smallest to largest [0] [1] [2] [3] [4] [5]

10 The Selection-Sort Algorithm
Start by finding the smallest entry. [0] [1] [2] [3] [4] [5]

11 The Selection-Sort Algorithm
Start by finding the smallest entry. Swap the smallest entry with the first entry. [0] [1] [2] [3] [4] [5]

12 The Selection-Sort Algorithm
Start by finding the smallest entry. Swap the smallest entry with the first entry. [0] [1] [2] [3] [4] [5]

13 The Selection-Sort Algorithm
Sorted side Unsorted side Part of the array is now sorted. [0] [1] [2] [3] [4] [5]

14 The Selection-Sort Algorithm
Sorted side Unsorted side Find the smallest element in the unsorted side. [0] [1] [2] [3] [4] [5]

15 The Selection-Sort Algorithm
Sorted side Unsorted side Find the smallest element in the unsorted side. Swap with the front of the unsorted side. [0] [1] [2] [3] [4] [5]

16 The Selection-Sort Algorithm
Sorted side Unsorted side We have increased the size of the sorted side by one element. [0] [1] [2] [3] [4] [5]

17 The Selection-Sort Algorithm
Sorted side Unsorted side Smallest from unsorted The process continues... [0] [1] [2] [3] [4] [5]

18 The Selection-Sort Algorithm
Sorted side Unsorted side Swap with front The process continues... [0] [1] [2] [3] [4] [5]

19 The Selection-Sort Algorithm
Sorted side is bigger Sorted side Unsorted side The process continues... [0] [1] [2] [3] [4] [5]

20 The Selection-Sort Algorithm
The process keeps adding one more number to the sorted side. The sorted side has the smallest numbers, arranged from small to large. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

21 The Selection-Sort Algorithm
We can stop when the unsorted side has just one number, since that number must be the largest number. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

22 The Selection-Sort Algorithm
The array is now sorted. We repeatedly selected the smallest element, and moved this element to the front of the unsorted side. [0] [1] [2] [3] [4] [5]

23 The Selection-Sort Algorithm
public static void selectionsort(int[] data, int first, int n) { int i, j, temp; int min; // index of smallest value in data[first…first + i] for (i = first; i < first+n-1; i++) { min = i; for (j = i+1; j < first+n-1; j++) if (data[j]<data[min]) min = j; // swap array elements temp = data[i]; data[i] = data[min]; data[min] = temp; }

24 Analysis of Selection-Sort
Number of comparisons: (n – 1) + (n – 2) + … + 1 = (n – 1) n / 2 = O(n 2) // The number of comparisons is the same in each case (best, average, and worst) In the worst case (when the list is in reverse order), selection-sort performs O(n) swaps. The best case, when all elements are already ordered, requires no swaps

25 Advantages of Selection-Sort
can be done “in-place”—no need for a second array minimizes number of swaps

26 The Insertion-Sort Algorithm
The Insertionsort algorithm also views the array as having a sorted side and an unsorted side. [0] [1] [2] [3] [4] [5]

27 The Insertion-Sort Algorithm
Sorted side Unsorted side The sorted side starts with just the first element, which is not necessarily the smallest element. [0] [1] [2] [3] [4] [5]

28 The Insertion-Sort Algorithm
Sorted side Unsorted side The sorted side grows by taking the front element from the unsorted side... [0] [1] [2] [3] [4] [5]

29 The Insertion-Sort Algorithm
Sorted side Unsorted side ...and inserting it in the place that keeps the sorted side arranged from small to large. [0] [1] [2] [3] [4] [5]

30 The Insertion-Sort Algorithm
Sorted side Unsorted side In this example, the new element goes in front of the element that was already in the sorted side. [0] [1] [2] [3] [4] [5]

31 The Insertion-Sort Algorithm
Sorted side Unsorted side Sometimes we are lucky and the new inserted item doesn't need to move at all. [0] [1] [2] [3] [4] [5]

32 The Insertion-Sort Algorithm
Sorted side Unsorted side Sometimes we are lucky twice in a row. [0] [1] [2] [3] [4] [5]

33 How to Insert One Element
Copy the new element to a separate location. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

34 How to Insert One Element
Shift elements in the sorted side, creating an open space for the new element. [0] [1] [2] [3] [4] [5]

35 How to Insert One Element
Shift elements in the sorted side, creating an open space for the new element. [0] [1] [2] [3] [4] [5]

36 How to Insert One Element
Continue shifting elements... [0] [1] [2] [3] [4] [5]

37 How to Insert One Element
Continue shifting elements... [0] [1] [2] [3] [4] [5]

38 How to Insert One Element
...until you reach the location for the new element. [0] [1] [2] [3] [4] [5]

39 How to Insert One Element
Copy the new element back into the array, at the correct location. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

40 How to Insert One Element
The last element must also be inserted. Start by copying it... Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

41 How to Insert One Element
How many shifts will occur before we copy this element back into the array? [0] [1] [2] [3] [4] [5]

42 How to Insert One Element
Four items are shifted. [0] [1] [2] [3] [4] [5]

43 How to Insert One Element
Four items are shifted. And then the element is copied back into the array. [0] [1] [2] [3] [4] [5]

44 The Insertion-Sort Algorithm
public static void insertionsort (int[] data, int first, int n) { int i, j; int entry; for (i=1; i<n; i++) { entry = data[first + i]; for (j= first+i;(j>first)&&(data[j-1]>entry);j--) data[j] = data[j-1]; data[j]=entry; }

45 Analysis of Insertion-Sort
worst case: elements initially in reverse of sorted order. O(n2) comparisons, swaps average case: same analysis as worst case best case: elements initially in sorted order no swaps O(n) comparisons

46 Advantages of Insertion-Sort
Can be done “in-place” If data is in “nearly sorted” order, runs in O(n) time

47 Timing and Other Issues
Bubble sort, Selection sort and Insertion sort have a worst-case time of O(n2), making them impractical for large arrays. But they are easy to program, easy to debug. Insertion sort also has good performance when the array is nearly sorted to begin with. But more sophisticated sorting algorithms (Divide-and-Conquer) are needed when good performance is needed in all cases for large arrays.

48 Divide-and-conquer a recursive design technique
solve small problem directly divide large problem into two subproblems, each approximately half the size of the original problem solve each subproblem with a recursive call combine the solutions of the two subproblems to obtain a solution of the larger problem

49 Divide-and-Conquer Sorting
General algorithm: divide the elements to be sorted into two lists of (approximately) equal size sort each smaller list (use divide-and-conquer unless the size of the list is 1) combine the two sorted lists into one larger sorted list

50 Divide-and-Conquer Sorting
Design decisions: How is list partitioned into two lists? How are two sorted lists combined? Common techniques: Merge-Sort trivial partition, merge to combine Quick-Sort sophisticated partition, no work to combine

51 Merge-Sort divide array into first half and last half
sort each subarray with recursive call merge together two sorted subarrays use a temporary array to do the merge

52 Merge-Sort Algorithm public static void mergesort(int[ ] data, int first, int n) { int n1, n2; // sizes of subarrays if (n>1) { n1 = n / 2; n2 = n – n1; mergesort(data, first, n1); mergesort(data, first+n1, n2); merge(data, first, n1, n2); } divide Recursive calls to smaller sub-sequences conquer

53 Merging Two Sorted Sequences
The conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S Let A=a1,a2,…,an and B=b1,b2,…,bm we want to insert B into A We scan A from the left for the correct position for b1 We can then continue, without going back, to scan for the correct position for b2 and so on ... Merging two sorted sequences, containing n and m elements and copying them into S, takes O(n+m) time

54 Merging Two Sorted Sequences
public static void merge(int[ ] data, int first, int n1, int n2) { int[ ] temp = new int[n1+n2]; int i; int c, c1, c2 = 0; while ((c1<n1)&&(c2<n2)) { if (data[first+c1] < data[first+n1+c2]) temp[c++]=data[first+(c1++)]; else temp[c++]=data[first+n1+(c2++)]; } while (c1 < n1) temp[c++]=data[first+(c1++); while (c2 < n2 ) temp[c++]=data[first+n1+(c2++)]; for (i=0; i<n1+n2; i++) data[first+i] = temp[i];

55 Merge-Sort Tree 7 2  9 4  2 4 7 9 7  2  2 7 9  4  4 9 7  7
An execution of merge-sort is depicted by a binary tree each node represents a recursive call of merge-sort and stores unsorted sequence before the execution and its partition sorted sequence at the end of the execution the root is the initial call the leaves are calls on subsequences of size 0 or 1 7 2   7  2  2 7 9  4  4 9 7  7 2  2 9  9 4  4

56 Execution Example Partition 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9
  7 2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

57 Execution Example Recursive call, partition
  7 2  9 4  7 2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

58 Execution Example Recursive call, partition
  7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

59 Execution Example Recursive call, base case
  7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

60 Execution Example Recursive call, base case
  7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

61 Execution Example Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9
  7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

62 Execution Example Recursive call, …, base case, merge
  7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

63 Execution Example Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9
  7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

64 Execution Example Recursive call, …, merge, merge
  7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

65 Execution Example Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9
  7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

66 Analysis of Merge-Sort
First we examine the running time spent at each node: we perform only partitioning and merging, and both take a linear time limited by the number of elements. The number of elements is fixed in all tree depth the overall work done at the nodes of depth i is O(n) The height h of the merge sort tree is O(log n) at each recursive call we divide in half the sequence, Thus, the total running time of merge sort is O(n log n) (best, average, and worst cases) depth #seqs size 1 n 2 n/2 i 2i n/2i

67 Advantages of Merge-Sort
conceptually simple suited to sorting linked lists of elements because merge traverses each linked list suited to sorting external files; divides data into smaller files until can be stored in array in memory stable performance However, it needs O(n) extra space to store a temporary array to hold the data in between steps (Drawback)

68 Huge file (can not fit in the largest possible array)
sorting huge files Huge file (can not fit in the largest possible array)

69 sorting huge files sort smaller subfiles

70 sorting huge files sort smaller subfiles

71 sorting huge files merge subfiles

72 sorting huge files merge subfiles

73 sorting huge files merge subfiles

74 sorting huge files and merge subfiles into one file

75 sorting huge files: k-way merge
merge k subfiles

76 Quick-Sort Quick-sort algorithm sorts a sequence S using recursive approach based on the divide-and-conquer technique: Divide: if S has at least 2 elements (if less than 2 it is already sorted), select a random element x (called pivot) and partition S into 2 sequences: L storing the elements less than (or equal) x G storing the elements greater than x //design decision: how to choose pivot Recursively sort sequence L and G Conquer: put back the elements into S, by concatenate L , x, and G x L G

77 Quick-Sort Tree An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores Unsorted sequence before the execution and its pivot Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or 1 4 2  2 4 7 9  7 9 2  2 9  9

78 Execution Example Pivot selection 7 2 9 4 3 7 6 1  1 2 3 4 6 7 8 9
2  2 9 4  4 9 3  3 8  8 9  9 4  4

79 Execution Example Partition, recursive call, pivot selection
2  2 9 4  4 9 3  3 8  8 9  9 4  4

80 Execution Example Partition, recursive call, base case
 1  1 4 3 3  3 8  8 9  9 4  4

81 Execution Example Partition, recursive call, pivot selection
1  1 4 3 3  3 8  8 9  9

82 Execution Example Partition, recursive call, base case
1  1 4 3 3  3 8  8 9  9 4  4

83 Execution Example Base case, join 7 2 9 4 3 7 6 1  1 2 3 4 6 7 8 9
1  1 4 3  3 4 3  3 8  8 9  9 4  4

84 Execution Example Recursive call, …, base case, join
1  1 4 3  3 4 3  3 8  8 9  9 4  4

85 Execution Example Recursive call, pivot selection
1  1 4 3  3 4 8  8 9  9 9  9 4  4

86 Execution Example Partition, …, recursive call, base case
1  1 4 3  3 4 7  7 9  9 9  9 4  4

87 Execution Example Join, join 7 2 9 4 3 7 6 1  1 2 3 4 6 7 7 9
1  1 4 3  3 4 7  7 9  9 9  9 4  4

88 Worst-case Running Time
We get the worst case running time W(n) if the sequence of n element is in the correct order. Partition n-1 elements Recursively sort 0 element Recursively sort n-1 element The worst case running time is O(n2).

89 Expected Running Time Consider a recursive call of quick-sort on a sequence of size n and the pivot will be selected randomly, we say we made: Good call: if the sizes of L and G are at least 1n/4 and at most 3n/4 Bad call: otherwise A call is good with probability 1/2 1/2 of the possible pivots cause good calls:  1 1 Good call Bad call Bad pivots Good pivots Bad pivots

90 Expected Running Time In average case we expect the recursive calls are most likely good calls, so we have half way splitting for each partition step in average. The height of the quick sort tree will be log n And the average time complexity of quick sort will be O(n log n)

91 Quick-Sort Implementation
We use two indices, the left-most index l and the right-most index r. In the divide step, index l scan the sequence from left to right, and index r scan the sequence from right to left, until they crossed. A swap is performed when l is at element larger than the pivot and r is at an element smaller than the pivot. A final swap with the pivot complete one divide step.

92 Quick-Sort Implementation
l r swap The pivot l r l r l r l r r l

93 Quick-Sort Implementation
Algorithm QuickSort(S, a, b) Input array S, integers a and b Output array S with elements originally from indices from a to b, inclusive, sorted in non decreasing order from indices a to b if a  b return //there is at most one element} p := S[b] //the pivot l := a //will scan rightward r := b //will scan leftward while l ≤ r do while l ≤ r and S[l] ≤ p do //find an element larger than the pivot l := l + 1 while r ≥ l and S[r] ≥ p do //find an element smaller than the pivot r := r - 1 if l ≤ r then swap the elements at S[l] and S[r] swap the elements at S[l] and S[b] //put the pivot into its final place QuickSort (S, a, l - 1) QuickSort (S, l + 1, b)

94 Choosing Pivot How does method choose pivot?
- first (or last) element in sub-array is pivot // poor partitioning if data is sorted or nearly sorted Alternative strategy for choosing pivot? middle element of sub-array look at three elements of the sub-array, and choose the middle of the three values.

95 Quick-Sort Improvements
instead of stopping recursion when sub-array has 1 element, use elements as stopping case, and sort small sub-array without recursion (eg. Insertion-Sort) At the end, small sub-arrays can be sorted using Insertion-Sort, which is efficient for nearly sorted arrays Another improvement is to use non-recursive implementation for Quick-Sort

96 Sorting with Binary Trees
Using heaps (see lecture on heaps) How to sort using a Heap (Heap-Sort)? Using binary search trees (see lecture on BST) How to sort using BST?

97 Heap-Sort Interpret array as binary tree Convert the tree into a heap
Extract elements from heap, placing them into sorted position in the array

98 Overview of Heap-Sort - 1
Two stage process First, heapify the array: “rearrange the values in the array so that the corresponding complete binary tree is a heap.” Largest element now at the root position—the first location in the array.

99 Overview of Heap-Sort - 2
Second, repeat Swap elements in first and last locations of heap. Now, largest element in last position—its correct position in sorted order. Element in root out of place. Reheapify downward. Heap shrinks by one, sorted sequence increases by 1. Next largest element now at root position.

100 Analysis of Heap-Sort time to build initial heap: O(n log n)
time to remove the elements from heap, and place in sorted array: overall time: average, and worst cases

101 Advantages of Heap-Sort
in-place (doesn’t require temporary array) asymptotic analysis same as Mergesort, average case of Quicksort on average takes twice as long as Quicksort

102 Sorting with BST Use binary search trees for sorting
Start with unsorted sequence Insert all elements in a BST Traverse the tree…. how ? Running time?

103 Summary of Sorting Algorithms
Time Notes bubble-sort O(n2) slow (good for small inputs) selection-sort insertion-sort quick-sort O(n log n) expected in-place, randomized fastest (good for large inputs) heap-sort O(n log n) fast (good for large inputs) merge-sort sequential data access fast (good for huge inputs)

104 Comparison-Based Sorting?
What is the Lower Bound of Comparison-Based Sorting?

105 Comparison-Based Sorting
Many sorting algorithms are comparison based. They sort by making comparisons between pairs of objects Examples: bubble-sort, selection-sort, insertion-sort, heap-sort, merge-sort, quick-sort, ... Let us therefore derive a lower bound on the running time of any algorithm that uses comparisons to sort n elements, x1, x2, …, xn. Is xi < xj? no yes

106 Counting Comparisons Let us just count comparisons then.
Each possible run of the algorithm corresponds to a root-to-leaf path in a decision tree

107 Decision Tree Height Each leaf is labeled by the permutation of orders that the algorithm determines How many leaves on the decision tree?

108 Decision Tree Height The height of this decision tree is a lower bound on the running time Every possible input permutation must lead to a separate leaf output. If not, some input …4…5… would have same output ordering as …5…4…, which would be wrong. Since there are n!=1*2*…*n leaves, the height is at least log (n!)

109 The Lower Bound log (n!)  n log n – 1.44 n
Any comparison-based sorting algorithms requires at least log (n!) comparisons log (n!)  n log n – 1.44 n

110 Bucket sort Assumes the input is generated by a random process that distributes elements uniformly over [0, 1). Idea: Divide [0, 1) into n equal-sized buckets. Distribute the n input values into the buckets. Sort each bucket. Then go through buckets in order, listing elements in each one. Input: A[1 . . n], where 0 ≤ A[i ] < 1 for all i . Auxiliary array: B[0 . . n − 1] of linked lists, each list initially empty.

111 Bucket-Sort Algorithm bucketSort( A, n) Input array A Output array A sorted in non decreasing order //assumes that input is in n-element array A and each //element in A satisfies 0 ≤ A[i] < 1. //We also need an auxiliary array B[0 . . n -1] for linked-lists (buckets). For i = 0 to n-1 do Insert A[i] into list B[ n*A[i]  ] Sort list B with Insertion sort Concatenate the lists B[0], B[1], . . B[n-1] together in order.

112 Bucket-Sort Example: Output: 3 9 21 25 29 37 43 49 29 25 3 49 9 37 21
0-9 10-19 20-29 30-39 40-49 0-9 3 9 10-19 20-29 21 25 29 30-39 37 40-49 49 43 Output:

113 Correctness of Bucket-Sort
Consider A[i ], A[ j ]. Assume without loss of generality that: A[i ] ≤ A[ j ]. Then n · A[i ] ≤ n · A[ j ] So A[i ] is placed into the same bucket as A[ j ] or into a bucket with a lower index. If same bucket, insertion sort fixes up. If earlier bucket, concatenation of lists fixes up.

114 Analysis of Bucket-Sort
Relies on no bucket getting too many values. All lines of algorithm except insertion sorting take (n) altogether. If we have n elements and we use n buckets, then it is expected that (on average) we have one element per bucket Hence, it takes O(1) time to sort each bucket ⇒ O(n) sort time for all buckets.

115 Radix-Sort Can we perform bucket sort on any array of (non-negative) integers? Yes, but note that the number of buckets will depend on the maximum integer value If you are sorting 1000 integers and the maximum value is , you will need 1 million buckets ! (in order to have one record per bucket) Can we do better?

116 Radix sort Idea: repeatedly sort by digit—perform multiple bucket sorts on S starting with the rightmost digit If maximum value is , only ten buckets (not 1 million) will be necessary Use this strategy when the keys are integers, and there is a reasonable limit on their values Number of passes (bucket sort stages) will depend on the number of digits in the maximum value

117 Example: first pass 12 58 37 64 52 36 99 63 18 9 20 88 47 20 12 52 63 64 36 37 47 9 99 1 2 4 3 5 6 7 8 9 20 12 52 63 64 36 37 47 58 18 88 9 99

118 Example: second pass 20 12 52 63 64 36 37 47 58 18 88 9 99 9 12 18 20 3637 47 52 58 63 64 88 99 1 2 4 3 5 6 7 8 9 12 18 20 36 37 47 52 58 63 64 88 99

119 Example: 1st and 2nd passes
12 58 37 64 52 36 99 63 18 9 20 88 47 sort by rightmost digit 20 12 52 63 64 36 37 47 58 18 88 9 99 sort by leftmost digit 9 12 18 20 36 37 47 52 58 63 64 88 99

120 Radix-Sort and Stability
Radix sort works as long as the bucket sort stages are stable sorts Stable sort: in case of ties, relative order of elements are preserved in the resulting array Suppose there are two elements whose first digit is the same; for example, 52 & 58 If 52 occurs before 58 in the array prior to the sorting stage, 52 should occur before 58 in the resulting array This way, the work carried out in the previous bucket sort stages is preserved

121 Time complexity If there is a fixed number p of bucket sort stages (six stages in the case where the maximum value is ), then radix sort is O( n ) There are p bucket sort stages, each taking O( n ) time Strictly speaking, time complexity is O( pn ), where p is the number of digits (note that p = log10m, where m is the maximum value in the list)

122 About Radix-Sort Note that only 10 buckets are needed regardless of number of stages since the buckets are reused at each stage Radix sort can apply to words Set a limit to the number of letters in a word Use 27 buckets (or more, depending on the letters/characters allowed), one for each letter plus a “blank” character The word-length limit is exactly the number of bucket sort stages needed

123 Summary Bucket sort and Radix sort are O( n ) algorithms only because we have imposed restrictions on the input list to be sorted Sorting, in general, can be done in O( n log n ) time


Download ppt "Sorting."

Similar presentations


Ads by Google