Download presentation
Presentation is loading. Please wait.
Published byCameron Scott Modified over 8 years ago
1
PREVIOUS SORTING ALGORITHMS BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2) +..+ 1 = n(n – 1) / 2 = O(n 2 ) INSERTION SORT –Also O(n 2 ) Barely more efficient than bubble sort due to low overhead (easy to write)
2
PREVIOUS SORTING ALGORITHMS (cont.) SELECTION SORT –Also O(n 2 ) –Can be more efficient than bubble sort on larger list since items are moved to their final position when located –In worst case, on n passes you must do n – 1 comparisons, giving O(n 2 )
3
SLIGHTLY FASTER SORT ALGORITHMS SHELL SORT –Does insertion sort on items that are far apart with fewer comparisons –Better than insertion sort because items are quickly moved to their final destinations –Algorithm picks a “gap”, and sorts items that are “gap” number of elements apart –“gap” is reduced and process is repeated until “gap” is 1, at which point list is sorted
4
SHELL SORT (code) void ShellSort(int a[ ], int n) { int gap, i, j, temp; //create my "gap" for (gap = n /2; gap > 0; gap /= 2) { for (i = gap; i < n; i++) { //start sorting every gap th item for (j = i - gap; j >= 0 && a[j] > a[j+gap]; j -= gap) { temp = a[j]; a[j] = a[j+gap]; a[j+gap] = temp; }
5
SHELL SORT (Example) Original List: 75 70 84 78 100 1 18 How would the trace of this algorithm go?
6
SHELL SORT (Analysis) Time Complexity: –O(n 3/2 ) in worst case –O(n 7/6 ) in average case From very exhaustive proof
7
FASTER SORT ALGORITHMS (cont.) QUICK SORT –Improves slightly on shell sort (on average case) –Partitions large lists into smaller lists and sort the partitions (based on a pivot element) –Process is repeated until list of 1 remains, at which point the list is sorted –Used divide-and-conquer strategy (recursive calls) –Uses special function called partition() that is responsible for splitting the lists
8
QUICK SORT (code) void QuickSort(Item a[], int low, int high) { int part; if (low < high) { //partition the list part = Parition(a, low, high); //sort the left side QuickSort(a, low, part-1); //sort the right side QuickSort(a, part+1, high); } else cout << "list sorted"; }
9
QUICK SORT (code cont.) int Partition(int a[], int low, int high) { int up, down, pivot, temp; pivot = a[low]; up = high; down = low; while (down < up) { while ((a[down] <= pivot) && (down < high)) down++; while (a[up] > pivot) up--;
10
QUICK SORT (Partition Function) if (down < up) { temp = a[up]; a[up] = a[down]; a[down] = temp; } a[low] = a[up]; a[up] = pivot; return(up); }
11
EXAMPLE – QUICKSORT Original List: 75 70 84 78 100 55 61 81 65 1 st call: pivot is 75 P 75 70 84 78 100 55 61 81 65 -now, search from left for items > pivot and from right for items < pivot P D U 75 70 84 78 100 55 61 81 65 -swap D and U and continue search process: P D U 7570 65 78 100 55 61 81 84
12
QUICKSORT (cont.) -swap D and U again, and continue: P D U 75 70 65 61 100 55 78 81 84 -again swap D and U, and continue: P DU 75 70 65 61 55 100 78 81 84 -D and U have met, so swap that spot with pivot: P 55 70 65 61 75 100 78 81 84 -now everything left of pivot is less than pivot and everything right of pivot is greater than pivot -The position of the pivot is now returned, and the lists are split at that point; process is then repeated
13
QUICK SORT (Analysis) Time Complexity: –average case: O(n log n) – worst case: O(n 2 ) not likely –Explanation: The efficiency calculated as running time of the two recursive calls plus the time spent in the partition. The Partition: n-1 comparisons Recursive Calls: –in average: pivot splits array evenly T(n) = O(n log n) –worst case: pivot value ends up being one of the ends of the array (like the first element) –ex: size 8 reduced to: 7, 6, 5... - thus, worst case is linear, or O(n) for this situation –therefore, overall worst case: T(n) = O(n 2 )
14
FASTER SORT ALGORITHMS (cont.) - MERGE SORT -Has better worst-case time than bubble-sort or linear search -Divides the list into 2 halves, sorts them -Then, two halves are “merged” (combined using insertion sort) -Uses recursive function calls and partitioning similar to Quick Sort -Algorithm: -If the input sequence has only one element, return. -Otherwise: -Partition the input sequence into two halves. -Sort the two subsequences using the same algorithm -Merge the two sorted subsequences to form the output sequence. -Repeat this until only 1 element remains
15
MERGE SORT (code) void MergeSort(Item A[], int First, int Last) { int Mid; // Will be index of the middle element if (First < Last) { // only do if array more than one element Mid = (First + Last)/2; // Sort the first half: MergeSort(A,First,Mid); // Sort the second half: MergeSort(A, Mid+1, Last); // Now merge the two halves Merge(A,First,Mid,Last); } return; } // end MergeSort
16
MERGE SORT (MERGE function) void Merge(Item A[], int F, int Mid, int L) { Item TempArray[MAXARRAY]; // Temporary array int First1 = F; // Beginning of first subarray int Last1 = Mid; // End of first subarray int First2 = Mid + 1; // Beginning of second subarray int Last2 = L; // End of second subarray int Index = First1; // Next available location in temp array for(; (First1 <= Last1) && (First2 <= Last2); ++Index ) { // First case: Take element from first subarray if (A[First1] < A[First2]) { TempArray[Index] = A[First1]; First1++; }
17
MERGE SORT (Merge Function Cont.) // Otherwise: Take element from second subarray else { TempArray[Index] = A[First2]; First2++; } // end if } // end for for(; First1 <= Last1; First1++, Index++ ) { TempArray[Index] = A[First1]; } // Now finish off the second subarray: for(; First2 <= Last2; First2++, Index++ ) { TempArray[Index] = A[First2]; } // Now copy back from the temporary array to the original one for (Index = F; Index<=L; Index++) { A[Index] = TempArray[Index]; } } // End Merge
18
EXAMPLE – MERGE SORT - Original List: 38 16 27 39 12 26 - 1 st call: list is split into two halves: L1: 38 16 27 L2: 39 12 26 - 2 nd call: L1 is split into two halves: L1: 38 16 27 L2: 39 12 26 L3: 38 16 L4: 27 - 3 rd call: L3 is split into two halves: L1: 38 16 27 L2: 39 12 26 L3: 38 16 L4: 27 L5: 38 L6: 16
19
MERGE SORT (cont.) L5 and L6 are now single elements, so those two are now merged: L1: 38 16 27 L2: 39 12 26 L3: 38 16 L4: 27 L7: 16 38 - then, L7 is merged with L4: L1: 38 16 27 L2: 39 12 26 L8: 16 27 38 - 4 th call, L2 is split into two halves: L8: 16 27 38 L2: 39 12 26 L9: 39 12 L10: 26
20
MERGE SORT (cont.) - 5 th call, L9 is now split: L8: 16 27 38 L11: 39 L12: 12 L10: 26 - now, L11 and L12 are single elements, so they are merged: L8: 16 27 38 L13: 12 39 L10: 26 - then, L13 and L10 are merged: L8: 16 27 38 L14: 12 26 39 - Finally, L8 and L14 are merged, giving final sorted list: Final List: 12 16 26 27 38 39
21
MERGE SORT (Analysis) - To determine the efficiency of the sorting (breaking down) algorithm, consider how many times the data has to be split. -A data set of size 4 has to be split twice, once into two sets of two and then again into four sets of one. -A data set of size 8 has to be split 3 times, 16 pieces of data have to be split 4 times, 32 needs 5 splits, and so on. -This sort of behavior is reflected by the logarithm: -log2(4) = 2 log2(8) = 3 log2(16) = 4 log2(32) = 5. -This means that the sorting part runs in O(log n)
22
MERGE SORT (Analysis cont.) -The merging is done by doing one comparison for each pair of elements at the top of each sublist. -For example, -to merge the subarrays (2 4) and (0 1 7), the following comparisons have to take place: -0 & 2, 1 & 2, 2 & 7, 4 & 7, and 7 alone. - this is 5 comparisons for 5 elements, efficiency n. -Because all log(n) sublists have to be merged, the efficiency of mergesort is O(n log(n)).
23
QUESTIONS? - Get ready for TEST #2!!!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.