Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sorting Algorithms Written by J.J. Shepherd. Sorting Review For each one of these sorting problems we are assuming ascending order so smallest to largest.

Similar presentations


Presentation on theme: "Sorting Algorithms Written by J.J. Shepherd. Sorting Review For each one of these sorting problems we are assuming ascending order so smallest to largest."— Presentation transcript:

1 Sorting Algorithms Written by J.J. Shepherd

2 Sorting Review For each one of these sorting problems we are assuming ascending order so smallest to largest Two You’ve Seen Before – Selection – Bubble

3 Selection Sort Scans through the data structure and finds the smallest element then swaps that element with the first element Then it looks for the next smallest and does the same This is repeated until the end of the data structure is reached

4 Selection Sort Look for the smallest element in the array since the smallest value goes first 0123456789 95673410128 index value

5 Selection Sort The first value is assumed to be the smallest 0123456789 95673410128 index value

6 Selection Sort The next value is examine and it is smaller than the first index, so that’s assumed to be the smallest value. Store that index. 0123456789 95673410128 index value

7 Selection Sort This value is larger, so keep going 0123456789 95673410128 index value

8 Selection Sort This value is larger, so keep going 0123456789 95673410128 index value

9 Selection Sort This value is smaller, so store this index 0123456789 95673410128 index value

10 Selection Sort This value is larger, so move on 0123456789 95673410128 index value

11 Selection Sort This value is larger, so move on 0123456789 95673410128 index value

12 Selection Sort This value is smaller, so save this index 0123456789 95673410128 index value

13 Selection Sort This value is larger, so move on 0123456789 95673410128 index value

14 Selection Sort This value is larger, so move on 0123456789 95673410128 index value

15 Selection Sort Now we’ve reached the end so we swap the stored smallest value with the value at the first index 0123456789 95673410128 index value

16 Selection Sort That index is complete so we never test it again We move on finding the next smallest value 0123456789 15673410928 index value

17 Selection Sort It starts on the next index 0123456789 15673410928 index value

18 Selection Sort After a while we discover that this is the next smallest value 0123456789 15673410928 index value

19 Selection Sort Swap these values 0123456789 15673410928 index value

20 Selection Sort Start the process again for the next smallest value 0123456789 12673410958 index value

21 Selection Sort Eventually this is the result 0123456789 12345678910 index value

22

23 Selection Sort Theoretically how long does this take in the worst case scenario? Again let’s remember Big O One function (f(x)) is bounded by another (g(x)) given some large constant (M)

24 Selection Sort Let’s assume the data structure has n elements in there. Then how many times will this iteration run?

25 Selection Sort Search for the smallest element = n Search for the next smallest = n-1 Search for the next smallest = n-2 … The final element = 1

26 Selection Sort If we add all of these searches together we can say it roughly takes n 2 times to sort every element. Thus O(n 2 )

27 Bubble Sort The idea is you keep swapping values which are out of order until no more swaps are made The idea is the largest values “bubble up” to the top of the data structure

28 Bubble Sort Examine the two side-by-side elements if the right one is larger than the left one swap 0123456789 95673410128 index value

29 Bubble Sort Left is larger than right. SWAP 0123456789 95673410128 index value

30 Bubble Sort Move forward 0123456789 59673410128 index value

31 Bubble Sort Left is larger than right SWAP! 0123456789 59673410128 index value

32 Bubble Sort Move forward 0123456789 56973410128 index value

33 Bubble Sort Left is larger than right. SWAP! 0123456789 56973410128 index value

34 Bubble Sort Move forward 0123456789 56793410128 index value

35 Bubble Sort Left is larger than right. SWAP! 0123456789 56793410128 index value

36 Bubble Sort Move forward 0123456789 56739410128 index value

37 Bubble Sort Left is larger than right. SWAP! 0123456789 56739410128 index value

38 Bubble Sort Move forward 0123456789 56734910128 index value

39 Bubble Sort Left is less than right so it is sorted. Move forward. 0123456789 56734910128 index value

40 Bubble Sort Left is larger than right. SWAP! 0123456789 56734910128 index value

41 Bubble Sort Move forward 0123456789 56734911028 index value

42 Bubble Sort Left is larger than right. SWAP 0123456789 56734911028 index value

43 Bubble Sort Move forward 0123456789 56734912108 index value

44 Bubble Sort Left is larger than right. SWAP! 0123456789 56734912108 index value

45 Bubble Sort We’ve reached the end but since there was at least one swap the process has to start all over again from the beginning. 0123456789 56734912810 index value

46

47 Bubble Sort Theoretically how long does Bubble Sort run in the worst case scenario? What is the worst case scenario for bubble sort?

48 Bubble Sort The worst case scenario is we are given a data structure of n values that are sorted… Backwards Let’s examine the swaps involved with this case.

49 Bubble Sort The first iteration takes n swaps The next takes n-1 swaps The next takes n-2 swaps … Finally 0 swaps

50 Bubble Sort If we add all of these swaps together we can say it roughly takes n 2 times to sort every element. Thus O(n 2 )

51

52

53 Merge Sort A divide and conquer algorithm that splits apart a data structure in half over and over again and then finally merges the elements together piece by piece Similar concept to binary search but applied to sorting

54 Merge Sort Split the structure in half until single elements remain 0123456789 95673410128 index value 01234 95673 01234 410128

55 Merge Sort Split the structure in half until single elements remain 01234 95673 01234 410128 012 956 01 73 012 4 1 01 28

56 Merge Sort Split the structure in half until single elements remain 012 956 01 73 012 4101 01 28 01 95 0 6 0 7 0 3 01 4 0 1 0 2 0 8

57 Merge Sort Split the structure in half until single elements remain 012 956 01 73 012 4101 01 28 01 95 0 6 0 7 0 3 01 4 0 1 0 2 0 8

58 Merge Sort Finally we have single elements so we can start merging 0 5 0 6 0 7 0 3 0 10 0 1 0 2 0 8 0 9 0 4

59 Merge Sort It’s sort of hard to see how merging works in the first step as it’s just one comparison 0 5 0 6 0 7 0 3 0 10 0 1 0 2 0 8 0 9 0 4 01 59 01 67 01 34 01 1 01 28

60 Merge Sort The idea of merging is for each smaller data structure we assume they have been sorted in the previous step. In this way we do not need to resort those data structure only sort them versus the others 01 59 01 67 01 34 01 110 01 28

61 Merge Sort Now we continue to merge 01 59 01 67 01 34 01 110 01 28

62 Merge Sort Check the first two values. The smaller one is added to the new data structure and its index is moved forward. The other remains the same 01 59 01 67 01 34 01 110 01 28 0123

63 Merge Sort Check the indexed values. The smaller one is added to the new data structure and its index is moved forward. The other remains the same 01 59 01 67 01 34 01 110 01 28 0123 5

64 Merge Sort Check the indexed values. The smaller one is added to the new data structure and its index is moved forward. The other remains the same 01 59 01 67 01 34 01 110 01 28 0123 56

65 Merge Sort The second data structure reached its end so the rest of the first data structure is simply added to the end 01 59 01 67 01 34 01 110 01 28 0123 567

66 Merge Sort The second data structure reached its end so the rest of the first data structure is simply added to the end 01 59 01 67 01 34 01 110 01 28 0123 5679

67 Merge Sort Similarly let’s look at the next merge 01 59 01 67 01 34 01 110 01 28 0123 5679 0123

68 Merge Sort Similarly let’s look at the next merge 01 59 01 67 01 34 01 110 01 28 0123 5679 0123 1

69 Merge Sort Similarly let’s look at the next merge 01 59 01 67 01 34 01 110 01 28 0123 5679 0123 13

70 Merge Sort Similarly let’s look at the next merge 01 59 01 67 01 34 01 110 01 28 0123 5679 0123 134

71 Merge Sort Similarly let’s look at the next merge 01 59 01 67 01 34 01 110 01 28 0123 5679 0123 134

72 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567

73 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567 1

74 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567 13

75 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567 134

76 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567 1345

77 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567 13456

78 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567 134567

79 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567 1345679

80 Merge Sort And the next one 01 28 0123 5679 0123 13410 01234567 1345679

81 Merge Sort Finally 0123456789 12345678910 index value

82

83 Merge Sort Theoretically how long does merge sort take? There are essentially two steps that work in conjunction with each other – Dividing the structure – Merging it back together

84 Merge Sort We can actually visualize how long it takes n n/2 n/4 1 1...... … 1 1

85 Merge Sort Dividing the structure takes lg(n) time n n/2 n/4 1 1...... … 1 1 lg(n)

86 Merge Sort Merging takes n time n n/2 n/4 1 1...... … 1 1 lg(n) n

87 Merge Sort If combine the dividing with the merging parts we finally get that it takes n*lg(n) time Thus O(nlgn)

88 Was All of This Really Worth it?

89 Common Big O Complexities

90 Quick Sort Look for the first element to the left that is larger than the pivot 0123456789 95673410128 index value i j

91 Quick Sort Look for the first element to the left that is larger than the pivot 0123456789 95673410128 index value i j

92 Quick Sort Look for the first element to the right of the pivot that’s less than the pivot 0123456789 95673410128 index value i j

93 Quick Sort Look for the first element to the right of the pivot that’s less than the pivot 0123456789 95673410128 index value i j

94 Quick Sort Swap those elements! 0123456789 25673410198 index value j i

95 Quick Sort Repeat that process. Look for one that’s greater than on the left side 0123456789 25673410198 index value i j

96 Quick Sort Look for one that is less than on the right side 0123456789 25673410198 index value ji

97 Quick Sort Swap! 0123456789 21673410598 index value ij

98 Quick Sort Continue on 0123456789 21673410598 index value ij

99 Quick Sort Continue on 0123456789 21673410598 index value ij

100 Quick Sort Continue on 0123456789 21673410598 index value ij

101 Quick Sort Swap! 0123456789 21376410598 index value i

102 Quick Sort Now since i = j we need to split the data structure and put the pivot in the center 0123456789 21376410598 index value ij

103 Quick Sort Now we repeat the same process for the smaller structures 012345 7610598 012 213 0 4

104

105 Quick Sort How long does this take theoretically? What is its worst case scenario?

106 Quick Sort Strangely enough its worst case scenario is an already sorted array. In this one unique case the pivot is selected every time and is swapped in and out of places n times for a data structure of size in so technically it is O(n 2 )

107 Quick Sort However since this is a rare case, and assuming the pivot is randomly chosen and not fixed then the average case becomes  (nlgn)

108 Wrapping up asymptotics Big O (O) – is the worst case Big Omega (  ) – is the best case scenario Bit Theta (  ) – is the average case scenario

109 Formal Definitions Big O for f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n.

110 Formal Definitions Big O for f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n.

111 Formal Definitions Big Omega for f(n) = Ω (g(n)) means there are positive constants c and k, such that 0 ≤ cg(n) ≤ f(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n.

112 Formal Definitions Big Theta for f(n) = Θ (g(n)) means there are positive constants c 1, c 2, and k, such that 0 ≤ c 1 g(n) ≤ f(n) ≤ c 2 g(n) for all n ≥ k. The values of c 1, c 2, and k must be fixed for the function f and must not depend on n. IE in between Big O and Big Omega


Download ppt "Sorting Algorithms Written by J.J. Shepherd. Sorting Review For each one of these sorting problems we are assuming ascending order so smallest to largest."

Similar presentations


Ads by Google