Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data Structures and Algorithm Analysis Algorithm Analysis and Sorting

Similar presentations


Presentation on theme: "Data Structures and Algorithm Analysis Algorithm Analysis and Sorting"— Presentation transcript:

1 Data Structures and Algorithm Analysis Algorithm Analysis and Sorting
Lecturer: Jing Liu Homepage:

2 What’s Algorithms? An algorithm is a procedure that consists of a finite set of instructions which, given an input from some set of possible inputs, enables us to obtain an output if such an output exists or else obtain nothing at all if there is no output for that particular input through a systematic execution of the instructions.

3 Instructions Inputs (Problems) Outputs (Answers) Computers

4 Software Systems Programming Languages Data Structure Algorithms

5 Binary Search Let A[1…n] be a sequence of n elements. Consider the problem of determining whether a given element x is in A.

6 Binary Search Example: A[1…14]=1 4 5 7 8 9 10 12 15 22 23 27 32 35
Does x exist in A? How many comparisons do you need to give the answer?

7 Binary Search Inputs: (1) An array A[1…n] of n elements sorted in nondecreasing order; (2) x; Output: j if x=A[j], and 0 otherwise; 1. low1; highn; j0; 2. while (lowhigh) and (j=0) mid(low+high)/2; if x=A[mid] then jmid; else if x<A[mid] then highmid–1; else lowmid+1; 7. end while; 8. return j;

8 Binary Search What’s the performance of the algorithm Binary Search on a sorted array of size n? What’s the minimum number of comparisons we need and in which case it will happen? What’s the maximum number of comparisons we need and in which case it will happen?

9 Binary Search The number of comparisons performed by Algorithm BINARYSEARCH on a sorted array of size n is at most log n+1.

10 Binary Search Can you implement the Binary Search in a certain kind of computer language?

11 Merging Two Sorted Lists
Suppose we have an array A[1…m] and three indices p, q, and r, with 1pq<rm, such that both the subarrays A[p…q] and A[q+1…r] are individually sorted in nondecreasing order. We want to rearrange the elements in A so that the elements in the subarray A[p…r] are sorted in nondecreasing order.

12 Merging Two Sorted Lists
Example: A[1…7]= p=1, q=3, r=7 A[1…3] and A[4…7] are already sorted nondecreasingly, please sort A nondecreasingly? How many comparisons do you need to give the answer?

13 Merging Two Sorted Lists
Inputs: (1) A[1…m]; (2) p, q, and r, and 1pq<rm, such that both A[p…q] and A[q+1…r] are sorted individually in nondecreasing order; Output: A[p…r] contains the result of merging the two subarrays A[p…q] and A[q+1…r];

14 Merging Two Sorted Lists
1. sp; tq+1; kp; 2. while sq and tr if A[s]A[t] then B[k]A[s]; ss+1; else B[k]A[t]; tt+1; end if; 10. kk+1; 11. end while; 12. if s=q+1 then B[k…r]A[t…r] 13. else B[k…r]A[s…q] 14. end if 15. A[p…r]B[p…r]

15 Merging Two Sorted Lists
What’s the performance of the algorithm Merging Two Sorted Lists? What’s the minimum number of comparisons we need and in which case it will happen? What’s the maximum number of comparisons we need and in which case it will happen? What’s the minimum number of element assignments we need and in which case it will happen? What’s the maximum number of element assignments we need and in which case it will happen?

16 Merging Two Sorted Lists
The number of element comparisons performed by Algorithm MERGE to merge two nonempty arrays of sizes n1 and n2, respectively, where n1n2, into one sorted array of size n=n1+n2 is between n1 and n-1.

17 Merging Two Sorted Lists
The number of element assignments performed by Algorithm MERGE to merge two arrays into one sorted array of size n is exactly 2n.

18 Problems and Algorithms
Each algorithm is designed to solve one problem, but for one problem, we can design many different algorithms. What’s the difference of these algorithms? Why we design different algorithms for the same problem? Example Problem: Sort Selection Sort Insertion Sort Bottom-Up Merge Sorting

19 Selection Sort Let A[1…n] be an array of n elements.
A simple and straightforward algorithm to sort the entries in A works as follows. First, we find the minimum element and store it in A[1]. Next, we find the minimum of the remaining n-1 elements and store it in A[2]. We continue this way until the second largest element is stored in A[n-1].

20 Selection Sort Input: A[1…n];
Output: A[1…n] sorted in nondecreasing order; 1. for i1 to n-1 ki; for ji+1 to n if A[j]<A[k] then kj; end for; if ki then interchange A[i] and A[k]; 7. end for;

21 Selection Sort What’s the performance of the algorithm Selection Sort?
What’s the minimum number of comparisons we need and in which case it will happen? What’s the maximum number of comparisons we need and in which case it will happen? What’s the minimum number of element assignments? What’s the maximum number of element assignments?

22 Selection Sort The number of element comparisons performed by Algorithm SELECTIONSORT to sort an array with n elements is n(n-1)/2.

23 Selection Sort The number of element assignments performed by Algorithm SELECTIONSORT to sort an array with n elements is between 0 and 3(n-1).

24 Insertion Sort We begin with the subarray of size 1, A[1], which is already sorted. Next, A[2] is inserted before or after A[1] depending on whether it is smaller than A[1] or not. Continuing this way, in the i th iteration, A[i] is inserted in its proper position in the sorted subarray A[1 … i-1]. This is done by scanning the elements from index i-1 down to 1, each time comparing A[i] with the element at the current position.

25 Insertion Sort In each iteration of the scan, an element is shifted one position up to a higher index. This process of scanning, performing the comparison and shifting continues until an element less than or equal to A[i] is found, or when all the sorted sequence so far is exhausted. At this point, A[i] is inserted in its proper position, and the process of inserting element A[i] in its proper place is complete.

26 Insertion Sort Example: A[1 … 4]=

27 Insertion Sort Input: An array A[1…n] of n elements;
Output: A[1…n] sorted in nondecreasing order; 1. for i2 to n xA[i]; ji-1; while (j>0) and (A[j]>x) A[j+1]A[j]; jj-1; end while; A[j+1]x; 9. end for;

28 Insertion Sort What’s the performance of the algorithm Insertion Sort?
What’s the minimum number of comparisons we need and in which case it will happen? What’s the maximum number of comparisons we need and in which case it will happen? What’s the minimum number of element assignments? What’s the maximum number of element assignments?

29 Insertion Sort The number of element comparisons performed by Algorithm INSERTIONSORT is between n-1 and n(n-1)/2.

30 Insertion Sort The number of element assignments performed by Algorithm INSERTIONSORT is equal to the number of element comparisons plus n-1.

31 Bottom-Up Merge Sorting
Example: A[1 … 8]=

32 Bottom-Up Merge Sorting
Example: A[1 … 11]=

33 Bottom-Up Merge Sorting
Let A be an array of n elements that is to be sorted. We first merge n/2 consecutive pairs of elements to yield n/2 sorted sequences of size 2. If there is one remaining element, then it is passed to the next iteration. Next, we merge n/4 pairs of consecutive 2-element sequences to yield n/4 sorted sequences of size 4. If there are one or two remaining elements, then they are passed to the next iteration. If there are three elements left, then two (sorted) elements are merged with one element to form a 3-element sorted sequence.

34 Bottom-Up Merge Sorting
Continuing this way, in the jth iteration, we merge n/2j pairs of sorted sequences of size 2j-1 to yield n/2j sorted sequences of size 2j. If there are k remaining elements, where 1k 2j-1, then they are passed to the next iteration. If there are k remaining elements, where 2j-1<k<2j, then these are merged to form a sorted sequence of size k.

35 Bottom-Up Merge Sorting
Input: An array A[1…n] of n elements; Output: A[1…n] sorted in nondecreasing order; 1. t1; 2. while t<n st; t2s; i0; while i+tn MERGE(A, i+1, i+s, i+t); ii+t; end while; if i+s<n then MERGE(A, i+1, i+s, n); 9. end while;

36 Bottom-Up Merge Sorting
What’s the performance of the algorithm Bottom-Up Merge Sorting? What’s the minimum number of comparisons we need and in which case it will happen? What’s the maximum number of comparisons we need and in which case it will happen? What’s the minimum number of element assignments? What’s the maximum number of element assignments?

37 Bottom-Up Merge Sorting
The total number of element comparisons performed by Algorithm BOTTOMUPSORT to sort an array of n element, where n is a power of 2, is between (nlog n)/2 and nlog n-n+1.

38 Bottom-Up Merge Sorting
The total number of element assignments performed by Algorithm BOTTOMUPSORT to sort an array of n element, where n is a power of 2, is exactly 2nlog n.

39 Time Complexity Selection Sort: n(n-1)/2 Merge Sort: nlogn-n+1
If each comparison needs 10-6 second, n=128: Merge= seconds Selection=0.008 seconds n=220: Merge=20 seconds Selection=6.4 days

40 Time Complexity When analyzing the running time,
(1) We usually compare its behavior with another algorithm that solves the same problem. (2) It is desirable for an algorithm to be not only machine independent, but also capable of being expressed in any language. (3) It should be technology independent, that is, we want our measure of the running time of an algorithm to survive technological advances. (4) Our main concern is not in small input sizes; we are mostly concerned with the behavior of the algorithm on large input instances.

41 Time Complexity Therefore, usually only Elementary Operations are used to evaluate the time complexity: Elementary Operation: Any computational step whose cost is always upperbounded by a constant amount of time regardless of the input data or the algorithm used.

42 Time Complexity Examples of elementary operations:
(1) Arithmetic operations: addition, subtraction, multiplication and division (2) Comparisons and logical operations (3) Assignments, including assignments of pointers

43 Time Complexity Usually, we care about how the elementary operations increase with the size of input, namely the rate of growth or the order of growth of the running time. This can be expressed by a function, for example: f(n)=n2logn+10n2+n Once we dispose of lower order terms and leading constants from a function that expresses the running time of an algorithm, we say that we are measuring the asymptotic running time of the algorithm, namely time complexity.

44 Time Complexity Functions that are widely used to represent the running times of algorithms: (1) Sublinear: nc, nclogkn, 0<c<1 (2) Linear: cn (3) Logarithmic: logkn (4) Subquadratic: nlogn, n1.5 (5) Quadratic: cn2 (6) Cubic: cn3

45 Time Complexity In order to formalize the notions of time complexity, special mathematical notations have been widely used. (1) O-notation: An upper bound on the running time. The running time of an algorithm is O(g(n)), if whenever the input size is equal to or exceeds some threshold n0, its running time can be bounded above by some positive constant c times g(n). (2) -notation: A lower bound on the running time. The running time of an algorithm is (g(n)), if whenever the input size is equal to or exceeds some threshold n0, its running time can be bounded below by some positive constant c times g(n).

46 Time Complexity (3) -notation: An exact bound. The running time of an algorithm is of order (g(n)) if whenever the input size is equal to or exceeds some threshold n0, its running time can be bounded below by c1g(n) and above by c2g(n), where 0<c1c2.

47 Time Complexity (1) f(n) is (g(n)) if and only if g(n) is O(f(n))
(2) f(n)=(g(n)) if and only if f(n)=O(g(n)) and f(n)= (g(n))

48 Time Complexity It may be helpful to think of O as similar to ,  as similar to , and  as similar to =. Don’t confuse the exact relations with the asymptotic notations. 100nn, but c, s.t. when n>n0, 100ncn  100n=O(n) n100n, but c, s.t. when n>n0, nc100n  n=(100n), such as c0.01 n100n, but c1, c2, s.t. when n>n0, c1100nnc2100n => n=(100n). Such as c10.01, c20.01

49 Time Complexity How to use the notations O, ,  to represent the time complexity of the three previous sorting algorithms?

50 Time Complexity How to use the notations O, ,  to represent the following functions? Any constant functions logn2 log nk log n!

51 Space Complexity We define the space used by an algorithm to be the number of memory cells (or words) needed to carry out the computational steps required to solve an instance of the problem excluding the space allocated to hold the input. That is, it is only the work space required by the algorithm.

52 Space Complexity The work space cannot exceed the running time of an algorithm, as writing into each memory cell requires at least a constant amount of time. Let T(n) and S(n) denote, respectively, the time and space complexities of an algorithm, then S(n)=O(T(n))

53 Space Complexity Selection Sort and Insertion Sort: (1) Merge: (n)
Merge Sort: (n)

54 Radix Sort Let L={a1, a2, …, an} be a list of n numbers each consisting of exactly k digits. That is, each number is of the form dkdk-1…d1, where each di is a digit between 0 and 9. If the numbers are first distributed into the lists by their least significant digit, then a very efficient algorithm results. Suppose that the numbers are sorted lexicographically according to their least k-1 digits, i.e., digits dk-1, dk-2, …, d1. After sorting them on their kth digits, they will eventually be sorted.

55 Radix Sort First, distribute the numbers into 10 lists L0, L1, …, L9 according to digit d1 so that those numbers with d1=0 constitute list L0, those with d1=1 constitute list L1 and so on. Next, the lists are coalesced in the order L0, L1, …, L9. Then, they are distributed into 10 lists according to digit d2, coalesced in order, and so on. After distributing them according to dk and collecting them in order, all numbers will be sorted.

56 Radix Sort Example: Sort A nondecreasingly.

57 Radix Sort Input: A linked list of numbers L={a1, a2, …, an} and k, the number of digits. Output: L sorted in nondecreasing order. 1. for j1 to k Prepare 10 empty lists L0, L1, …, L9; while L is not empty anext element in L; Delete a from L; ijth digit in a; Append a to list Li; end while; L L0; 10. for i 1 to 9 LL, Li //Append list Li to L 12. end for; 13. end for; 14. return L;

58 Radix Sort What’s the performance of the algorithm Radix Sort?
Time Complexity? Space Complexity?

59 Radix Sort Time Complexity: (n) Space Complexity: (n)

60 Radix Sort Write codes to implement the Radix Sort.

61 Quicksort Let A[low…high] be an array of n numbers, and x=A[low].
We consider the problem of rearranging the elements in A so that all elements less than or equal to x precede x which in turn precedes all elements greater than x. After permuting the element in the array, x will be A[w] for some w, lowwhigh. The action of rearrangement is also called splitting or partitioning around x, which is called the pivot or splitting element.

62 Quicksort We say that an element A[j] is in its proper position or correct position if it is neither smaller than the elements in A[low…j-1] nor larger than the elements in A[j+1…high]. After partitioning an array A using xA as a pivot, x will be in its correct position.

63 Quicksort Input: An array of elements A[low…high];
Output: (1) A with its elements rearranged, if necessary; (2) w, the new position of the splitting element A[low]; 1. ilow; 2. xA[low]; 3. for jlow+1 to high if A[j]x then ii+1; if ij then interchange A[i] and A[j]; end if; 8. end for; 9. interchange A[low] and A[i]; 10.wi; 11.return A and w;

64 Quicksort Example: Adjust 5 to e the correct position.

65 Quicksort The number of element comparisons performed by Algorithm SPLIT is exactly n-1. Thus, its time complexity is (n). The only extra space used is that needed to hold its local variables. Therefore, the space complexity is (1).

66 Quicksort Input: An array A[1…n] of n elements;
Output: The elements in A sorted in nondecreasing order; 1. quicksort(A, 1, n); quicksort(A, low, high) 1. if low<high then SPLIT(A[low…high], w) \\w is the new position of A[low]; quicksort(A, low, w-1); quicksort(A, w+1, high); 5. end if;

67 Quicksort The average number of comparisons performed by Algorithm QUICKSORT to sort an array of n elements is (nlogn).

68 Quicksort Write codes to implement the Quicksort.

69 Homework Exercises: 7.1 7.13 Implement Quicksort in C


Download ppt "Data Structures and Algorithm Analysis Algorithm Analysis and Sorting"

Similar presentations


Ads by Google