Presentation is loading. Please wait.

Presentation is loading. Please wait.

2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.

Similar presentations


Presentation on theme: "2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems."— Presentation transcript:

1 2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems

2 2 -2 The goodness of an algorithm Time complexity (more important) Space complexity For a parallel algorithm : time-processor product For a VLSI circuit : area-time (AT, AT 2 )

3 2 -3 Measure the goodness of an algorithm Time complexity of an algorithm efficient (algorithm) worst-case average-case Amortized We can use the number of comparisons to measure a sorting algorithm.

4 2 -4 NP-complete ? Undecidable ? Is the algorithm best ? optimal (algorithm) We can also use the number of comparisons to measure a sorting problem. Measure the difficulty of a problem

5 2 -5 Asymptotic notations Def: f(n) = O(g(n))"at most"  c, n 0  |f(n)|  c|g(n)|  n  n 0 e.g. f(n) = 3n 2 + 2 g(n) = n 2  n 0 =2, c=4  f(n) = O(n 2 ) e.g. f(n) = n 3 + n = O(n 3 ) e. g. f(n) = 3n 2 + 2 = O(n 3 ) or O(n 100 )

6 2 -6 Def : f(n) =  (g(n)) “at least“, “lower bound"  c, and n 0,  |f(n)|  c|g(n)|  n  n 0 e. g. f(n) = 3n 2 + 2 =  (n 2 ) or  (n) Def : f(n) =  (g(n))  c 1, c 2, and n 0,  c 1 |g(n)|  |f(n)|  c 2 |g(n)|  n  n 0 e. g. f(n) = 3n 2 + 2 =  (n 2 ) Def : f(n)  o(g(n))  0 e.g. f(n) = 3n 2 +n = o(3n 2 )

7 2 -7 Problem size Time Complexity Functions 1010 2 10 3 10 4 log 2 n3.36.61013.3 n1010 2 10 3 10 4 nlog 2 n0.33x10 2 0.7x10 3 10 4 1.3x10 5 n2n2 10 2 10 4 10 6 10 8 2n2n 10241.3x10 30 >10 100 n!3x10 6 >10 100

8 2 -8 Rate of growth of common computing time functions

9 2 -9 Common computing time functions  (1)   (log n)   (n)   (n log n)   (n 2 )   (n 3 )   (2 n )   (n!)   (n n ) Exponential algorithm:  (2 n ) polynomial algorithm Algorithm A :  (n 3 ), algorithm B :  (n) Should Algorithm B run faster than A? NO ! It is true only when n is large enough!

10 2 -10 Analysis of algorithms Best case: easiest Worst case Average case: hardest

11 Example: loop (1/3) Given a integer N, try to determine whether N is prime number How to answer the question above? 2 -11

12 Example: loop (2/3) count = 0; for( i=2 ; i<N ; i++) { count++; if(N%i == 0) return IS_NOT_A_PRIME; } return IS_A_PRIME; 2 -12

13 Example: loop (3/3) NPrime?# divide 11prime9 12 1 13prime11 14 1 15 2 16 1 17prime15 18 1 19prime17 20 1 2 -13 NPrime?# divide 21 2 22 1 23prime21 24 1 25 4 26 1 27 2 28 1 29prime27 30 1

14 Example: recursive T(n) = T(n-1) + 1 T(n) = 2T(n-1) + 1 T(n) = T(n/2) + 1 T(n) = aT(n/b) + f(n) ?? 2 -14 Selection sort Hanoi tower Binary search

15 A General Method of Solving Divide-and-Conquer 2 -15

16 16 Deduction-1 given rewrite it into template,

17 17 Example1-(1) Example 1 : Solution : (1)

18 18 Deduction-2 而 iteration and cancellation 又 log n terms,

19 19 Deduction-3 Ο : upper bound ex: Ω : lower bound ex: Θ : exactly ex: ο : exactly,even constant ex: 不同的 g(n) 對應不同的

20 20 Example1-(2) Example 1 : Solution : From (1)

21 21 Example2 Example 2 : Solution :,

22 22 General Representation In general given,, 仍然是只差 constant If g(n) is monotone

23 23 Example3-- Pan ’ s Matrix Example 3 : Solution :

24 24 More General Representation More general given,

25 25 Example4 Example 4 : Solution :,

26 26 Reference Bentley, Haken & Saxe “A General Method of Solving Divide-and-Conquer Recurrences”, SIGACT News Fall 1980,page36-44

27 2 -27 Straight insertion sort input: 7,5,1,4,3 7,5,1,4,3 5,7,1,4,3 1,5,7,4,3 1,4,5,7,3 1,3,4,5,7

28 2 -28 Straight insertion sort Algorithm 2.1 Straight Insertion Sort Input: x 1,x 2,...,x n Output: The sorted sequence of x 1,x 2,...,x n For j := 2 to n do Begin i := j-1 x := x j While x 0 do Begin x i+1 := x i i := i-1 End x i+1 := x End

29 Analysis of Straight insertion sort best case:  (n) Sorted sequence worst case:  (n 2 ) Reverse order average case:  (n 2 ) 2 -29

30 2 -30 Analysis of # of exchanges Method 1 (straightforward) x j is being inserted into the sorted sequence x 1 x 2.... x j-1 If x j is the kth (1  k  j) largest, it takes (k  1) exchanges. e.g.1 5 7  4 1 5  4 7 1 4 5 7 # of exchanges required for x j to be inserted:

31 2 -31 # of exchanges for sorting:

32 2 -32

33 2 -33 Binary search sorted sequence : (search 9) 14579101215 step 1  step 2  step 3  best case: 1 step =  (1) worst case: (  log 2 n  +1) steps =  (log n) average case:  (log n) steps

34 2 -34 n cases for successful search n+1 cases for unsuccessful search Average # of comparisons done in the binary tree: A(n) =, where k =  log n  +1

35 2 -35 A(n) =  k as n is very large = log n =  (log n)

36 2 -36

37 2 -37 Straight selection sort 7 5 1 4 3 1 5 7 4 3 1 3 7 4 5 1 3 4 7 5 1 3 4 5 7 Comparison vs data movement For each number, Comparison:  (n) Data movement:  (1) In worst case, Comparison:  (n 2 ) Data movement:  (n)

38 2 -38

39 2 -39 Quick sort Recursively apply the same procedure.

40 2 -40 Best case :  (n log n) A list is split into two sublists with almost equal size. log n rounds are needed. In each round, n comparisons (ignoring the element used to split) are required.

41 2 -41 Worst case :  (n 2 ) In each round, the number used to split is either the smallest or the largest.

42 2 -42 Average case:  (n log n)

43 2 -43

44 2 -44

45 2 -45 2-D ranking finding Def: Let A = (a 1,a 2 ), B = (b 1,b 2 ). A dominates B iff a 1 > b 1 and a 2 > b 2 Def: Given a set S of n points, the rank of a point x is the number of points dominated by x. B A C D E rank(A)= 0 rank(B) = 1 rank(C) = 1 rank(D) = 3 rank(E) = 0

46 2 -46 Straightforward algorithm: compare all pairs of points : O(n 2 )

47 2 -47 Input: A set S of planar points P 1,P 2,…,P n Output: The rank of every point in S Step 1: (Split the points along the median line L into A and B.) a.If S contains only one point, return its rank its rank as 0. b.Otherwise,choose a cut line L perpendicular to the x-axis such that n/2 points of S have X-values L (call this set of points A) and the remainder points have X-values L(call this set B).Note that L is a median X-value of this set. Step 2: Find ranks of points in A and ranks of points in B, recursively. Step 3: Sort points in A and B according to their y-values. Scan these points sequentially and determine, for each point in B, the number of points in A whose y-values are less than its y- value. The rank of this point is equal to the rank of this point among points in B, plus the number of points in A whose y- values are less than its y-value. Divide-and-conquer 2-D ranking finding

48 2 -48

49 2 -49 Lower bound Def : A lower bound of a problem is the least time complexity required for any algorithm which can be used to solve this problem. ☆ worst case lower bound ☆ average case lower bound The lower bound for a problem is not unique. e.g.  (1),  (n),  (n log n) are all lower bounds for sorting. (  (1),  (n) are trivial)

50 2 -50 At present, if the highest lower bound of a problem is  (n log n) and the time complexity of the best algorithm is O(n 2 ). We may try to find a higher lower bound. We may try to find a better algorithm. Both of the lower bound and the algorithm may be improved.. If the present lower bound is  (n log n) and there is an algorithm with time complexity O(n log n), then the algorithm is optimal.

51 2 -51 6 permutations for 3 data elements a 1 a 2 a 3 123 132 213 231 312 321 The worst case lower bound of sorting

52 2 -52 Straight insertion sort: input data: (2, 3, 1) (1) a 1 :a 2 (2) a 2 :a 3, a 2  a 3 (3) a 1 :a 2, a 1  a 2 input data: (2, 1, 3) (1)a 1 :a 2, a 1  a 2 (2)a 2 :a 3

53 2 -53 Decision tree for straight insertion sort

54 2 -54 D ecision tree for bubble sort

55 2 -55 Lower bound of sorting To find the lower bound, we have to find the smallest depth of a binary tree. n! distinct permutations n! leaf nodes in the binary decision tree. balanced tree has the smallest depth:  log(n!)  =  (n log n) lower bound for sorting:  (n log n)

56 2 -56 Method 1:

57 2 -57 Method 2: Stirling approximation: n!  log n!  log  n log n   (n log n)

58 2 -58

59 2 -59 Heapsort — An optimal sorting algorithm A heap : parent  son

60 2 -60 output the maximum and restore: Heapsort: construction output

61 2 -61 Phase 1: construction input data: 4, 37, 26, 15, 48 restore the subtree rooted at A(2): restore the tree rooted at A(1):

62 2 -62 Phase 2: output

63 2 -63 Implementation using a linear array not a binary tree. The sons of A(h) are A(2h) and A(2h+1). time complexity: O(n log n)

64 2 -64 Time complexity Phase 1: construction

65 2 -65 Time complexity Phase 2: output

66 2 -66 Average case lower bound of sorting By binary decision tree The average time complexity of a sorting algorithm: the external path length of the binary tree n! The external path length is minimized if the tree is balanced. (all leaf nodes on level d or level d  1)

67 2 -67 Average case lower bound of sorting

68 2 -68 Compute the min external path length 1. Depth of balanced binary tree with c leaf nodes: d =  log c  Leaf nodes can appear only on level d or d  1. 2. x 1 leaf nodes on level d  1 x 2 leaf nodes on level d x 1 + x 2 = c x 1 + = 2 d-1  x 1 = 2 d  c x 2 = 2(c  2 d-1 )

69 2 -69 3. External path length: M= x 1 (d  1) + x 2 d = (2 d  1)(d  1) + 2(c  2 d-1 )d = c(d  1) + 2(c  2 d-1 ), d  1 =  log c  = c  log c  + 2(c  2  log c  ) 4. c = n! M = n!  log n!  + 2(n!  2  log n!  ) M/n! =  log n!  + 2 =  log n!  + c, 0  c  1 =  (n log n) Average case lower bound of sorting:  (n log n)

70 2 -70 Quicksort & Heapsort Quicksort is optimal in the average case. (  (n log n) in average ) (i)worst case time complexity of heapsort is  (n log n) (ii)average case lower bound:  (n log n) average case time complexity of heapsort is  (n log n). Heapsort is optimal in the average case.

71 2 -71 Improving a lower bound through oracles Problem P: merge two sorted sequences A and B with lengths m and n. (1) Binary decision tree: There are ways ! leaf nodes in the binary tree.  The lower bound for merging:  log   m + n  1 (conventional merging)

72 2 -72

73 2 -73 (2) Oracle: The oracle tries its best to cause the algorithm to work as hard as it might. (to give a very hard data set) Sorted sequences: A: a 1 < a 2 < … < a m B: b 1 < b 2 < … < b m The very hard case: a 1 < b 1 < a 2 < b 2 < … < a m < b m

74 2 -74 We must compare: a 1 : b 1 b 1 : a 2 a 2 : b 2 : b m-1 : a m-1 a m : b m Otherwise, we may get a wrong result for some input data. e.g. If b 1 and a 2 are not compared, we can not distinguish a 1 < b 1 < a 2 < b 2 < … < a m < b m and a 1 < a 2 < b 1 < b 2 < … < a m < b m Thus, at least 2m  1 comparisons are required. The conventional merging algorithm is optimal for m = n.

75 2 -75 Finding lower bound by problem transformation Problem A reduces to problem B (A  B) iff A can be solved by using any algorithm which solves B. If A  B, B is more difficult. Note: T(tr 1 ) + T(tr 2 ) < T(B) T(A)  T(tr 1 ) + T(tr 2 ) + T(B)  O(T(B))

76 2 -76 The lower bound of the convex hull problem sorting  convex hull A B an instance of A: (x 1, x 2, …, x n ) ↓ transformation an instance of B: {( x 1, x 1 2 ), ( x 2, x 2 2 ), …, ( x n, x n 2 )} assume: x 1 < x 2 < … < x n

77 2 -77 If the convex hull problem can be solved, we can also solve the sorting problem. The lower bound of sorting:  (n log n) The lower bound of the convex hull problem:  (n log n)

78 2 -78 The lower bound of the Euclidean minimal spanning tree (MST) problem sorting  Euclidean MST A B an instance of A: (x 1, x 2, …, x n ) ↓ transformation an instance of B: {( x 1, 0), ( x 2, 0), …, ( x n, 0)} Assume x 1 < x 2 < x 3 < … < x n  there is an edge between (x i, 0) and (x i+1, 0) in the MST, where 1  i  n  1

79 2 -79 If the Euclidean MST problem can be solved, we can also solve the sorting problem. The lower bound of sorting:  (n log n) The lower bound of the Euclidean MST problem:  (n log n)


Download ppt "2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems."

Similar presentations


Ads by Google