Presentation is loading. Please wait.

Presentation is loading. Please wait.

Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej.

Similar presentations


Presentation on theme: "Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej."— Presentation transcript:

1 Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej Ciesielski, University of Massachusetts

2 The Hiring Problem The goal is to hire a new assistant through an employment agency. –The agency sends one candidate each day. –The commitment is to have the best person to do the job. –When the interviewed person is better than the current assistant, he/she is hired in place of the current one. –There is a small cost to pay for the interview. –There is usually a larger cost associated with the fire/hire process.

3 The Hiring Problem: Algorithm Hire-Assistant (n) 1 best = 0 // candidate 0 is least qualified 2 for i = 1 to n 3 4 if i is better than best 5 best = i 6 Assume interviewing has cost c i, whereas more expensive hiring has cost c h. Let m be the number of people hired. Then the cost of the above algorithm is: O(nc i + mc h ) The quantity m varies with each run and determines the overall cost of the algorithm. It is estimated using probabilistic analysis.

4 Indicator Random Variables Assume sample space S and an event A. The indicator random variable I{A} is defined as 1, if A occurs I{A} = 0, otherwise Given a sample space S and an event A, denote X A a random variable associated with an event being A, i.e. X A = I{A}. The the expected value of X A is: E[X A ] = Pr{A} Proof. E[X A ] = E[I{A}] = 1  Pr{A} + 0  Pr{  A} = Pr{A}

5 The Hiring Problem: Analysis Let X i be the indicator random variable associated with the event that the candidate i is hired: X i = I{candidate i is hired} Let X be the random variable whose value equals the number of time we hire a new candidate: X = X 1 +... + X n Note that E[X i ] = Pr{X i } = Pr{candidate i is hired}. We now need to compute Pr{candidate i is hired}.

6 The Hiring Problem: Analysis (cont.) Candidate i is hired (line 5) when it is better than any of the previous (i-1) candidates. Since all candidates arrive in random order, each of them have the same probability of being the best so far. Therefore: E[X i ] = Pr{X i } = 1/i We can now compute E[X]: E[X] = E[X 1 +... + X n ] = 1 + ½ +... + 1/n = ln(n) + O(1) Hence, when candidates are presented in random order, the algorithm Hire- Assistant has a total hiring cost: O(c h ln(n))

7 Randomized Algorithms In a randomized algorithm the distribution of inputs is imposed. In particular, in the randomized version of the Hire-Assistant algorithm we randomly permute the candidates: Randomized-Hire-Assistant (n) 1 2 best = 0 // candidate 0 is least qualified 3 for i = 1 to n 4 5 if i is better than best 6 best = i 7 According to the earlier computations, the expected cost of the above algorithm is O(nc i +c h ln(n)).

8 Randomized Version Want to make running time independent of input ordering. Randomized-Partition(A, p, r) i := Random(p, r); A[r]  A[i]; Partition(A, p, r) Randomized-Partition(A, p, r) i := Random(p, r); A[r]  A[i]; Partition(A, p, r) Randomized-Quicksort(A, p, r) if p < r then q := Randomized-Partition(A, p, r); Randomized-Quicksort(A, p, q – 1); Randomized-Quicksort(A, q + 1, r) fi Randomized-Quicksort(A, p, r) if p < r then q := Randomized-Partition(A, p, r); Randomized-Quicksort(A, p, q – 1); Randomized-Quicksort(A, q + 1, r) fi

9 Quick-Sort 7 4 9 6 2  2 4 6 7 9 4 2  2 47 9  7 9 2  29  9

10 Quick-Sort Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: Divide: pick a random element x (called pivot) and partition S into  L elements less than x  E elements equal x  G elements greater than x Recur: sort L and G Conquer: join L, E and G x x L G E x

11 Partition We partition an input sequence as follows: We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot x Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time Thus, the partition step of quick-sort takes O(n) time Algorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G  empty sequences x  S.remove(p) while  S.isEmpty() y  S.remove(S.first()) if y < x L.insertLast(y) else if y = x E.insertLast(y) else { y > x } G.insertLast(y) return L, E, G

12 Quick-Sort Tree An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores  Unsorted sequence before the execution and its pivot  Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or 1 7 4 9 6 2  2 4 6 7 9 4 2  2 47 9  7 9 2  29  9

13 Execution Example Pivot selection 7 2 9 4  2 4 7 9 2  2 7 2 9 4 3 7 6 1  1 2 3 4 6 7 8 9 3 8 6 1  1 3 8 6 3  38  8 9 4  4 9 9  94  4

14 Execution Example (cont.) Partition, recursive call, pivot selection 2 4 3 1  2 4 7 9 9 4  4 9 9  94  4 7 2 9 4 3 7 6 1  1 2 3 4 6 7 8 9 3 8 6 1  1 3 8 6 3  38  8 2  2

15 Execution Example (cont.) Partition, recursive call, base case 2 4 3 1  2 4 7 1  11  1 9 4  4 9 9  94  4 7 2 9 4 3 7 6 1   1 2 3 4 6 7 8 9 3 8 6 1  1 3 8 6 3  38  8

16 Execution Example (cont.) Recursive call, …, base case, join 3 8 6 1  1 3 8 6 3  38  8 7 2 9 4 3 7 6 1  1 2 3 4 6 7 8 9 2 4 3 1  1 2 3 4 1  11  14 3  3 4 9  94  44  4

17 Execution Example (cont.) Recursive call, pivot selection 7 9 7 1  1 3 8 6 8  8 7 2 9 4 3 7 6 1  1 2 3 4 6 7 8 9 2 4 3 1  1 2 3 4 1  11  14 3  3 4 9  94  44  4

18 Execution Example (cont.) Partition, …, recursive call, base case 7 9 7 1  1 3 8 6 8  8 7 2 9 4 3 7 6 1  1 2 3 4 6 7 8 9 2 4 3 1  1 2 3 4 1  11  14 3  3 4 9  94  44  4 9  99  9

19 Execution Example (cont.) Join, join 7 9 7  17 7 9 8  8 7 2 9 4 3 7 6 1  1 2 3 4 6 7 7 9 2 4 3 1  1 2 3 4 1  11  14 3  3 4 9  94  44  4 9  99  9

20 Worst-case Running Time The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element One of L and G has size n  1 and the other has size 0 The running time is proportional to the sum n  (n  1)  …  2  Thus, the worst-case running time of quick-sort is O(n 2 ) depthtime 0n 1 n  1 …… 1 …

21 Expected Running Time Consider a recursive call of quick-sort on a sequence of size s Good call: the sizes of L and G are each less than 3s  4 Bad call: one of L and G has size greater than 3s  4 A call is good with probability 1  2 1/2 of the possible pivots cause good calls: 7 9 7 1  1 7 2 9 4 3 7 6 1 9 2 4 3 1 7 2 9 4 3 7 61 7 2 9 4 3 7 6 1 Good callBad call 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Good pivotsBad pivots

22 Expected Running Time, Part 2 Probabilistic Fact: The expected number of coin tosses required in order to get k heads is 2k For a node of depth i, we expect i  2 ancestors are good calls The size of the input sequence for the current call is at most ( 3  4 ) i  2 n Therefore, we have For a node of depth 2log 4  3 n, the expected input size is one The expected height of the quick-sort tree is O(log n) The amount or work done at the nodes of the same depth is O(n) Thus, the expected running time of quick-sort is O(n log n)


Download ppt "Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej."

Similar presentations


Ads by Google