1 Introduction to Randomized Algorithms Srikrishnan Divakaran DA-IICT.

Slides:



Advertisements
Similar presentations
Order Statistics Sorted
Advertisements

Totally Unimodular Matrices
COMP 553: Algorithmic Game Theory Fall 2014 Yang Cai Lecture 21.
Greedy Algorithms Greed is good. (Some of the time)
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
Theory of Computing Lecture 3 MAS 714 Hartmut Klauck.
MIT and James Orlin © Game Theory 2-person 0-sum (or constant sum) game theory 2-person game theory (e.g., prisoner’s dilemma)
Study Group Randomized Algorithms 21 st June 03. Topics Covered Game Tree Evaluation –its expected run time is better than the worst- case complexity.
Techniques for Dealing with Hard Problems Backtrack: –Systematically enumerates all potential solutions by continually trying to extend a partial solution.
“Devo verificare un’equivalenza polinomiale…Che fò? Fò dù conti” (Prof. G. Di Battista)
Introduction to Algorithms
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
CSE 2331/5331 Topic 5: Prob. Analysis Randomized Alg.
Analysis of Algorithms CS 477/677 Randomizing Quicksort Instructor: George Bebis (Appendix C.2, Appendix C.3) (Chapter 5, Chapter 7)
CSL758 Instructors: Naveen Garg Kavitha Telikepalli Scribe: Manish Singh Vaibhav Rastogi February 7 & 11, 2008.
1 Introduction to Randomized Algorithms Md. Aashikur Rahman Azim.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Approximation Algorithms
Probabilistic (Average-Case) Analysis and Randomized Algorithms Two different but similar analyses –Probabilistic analysis of a deterministic algorithm.
Ch. 7 - QuickSort Quick but not Guaranteed. Ch.7 - QuickSort Another Divide-and-Conquer sorting algorithm… As it turns out, MERGESORT and HEAPSORT, although.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 8 May 4, 2005
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
Karger’s Min-Cut Algorithm Amihood Amir Bar-Ilan University, 2009.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Randomized Algorithms and Randomized Rounding Lecture 21: April 13 G n 2 leaves
DAST, Spring © L. Joskowicz 1 Data Structures – LECTURE 1 Introduction Motivation: algorithms and abstract data types Easy problems, hard problems.
Study Group Randomized Algorithms Jun 7, 2003 Jun 14, 2003.
Randomness in Computation and Communication Part 1: Randomized algorithms Lap Chi Lau CSE CUHK.
DAST 2005 Week 4 – Some Helpful Material Randomized Quick Sort & Lower bound & General remarks…
Karger’s Min-Cut Approximate Max-Cut Monday, July 21st 1.
DAST, Spring © L. Joskowicz 1 Data Structures – LECTURE 1 Introduction Motivation: algorithms and abstract data types Easy problems, hard problems.
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Randomized Algorithms Morteza ZadiMoghaddam Amin Sayedi.
10 Algorithms in 20th Century Science, Vol. 287, No. 5454, p. 799, February 2000 Computing in Science & Engineering, January/February : The Metropolis.
1 Introduction to Randomized Algorithms Srikrishnan Divakaran DA-IICT.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
1 Min-cut for Undirected Graphs Given an undirected graph, a global min-cut is a cut (S,V-S) minimizing the number of crossing edges, where a crossing.
The Best Algorithms are Randomized Algorithms N. Harvey C&O Dept TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AAAA.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Introduction to Algorithms Jiafen Liu Sept
Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej.
Issues on the border of economics and computation נושאים בגבול כלכלה וחישוב Speaker: Dr. Michael Schapira Topic: Dynamics in Games (Part III) (Some slides.
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
1. 2 You should know by now… u The security level of a strategy for a player is the minimum payoff regardless of what strategy his opponent uses. u A.
Instructor Neelima Gupta Expected Running Times and Randomized Algorithms Instructor Neelima Gupta
ICS 353: Design and Analysis of Algorithms
COSC 3101A - Design and Analysis of Algorithms 4 Quicksort Medians and Order Statistics Many of these slides are taken from Monica Nicolescu, Univ. of.
CSC317 1 Quicksort on average run time We’ll prove that average run time with random pivots for any input array is O(n log n) Randomness is in choosing.
Approximation Algorithms based on linear programming.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Game Theory Just last week:
Order Statistics Comp 122, Spring 2004.
Randomized Algorithm (Lecture 2: Randomized Min_Cut)
Minimum Spanning Tree 8/7/2018 4:26 AM
Randomized Algorithms
Lecture 8 Randomized Algorithms
Randomized Algorithms CS648
Randomized Algorithms
Loomis’ Theorem F o r a n y 2 - p e s u m g , w h v x i M = ; t b l 1
3.5 Minimum Cuts in Undirected Graphs
CS 583 Analysis of Algorithms
Lecture 20 Linear Program Duality
Tomado del libro Cormen
Presentation transcript:

1 Introduction to Randomized Algorithms Srikrishnan Divakaran DA-IICT

2 Talk Outline Preliminaries and Motivation Analysis of –Randomized Quick Sort –Karger’s Min-cut Algorithm Basic Analytical Tools Yao’s Lower Bounding Technique References

3 Preliminaries and Motivation

4 Quick Sort Select: pick an arbitrary element x in S to be the pivot. Partition: rearrange elements so that elements with value less than x go to List L to the left of x and elements with value greater than x go to the List R to the right of x. Recursion: recursively sort the lists L and R.

WORST CASE PARTITIONING OF QUICK SORT 5

BEST CASE PARTITIONING OF QUICK SORT 6

AVERAGE CASE OF QUICK SORT 7

RANDOMIZED QUICK SORT Randomized-Partition(A, p, r) 1. i  Random(p, r) 2. exchange A[r]  A[i] 3. return Partition(A, p, r) Randomized-Quicksort(A, p, r) 1. if p < r 2. then q  Randomized-Partition(A, p, r) 3. Randomized-Quicksort(A, p, q-1) 4. Randomized-Quicksort(A, q+1, r) 8

RANDOMIZED QUICK SORT Exchange A[r] with an element chosen at random from A[p…r] in Partition. The pivot element is equally likely to be any of input elements. For any given input, the behavior of Randomized Quick Sort is determined not only by the input but also by the random choices of the pivot. We add randomization to Quick Sort to obtain for any input the expected performance of the algorithm to be good. 9

DETERMINISTIC ALGORITHMS Goal: Prove for all input instances the algorithm solves the problem correctly and the number of steps is bounded by a polynomial in the size of the input. 10 ALGORITHM INPUT OUTPUT

RANDOMIZED ALGORITHMS In addition to input, algorithm takes a source of random numbers and makes random choices during execution; Behavior can vary even on a fixed input; 11 ALGORITHM INPUT OUTPUT RANDOM NUMBERS

LAS VEGAS RANDOMIZED ALGORITHMS Goal: Prove that for all input instances the algorithm solves the problem correctly and the expected number of steps is bounded by a polynomial in the input size. Note: The expectation is over the random choices made by the algorithm. 12 ALGORITHM INPUT OUTPUT RANDOM NUMBERS

PROBABILISTIC ANALYSIS OF ALGORITHMS Input is assumed to be from a probability distribution. Goal: Show that for all inputs the algorithm works correctly and for most inputs the number of steps is bounded by a polynomial in the size of the input. 13 ALGORITHM RANDOM INPUT OUTPUT DISTRIBUTION

14 Min-cut for Undirected Graphs Given an undirected graph, a global min-cut is a cut (S,V-S) minimizing the number of crossing edges, where a crossing edge is an edge (u,v) s.t. u ∊S and v∊ V-S. S V - S

15 GRAPH CONTRACTION For an undirected graph G, we can construct a new graph G’ by contracting two vertices u, v in G as follows:  u and v become one vertex {u,v} and the edge (u,v) is removed;  the other edges incident to u or v in G are now incident on the new vertex {u,v} in G’; Note: There may be multi-edges between two vertices. We just keep them. u v a b c d e {u,v} a b c d e Graph G Graph G’

16 A D B C contract Karger’s Min-cut Algorithm C is a cut, but not necessarily a min-cut. A D B C (i) Graph G (ii) Contract nodes C and D (iii) contract nodes A and CD A B CD contract B ACD (Iv) Cut C={(A,B), (B,C), (B,D)} Note: C is a cut but not necessarily a min-cut.

17 Karger’s Min-cut Algorithm For i = 1 to 100n 2 repeat randomly pick an edge (u,v) contract u and v until two vertices are left c i ← the number of edges between them Output mini c i

18 KEY IDEA Let C* = {c 1 *, c 2 *, …, c k *} be a min-cut in G and C i be a cut determined by Karger’s algorithm during some iteration i. C i will be a min-cut for G if during iteration “i” none of the edges in C* are contracted. If we can show that with prob. Ω(1/n 2 ), where n = |V|, C i will be a min-cut, then by repeatedly obtaining min-cuts O(n 2 ) times and taking minimum gives the min-cut with high prob.

19 Monte Carlo Randomized Algorithms Goal: Prove that the algorithm –with high probability solves the problem correctly; –for every input the expected number of steps is bounded by a polynomial in the input size. Note: The expectation is over the random choices made by the algorithm. ALGORITHM INPUT OUTPUT RANDOM NUMBERS

MONTE CARLO VERSUS LAS VEGAS A Monte Carlo algorithm runs produces an answer that is correct with non- zero probability, whereas a Las Vegas algorithm always produces the correct answer. The running time of both types of randomized algorithms is a random variable whose expectation is bounded say by a polynomial in terms of input size. These expectations are only over the random choices made by the algorithm independent of the input. Thus independent repetitions of Monte Carlo algorithms drive down the failure probability exponentially. 20

MOTIVATION FOR RANDOMIZED ALGORITHMS Simplicity; Performance; Reflects reality better (Online Algorithms); For many hard problems helps obtain better complexity bounds when compared to deterministic approaches; 21

22 Analysis of Randomized Quick Sort

23 Linearity of Expectation If X 1, X 2, …, X n are random variables, then

NOTATION Rename the elements of A as z 1, z 2,..., z n, with z i being the i th smallest element (Rank “i”). Define the set Z ij = {z i, z i+1,..., z j } be the set of elements between z i and z j, inclusive z1z1 z2z2 z9z9 z8z8 z5z5 z3z3 z4z4 z6z6 z 10 z7z7

EXPECTED NUMBER OF TOTAL COMPARISONS IN PARTITION Let X ij = I {z i is compared to z j } Let X be the total number of comparisons performed by the algorithm. Then 25 by linearity of expectation indicator random variable The expected number of comparisons performed by the algorithm is

COMPARISONS IN PARTITION Observation 1: Each pair of elements is compared at most once during the entire execution of the algorithm  Elements are compared only to the pivot point!  Pivot point is excluded from future calls to PARTITION 26 Observation 2: Only the pivot is compared with elements in both partitions z1z1 z2z2 z9z9 z8z8 z5z5 z3z3 z4z4 z6z6 z 10 z7z Z 1,6 = {1, 2, 3, 4, 5, 6} Z 8,9 = {8, 9, 10}{7} pivot Elements between different partitions are never compared

COMPARISONS IN PARTITION Case 1: pivot chosen such as: z i < x < z j  z i and z j will never be compared Case 2 : z i or z j is the pivot  z i and z j will be compared  only if one of them is chosen as pivot before any other element in range z i to z j z1z1 z2z2 z9z9 z8z8 z5z5 z3z3 z4z4 z6z6 z 10 z7z7 Z 1,6 = {1, 2, 3, 4, 5, 6} Z 8,9 = {8, 9, 10}{7}

28 EXPECTED NUMBER OF COMPARISONS IN PARTITION Pr {Z i is compared with Z j } = Pr{Z i or Z j is chosen as pivot before other elements in Z i,j } = 2 / (j-i+1) = O(nlgn)

29 Analysis of Karger’s Min-Cut Algorithm

30 ANALYSIS OF KARGER’S ALGORITHM Let k be the number of edges of min cut (S, V-S). If we never picked a crossing edge in the algorithm, then the number of edges between two last vertices is the correct answer. The probability that in step 1 of an iteration a crossing edge is not picked = (|E|-k)/|E|. By def of min cut, we know that each vertex v has degree at least k, Otherwise the cut ({v}, V-{v}) is lighter. Thus |E| ≥ nk/2 and (|E|-k)/|E| = 1 - k/|E| ≥ 1- 2/n. k ≥ k

31 In step 1, Pr [no crossing edge picked] >= 1 – 2/n Similarly, in step 2, Pr [no crossing edge picked] ≥ 1-2/(n-1) In general, in step j, Pr [no crossing edge picked] ≥ 1-2/(n-j+1) Pr {the n-2 contractions never contract a crossing edge}  = Pr [first step good] * Pr [second step good after surviving first step] * Pr [third step good after surviving first two steps] * … * Pr [(n-2)-th step good after surviving first n-3 steps] ≥ (1-2/n) (1-2/(n-1)) … (1-2/3) = [(n-2)/n] [(n-3)(n-1)] … [1/3] = 2/[n(n-1)] = Ω(1/n 2 ) Analysis of Karger’s Algorithm

32 Basic Analytical Tools

33 TAIL BOUNDS In the analysis of randomized algorithms, we need to know how much does an algorithms run-time/cost deviate from its expected run-time/cost. That is we need to find an upper bound on Pr[X deviates from E[X] a lot]. This we refer to as the tail bound on X.

34 MARKOV AND CHEBYSHEV’S INEQUALITY Markov’s Inequality If X ≥ 0, then Pr[X ≥ a] ≤ E[X]/a. Proof. Suppose Pr[X ≥ a] > E[X]/a. Then E[X] ≥ a∙Pr[X ≥ a] > a∙E[X]/a = E[X]. Chebyshev’s Inequality: Pr[ |X-E[X]| ≥ a ] ≤ Var[X] / a2. Proof. Pr[ |X-E[X]| ≥ a ] = Pr[ |X-E[X]|2 ≥ a2 ] =Pr[ (X-E[X])2 ≥ a2 ] ≤E[(X-E[X])2] / a2 // Markov on (X-E[X])2 = Var[X] / a2

35 YAO’S INEQUALITY FOR ESTABLISHING LOWER BOUNDS

ALGORITHM ANALYSIS IN TERMS OF TWO PLAYER ZERO SUM GAMES the sum of payoffs of two players is zero in each cell of the table; The row player (one who maximizes cost) is the adversary responsible for designing malicious inputs; The column player (one who minimizes cost) is the algorithm designer responsible for designing efficient algorithms; 36 M ( r ; c ) ALG 1 ALG 2 … ALG c … ALG n Input 1 Input 2 Input r. Input m ROWPLAYERROWPLAYER COLUMN PLAYER PAYOFF MATRIX M

PURE STRATEGY For the column player (minimizer)  Each pure strategy corresponds to a deterministic algorithm. For the row player (maximizer)  Each pure strategy corresponds to a particular input instance. 37

MIXED STRATEGIES For the column player (minimizer)  Each mixed strategy corresponds to a Las Vegas randomized algorithm. For the row player (maximizer)  Each mixed strategy corresponds to a probability distribution over all the input instances. 38

VON NEUMANN’S MINIMAX THEOREM 39

LOOMIS’ THEOREM 40

YAO’S INTERPRETATION FOR LOOMIS’ THEOREM 41

HOW TO USE YAO’S INEQUALITY? Task 1:  Design a probability distribution p for the input instance. Task 2:  Obtain a lower bound on the expected running for any deterministic algorithm running on I p. 42

APPLICATION OF YAO’S INEQUALITY 43 Find bill problem: There are n boxes and exactly one box contains a dollar bill, and the rest of the boxes are empty. A probe is defined as opening a box to see if it contains the dollar bill. The objective is to locate the box containing the dollar bill while minimizing the number of probes performed. Randomized Find 1. select x έ {H, T} uniformly at random 2. if x = H then (a) probe boxes in order from 1 through n and stop if bill is located 3. else (a) probe boxes in order from n through 1 and stop if bill is located The expected number of probes made by the algorithm is (n+1)/2. Since, if the dollar bill is in the i th box, then with probability 0:5, i probes are made and with probability 0:5, (n - i + 1) probes are needed.

APPLICATION OF YAO’S LEMMA 44 Lemma: A lower bound on the expected number of probes required by any randomized algorithm to solve the Find-bill problem is (n + 1)/2. Proof: We assume that the bill is located in any one of the n boxes uniformly at random. We only consider deterministic algorithms that does not probe the same box twice. By symmetry we can assume that the probe order for the deterministic algorithm is 1 through n. B Yao’s in-equality, we have Min A έ A E[C(A; I p )] =∑i/n = (n+1)/2 <= max I έ I E[C(I;A q )] Therefore any randomized algorithm A q requires at least (n+1)/2 probes.

REFERENCES 45 1.Amihood Amir, Karger’s Min-cut Algorithm, Bar-Ilan University, George Bebis, Randomizing Quick Sort, Lecture Notes of CS 477/677: Analysis of Algorithms, University of Nevada. 3.Avrim Blum and Amit Gupta, Lecture Notes on Randomized Algorithms, CMU, Hsueh-I Lu, Yao’s Theorem, Lecture Notes on Randomized Algorithms, National Taiwan University, Rada Mihalcea, Quick Sort, Lecture Notes of CSCE3110: Data Structures, University of North Texas, 6.Rajeev Motwani and Prabhakar Raghavan, Randomized Algorithms, Cambridge University Press, Prabhakar Raghavan, AMS talk on Randomized Algorithms, Stanford University.