Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dependent Randomized Rounding in Matroid Polytopes (& Related Results) Chandra Chekuri Jan VondrakRico Zenklusen Univ. of Illinois IBM ResearchMIT.

Similar presentations


Presentation on theme: "Dependent Randomized Rounding in Matroid Polytopes (& Related Results) Chandra Chekuri Jan VondrakRico Zenklusen Univ. of Illinois IBM ResearchMIT."— Presentation transcript:

1 Dependent Randomized Rounding in Matroid Polytopes (& Related Results) Chandra Chekuri Jan VondrakRico Zenklusen Univ. of Illinois IBM ResearchMIT

2 Example: Congestion Minimization s3s3 s2s2 s1s1 t1t1 t2t2 t3t3 Choose a path for each pair Minimize max number of paths using any edge (congestion) Special case: Edge-Disjoint Paths G

3 Example: Congestion Minimization s3s3 s2s2 s1s1 t1t1 t2t2 t3t3 Choose a path for each pair Minimize max number of paths using any edge (congestion) Special case: Edge-Disjoint Paths [Raghavan-Thompson’87] Solve mc-flow relaxation (LP) Randomly pick a path according to fractional solution Chernoff bounds to show approx ratio of O(log n/log log n) 0.1 0.3 0.2 0.7 0.25 0.5 0.65 0.15 G

4 Chernoff-Hoeffding Concentration Bounds X 1, X 2,..., X n independent {0,1} random variables E[X i ] = Pr[X i = 1] = x i a 1, a 2,..., a n numbers in [0,1] μ = E[ Σ i a i X i ] = Σ i a i x i Theorem: Pr[ Σ i a i X i > (1+ δ ) μ ] ≤ ( e δ / (1+ δ ) δ ) μ Pr[ Σ i a i X i < (1 - δ ) μ ] ≤ exp(- μ δ 2 /2)

5 Example: Multipath Routing s3s3 s2s2 s1s1 t1t1 t2t2 t3t3 Choose k i paths for pair (s i, t i ) (assume paths for pair disjoint) Minimize max number of paths using any edge (congestion) k 2 = 1 k 1 = 2 k 3 = 2 G

6 Example: Multipath Routing s3s3 s2s2 s1s1 t1t1 t2t2 t3t3 Choose k i paths for pair (s i, t i ) (assume paths for pair disjoint) Minimize max number of paths using any edge (congestion) [Srinivasan’99] Solve mc-flow relaxation (LP) Randomized pipage rounding O(log n/log log n) approx via negative correlation 0.25 0.3 0.7 0.3 0.5 0.8 0.5 0.95 G

7 Dependent Randomized Rounding Randomized rounding while maintaining some dependency/correlation between variables

8 Dependent Randomized Rounding Randomized rounding while maintaining some dependency/correlation between variables Several variants in literature This talk : dependent randomized rounding to satisfy a matroid base constraint while retaining concentration bounds similar to independent rounding Briefly, related work on matroid intersection and non- bipartite graph matchings

9 Crossing Spanning Trees and ATSP Undir graph G=(V,E) Cuts S 1, S 2, …, S m Find spanning tree T that minimizes max # of edges crossing a given cut [Bilo-Goyal-Ravi-Singh-’04] [Fekete-Lubbecke-Meijer’04]

10 Crossing Spanning Trees and ATSP Undir graph G=(V,E) Cuts S 1, S 2, …, S m Find spanning tree T that minimizes max # of edges crossing a given cut [Asadpour etal] Solve LP: x point in spanning tree polytope of G Dependent rounding via maximum entropy sampling O(log m/log log m) approx Also O(log n/log log n) for ATSP (several other ideas) 1 0.4 0.6 0.9 0.4 0.3 1 0.7

11 Tool: Negative Correlation X 1, X 2 two binary ({0,1}) random variables X 1, X 2 are negatively correlated if E[X 1 X 2 ] ≤ E[X 1 ] E[X 2 ] That is, Pr[X 1 = 1 | X 2 = 1] ≤ Pr[X 1 = 1] and Pr[X 2 = 1 | X 1 = 1] ≤ Pr[X 2 = 1]

12 Tool: Negative Correlation X 1, X 2 two binary random variables X 1, X 2 are negatively correlated if E[X 1 X 2 ] ≤ E[X 1 ] E[X 2 ] That is, Pr[X 1 = 1 | X 2 = 1] ≤ Pr[X 1 = 1] and Pr[X 2 = 1 | X 1 = 1] ≤ Pr[X 2 = 1] Also implies (1-X 1 ), (1-X 2 ) are negatively correlated

13 Negative Correlation X 1, X 2,..., X n binary random variables X 1, X 2,..., X n are negatively correlated if for any index set J  {1,2,..., n} E[  i  J X i ] ≤  i  J E[X i ] and E[  i  J (1-X i )] ≤  i  J E[(1-X i )]

14 Negative Correlation and Concentration X 1, X 2,..., X n binary random variables that are negatively correlated (can be dependent) E[X i ] = Pr[X i = 1] = x i a 1, a 2,..., a n numbers in [0,1] μ = E[ Σ i a i X i ] = Σ i a i x i Theorem: [Panconesi-Srinivasan’ 97] Pr[ Σ i a i X i > (1+ δ ) μ ] ≤ ( e δ / (1+ δ ) δ ) μ Pr[ Σ i a i X i < (1 - δ ) μ ] ≤ exp(- μ δ 2 /2)

15 Connecting the dots... What is common between the two applications? Integer Program: min λ s.t A x ≤ λ b x is a base in a matroid A non-neg matrix, packing constraints Multipath: x corresponds to choosing k i paths for pair s i t i from P i Crossing tree: x induces a spanning tree congestion

16 Matroids M=(N, I ) where N is a finite ground set and I  2 N is a set of independent sets such that I is not empty I is downward closed: B  I and A  B  A  I A, B  I and |A| < |B| implies there is i  B\A such that A+i  I

17 Matroid Examples Uniform matroid: I = { S : |S| ≤ k } Partition matroid: I = { S : |S  N j | ≤ k j, 1 ≤ i ≤ h } where N 1,..., N h partition N, and k j are integers Graphic matroid: G = (V, E) is a graph and M=(E, I ) where I = { S  E : S induces a forest }

18 Bases in Matroid B  I is a base of a matroid M=(N, I) if B is a maximal independent set All bases have same cardinality Matroids can also be defined via bases Example: spanning trees in a graph

19 Base Exchange Theorem B’ and B’’ are distinct bases in a matroid M=(N, I) Strong Base Exchange Theorem: There are elements i  B’\B’’ and j  B’’\B’ such that B’-i+j and B’’-j+i are both bases. B’ B’’ B’  B’’ ij B’-i+j and B’’-j+i are both bases

20 Dependent Rounding in Matroids M = (N, I ) is a matroid with |N| = n B (M) is the base polytope: conv{ 1 B : B is a base} x is a fractional point in B (M) Round x to a random base B such that Pr[i  B] = x i for each i  N X i (indicator for i  B ) variables are negatively correlated

21 Our Work Two methods for arbitrary matroids: 1.Randomized pipage rounding for matroids [Calinescu-C-Pal-Vondrak’07,’09] 2.Randomized swap rounding [C-Vondrak-Zenklusen’09] This talk: randomized swap rounding

22 Randomized Swap Rounding Express x =  m j=1 β i B i (convex comb. of bases) C 1 = B 1, β = β 1 For k = 1 to m-1 do Randomly Merge β C k & β k+1 B k+1 into (β+β k+1 ) C k+1 Output C m

23 Swap Rounding 0.2 C 1 + 0.1 B 2 + 0.5 B 3 + 0.15 B 4 + 0.05 B 5 0.3 C 2 + 0.5 B 3 + 0.15 B 4 + 0.05 B 5 0.8 C 3 + 0.15 B 4 + 0.05 B 5 0.95 C 4 + 0.05 B 5 C 5 x = 0.2 B 1 + 0.1 B 2 + 0.5 B 3 + 0.15 B 4 + 0.05 B 5

24 0.9 0.3 1 0.4 0.6 0.4 1 0.7 0.3 0.6 0.1

25 Merging two Bases Merge B’ and B’’ into a random B that looks like B’ with probability p and like B’’ with probability (1-p)

26 Merging two Bases Merge B’ and B’’ into a random B that looks like B’ with probability p and like B’’ with probability (1-p) Option: Pick B’ with prob. p and B’’ with prob. (1-p) ? Will not have negative correlation properties!

27 Merging two Bases B’ B’’ B’  B’’ ij Base ExchangeTheorem: B’-i+j and B’’-j+i are both bases

28 Merging two Bases B’ B’’ B’  B’’ ij prob p prob 1-p B’ B’’ B’  B’’ ii B’ B’’ B’  B’’ j j p p 1-p

29 Merging Spanning Trees 0.3 0.6

30 Merging Spanning Trees 0.3 0.6 0.3 0.6 0.3 0.6 0.3/(0.3+0.6) 0.6/(0.3+0.6)

31 Swap Rounding for Matroids Theorem: Randomized-Swap-Rounding with x  B(M) outputs a random base B such that Pr [i  B] = x i for each i  N X i (indicator for i  B ) variables are negatively correlated Negative correlation gives concentration bounds for linear functions of the X i s

32 Swap Rounding for Matroids Theorem: Randomized-Swap-Rounding with x  B(M) outputs a random base B such that Pr [i  B] = x i for each i  N X i (indicator for i  B ) variables are negatively correlated Additional properties for submodular functions: E [f(B)] ≥ F(x) where F is multilinear extension of f Pr [ f(B) < (1- δ ) F(x)] ≤ exp(- F(x) δ 2 /8) (concentration for lower tail of submod functions)

33 Several Applications Can handle matroid constraint plus packing constraints x  B (M) and Ax ≤ b (1-1/e) approximation for submodular functions subject to a matroid plus O(1) knapsack/packing constraints (or many “loose” packing constraints) Simpler rounding and proof for “thin” spanning trees in ATSP application ([Asadpour etal’10])...

34 Proof idea for Negative Correlation Process is a vector-valued martingale : each iteration merges two bases merging bases involves swapping elements in each step In each step only two elements i and j involved

35 Proof idea for Negative Correlation In each step only two elements i and j involved X i, X j before swap step and X’ i, X’ j after swap step 1. E [X’ i | X i, X j ] = X i and E [X’ j | X i, X j ] = X j 2.X’ i + X’ j = X i + X j

36 Proof idea for Negative Correlation In each step only two elements i and j involved X i, X j before swap step and X’ i, X’ j after swap step 1. E [X’ i | X i, X j ] = X i and E [X’ j | X i, X j ] = X j 2.X’ i + X’ j = X i + X j E [X’ i X’ j |X i,X j ] = ¼ E [(X’ i +X’ j ) 2 | X i,X j ] − ¼ E [(X’ i - X’ j ) 2 | X i,X j ] = ¼ (X i +X j ) 2 − ¼ E [(X’ i - X’ j ) 2 | X i, X j ] ≤ ¼ (X i +X j ) 2 − ¼ (X i - X j ) 2 ≤ X i X j

37 Beyond matroids? Question: Can we obtain negative correlation for other combinatorial structures/polytopes?

38 Beyond matroids? Question: Can we obtain negative correlation for other combinatorial structures/polytopes? Answer: No. Negative correlation implies the polytope is “essentially” a matroid base polytope

39 Other Comments Swap rounding advantage: identifies exchange property as the key Idea generalizes/inspires work for other structures such as matroid intersection, and b-matchings with some restrictions Lower tail for submodular functions uses martingale analysis (does not follow from negative correlation) Negative correlation not needed for concentration

40 Do we need negative correlation for concentration? No. Lower tail for submodular functions shown via martingale method Also can show concentration for linear functions in the matroid intersection polytope and non-bipartite matching (a the loss of a bit in expectation)

41 Example: Rounding in bipartite-matching polytope x e = ½ on each edge Can we round x to a matching?

42 Example: Rounding in bipartite-matching polytope x e = ½ on each edge Can we round x to a matching? If we want to preserve expectation of x only choice is to pick one of two perfect matchings, each with prob ½ Large positive correlation!

43 Informal Statements For any point x in the bipartite matching polytope Can round x to a matching preserving expectation and negative correlation holds for edge variables incident to any vertex [Srinivasan’99] Can round x to a matching x’ s.t E[x’] = (1- γ ) x and concentration holds for any linear functions of x (the exponent in tail bound depends on γ ) [CVZ] Above results generalize to matroid intersection and non-bipartite matchings [CVZ]

44 Questions?

45 Thanks!

46 Submodular Functions Non-negative submodular set functions f(A) ≥ 0 for all A Monotone submodular set functions f( ϕ ) = 0 and f(A) ≤ f(B) for all A  B Symmetric submodular set functions f(A) = f(N\A) for all A

47 Multilinear Extension of f [CCPV’07] inspired by [Ageev-Sviridenko] For f: 2 N  R + define F:[0,1] N  R + as x = (x 1, x 2,..., x n )  [0,1] |N| F(x) = Expect[ f(x) ] =  S  N f(S) p x (S) =  S  N f(S)  i  S x i  i  N\S (1-x i )

48 Multilinear Extension of f For f: 2 N  R + define F:[0,1] N  R + as F(x) =  S  N f(S)  i  S x i  i  N\S (1-x i ) F is smooth submodular ([Vondrak’08])  F/  x i ≥ 0 for all i (monotonicity)  2 F/  x i  x j ≤ 0 for all i,j (submodularity)

49 Optimizing F(x) [Vondrak’08] Theorem: For any down-monotone polytope P  [0,1] n max F(x) s.t x  P can be optimized to within a (1- 1/e) approximation if we can do linear optimization over P Algorithm: Continuous-Greedy


Download ppt "Dependent Randomized Rounding in Matroid Polytopes (& Related Results) Chandra Chekuri Jan VondrakRico Zenklusen Univ. of Illinois IBM ResearchMIT."

Similar presentations


Ads by Google