Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algorithms for Max-min Optimization

Similar presentations


Presentation on theme: "Algorithms for Max-min Optimization"— Presentation transcript:

1 Algorithms for Max-min Optimization
Anupam Gupta Carnegie Mellon University based on works with Viswanath Nagarajan (IBM Watson) and R.Ravi (CMU), Aaron Roth (MSR/Penn), Grant Schoenebeck (Princeton), and Kunal Talwar (MSR)

2 maximization problems
Given: universe U of elements f(S) max s.t. S in I collection I of “independent” subsets “value” function f:2U  non-negative values When can we solve this? Example #1: Universe = vertices of graph “independent” set = no edge between two vertices f(S) = v in S wv max-weight independent set

3 maximization problems
Given: universe U of elements f(S) max s.t. S in I collection I of “independent” subsets “value” function f:2U  non-negative values When can we solve this? Example #2: Universe = edges of graph “independent” set = no cycles within the edges f(S) = e in S we max-weight spanning tree

4 general outline of this talk
A quick survey on linear and submodular maximization I.e., when f(S) is either linear or submodular. What is max-min optimization? I.e., cases where f(S) is defined by a minimization (covering) problem. A couple of general theorems about these topics, the ideas behind some of the algorithms, and a couple of simple proofs. f(S) max s.t. S in I

5 case I: linear fn maximization
f(S) max s.t. S in I linear case: f(S) = e in S we If I forms a matroid, we can solve this exactly! recall: a family of independent sets I forms a matroid if a) it is closed under taking subsets b) if A and B independent and |A| < |B|, there is an element e in B such that A+e independent. In fact, the greedy algorithm suffices… e.g., max-weight spanning tree

6 case I: linear fn maximization
f(S) max s.t. S in I linear case: f(S) = e in S we If I is the intersection of two matroids, also solvable e.g., max-weight bipartite matching max-weight arborescence The greedy algorithm is not the best any more… … but it is a 2-approximation

7 case I: linear fn maximization
f(S) max s.t. S in I linear case: f(S) = e in S we If I is the intersection of p > 2 matroids, NP hard in general But the greedy algorithm is still a p-approximation! can do slightly better via a local-search algorithm…

8 case II: submodular maximization
f(S) max s.t. S in I submodular function f Recall f is submodular if f(subset + e) – f(subset) f(superset + e) – f(superset) f is monotone if f(subset) f(superset)

9 two quick examples Given a collection of sets A1, A2, …, Am universe = {1, 2, …, m} f({1, 3, 7}) = | A1 [ A3 [ A7 | Given an undirected graph G = (V,E) universe = V f(S) = # of edges going from S to V\S in the graph. The “set-coverage” function is monotone submodular. The cut function in graphs is (non-monotone) submodular.

10 case II: submodular maximization
f(S) max s.t. S in I monotone submodular function f If I is the intersection of p matroids the greedy algorithm is a (p+1)-approximation! S = { } While  e such S + e is independent pick the element e* which maximizes f(S+e*) – f(S)

11 the greedy algorithm: a proof
Suppose we want to maximize monotone submodular f(S) such that I = { S ½ U | S contains at most k elements } Theorem: f(Sfinal) ≥ ½ f(Sfinal [ OPT) Proof: Suppose not. Cardinality constraint ≥ ½ f(OPT) n.b. use monotonicity only in the last step. Then f(Sfinal) < ½ f(Sfinal [ OPT)  f(Sfinal [ OPT) – f(Sfinal) > f(Sfinal) k f(Sfinal)   e in OPT\S s.t. f(Sfinal + e) – f(Sfinal) > k f(Sfinal)  at every step of greedy, f(S + e) – f(S) > So greedy’s improvement must have been at least as much. Contradiction

12 case II: submodular maximization
f(S) max s.t. S in I non-monotone submodular function f If I is 2U (there are no constraints at all) already the problem is APX-hard! However, the following algorithm is an 4-approximation: S = a random subset of the universe U. again, can do better (won’t talk about it today)…

13 case II: submodular maximization
f(S) max s.t. S in I non-monotone submodular function f If I is the intersection of p matroids the greedy algorithm is not good! However, the following algorithm is an O(p) approximation: S1 ← Greedy(U) Find the subset T of S1 maximizing f(T) S2 ← Greedy(U\S1) Return the better of T and S2

14 the greedy algorithm: non-monotone proof
for cardinality constraints f(S1) ≥ ½ f(S1 [ OPT) (from analysis of monotone case) f(T) ≥ ¼ f(S1 \ OPT) (from unconstrained max.) f(S2) ≥ ½ f(S2 [ (OPT \ S1)) (from monotone case again) ≥ f(OPT) max{ f(T) , f(S2) } ≥ f(OPT)/12 S1 ← Greedy(U) Find the subset T of S1 maximizing f(T) S2 ← Greedy(U\S1) Return the better of T and S2

15 quick recap f(S) max s.t. S in I If f is submodular
If I is the intersection of p matroids then we can get an O(p) approximation. what can we do for more general functions f( )? for the rest of the talk, just focus on cardinality constraints I = { S ½ U, |S| ≤ k } (some of our results will extend to more general constraints)

16 max-min covering problems
Fix a graph G universe = set of nodes fST(S) = min-cost Steiner tree on S fMC(S) = min-cost cut separating S from root r Fix a collection of sets A1, A2, …, Am fSC(S) = size of min-cost set cover on S using these sets Ai f(S) max s.t. |S|≤ k “Which k nodes is most expensive to connect up”? “Which k nodes are most expensive to disconnect from r”? “Which k elements are most expensive to cover”? Of course, if f(S) is submodular (or close), we can use previous results… Can prove, e.g., fSC is very far from any submodular function

17 max-min covering: results (1)
For general covering problems… Theorem: if you can solve some covering problem offline and you can solve the problem online (deterministically)  you can solve the max-min version of the problem. Usually offline × online approximation. f(S) max s.t. |S|≤ k These algos extend to intersections of p matroids and q knapsacks…

18 max-min covering: results (2)
For the cardinality max-min: O(log m + log n)-approximation for set cover O(1)-approximation for Steiner tree/forest O(log2 n)-approximation for multicut all using the “same” algorithmic idea… f(S) max s.t. |S|≤ k

19 what’s the algorithm? “cost(S)” Recall: want to find S with |S| ≤ k that maximizes f(S) Generic Idea: call this value T* For each value T do the following: While some element costs “much more” than T/k to satisfy, add it to a set XT Talg = smallest T such that cost(XT) not “much more” than T For proof, need to show that a) for T ≈ T*, we have cost(XT) ≈ T* b) for T a bit smaller than T*, XT contains S of size k, such that cost(S) > T.  all k-sets cost < O(T*)  exists a k-set with cost ≥ (T*)

20 rest of the talk Generic Algorithm:
Show how generic algorithm applies to: maxmin Steiner tree (time permitting) maxmin set cover For each value T do the following: While some element costs “much more” than T/k to satisfy, add it to a set XT Talg = smallest T such that cost(XT) not “much more” than T

21 Steiner tree Given a metric space (V, ℓ) and root r and a “scenario” S of terminals find the least cost network connecting S to r Results: MST is a 2-approximation. polytime 1.39-approx. algo [Byrka Grandoni Rothvoss Sanita ’10] APX-hard [Bern Plassmann ‘89]

22 maxmin Steiner tree SteinerTree(S)
Given a metric space (V, ℓ) and root r find S approximately achieving max s.t. |S|≤ k SteinerTree(S)

23 the precise algorithm Algorithm: For each value T do:
add the root r to XT. while exists a vertex v such that distance(v, XT) > 4T/k add v to XT find smallest T such that MST(XT) < 4T. Fact: cost of connecting any set of k guys to MST(XT) ≤ 4T. Corollary: cost of connecting any set of k guys ≤ 4T + MST(XT).

24 proof that any k-subset can be connected at cost < 8T*
the analysis(1) proof that any k-subset can be connected at cost < 8T* Theorem 1: For T = T*, cost of MST(XT) < 4T. Fact: cost of connecting any k guys ≤ 4T + MST(XT). Lemma 2: For T = T*/8, cost of MST(XT) ≥ 4T. Proof: if not, we could connect up any k points at cost < 8T = T*. if |XT| ≤ k, it is a set with cost ≥ 4T = T*/2 and small size have found k-subset with cost > T*/4 if |XT| is large, pick any k points from it. their mutual distance > 4T/k, so Steiner-tree cost > 2T = T*/4.

25 remains to prove… optimal tree on XT |XT| × 4T*/k × ½ ≤ ≤
Theorem 1: For T = T*, cost of MST(XT) < 4T. optimal tree on XT |XT| × 4T*/k × ½ ≤ |XT|/k × T* ≤ (|XT|/k + 1) × T* ≤ 2T* |XT| × T*/k ≤ T*  |XT| ≤ k MST is a 2-approximation, hence the 4T*

26 wrapping up maxmin Steiner
Algorithm: For each value T do: add the root r to XT. while exists a vertex v such that distance(v, XT) > 4T/k add v to XT Talg = smallest T such that MST(XT) < 4T. This value Talg shows all k-subsets can be connected at cost O(Talg) the set XT for Talg/2 gives a k-subset of cost (Talg) so, constant approximation.

27 rest of the talk Generic Algorithm:
Show how generic algorithm applies to: maxmin Steiner tree (time permitting) maxmin set cover For each value T do the following: While some element costs “much more” than T/k to satisfy, add it to a set XT Talg = smallest T such that cost(XT) not “much more” than T

28 to conclude f(S) max s.t. S in I
studied constrained maximization problems good algorithms for submodular cases algorithms for max-min problems (f = minimization covering problem) O(log m + log n)-approximation for Set Cover also for Steiner tree, Steiner forest, min-cut, multicut “same” simple algorithm “dual-rounding” analysis for which other functions can we do constrained maximization well? papers are on the arxiv thanks!! f(S) max s.t. S in I


Download ppt "Algorithms for Max-min Optimization"

Similar presentations


Ads by Google