Presentation is loading. Please wait.

Presentation is loading. Please wait.

Non-LP-Based Approximation Algorithms Fabrizio Grandoni IDSIA

Similar presentations


Presentation on theme: "Non-LP-Based Approximation Algorithms Fabrizio Grandoni IDSIA"— Presentation transcript:

1 Non-LP-Based Approximation Algorithms Fabrizio Grandoni IDSIA fabrizio@idsia.ch

2 Some Techniques We will next describe a few techniques in the design of approximation algorithms: 1.Greedy heuristics 2.Local Search 3.Dynamic Programming 4.Randomization We intentionally skip the LP-based techniques, which are of major relevance in approximation algorithms and will be discussed separately

3 Randomization

4 Randomization A randomized algorithm is an algorithm that has access to a source of random bits that can be used to make decisions These algorithms are required to find the desired solution with large enough probability A typical advantage of randomized algorithms is that they are simple(r) to define and analyze A typical goal of a randomized approximation algorithm is to compute a solution that is always feasible and whose expected cost is close to the optimum

5 Maximum Satisfiability

6 Max-SAT Def (Max-SAT): given a CNF formula F, find a truth assignment that satisfies the largest number of clauses Def (Max-3SAT): the special case of Max-SAT where all clauses contain exactly 3 literals ( x V y V z ) ∧ ( x V y V z ) ∧ ( y V z ) ∧ ( z ) _ _ _ OPT=3

7 Max-3SAT Def: the algorithm on the right is a 8/7 apx for Max- 3SAT 1.Set each variable independently to true with probability ½ and to false otherwise Consider a given clause c c is false iff its 3 literals are false The latter event happens with probability (1/2) 3 =1/8 By linearity of expectation, the expected number of satisfied clauses in the approximate solution APX is 7/8  m, where m is the number of clauses Trivially OPT ≤ m Altogether, E[APX] ≥ 7/8  m ≥7/8  OPT Rem: the above argument shows that it is always possible to satisfy a fraction 7/8 of the clauses

8 Derandomization We might prefer a deterministic algorithm to a randomized one There are a few techniques that allow one to turn some randomized algorithms into deterministic ones ( derandomization techniques ) One of the simplest and most useful such techniques is the method of conditional expectations

9 Method of Conditional Expectations Let us focus on a maximization problem The algorithm uses random bits B 1,...,B h. Let APX(b 1,...,b i ) be the cost of the solution computed assuming (B 1,...,B i )=(b 1,...,b i ): suppose we are able to compute E[APX(b 1,...,b i )] Note that E[APX(b 1,..,b i-1 )]=Pr[B i =0]  E[APX(b 1,.., b i-1,0)+Pr[B i =1]  E[APX(b 1,.., b i-1,1)] Hence max{E[APX(b 1,..., b i-1,0),E[APX(b 1,..., b i-1,1)} ≥ E[APX(b 1,...,b i-1 )] Let us compute inductively (and deterministically!) b* i so that max{E[APX(b* 1,..., b* i-1,0),E[APX(b* 1,..., b* i-1,1)} = E[APX(b* 1,...,b* i )] This way we guarantee that E[APX(b* 1,...,b* i )] ≥ E[APX(b* 1,...,b* i-1 )] Altogether we obtain deterministically a solution of cost APX(b* 1,..., b* h ) = E[APX(b* 1,..., b* h )] ≥... ≥ E[APX(b* 1 )] ≥ E[APX]

10 Derandomization for Max-3SAT Let F(b 1,...,b i ) be the simplified formula where we fix (x 1,...,x i ) = (b 1,...,b i ) k’ clauses become true and the remaining clauses might lose some literals, which are set to false (a clause might become empty, hence false) It is easy to compute the expected number k” of satisfied clauses in F(b 1,...,b i ) with a random assignment of the remaining variables: a clause with h≥1 remaining literals is not satisfied with probability 1/2 h One has E[APX(b 1,...,b i )] = k’+k” Therefore we can apply the method of conditional expectations to obtain a deterministic 8/7 approximation Thr [Hastad’01]: there is no 8/7-ε approximation for 3SAT unless P=NP

11 Max-SAT Rem: The same randomized algorithm provides a 2 apx for Max-SAT. The worst case is given by unit-clauses, i.e. clauses with one literal Idea2: Use a biased coin Idea1: Let us assume for the sake of simplicity that there are no duplicated clauses. Suppose there are k pairs of unit clauses of the type (x) and (x). Then OPT ≤ m-k _

12 Max-SAT 1.For each pair of clauses (x) and (x), delete (x) 2.Rename variables so that no clause of type (x) exists 3.Set each variable independently to true with probability p=(√5-1)/2 and to false otherwise _ _ Def: the algorithm on the right is a 2/(√5-1) ≅ 1.618 apx for Max-SAT A unit clause is satisfied with probability p A clause with h≥2 literals, among which k are positive, is satisfied with probability 1-(1-p) k p h-k ≥ 1-p h ≥ 1-p 2 (since 1≥ p>1/2) Therefore in expectation we satisfy a fraction at least min{p,1-p 2 } = p of the remaining m-k ≥ OPT clauses Hmw: derandomize this algorithm Hint: the conditional expectations method works also with biased bits _

13 Item Pricing

14 In a typical item pricing problem, we are given a collection of items that we want to sell A set of clients is interested in buying a subset of items Clients have budget constraints We have to fix prices on items and assign items to clients so that budget constraints are respected, and we maximize the total profit Rem: In the unlimited supply version of the problem we have an unbounded number of copies of each item ( e.g., digital goods) Rem: In the single-minded version of the problem, each client wants to buy a whole subset of items ( bundle ) or nothing Rem: In the envy-free version of the problem, if a client can pay for the (bundle of) items she wants, she must get the items Item Pricing

15 1$ 2$1$ ?$ Def (Item Pricing): given a set S of n item types and m clients, where each client i has a budget B(i)≥0 and a bundle S(i)  S, assign a price p(j)≥0 to each item j so at to maximize the profit  i:p(S(i))≤B(i) p(S(i)), where p(S(i))=  j  S(i) p(j) Item Pricing

16 ?$ Exe: Give a polynomial time algorithm for the case that each bundle has size 1 Item Pricing

17 ?$ price profit 1 2 4 4$ Exe: Give a polynomial time algorithm for the case that each bundle has size 1 Item Pricing

18 Vertex Pricing Def (Vertex Pricing): special case of item pricing where each bundle has size 2

19 Vertex Pricing 1.Bipartition randomly the nodes, and delete the edges not crossing the partition 2.Set to zero the prices of one random side of the partition 3.Choose optimally the remaining prices

20 price profit 2 4 Vertex Pricing 1.Bipartition randomly the nodes, and delete the edges not crossing the partition 2.Set to zero the prices of one random side of the partition 3.Choose optimally the remaining prices

21 Vertex Pricing 1.Bipartition randomly the nodes, and delete the edges not crossing the partition 2.Set to zero the prices of one random side of the partition 3.Choose optimally the remaining prices Def: the algorithm on the right is a 4 apx for Vertex Pricing By the initial random bipartition we lose each edge with probability ½ By linearity of expectation, the optimal solution for the residual instance has profit OPT/2 in expectation By setting to zero the price of a random side, in expectation we lose ½ of the profit of each edge By linearity of expectation, the optimal solution for the residual instance has profit OPT/4 in expectation The residual instance is equivalent to a problem with bundles of size 1, that we can solve optimally Rem: this is still the best known!

22 Def (Item Pricing): given a set S of n item types and m clients, where each client i has a budget B(i)≥0 and a bundle S(i)  S, assign a price p(j)≥0 to each item j so at to maximize the profit  i:p(S(i))≤B(i) p(S(i)), where p(S(i))=  j  S(i) p(j) Item Pricing Hmw: Suppose all bundles have size k=O(1). Give a constant (depending on k) approximation algorithm for this case


Download ppt "Non-LP-Based Approximation Algorithms Fabrizio Grandoni IDSIA"

Similar presentations


Ads by Google