Presentation is loading. Please wait.

Presentation is loading. Please wait.

Reuven Bar-Yehuda Gleb Polevoy Gleb Polevoy Dror Rawitz Technion Technion 1.

Similar presentations


Presentation on theme: "Reuven Bar-Yehuda Gleb Polevoy Gleb Polevoy Dror Rawitz Technion Technion 1."— Presentation transcript:

1 Reuven Bar-Yehuda Gleb Polevoy Gleb Polevoy Dror Rawitz Technion Technion 1

2 The Model 2 Base Stations {1,2,…,i,…,n} Interferences  (i) <1 Users {1,2,…,j,…,m} Frequencies {1,2,…,t,…,f} User j has a set of bandwidth requests from base station i: R ij ={I ij1,…,I ijk,….} Each request ijk has a profit P ijk >0 Optimization problem: Allocating subsets of demands with maximum profit s.t: At most one demand per user All demands satisfied by a base station are independent. If t allocated by base i to user j then R ij i j

3 The Local-Ratio Technique: Basic definitions Given a profit [penalty] vector p. Maximize[Minimize] p·x Subject to:feasibility constraints F(x) x is r-approximation if F(x) and p·x  [  ] r · p·x* An algorithm is r-approximation if for any p, F it returns an r-approximation

4 The Local-Ratio Theorem: x is an r-approximation with respect to p 1 x is an r-approximation with respect to p- p 1  x is an r-approximation with respect to p Proof: ( For maximization) p 1 · x  r × p 1 * p 2 · x  r × p 2 *  p · x  r × ( p 1 *+ p 2 *)  r × ( p 1 + p 2 )*

5 Special case: Optimization is 1-approximation x is an optimum with respect to p 1 x is an optimum with respect to p- p 1 x is an optimum with respect to p

6 A Local-Ratio Schema for Maximization[Minimization] problems: Algorithm r-ApproxMax[Min]( Set, p ) If Set = Φ then return Φ ; If  I  Set p(I)  0 then return r-ApproxMax( Set-{I}, p ) ; [ If  I  Set p(I)=0 then return {I}  r-ApproxMin( Set-{I}, p ) ; ] Define “good” p 1 ; REC = r-ApproxMax[Min]( S, p- p 1 ) ; If REC is not an r-approximation w.r.t. p 1 then “fix it”; return REC;

7 The Local-Ratio Theorem: Applications Applications to some optimization algorithms (r = 1): ( MST) Minimum Spanning Tree (Kruskal) MST ( SHORTEST-PATH) s-t Shortest Path (Dijkstra) SHORTEST-PATH (LONGEST-PATH) s-t DAG Longest Path (Can be done with dynamic programming)(LONGEST-PATH) (INTERVAL-IS) Independents-Set in Interval Graphs Usually done with dynamic programming)(INTERVAL-IS) (LONG-SEQ) Longest (weighted) monotone subsequence (Can be done with dynamic programming)(LONG-SEQ) ( MIN_CUT) Minimum Capacity s,t Cut (e.g. Ford, Dinitz) MIN_CUT Applications to some 2-Approximation algorithms: (r = 2) ( VC) Minimum Vertex Cover (Bar-Yehuda and Even) VC ( FVS) Vertex Feedback Set (Becker and Geiger) FVS ( GSF) Generalized Steiner Forest (Williamson, Goemans, Mihail, and Vazirani) GSF ( Min 2SAT) Minimum Two-Satisfibility (Gusfield and Pitt) Min 2SAT ( 2VIP) Two Variable Integer Programming (Bar-Yehuda and Rawitz) 2VIP ( PVC) Partial Vertex Cover (Bar-Yehuda) PVC ( GVC) Generalized Vertex Cover (Bar-Yehuda and Rawitz) GVC Applications to some other Approximations: ( SC) Minimum Set Cover (Bar-Yehuda and Even) SC ( PSC) Partial Set Cover (Bar-Yehuda) PSC ( MSP) Maximum Set Packing (Arkin and Hasin) MSP Applications Resource Allocation and Scheduling : ….

8 Fatal interference, one request per user I99 I88 I77 I66 I55 I44 I33 I22 I11 Maximize s.t: For each instance I: For each time t: R ij = {I ij } i j

9 Fatal interference, one request per user : How to select P 1 to get optimization? Activity9 Activity8 Activity7 Activity6 Activity5 Activity4 Activity3 Activity2 Activity1 Î time Let Î be an interval that ends first; 1 if I in conflict with Î For all intervals I define: p 1 (I) = 0 else For every feasible x: p 1 ·x  1 Every Î- maximal is optimal. For every Î- maximal x: p 1 ·x  1 P1=1P1=1 P1=1P1=1 P1=1P1=1 P1=1P1=1 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0

10 Fatal interference, one request per user: An Optimization Algorithm Activity9 Activity8 Activity7 Activity6 Activity5 Activity4 Activity3 Activity2 Activity1 Î time Algorithm MaxIS( S, p ) If S = Φ then return Φ ; If  I  S p(I)  0 then return MaxIS( S - {I}, p); Let Î  S that ends first;  I  S define: p 1 (I) = p(Î)  (I in conflict with Î) ; IS = MaxIS( S, p- p 1 ) ; If IS is Î- maximal then return IS else return IS  {Î}; P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0 P 1 =P (Î )

11 Fatal interference, one request per user : Running Example P(I 1 ) = 5 -5 P(I 4 ) = 9 -5 -4 P(I 3 ) = 5 -5 P(I 2 ) = 3 -5 P(I 6 ) = 6 -4 -2 P(I 5 ) = 3 -4 -5 -4 -2

12 Single Machine Scheduling : Activity9 Activity8 Activity7 Activity6 Activity5 Activity4 Activity3 Activity2 Activity1 ????????????? time Maximize s.t. For each instance I: For each time t: For each activity A: Bar-Noy, Guha, Naor and Schieber STOC 99: 1/2 LP Berman, DasGupta, STOC 00: 1/2 This Talk, STOC 00(Independent) 1/2

13 Single Machine Scheduling: How to select P 1 to get ½-approximation ? Activity9 Activity8 Activity7 Activity6 Activity5 Activity4 Activity3 Activity2 Activity 1 Î time Let Î be an interval that ends first; 1if I in conflict with Î 1 if I in conflict with Î For all intervals I define: p 1 (I) = 0 else For every feasible x: p 1 ·x  2 Every Î- maximal is 1/2-approximation For every Î- maximal x: p 1 ·x  1 P1=1P1=1P1=1P1=1P1=1P1=1P1=1P1=1 P1=1P1=1 P1=1P1=1 P1=1P1=1 P1=1P1=1 P1=1P1=1 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0 P1=0P1=0P1=0P1=0 P1=0P1=0

14 The ½-approximation Algorithm Activity9 Activity8 Activity7 Activity6 Activity5 Activity4 Activity3 Activity2 Activity1 Î time Algorithm MaxIS( S, p ) If S = Φ then return Φ ; If  I  S p(I)  0 then return MaxIS( S - {I}, p); Let Î  S that ends first;  I  S define: p 1 (I) = p(Î)  (I in conflict with Î) ; IS = MaxIS( S, p- p 1 ) ; If IS is Î- maximal then return IS else return IS  {Î};


Download ppt "Reuven Bar-Yehuda Gleb Polevoy Gleb Polevoy Dror Rawitz Technion Technion 1."

Similar presentations


Ads by Google