Download presentation
Presentation is loading. Please wait.
2
Seminar 236813: Approximation algorithms for LP/IP optimization problems Reuven Bar-Yehuda Technion IIT Slides and papers at: http://www.cs.technion.ac.il/~cs236813
3
Example VC Given a graph G=(V,E) penalty p v Z for each v V Min p v ·x v S.t.: x v {0,1} x v + x u 1 {v,u} E
4
Linear Programming (LP) Integer Programming (IP) Linear Programming (LP) Integer Programming (IP) Given a profit [penalty] vector p. Maximize[Minimize] p·x Subject to:Linear Constraints F(x) IP: where “x is an integer vector” is a constraints
5
Example VC Given a graph G=(V,E) and penalty vector p Z n Minimize p·x Subject to: x {0,1} n x i + x j 1 {i,j} E
6
Example SC Given a Collection S 1, S 2,…,S n of all subsetsof {1,2,3,…,m} and penalty vector p Z n Minimize p·x Subject to: x {0,1} n x i 1 j=1..m j Si
7
Example Min Cut Given Network N(V,E) s,t V and capasity vector p Z |E| Minimize p·x Subject to: x {0,1} |E| x e 1 s t path P e P
8
Example Min Path Given digraph G(V,E) s,t V and length vector p Z |E| Minimize p·x Subject to: x {0,1} |E| x e 1 s t cut P e P
9
Example MST (Minimum Spanning Tree) Given graph G(V,E) and length vector p Z |E| Minimize p·x Subject to: x {0,1} |E| x e 1 cut P e P
10
Example Minimum Steiner Tree Given graph G(V,E) T V and length vector p Z |E| Minimize p·x Subject to: x {0,1} |E| x e 1 T’s cut P e P
11
Example Generalized Steiner Forest Given graph G(V,E) T 1 T 1 …T k V and length vector p Z |E| Min p·x S.t.: x {0,1} |E| x e 1 i T i ’s cut P e P
12
Example IS (Maximum Independent Set) Given a graph G=(V,E) and profit vector p Z n Maximaize p·x Subject to: x {0,1} n x i + x j 1 {i,j} E
13
Maximum Independent Set in Interval Graphs Maximum Independent Set in Interval Graphs Activity9 Activity8 Activity7 Activity6 Activity5 Activity4 Activity3 Activity2 Activity1 time Maximize s.t. For each instance I: For each time t:
14
The Local-Ratio Technique: Basic definitions The Local-Ratio Technique: Basic definitions Given a penalty [profit] vector p. Minimize [Maximize] p·x Subject to:feasibility constraints F(x) x is r-approximation if F(x) and p·x r · p·x* An algorithm is r-approximation if for any p, F it returns an r-approximation
15
The Local-Ratio Theorem: The Local-Ratio Theorem: x is an r-approximation with respect to p 1 x is an r-approximation with respect to p- p 1 x is an r-approximation with respect to p Proof: ( For minimization) p 1 · x r × p 1 * p 2 · x r × p 2 * p · x r × ( p 1 *+ p 2 *) r × ( p 1 + p 2 )*
16
The Local-Ratio Theorem: (Proof2) The Local-Ratio Theorem: (Proof2) x is an r-approximation with respect to p 1 x is an r-approximation with respect to p- p 1 x is an r-approximation with respect to p Proof2: ( For minimization) Let x*, x 1 *, x 2 * be optimal solutions for p, p 1, p 2 respectively p 1 · x r × p 1 x 1 * p 2 · x r × p 2 x 2 * p · x r × ( p 1 x 1 *+ p 2 x 2 * ) r × ( p 1 x*, + p 2 x*) = px*
17
Special case: Optimization is 1-approximation Special case: Optimization is 1-approximation x is an optimum with respect to p 1 x is an optimum with respect to p- p 1 x is an optimum with respect to p
18
A Local-Ratio Schema for Minimization[Maximization] problems: A Local-Ratio Schema for Minimization[Maximization] problems: Algorithm r-ApproxMin[Max]( Set, p ) If Set = Φ then return Φ ; If I Set p(I)=0 then return {I} r-ApproxMin( Set-{I}, p ) ; [If I Set p(I) 0 then return r-ApproxMax( Set-{I}, p ) ;] Define “good” p 1 ; REC = r-ApproxMax[Min]( Set, p- p 1 ) ; If REC is not an r-approximation w.r.t. p 1 then “fix it”; return REC;
19
The Local-Ratio Theorem: Applications Applications to some optimization algorithms (r = 1): ( MST) Minimum Spanning Tree (Kruskal) MST ( SHORTEST-PATH) s-t Shortest Path (Dijkstra) SHORTEST-PATH (LONGEST-PATH) s-t DAG Longest Path (Can be done with dynamic programming)(LONGEST-PATH) (INTERVAL-IS) Independents-Set in Interval Graphs Usually done with dynamic programming)(INTERVAL-IS) (LONG-SEQ) Longest (weighted) monotone subsequence (Can be done with dynamic programming)(LONG-SEQ) ( MIN_CUT) Minimum Capacity s,t Cut (e.g. Ford, Dinitz) MIN_CUT Applications to some 2-Approximation algorithms: (r = 2) ( VC) Minimum Vertex Cover (Bar-Yehuda and Even) VC ( FVS) Vertex Feedback Set (Becker and Geiger) FVS ( GSF) Generalized Steiner Forest (Williamson, Goemans, Mihail, and Vazirani) GSF ( Min 2SAT) Minimum Two-Satisfibility (Gusfield and Pitt) Min 2SAT ( 2VIP) Two Variable Integer Programming (Bar-Yehuda and Rawitz) 2VIP ( PVC) Partial Vertex Cover (Bar-Yehuda) PVC ( GVC) Generalized Vertex Cover (Bar-Yehuda and Rawitz) GVC Applications to some other Approximations: ( SC) Minimum Set Cover (Bar-Yehuda and Even) SC ( PSC) Partial Set Cover (Bar-Yehuda) PSC ( MSP) Maximum Set Packing (Arkin and Hasin) MSP Applications Resource Allocation and Scheduling : ….
20
The creative part… find -Effective weights p 1 is -Effective if every feasible solution is -approx w.r.t. p 1 i.e. p 1 ·x p 1 * VC (vertex cover) Edge Matching Greedy Homogenious
21
VC (V, E, p) If E= return ; If p(v)=0 return {v}+VC(V-{v}, E-E(v), p); Let (x,y) E; Let = min{p(x), p(y)}; Define p 1 (v) = if v=x or v=y and 0 otherwise; Return VC(V, E, p- p 1 ) VC: Recursive implementation (edge by edge) 0 0 0 0 0 0
22
VC: Iterative implementation (edge by edge) VC (V, E, p) for each e E; let = min{p(v)| v e}; for each v e p(v) = p(v) - ; return {v| p(v)=0}; 0 0 0 0 0 0
23
15 5 8 12 20 6 10 Min 5x Bisli +8x Tea +12x Water +10x Bamba +20x Shampoo +15x Popcorn +6x Chocolate s.t. x Shampoo + x Water 1
24
Movie: 1 4 the price of 2
25
VC: Iterative implementation (edge by edge) VC (V, E, p) for each e E; let = min{p(v)| v e}; for each v e p(v) = p(v) - ; return {v| p(v)=0}; 10 15 100 2 30 90 50 80
26
VC: Greedy ( O(H( )) - approximation) H( )=1/2+1/3+…+1/ = O(ln ) Greedy_VC (V, E, p) C = ; while E let v=arc min p(v)/d(v) C = C + {v}; V = V – {v}; return C; n/ n/4 n/3 n/2 n … … … …
27
VC: LR-Greedy (star by star) LR_Greedy_VC (V, E, p) C = ; while E let v=arc min p(v)/d(v) let = p(v)/d(v); C = C + {v}; V = V – {v}; for each u N(v) p(v) = p(v) - ; return C; 44
28
VC: LR-Greedy by reducing 2-effective homogenious Homogenious = all vertices have the same “greedy value” LR_Greedy_VC (V, E, p) C = ; Repeat Let = Min p(v)/d(v); For each v V p(v) = p(v) – d(v); Move from V to C all zero weight vertices; Remove from V all zero degree vertices; Until E= Return C; 44 66 44 55 33 33 33 22
29
Example MST (Minimum Spanning Tree) Given graph G(V,E) and length vector p Z |E| Minimize p·x Subject to: x {0,1} |E| x e 1 cut P e P
30
MST (V, E, p) If V= return ; If self-loop e return MST(V, E-{e}, p); If p(e)=0 return {e}+MST(V shrink(e), E shrink(e), p); Let = min{p(e) : e E}; Define p 1 (e) = for all e E; Return MST(V, E, p- p 1 ) MST: Recursive implementation (Homogenious)
31
MST (V, E, p) Kruskal MST: Iterative implementation (Homogenious)
32
Some effective weights VCISMSTS. PathSteinerFVSMin Cut Edge 2 Matching 2 Cycle 2-1/k Clique (k-1)/k Star 2 (k+1)/2 1 Homogenious 2k 1 22 Special trik1
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.