Presentation is loading. Please wait.

Presentation is loading. Please wait.

Resparsification of Graphs

Similar presentations


Presentation on theme: "Resparsification of Graphs"— Presentation transcript:

1 Resparsification of Graphs
Richard Peng Georgia Tech Rasmus Kyng Yale Jakub Pachocki Harvard Sushant Sachdeva Google  U of Toronto 1

2 OUtline Graph Sparsification The Resparsification Game
Concentration bounds (Matrix) Martingales

3 Graph Sparsification Reduce edge count to while preserving some property Any undirected graph can be `sparsified’ to O(nlogn) edges while preserving: Distance: spanners [Peleg `89] Cuts / flows [BK `96] Spectrum / operator [ST `04, BSS `09] This talk Error is relative, will assume ε = constant for rest of talk

4 Spectral Sparsification
Goal: approximate the graph Laplacian matrix spectrally 1 Graph Laplacian L Diagonal: degree Off-diagonal: -weights 2 3 Goal: (1-ε)xTLGx ≤ xTLHx ≤ (1+ε)xTLHx for all x Implies: All cuts in G and H are similar LH is good preconditioner for LG

5 Applications of Spectral sparsifiers
Lx=b Solving linear systems Combinatorial optimization Data structures Common algorithmic use: trade graph density/size with accuracy/iteration count via (recursive) algorithmic calls

6 Resparsification Common in many algorithms: sequence of graph sparsifiers, each constructed based on previous Issue: O(1)t approximation after t steps Standard fix: set (relative) error to 1/t Our main result: for spectral / cut sparsifiers, this overhead is not necessary

7 Application: semi-streaming sparsification
Edges arrive in some order, maintain a sparisfier using O(nlogO(1)n) space [Anh-Guha `09]: cut sparsification [McGregor] Bucketing: O(nlog3n) [KL`12]: O(nlog2n) space [CMP`16]: O(nlognlog(Wmax/Wmin)) Implication of our result: resparsify when > O(nlogn) edges works O(nlogn) space Amortized work: O(log2n) per edge discarded

8 OUtline Graph Sparsification The Resparsification Game
Concentration bounds (Matrix) Martingales

9 Importance Sampling of Graphs
Get H by keeping each edge of G w.p. pe, Need E[LH] = LG: rescale e by 1 / pe if kept Simple scheme: uniform, pen/m Degree sampling: for e=uv, pe  1/du + 1/dv Works well on expanders Problem: long path, removing any edge changes connectivity (hard part: graphs with both structures)

10 Sampling By Effective Resistances
[Spielman-Srivastava `08] any pe ≥ O(logn) × weight × effective resistance produces spectral approximation w.h.p. weight × effective resistance: Proportional to commute time Fraction of spanning trees involving e Statistical leverage scores

11 Resparsification game
Attempt to break things adverserially Iteratively: (A)dversary picks an edge B tosses a coin p ≥ min(1, O(logn were)) If `discard’, remove edge. If `keep’, we’  we/p A B (A)dversary can choose e based on the current graph, just not future coins. B computes were based on the current graph Our result: w.h.p. (A)dversary incurs O(1) relative error

12 Application: Combinatorial Sparsifiers
[Notes by Spielman]: “just need a way to identify many edges of low effective resistance … better algorithms for doing this remain to be found.” [Koutis`14]: O(log2n) spanners allow one to sparsify graph with m edges to one with O(m/2 + nlog2n) edges [ADKKP `16]: maintain this sparsifiers for dynamic graphs m large: need to repeat O(logn) times Our result implies: errors don’t accumulate We also simplify the construction in this paper

13 OUtline Graph Sparsification The Resparsification Game
Concentration bounds (Matrix) Martingales

14 Sparsification via Matrix CONCENTRATION
Goal: approximate the graph Laplacian matrix spectrally 1 Graph Laplacian L Diagonal: degree Off-diagonal: -weights 2 3 Matrix concentration bounds: convergence of this process Each element is a positive semi-definite matrix

15 (one Step) Matrix Concentration
[Tropp `12]: Relative error w.h.p. bounded by O(logn) × sum of variance Xe: additive (matrix) error of coin e, = keep: (1/p-1) Le w.p. p discard: Le w.p. 1-1/p Variance dominated by the `keep’ case: 1/pe (LG-1/2XeLG-1/2)2 =1/pe LG-1/2LeLG-1LeLG-1/2

16 Role Played by Resistance
Σ Variance ≤ O(1) × Σe 1/pe L-1/2LeL-1LeL-1/2 Alternate definition of were: maxr s.t. r × Le ≤ L Implication of pe≥O(logn were): 1/pe LeL-1Le ≤ 1/O(logN) Le Sum of variances: Σe 1/pe L-1/2LeL-1LeL-1/2 ≤ 1/O(logN) Σe L-1/2LeL-1/2 = 1/O(logN) I

17 OUtline Graph Sparsification The Resparsification Game
Concentration bounds (Matrix) Martingales

18 A B Scalar Martingales Simplification: only use pe = 1/2
Have positive number x1…xm s.t. x1 + … + xm = 1, Iteratively: A picks some xi < 1/O(logn) B toss some fair coin, and Remove if `heads’. Replace with 2xi if `tails’. A B Can check that this stays close to 1 w.h.p.

19 A B What this is Not Start with (and keep) a sum s Iteratively:
B partitions s = x1 + … + xm A picks some xi < 1/O(logn) B toss some fair coin, and Remove if `heads’. Replace with 2xi if `tails’. A B Error compounds in this version Black-box use of sparsification can be viewed as this

20 Difference Between These Versions
Every time we `keep’ an edge, its weight gets larger Toss coins on green Higher weighted edges have less `potential’ for errors

21 Analysis: matrix martingales
Freedman’s Inequality: probability of a factor t relative error distortion in a sequence of (possibly dependent) coin flips bounded by: n exp(-t2 / (Rt + σ2)), where R is max `relative size’ of a sample, and σ2 is the total variance. As with matrix Chernoff, t/R ≤ O(logn) Suffices to bound sum of variance over the i coin tosses,║ Σi (L-1/2XiL-1/2)2 ║ 2

22 Bounding Variance Per Edge
we Weight doubles every time ‘kept’ Variance increases 4× w.p. ½ Geometric sum dominated by last term, O(variance of last step) 2we Variance of one step sampling with pe=O(lognwere) 4we Total: O(1) overhead, absorbed by the O(logn) sampling multiplier 8we

23 Open Questions Extend these ideas to turnstile streaming (which allow for removal of edges)? Analyze resparsification for cut sparsifiers? Better combinatorial spectral sparsification algorithms: our routine really takes O(nlog4n). Semi-streaming sparsification in O(n) memory?


Download ppt "Resparsification of Graphs"

Similar presentations


Ads by Google