Presentation is loading. Please wait.

Presentation is loading. Please wait.

Density Independent Algorithms for Sparsifying

Similar presentations


Presentation on theme: "Density Independent Algorithms for Sparsifying "β€” Presentation transcript:

1 Density Independent Algorithms for Sparsifying π‘˜-Step Random Walks
Gorav Jindal, Pavel Kolev (MPI-INF) Richard Peng, Saurabh Sawlani (Georgia Tech) August 18, 2017

2 Talk Outline Definitions Our result Sparsification by Resistances
Random walk graphs Our result Sparsification by Resistances Walk sampling algorithm 8/18/2017

3 Spectral Sparsification
Sparsification: Removing (many) edges from a graph while approximating some property Spectral properties of the graph Laplacian! 𝐿 𝐺 = 2 βˆ’2 0 βˆ’2 3 βˆ’1 0 βˆ’1 1 Graph Laplacian =π·βˆ’π΄ e.g.: 𝐺= 2 1 Formally, find sparse 𝐻 s.t.: 1βˆ’πœ– π‘₯ 𝑇 𝐿 𝐺 π‘₯ ≀ π‘₯ 𝑇 𝐿 𝐻 π‘₯≀ 1+πœ– π‘₯ 𝑇 𝐿 𝐺 π‘₯ βˆ€π‘₯ This preserves eigenvalues, cuts. 8/18/2017

4 Applications of Sparsification
Process huge graphs faster Dense graphs that arise in algorithms: Partial states of Gaussian elimination π‘˜-step random walks. 8/18/2017

5 Random Walk Graphs Walk along edges of 𝐺.
When at vertex 𝑣, choose the next edge 𝑒 with probability proportional to 𝑀(𝑒) a u Pr π‘’β†’π‘Ž = 𝑀(π‘’π‘Ž) 𝑀 π‘’π‘Ž +𝑀 𝑒𝑏 +𝑀(𝑒𝑐) = 𝑀(π‘’π‘Ž) 𝐷(𝑒) b c Each step = 𝐷 βˆ’1 𝐴 k step transition matrix = ( 𝐷 βˆ’1 𝐴) π‘˜ 8/18/2017

6 Random Walk Graphs Special case: 𝐷 = 𝐼 (for this talk)
Weights become probabilities Transition of k-step walk matrix: 𝐴 π‘˜ Laplacian πΌβˆ’ 𝐴 π‘˜ 8/18/2017

7 Random Walk Graphs Example: 𝐺= 𝐺 2 = .68 .68 .2 .8 .32 .32 .68 .8 .2
b b .32 𝐺= 𝐺 2 = .32 .68 .8 Make numbers bigger .2 c c .68 8/18/2017

8 Our Result Assume πœ– is constant 𝑂 hides log log terms Running time
Comments Spielman Srivastava β€˜08 𝑂 (π‘š log 2.5 𝑛) Only π‘˜ = 1 Kapralov Panigrahi β€˜11 𝑂 (π‘š log 3 𝑛) π‘˜=1, combinatorial Koutis Levin Peng β€˜12 𝑂 (π‘š log 2 𝑛) 𝑂 (π‘š+𝑛 log 10 𝑛) Peng-Spielman `14, Koutis β€˜14 𝑂 (π‘š log 4 𝑛) π‘˜ ≀ 2, combinatorial Highlight density independent ones Cheng, Cheng, Liu, Peng, Teng ’15 𝑂 ( π‘˜ 2 π‘š log 𝑂(1) 𝑛) π‘˜β‰₯1 Jindal, Kolev ’15 𝑂 (π‘š log 2 𝑛+𝑛 log 4 𝑛 log 5 π‘˜) Only π‘˜ = 2𝑖 Our result O (m+ k 2 n log 4 n) π‘˜β‰₯1 O (m+ k 2 n log 6 n) combinatorial

9 Density Independence Only sparsify when m >> size of sparsifier.
O (m+ k 2 n log 4 n) Only sparsify when m >> size of sparsifier. SS`08 + KLP `12: 𝑂(𝑛 log 𝑛 ) edge sparsifer in 𝑂(π‘š log 2 𝑛 ) time Actual cost at least: 𝑂(𝑛 log 3 𝑛 ) Density independent: 𝑂 π‘š + 𝑛⋅overhead Clearer picture of runtime

10 Algorithm Sample an edge in 𝐺 Pick an integer 𝑖 u.a.r. between 0 and k
Walk 𝑖 steps from 𝑒 and kβˆ’1βˆ’π‘– steps from 𝑣 Add the corresponding edge in 𝐺 π‘˜ to sparsifier (with rescaling) Walk sampling has analogs in: Personalized page rank algorithms Triangle counting / sampling 8/18/2017

11 Effective Resistances
View 𝐺 as an electrical circuit Resistance of 𝑒: 𝑅𝑒=1/ 𝑀 𝑒 Effective resistance (𝐸𝑅) between two vertices: Voltage difference required between them to get 1 unit of current between them. Leverage score = 𝑀 𝑒𝑣 βˆ™πΈπ‘… 𝑒𝑣 Importance Intuitive way of observing a graph Sparsification by 𝐸𝑅 is extremely useful! (Next slide) 8/18/2017

12 Sparsification using 𝐸𝑅
Suppose we have upper bounds on leverage scores of edges. ( 𝜏 β€² 𝑒 β‰₯𝑀 𝑒 𝐸𝑅(𝑒)) Algorithm: Repeat 𝑁=𝑂( πœ€ βˆ’2 βˆ™ 𝜏 β€² 𝑒 βˆ™ log 𝑛 ) times Pick an edge with probability 𝜏 β€² 𝑒 / 𝜏 β€² 𝑒 . Add it to H with appropriate re-weighting. Was this first used in SS08? [Tropp β€˜12] The output 𝐻 is an πœ€β€“sparsifier of 𝐺. Need: leverage score bounds for the edges in 𝐺 π‘˜ 8/18/2017

13 Tools for Bounding Leverage Scores
Odd-even lemma [CCLPT β€˜15]: For odd π‘˜, 𝐸𝑅 𝐺 π‘˜ 𝑒,𝑣 ≀2βˆ™ 𝐸𝑅 𝐺 𝑒,𝑣 For even π‘˜, 𝐸𝑅 𝐺 π‘˜ 𝑒,𝑣 ≀ 𝐸𝑅 𝐺 2 𝑒,𝑣 Triangle inequality of 𝐸𝑅 (on path 𝑒0β€¦π‘’π‘˜): 𝐸𝑅 𝐺 𝑒 0 , 𝑒 π‘˜ ≀ 𝑖=0 π‘˜ 𝐸𝑅 𝐺 𝑒 𝑖 , 𝑒 𝑖+1 We will use these to implicitly select edges by leverage score 8/18/2017

14 Analysis: Goal: sample an edge (𝑒0,π‘’π‘˜) in 𝐺 π‘˜ w.p. proportional to:
Simplifications: Assume odd π‘˜ (even π‘˜ uses one more idea) Assume access to exact effective resistances of 𝐺 (available from previous works) Goal: sample an edge (𝑒0,π‘’π‘˜) in 𝐺 π‘˜ w.p. proportional to: 𝑀 𝐺 π‘˜ 𝑒 0 , 𝑒 π‘˜ β‹… 𝐸𝑅 𝐺 π‘˜ 𝑒 0 , 𝑒 π‘˜ Claim: walk sampling achieves this! 8/18/2017

15 Analysis: Claim: it suffices to sample a path 𝑒0β€¦π‘’π‘˜ w.p. proportional to: 𝑀 𝑒 0 , 𝑒 1 ,…, 𝑒 π‘˜ β‹… 𝑖=0 π‘˜βˆ’1 𝐸 𝑅 𝐺 𝑒 𝑖 , 𝑒 𝑖+1 β‰₯𝑀 𝑒 0 , 𝑒 1 ,…, 𝑒 π‘˜ ⋅𝐸 𝑅 𝐺 𝑒 0 , 𝑒 π‘˜ (β–³ inequality) β‰₯𝑀 𝑒 0 , 𝑒 1 ,…, 𝑒 π‘˜ ⋅𝐸 𝑅 𝐺 π‘˜ 𝑒 0 , 𝑒 π‘˜ (odd-even lemma) Summing over all k-length paths from 𝑒 0 to 𝑒 π‘˜ , = 𝑀 𝐺 π‘˜ 𝑒 0 , 𝑒 π‘˜ β‹… 𝐸𝑅 𝐺 π‘˜ 𝑒 0 , 𝑒 π‘˜ 8/18/2017

16 Walk Sampling Algorithm
Algorithm (to pick an edge in 𝐺 π‘˜ ): Choose an edge (𝑒,𝑣) in 𝐺 with probability proportional to 𝑀 𝑒𝑣 βˆ™πΈπ‘… 𝑒𝑣 Pick u.a.r. an index 𝑖 in the range 0, π‘˜βˆ’1 Walk 𝑖 steps from 𝑒 and kβˆ’1βˆ’π‘– steps from 𝑣 8/18/2017

17 Analysis of Walk Sampling
Probability of sampling the walk ( 𝑒 0 , 𝑒 1 ,β‹―, 𝑒 π‘˜ )∝ Pr[selecting the edge ( 𝑒 𝑖 , 𝑒 𝑖+1 )] Pr[index=i] x x Pr[Walk from 𝑒 𝑖 to 𝑒 0 ] x Pr[Walk from 𝑒 𝑖+1 to 𝑒 π‘˜ ] = 𝑖=0 π‘˜βˆ’1 1 π‘˜ ⋅𝑀 𝑒 𝑖 , 𝑒 𝑖+1 𝐸 𝑅 𝐺 𝑒 𝑖 , 𝑒 𝑖+1 β‹… 𝑗=0 π‘–βˆ’1 𝑀 𝑒 𝑗 , 𝑒 𝑗+1 β‹… 𝑗=𝑖+1 π‘˜βˆ’1 𝑀 𝑒 𝑗 , 𝑒 𝑗+1 = 𝑖=0 π‘˜βˆ’1 1 π‘˜ ⋅𝐸 𝑅 𝐺 𝑒 𝑖 , 𝑒 𝑖+1 β‹… 𝑗=0 π‘˜βˆ’1 𝑀 𝑒 𝑗 , 𝑒 𝑗+1 = 1 π‘˜ ⋅𝑀(𝑒0,𝑒1β€¦π‘’π‘˜)β‹… 𝑖=0 π‘˜βˆ’1 𝐸 𝑅 β€² 𝑒 𝑖 , 𝑒 𝑖+1 8/18/2017

18 π‘˜ = even: 𝐺 2 𝐸 𝑅 πΊβˆ—π‘ƒ2 π‘Ž1,𝑏1 =𝐸 𝑅 𝐺 2 (π‘Ž,𝑏)
𝐺 2 is still dense and cannot be computed! Compute product of G and length 2 path, return ER from that .68 .68 d a b .2 .8 π‘Ž1 π‘Ž2 a .32 b 𝑏1 𝑏2 .68 .32 .8 .2 𝑐1 𝑐2 c c 𝑑1 𝑑2 .68 𝐺 𝐺×𝑃2 𝐺 2 𝐸 𝑅 πΊβˆ—π‘ƒ2 π‘Ž1,𝑏1 =𝐸 𝑅 𝐺 2 (π‘Ž,𝑏) 8/18/2017

19 Future Work This result: 𝑂 π‘š+ π‘˜ 2 𝑛 log 4 𝑛 time.
Log-dependency on π‘˜ (as in JK β€˜15) Better runtime of 𝑂 π‘š+𝑛 log 2 𝑛 ??? (combinatorial algorithm) 8/18/2017

20 ER estimates for 𝐺 (or 𝐺 Γ— 𝑃 2 )
Iterative improvement similar to KLP ’12: Create sequence of graphs, each more tree like than the previous, 𝐺1…𝐺𝑑 𝑂 1 -Sparsify the last graph to get 𝐻𝑑 Use sparsifier 𝐻 𝑖+1 to construct an 𝑂 1 -sparsifier 𝐻 𝑖 of 𝐺 𝑖 . 8/18/2017


Download ppt "Density Independent Algorithms for Sparsifying "

Similar presentations


Ads by Google