Tighter Cut-Based Bounds for k-pairs Communication Problems Nick Harvey Robert Kleinberg.

Slides:



Advertisements
Similar presentations
WSPD Applications.
Advertisements

C&O 355 Mathematical Programming Fall 2010 Lecture 22 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
Greedy Algorithms Greed is good. (Some of the time)
Introduction to Algorithms
Multicut Lower Bounds via Network Coding Anna Blasiak Cornell University.
Tirgul 8 Graph algorithms: Strongly connected components.
Approximation Some Network Design Problems With Node Costs Guy Kortsarz Rutgers University, Camden, NJ Joint work with Zeev Nutov The Open University,
Lectures on Network Flows
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 14 Strongly connected components Definition and motivation Algorithm Chapter 22.5.
Computational Game Theory
Approximation Algorithm: Iterative Rounding Lecture 15: March 9.
On the Capacity of Information Networks Nick Harvey Collaborators: Micah Adler (UMass), Kamal Jain (Microsoft), Bobby Kleinberg (MIT/Berkeley/Cornell),
Network Optimization Models: Maximum Flow Problems In this handout: The problem statement Solving by linear programming Augmenting path algorithm.
1 Simple Network Codes for Instantaneous Recovery from Edge Failures in Unicast Connections Salim Yaacoub El Rouayheb, Alex Sprintson Costas Georghiades.
Yangjun Chen 1 Network Flow What is a network? Flow network and flows Ford-Fulkerson method - Residual networks - Augmenting paths - Cuts of flow networks.
Rooted Trees. More definitions parent of d child of c sibling of d ancestor of d descendants of g leaf internal vertex subtree root.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 14 Strongly connected components Definition and motivation Algorithm Chapter 22.5.
1 Nick Harvey (MIT) Kamal Jain (MSR) Lap Chi Lau (U. Toronto) Chandra Nair (MSR) Yunnan Wu (MSR) Conservative Network Coding.
CSE 421 Algorithms Richard Anderson Lecture 22 Network Flow.
Applications of the Max-Flow Min-Cut Theorem. S-T Cuts SF D H C A NY S = {SF, D, H}, T={C,A,NY} [S,T] = {(D,A),(D,C),(H,A)}, Cap [S,T] =
Yangjun Chen 1 Network Flow What is a network? Flow network and flows Ford-Fulkerson method - Residual networks - Augmenting paths - Cuts of flow networks.
On a Network Creation Game PoA Seminar Presenting: Oren Gilon Based on an article by Fabrikant et al 1.
Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.
Module #19: Graph Theory: part II Rosen 5 th ed., chs. 8-9.
 Rooted tree and binary tree  Theorem 5.19: A full binary tree with t leaves contains i=t-1 internal vertices.
5.5.2 M inimum spanning trees  Definition 24: A minimum spanning tree in a connected weighted graph is a spanning tree that has the smallest possible.
5.5.3 Rooted tree and binary tree  Definition 25: A directed graph is a directed tree if the graph is a tree in the underlying undirected graph.  Definition.
Max Flow – Min Cut Problem. Directed Graph Applications Shortest Path Problem (Shortest path from one point to another) Max Flow problems (Maximum material.
5.5.2 M inimum spanning trees  Definition 24: A minimum spanning tree in a connected weighted graph is a spanning tree that has the smallest possible.
CS 473Lecture ?1 CS473-Algorithms I Lecture ? Network Flows Flow Networks & Flows.
Embeddings, flow, and cuts: an introduction University of Washington James R. Lee.
1 The Encoding Complexity of Network Coding Michael Langberg California Institute of Technology Joint work with Jehoshua Bruck and Alex Sprintson.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 25.
Chapter 8 Maximum Flows: Additional Topics All-Pairs Minimum Value Cut Problem  Given an undirected network G, find minimum value cut for all.
Julia Chuzhoy (TTI-C) Yury Makarychev (TTI-C) Aravindan Vijayaraghavan (Princeton) Yuan Zhou (CMU)
Multi-commodity Flows and Cuts in Polymatroidal Networks
Exercise 6.1 Find the number of different shortest paths from point A to point B in a city with perfectly horizontal streets and vertical avenues as shown.
CSCI 256 Data Structures and Algorithm Analysis Lecture 20 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
CHAPTER 11 TREES INTRODUCTION TO TREES ► A tree is a connected undirected graph with no simple circuit. ► An undirected graph is a tree if and only.
CSE 421 Algorithms Richard Anderson Lecture 22 Network Flow.
Theory of Computing Lecture 12 MAS 714 Hartmut Klauck.
Iterative Improvement for Domain-Specific Problems Lecturer: Jing Liu Homepage:
5. Biconnected Components of A Graph If one city’s airport is closed by bad weather, can you still fly between any other pair of cities? If one computer.
5.6 Prefix codes and optimal tree Definition 31: Codes with this property which the bit string for a letter never occurs as the first part of the bit string.
::Network Optimization:: Minimum Spanning Trees and Clustering Taufik Djatna, Dr.Eng. 1.
TU/e Algorithms (2IL15) – Lecture 8 1 MAXIMUM FLOW (part II)
Approximating k-route cuts
Discrete Mathematicsq
12. Graphs and Trees 2 Summary
Introduction to Trees Section 11.1.
Algorithm Design and Analysis
Lectures on Network Flows
Richard Anderson Lecture 23 Network Flow
Lecture 22 Network Flow, Part 2
Approximating k-route cuts
Planarity Testing.
Instructor: Shengyu Zhang
Elementary graph algorithms Chapter 22
3.5 Minimum Cuts in Undirected Graphs
Richard Anderson Lecture 23 Network Flow
Richard Anderson Lecture 23 Network Flow
Project Selection Bin Li 10/29/2008.
Lecture 21 Network Flow, Part 1
Richard Anderson Lecture 22 Network Flow
Lecture 21 Network Flow, Part 1
Elementary graph algorithms Chapter 22
Lecture 22 Network Flow, Part 2
Maximum Flow Problems in 2005.
Presentation transcript:

Tighter Cut-Based Bounds for k-pairs Communication Problems Nick Harvey Robert Kleinberg

Overview Definitions  Sparsity and Meagerness Bounds Show these bounds very loose Define Informational Meagerness  Based on Informational Dominance Show that it can be slightly loose

M1M1 M2M2 M1⊕M2M1⊕M2 S(1)S(2) T(2)T(1) k-pairs Communication Problem

Concurrent Rate Source i desires communication rate d i. Rate r is achievable if rate vector [ rd 1, rd 2, …, rd k ] is achievable Rate region interval of R + Def: “Network coding rate” (or NCR) := sup { r : r is achievable }

M1M1 M2M2 M1⊕M2M1⊕M2 S(1)S(2) T(2)T(1) k-pairs Communication Problem d 1 = d 2 = 1 c e = 1  e  E Rate 1 achievable

Upper bounds on rate [Classical]: Sparsity bound for multicommodity flows [CT91]: General bound for multi-commodity information networks [B02]: Application of CT91 to directed network coding instances; equivalent to sparsity. [KS03]: Bound for undirected networks with arbitrary two-way channels [HKL04]: Meagerness [SYC03], [HKL05]: LP bound [KS05]: Bound based on iterative d-separation

Vertex-Sparsity Def: For U  V, VS (G) := min U  V VS (U) Claim: NCR  VS (G) Capacity of edges crossing between U and U Demand of commodities separated by U VS (U) :=

Edge-Sparsity Def: For A  E, ES (G) = min A  E ES (A) Claim: Max-Flow  ES (G) But: Sometimes NCR > ES (G) Capacity of edges in A Demand of commodities separated in G\A ES (A) :=

NCR > Edge-Sparsity S(1) S(2) T(2) T(1) Cut {e} separates S(1) and S(2)  ES ({e}) = 1/2 But rate 1 achievable! e

Meagerness Def: For A  E and P  [k], A isolates P if for all i,j  P, S(i) and T(j) disconnected in G\A. M (G) := min A  E M (A) Claim: NCR  M (G) Capacity of edges in A Demand of commodities in P M (A) := min P isolated by A

Meagerness & Vtx-Sparsity are weak Thm: M ( G n ) = VS ( G n ) =  (1), but NCR  1/n. S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n :=

A Proof Tool Def: Let A,B  E. B is downstream of A if B disconnected from sources in G\A. Notation: A  B. Claim: If A  B then H(A)  H(A,B). Pf: Because S  A  B form Markov chain.

Proof: {g n }  {g n,T(1),h 1 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR  1/n

Proof: {g n }  {g n,T(1),h 1 }  {S(1),f 1,g 1,h 1 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR  1/n

Proof: {g n }  {g n,T(1),h 1 }  {S(1),f 1,g 1,h 1 }  {S(1),f 1,T(2),h 2 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR  1/n

Proof: {g n }  {g n,T(1),h 1 }  {S(1),f 1,g 1,h 1 }  {S(1),f 1,T(2),h 2 }  {S(1),S(2),f 2,g 2,h 2 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR  1/n

h3h3 Proof: {g n }  {g n,T(1),h 1 }  {S(1),f 1,g 1,h 1 }  {S(1),f 1,T(2),h 2 }  {S(1),S(2),f 2,g 2,h 2 }  {S(1),S(2),f 2,T(3),h 3 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR  1/n

Proof: {g n }  …  {S(1),S(2),…,S(n)} Thus 1  H(g n )  H(S(1),…,S(n)) = n ∙ r So 1/n  r S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR  1/n

Towards a stronger bound Our focus: cut-based bounds  Given A  E, we want to infer that H(A)  H(A,P) where P  {S(1),…,S(k)} Meagerness uses Markovicity: (sources in P)  A  (sinks in P) Markovicity sometimes not enough…

Informational Dominance Def: A dominates B if information in A determines information in B in every network coding solution. Denoted A B. Trivially implies H(A)  H(A,B) How to determine if A dominates B?  [HKL05] give combinatorial characterization and efficient algorithm to test if A dominates B. i 

Informational Meagerness Def: For A  E and P  {S(1),…,S(k)}, A informationally isolates P if A  P P. iM (A) = min P for P informationally isolated by A iM (G) = min A  E iM (A) Claim: NCR  iM (G). i  Capacity of edges in A Demand of commodities in P

iMeagerness Example “Obviously” NCR = 1. But no two edges disconnect t 1 and t 2 from both sources! s1s1 s2s2 t1t1 t2t2

iMeagerness Example After removing A, still a path from s 2 to t 1 ! Cut A s1s1 s2s2 t1t1 t2t2

Informational Dominance Example s1s1 s2s2 t1t1 t2t2 Our characterization shows A {t 1,t 2 } H(A)  H(t 1,t 2 ) and iM (G) = 1 Cut A i 

A bad example: H n Thm: iMeagerness gap of H n is  (log |V|) s(00) s(0) s(01) s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) r(00)r(01)r(10)r(11) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) Capacity 2 -n H2H2

s(00) s(0) s(01)s(10)s(11) s(1) s(ε) T n = Binary tree of depth n Source S(i)  i  T n

s(00) s(0) s(01)s(10)s(11) s(1) s(ε) T n = Binary tree of depth n Source S(i)  i  T n Sink T(i)  i  T n t(00) t(0) t(01)t(10)t(11) t(1) t(ε)

r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) q(00) q(01) q(10) q(11) Nodes q(i) and r(i) for every leaf i of T n

r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) q(00) q(01) q(10) q(11) Complete bip. graph between sources and q’s

r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) q(00) q(01) q(10) q(11) (r(a),t(b)) if b ancestor of a in T n

r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) (s(a),t(b)) if a and b cousins in T n

r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) Capacity 2 -n All edges have capacity  except (q(i),r(i))

r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) Capacity 2 -n Demand of source at depth i is 2 -i

Properties of H n Lemma: iM ( H n ) =  (1) Lemma: NCR < 1/n Corollary: iMeagerness gap is n=  (log |V|)

Properties of H n Lemma: iM ( H n ) =  (1) Lemma: NCR < 1/n Corollary: iMeagerness gap is n=O(log |V|) We will prove this

Entropy moneybags  i.e., sets of RVs Entropy investments  Buying sources and edges, putting into moneybag  Loans may be necessary Profit  Via Downstreamness or Info. Dominance  Earn new sources or edges for moneybag Corporate mergers  Via Submodularity  New Investment Opportunities and Debt Consolidation Debt repayment Proof Ingredients

Submodularity of Entropy Claim: Let A and B be sets of RVs. Then H(A)+H(B)  H(A  B)+H(A  B) Pf: Equivalent to I( X; Y | Z )  0.

Proof: Two entropy moneybags:  F(a) = { S(b) : b not an ancestor of a }  E(a) = F(a)  { (q(b),r(b)) : b is descendant of a } Lemma: NCR < 1/n

Entropy Investment Let a be a leaf of T n Take a loan and buy E(a). r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) a

t(00) Earning Profit Claim: E(a)  T(a) Pf: Cousin-edges not from ancestors. Vertex r(00) blocked by E(a). r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) a

Earning Profit Claim: E(a)  T(a) Result: E(a) gives free upgrade to E(a)  {S(a)}. Profit = S(a). r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) a t(00)

q(00) q(01) r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) E(a L )  {S(a L )} r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(10) q(11) E(a R )  {S(a R )} aLaL aRaR

q(00) q(01) r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) (E(a L )  {S(a L )})  (E(a R )  {S(a R )}) r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(10) q(11) (E(a L )  {S(a L )})  (E(a R )  {S(a R )}) Applying submodularity

r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) (E(a L )  {S(a L )})  (E(a R )  {S(a R )}) New Investment Union term has more edges  Can use downstreamness or informational dominance again! (E(a L )  {S(a L )})  (E(a R )  {S(a R )}) = E(a) a

Debt Consolidation Intersection term has only sources  Cannot earn new profit. Used for later “debt repayment” (E(a L )  {S(a L )})  (E(a R )  {S(a R )}) = F(a) q(00) q(01) r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(10) q(11) (E(a L )  {S(a L )})  (E(a R )  {S(a R )}) a

What have we shown? Let a L,a R be sibling leaves; a is their parent. H(E(a L )) + H(E(a R ))  H(E(a)) + H(F(a)) Iterate and sum over all nodes in tree where r is the root. Note: E(v) = F(v)  {(q(v),r(v))} when v is a leaf

Debt Repayment Claim: Pf: Simple counting argument. 

Finishing up  = 1 =  Rate < 1/n = (where α = rate of solution)