Tighter Cut-Based Bounds for k-pairs Communication Problems Nick Harvey Robert Kleinberg
Overview Definitions Sparsity and Meagerness Bounds Show these bounds very loose Define Informational Meagerness Based on Informational Dominance Show that it can be slightly loose
M1M1 M2M2 M1⊕M2M1⊕M2 S(1)S(2) T(2)T(1) k-pairs Communication Problem
Concurrent Rate Source i desires communication rate d i. Rate r is achievable if rate vector [ rd 1, rd 2, …, rd k ] is achievable Rate region interval of R + Def: “Network coding rate” (or NCR) := sup { r : r is achievable }
M1M1 M2M2 M1⊕M2M1⊕M2 S(1)S(2) T(2)T(1) k-pairs Communication Problem d 1 = d 2 = 1 c e = 1 e E Rate 1 achievable
Upper bounds on rate [Classical]: Sparsity bound for multicommodity flows [CT91]: General bound for multi-commodity information networks [B02]: Application of CT91 to directed network coding instances; equivalent to sparsity. [KS03]: Bound for undirected networks with arbitrary two-way channels [HKL04]: Meagerness [SYC03], [HKL05]: LP bound [KS05]: Bound based on iterative d-separation
Vertex-Sparsity Def: For U V, VS (G) := min U V VS (U) Claim: NCR VS (G) Capacity of edges crossing between U and U Demand of commodities separated by U VS (U) :=
Edge-Sparsity Def: For A E, ES (G) = min A E ES (A) Claim: Max-Flow ES (G) But: Sometimes NCR > ES (G) Capacity of edges in A Demand of commodities separated in G\A ES (A) :=
NCR > Edge-Sparsity S(1) S(2) T(2) T(1) Cut {e} separates S(1) and S(2) ES ({e}) = 1/2 But rate 1 achievable! e
Meagerness Def: For A E and P [k], A isolates P if for all i,j P, S(i) and T(j) disconnected in G\A. M (G) := min A E M (A) Claim: NCR M (G) Capacity of edges in A Demand of commodities in P M (A) := min P isolated by A
Meagerness & Vtx-Sparsity are weak Thm: M ( G n ) = VS ( G n ) = (1), but NCR 1/n. S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n :=
A Proof Tool Def: Let A,B E. B is downstream of A if B disconnected from sources in G\A. Notation: A B. Claim: If A B then H(A) H(A,B). Pf: Because S A B form Markov chain.
Proof: {g n } {g n,T(1),h 1 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR 1/n
Proof: {g n } {g n,T(1),h 1 } {S(1),f 1,g 1,h 1 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR 1/n
Proof: {g n } {g n,T(1),h 1 } {S(1),f 1,g 1,h 1 } {S(1),f 1,T(2),h 2 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR 1/n
Proof: {g n } {g n,T(1),h 1 } {S(1),f 1,g 1,h 1 } {S(1),f 1,T(2),h 2 } {S(1),S(2),f 2,g 2,h 2 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR 1/n
h3h3 Proof: {g n } {g n,T(1),h 1 } {S(1),f 1,g 1,h 1 } {S(1),f 1,T(2),h 2 } {S(1),S(2),f 2,g 2,h 2 } {S(1),S(2),f 2,T(3),h 3 } S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR 1/n
Proof: {g n } … {S(1),S(2),…,S(n)} Thus 1 H(g n ) H(S(1),…,S(n)) = n ∙ r So 1/n r S(3)S(2)S(n)S(n-1) f2f2 f n-1 f3f3 S(1) f1f1 T(1)T(n-1)T(n)T(3) h n-1 h1h1 h3h3 T(2) h2h2 g2g2 g3g3 g1g1 g n-1 gngn G n := Lemma: NCR 1/n
Towards a stronger bound Our focus: cut-based bounds Given A E, we want to infer that H(A) H(A,P) where P {S(1),…,S(k)} Meagerness uses Markovicity: (sources in P) A (sinks in P) Markovicity sometimes not enough…
Informational Dominance Def: A dominates B if information in A determines information in B in every network coding solution. Denoted A B. Trivially implies H(A) H(A,B) How to determine if A dominates B? [HKL05] give combinatorial characterization and efficient algorithm to test if A dominates B. i
Informational Meagerness Def: For A E and P {S(1),…,S(k)}, A informationally isolates P if A P P. iM (A) = min P for P informationally isolated by A iM (G) = min A E iM (A) Claim: NCR iM (G). i Capacity of edges in A Demand of commodities in P
iMeagerness Example “Obviously” NCR = 1. But no two edges disconnect t 1 and t 2 from both sources! s1s1 s2s2 t1t1 t2t2
iMeagerness Example After removing A, still a path from s 2 to t 1 ! Cut A s1s1 s2s2 t1t1 t2t2
Informational Dominance Example s1s1 s2s2 t1t1 t2t2 Our characterization shows A {t 1,t 2 } H(A) H(t 1,t 2 ) and iM (G) = 1 Cut A i
A bad example: H n Thm: iMeagerness gap of H n is (log |V|) s(00) s(0) s(01) s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) r(00)r(01)r(10)r(11) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) Capacity 2 -n H2H2
s(00) s(0) s(01)s(10)s(11) s(1) s(ε) T n = Binary tree of depth n Source S(i) i T n
s(00) s(0) s(01)s(10)s(11) s(1) s(ε) T n = Binary tree of depth n Source S(i) i T n Sink T(i) i T n t(00) t(0) t(01)t(10)t(11) t(1) t(ε)
r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) q(00) q(01) q(10) q(11) Nodes q(i) and r(i) for every leaf i of T n
r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) q(00) q(01) q(10) q(11) Complete bip. graph between sources and q’s
r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) q(00) q(01) q(10) q(11) (r(a),t(b)) if b ancestor of a in T n
r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) (s(a),t(b)) if a and b cousins in T n
r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) Capacity 2 -n All edges have capacity except (q(i),r(i))
r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) t(00) t(0) t(01)t(10)t(11) t(1) t(ε) Capacity 2 -n Demand of source at depth i is 2 -i
Properties of H n Lemma: iM ( H n ) = (1) Lemma: NCR < 1/n Corollary: iMeagerness gap is n= (log |V|)
Properties of H n Lemma: iM ( H n ) = (1) Lemma: NCR < 1/n Corollary: iMeagerness gap is n=O(log |V|) We will prove this
Entropy moneybags i.e., sets of RVs Entropy investments Buying sources and edges, putting into moneybag Loans may be necessary Profit Via Downstreamness or Info. Dominance Earn new sources or edges for moneybag Corporate mergers Via Submodularity New Investment Opportunities and Debt Consolidation Debt repayment Proof Ingredients
Submodularity of Entropy Claim: Let A and B be sets of RVs. Then H(A)+H(B) H(A B)+H(A B) Pf: Equivalent to I( X; Y | Z ) 0.
Proof: Two entropy moneybags: F(a) = { S(b) : b not an ancestor of a } E(a) = F(a) { (q(b),r(b)) : b is descendant of a } Lemma: NCR < 1/n
Entropy Investment Let a be a leaf of T n Take a loan and buy E(a). r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) a
t(00) Earning Profit Claim: E(a) T(a) Pf: Cousin-edges not from ancestors. Vertex r(00) blocked by E(a). r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) a
Earning Profit Claim: E(a) T(a) Result: E(a) gives free upgrade to E(a) {S(a)}. Profit = S(a). r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) a t(00)
q(00) q(01) r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) E(a L ) {S(a L )} r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(10) q(11) E(a R ) {S(a R )} aLaL aRaR
q(00) q(01) r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) (E(a L ) {S(a L )}) (E(a R ) {S(a R )}) r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(10) q(11) (E(a L ) {S(a L )}) (E(a R ) {S(a R )}) Applying submodularity
r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(00) q(01) q(10) q(11) (E(a L ) {S(a L )}) (E(a R ) {S(a R )}) New Investment Union term has more edges Can use downstreamness or informational dominance again! (E(a L ) {S(a L )}) (E(a R ) {S(a R )}) = E(a) a
Debt Consolidation Intersection term has only sources Cannot earn new profit. Used for later “debt repayment” (E(a L ) {S(a L )}) (E(a R ) {S(a R )}) = F(a) q(00) q(01) r(00)r(01)r(10)r(11) s(00) s(0) s(01)s(10)s(11) s(1) s(ε) q(10) q(11) (E(a L ) {S(a L )}) (E(a R ) {S(a R )}) a
What have we shown? Let a L,a R be sibling leaves; a is their parent. H(E(a L )) + H(E(a R )) H(E(a)) + H(F(a)) Iterate and sum over all nodes in tree where r is the root. Note: E(v) = F(v) {(q(v),r(v))} when v is a leaf
Debt Repayment Claim: Pf: Simple counting argument.
Finishing up = 1 = Rate < 1/n = (where α = rate of solution)