Presentation is loading. Please wait.

Presentation is loading. Please wait.

Network Coding: Mixin’ it up Sidharth Jaggi Michelle Effros Michael Langberg Tracey Ho Philip Chou Kamal Jain Muriel MédardPeter Sanders Ludo Tolhuizen.

Similar presentations


Presentation on theme: "Network Coding: Mixin’ it up Sidharth Jaggi Michelle Effros Michael Langberg Tracey Ho Philip Chou Kamal Jain Muriel MédardPeter Sanders Ludo Tolhuizen."— Presentation transcript:

1 Network Coding: Mixin’ it up Sidharth Jaggi Michelle Effros Michael Langberg Tracey Ho Philip Chou Kamal Jain Muriel MédardPeter Sanders Ludo Tolhuizen Sebastian Egner

2 Network Coding R. Ahlswede, N. Cai, S.-Y. R. Li and R. W. Yeung, "Network information flow," IEEE Trans. on Information Theory, vol. 46, pp. 1204-1216, 2000. http://tesla.csl.uiuc.edu/~koetter/NWC/Bibliography.html 131 papers as of last night(≈2 years) http://tesla.csl.uiuc.edu/~koetter/NWC/Bibliography.html NetCod Workshop, DIMACS working group, ISIT 2005 - 4+ sessions Several patents, theses

3 “The core notion of network coding is to allow and encourage mixing of data at intermediate network nodes. “ (Network Coding homepage) But... what is it?

4 Point-to-point flows C Min-cut Max-flow (Menger’s) Theorem [M27] Ford-Fulkerson Algorithm [FF62] s t

5 Multicasting Webcasting P2P networks Sensor networks s1s1 t1t1 t2t2 t |T| Network s |S|

6 Justifications revisited - I s t1t1 t2t2 b1b1 b2b2 b2b2 b2b2 b1b1 b1b1 ? b1b1 b1b1 b1b1 b1b1 (b 1,b 2 ) b 1 +b 2 (b 1,b 2 ) [ACLY00] Throughput

7 Gap Without Coding... Coding capacity = h Routing capacity≤2 Example due to Sanders et al. (collaborators) s

8 Multicasting Upper bound for multicast capacity C, C ≤ min{C i } s t1t1 t2t2 t |T| C |T| C1C1 C2C2 Network [ACLY00] - achievable! [LYC02] - linear codes suffice!! [KM01] - “finite field” linear codes suffice!!!

9 Multicasting b1b1 b2b2 bmbm β1β1 β2β2 βkβk F(2 m )-linear network [KM01] Source:- Group together `m’ bits, Every node:- Perform linear combinations over finite field F(2 m )

10 Multicasting Upper bound for multicast capacity C, C ≤ min{C i } s t1t1 t2t2 t |T| C |T| C1C1 C2C2 Network [ACLY00] - achievable! [LYC02] - linear codes suffice!! [KM01] - “finite field” linear codes suffice!!! [JCJ03],[SET03] - polynomial time code design!!!!

11 Thms: Deterministic Codes For m ≥ log(|T|), exists an F(2 m )-linear network which can be designed in O(|E||T|C(C+|T|)) time. [JCJ03],[SET03] Exist networks for which minimum m≈0.5(log(|T|)) [JCJ03],[LL03]

12 Justifications revisited - II s t1t1 t2t2 One link breaks Robustness/Distributed design

13 Justifications revisited - II s t1t1 t2t2 b1b1 b2b2 b2b2 b2b2 b1b1 b1b1 (b 1,b 2 ) b 1 +b 2 Robustness/Distributed design (b 1,b 2 ) b 1 +2b 2 (Finite field arithmetic) b 1 +b 2 b 1 +2b 2

14 Thm: Random Robust Codes s t1t1 t2t2 t |T| C |T| C1C1 C2C2 Original Network C = min{C i }

15 Thm: Random Robust Codes s t1t1 t2t2 t |T| C |T| ' C1'C1' C2'C2' Faulty Network C' = min{C i '} If value of C' known to s, same code can achieve C' rate! (interior nodes oblivious)

16 Thm: Random Robust Codes m sufficiently large, rate R<C Choose random [ß] at each node Probability over [ß] that code works >1-|E||T|2 -m(C-R)+|V| [JCJ03] [HKMKE03] (different notions of linearity) Decentralized design  b1b1 b2b2 bmbm b’ 1 b’ 2 b’ m b’’ 1 b’’ 2 b’’ m  ’’  ’’ Much “sparser” linear operations (O(m) instead of O(m 2 )) [JCE06?] Vs. prob of error - necessary evil?

17 Zero-error Decentralized Codes No a priori network topological information available - information can only be percolated down links Desired - zero-error code design One additional resource - each node v i has a unique ID number i (GPS coordinates/IP address/…) Need to use yet other types of linear codes [JHE06?]

18 Inter-relationships between notions of linearity C B M M Multicast G General Global Local I/O ≠ Local I/O = a Acyclic A Algebraic B Block C Convolutional Does not exist Є epsilon rate loss G a G Є A M a M a M a G? M G a G M a G G [JEHM04]

19

20 Justifications revisited - III s t1t1 t2t2 Security Evil adversary hiding in network eavesdropping, injecting false information [JLHE05]

21 Multicasting Simplifying assumptions (for this talk) Single source Directed, acyclic graph. Each link has unit capacity. Links have zero delay. s t1t1 t2t2 t |T| C |T| C1C1 C2C2 Network

22 Kinds of linearity b1b1 b2b2 bmbm β1β1 β2β2 βkβk Algebraic codes b0b0 b1b1 b m-1 Block codes b0b0 b1b1 Convolutional codes

23 p (“Noise parameter”) 0 1 1 C (Capacity) Model 1 - ResultsModel 1 - Encoding 0.5

24 Model 1 - Encoding … T |E| … T 1... r1r1 r |E| nεnε D 11 …D 1|E| D |E|1 …D |E||E| D ij =T j (1).1+T j (2).r i +…+ T j (n(1- ε)).r i n(1- ε) … T j riri D ij j

25 Model 1 - Encoding … T |E| … T 1... r1r1 r |E| nεnε D 11 …D 1|E| D |E|1 …D |E||E| D ij =T j (1).1+T j (2).r i +…+ T j (n(1- ε)).r i n(1- ε) … T j riri D ij i

26 Model 1 - Transmission … T |E| … T 1... r1r1 r |E| D 11 …D 1|E| D |E|1 …D |E||E| … T |E| ’ … T 1 ’... r1’r1’ r |E| ’ D 11 ’…D 1|E| ’ D |E|1 ’…D |E||E| ’

27 Model 1 - Decoding … T |E| ’ … T 1 ’... r1’r1’ r |E| ’ D 11 ’…D 1|E| ’ D |E|1 ’…D |E||E| ’ “Quick consistency check” D ij ’=T j (1)’.1+T j (2)’.r i ’+…+ T j (n(1- ε))’.r i ’ n(1- ε) ? … T j ’ ri’ri’D ij ’

28 Model 1 - Decoding … T |E| ’ … T 1 ’... r1’r1’ r |E| ’ D 11 ’…D 1|E| ’ D |E|1 ’…D |E||E| ’ “Quick consistency check” D ij ’=T j (1)’.1+T j (2)’.r i ’+…+ T j (n(1- ε))’.r i ’ n(1- ε) ? … T j ’ ri’ri’D ij ’ D ji ’=T i (1)’.1+T i (2)’.r j ’+…+ T i (n(1- ε))’.r j ’ n(1- ε) ?

29 Model 1 - Decoding … T |E| ’ … T 1 ’... r1’r1’ r |E| ’ D 11 ’…D 1|E| ’ D |E|1 ’…D |E||E| ’ Edge i consistent with edge j D ij ’=T j (1)’.1+T j (2)’.r i ’+…+ T j (n(1- ε))’.r i ’ n(1- ε) D ji ’=T i (1)’.1+T i (2)’.r j ’+…+ T i (n(1- ε))’.r j ’ n(1- ε) Consistency graph

30 Model 1 - Decoding … T |E| ’ … T 1 ’... r1’r1’ r |E| ’ D 11 ’…D 1|E| ’ D |E|1 ’…D |E||E| ’ Consistency graph 1 2 4 5 3 (Self-loops… not important) T r,D T 1 2 3 4 5 Edge i consistent with edge j D ij ’=T j (1)’.1+T j (2)’.r i ’+…+ T j (n(1- ε))’.r i ’ n(1- ε) D ji ’=T i (1)’.1+T i (2)’.r j ’+…+ T i (n(1- ε))’.r j ’ n(1- ε)

31 Model 1 - Decoding … T |E| ’ … T 1 ’... r1’r1’ r |E| ’ D 11 ’…D 1|E| ’ D |E|1 ’…D |E||E| ’ 1 2 4 5 3 T r,D T 1 2 3 4 5 Consistency graph Detection – select vertices connected to at least |E|/2 other vertices in the consistency graph. Decode using T i s on corresponding edges.

32 Model 1 - Proof … T |E| ’ … T 1 ’... r1’r1’ r |E| ’ D 11 ’…D 1|E| ’ D |E|1 ’…D |E||E| ’ 1 2 4 5 3 T r,D T 1 2 3 4 5 Consistency graph D ij =T j (1)’.1+T j (2)’.r i +…+ T j (n(1- ε))’.r i n(1- ε) D ij =T j (1).1+T j (2).r i +…+ T j (n(1- ε)).r i n(1- ε) ∑ k (T j (k)-T j (k)’).r i k =0 Polynomial in r i of degree n over F q, value of r i unknown to Zorba Probability of error < n/q<<1

33 Greater throughput Robust against random errors... Aha! Network Coding!!!

34

35 ? ? ?

36 XavierYvonne Zorba ? ? ?

37 Unicast 1.Code (X,Y,Z) 2.Message (X,Z) 3.Bad links (Z) 4.Coin (X) 5.Transmission (Y,Z) 6.Decode correctly (Y) Eurek a

38 XavierYvonne ? Zorba ? ? |E| directed unit-capacity links Zorba (hidden to Xavier/Yvonne) controls |Z| links Z. p = |Z|/|E| Xavier and Yvonne share no resources (private key, randomness) Zorba computationally unbounded; Xavier and Yvonne can only perform “simple” computations. Unicast Zorba knows protocols and already knows almost all of Xavier’s message (except Xavier’s private coin tosses) Goal: Transmit at “high” rate and w.h.p. decode correctly

39 Background Noisy channel models (Shannon,…)  Binary Symmetric Channel p (“Noise parameter”) 0 1 1 C (Capacity) 01 H(p) 0.5

40 Background Noisy channel models (Shannon,…)  Binary Symmetric Channel  Binary Erasure Channel p (“Noise parameter”) 0 1 1 C (Capacity) 0E 1-p 0.5

41 Background Adversarial channel models  “Limited-flip” adversary (Hamming,Gilbert-Varshanov,McEliece et al…)  Shared randomness, private key, computationally bounded adversary… p (“Noise parameter”) 0 1 1 C (Capacity) 01 0.5

42 p (“Noise parameter”) 0 1 1 C (Capacity) Unicast - Results 0.5 1-p

43 p (“Noise parameter”) 0 1 1 C (Capacity) Unicast - Results 0.5 ? ? ? 0

44 p (“Noise parameter”) 0 1 1 C (Capacity) Unicast - Results 0.5 (Just for this talk, Zorba is causal)

45 Ignorant Zorba 1.Code (X,Y,Z) 2.Message X p,X s (X) 3.Bad links (Z) 4.Coin (X) 5.Transmission (Y,Z) 6.Decode correctly (Y,Z) I(Z;X s )=0 Eurek a

46 p = |Z|/h 0 1 1 C (Normalized by h) General Multicast Networks 0.5 h Z S R1R1 R |T| Slightly more intricate proof

47 |E|-|Z| |E| |E|-|Z| Unicast - Encoding

48 |E|-|Z| |E| MDS Code X |E|-|Z| Block-length n over finite field F q |E|-|Z| n(1-ε) x1x1 … n Vandermonde matrix T |E| |E| n(1-ε) T1T1... n Rate fudge-factor “Easy to use consistency information” nεnε Symbol from F q Unicast - Encoding

49 … T |E| … T 1... r r nεnε D 1 …D |E| D i =T i (1).1+T i (2).r+…+ T i (n(1- ε)).r n(1- ε) TiTi rDiDi i Unicast - Encoding

50 … T |E| … T 1... r r D 1 …D |E| … T |E| ’ … T 1 ’... r’ D 1 ’…D |E| ’ Unicast - Transmission

51 D i =T i (1)’.1+T i (2)’.r+…+ T i (n(1- ε))’.r n(1- ε) ? If so, accept T i, else reject T i Unicast - Quick Decoding … T |E| ’ … T 1 ’... r r’ D 1 …D |E| D 1 ’…D |E| ’ Choose majority (r,D 1,…,D |E| ) ∑ k (T i (k)-T i (k)’).r k =0 Polynomial in r of degree n over F q, value of r unknown to Zorba Probability of error < n/q<<1 Use accepted T i s to decode

52 ? ? ? General Multicast Networks

53 p = |Z|/h 0 1 1 C (Normalized by h) 0.5 General Multicast Networks R1R1 R |T| S S’ |Z| S’ 2 S’ 1 Observation: Can treat adversaries as new sources

54 R1R1 R |T| S General Multicast Networks y i =T i x x y1y1 S’ |Z| S’ 2 S’ 1

55 R1R1 R |T| S General Multicast Networks y i =T i x x y1’y1’ S’ |Z| S’ 2 S’ 1 a1a1 y i ’=T i x+T i ’a i (x(1),x(2),…,x(n)) form a R-dimensional subspace X w.h.p. over network code design, TX and TA i do not intersect (robust codes…). (a i (1),a i (2),…,a i (n)) form a |Z|-dimensional subspace A i w.h.p. over x(i), (y(1),y(2),…y(R+|Z|)) forms a basis for TX  TA i But already know basis for TX, therefore can obtain basis for TA i

56 Variations - Feedback C p 0 1 1

57 Variations – Know thy enemy C p 0 1 1 C p 0 1 1

58 Variations – Omniscient but not Omnipresent C p 0 1 1 0.5 Achievability: Gilbert-Varshamov, Algebraic Geometry Codes Converse: Generalized MRRW bound

59 Variations – Random Noise C p 0 CNCN 1 SEPARATIONSEPARATION

60 p (“Noise parameter”) 0 1 1 C (Capacity) Ignorant Zorba - Results 0.5 1 X p +X s XsXs 1-2p

61 p (“Noise parameter”) 0 1 1 C (Capacity) Ignorant Zorba - Results 0.5 1 X p +X s XsXs 1-2p a+b+c a+2b+4c a+3b+9c MDS code

62 Overview of results Centralized design  Deterministic Decentralized design  Randomized  Deterministic Complexity  Lower bounds  Sparse codes Types of linearity - interrelationships Adversaries

63 THE ENDTHE END


Download ppt "Network Coding: Mixin’ it up Sidharth Jaggi Michelle Effros Michael Langberg Tracey Ho Philip Chou Kamal Jain Muriel MédardPeter Sanders Ludo Tolhuizen."

Similar presentations


Ads by Google