Presentation is loading. Please wait.

Presentation is loading. Please wait.

Qiwen Wang, Sidharth Jaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information.

Similar presentations


Presentation on theme: "Qiwen Wang, Sidharth Jaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information."— Presentation transcript:

1 Qiwen Wang, Sidharth Jaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information Theory Workshop 2011, Paraty, Brazil The Chinese University of Hong Kong Institute of Network Coding 1

2 The Chinese University of Hong Kong Institute of Network Coding 2 4 Motivation1 2 3 5 Model Main Results Discussion Conclusion

3 The Chinese University of Hong Kong Institute of Network Coding 3 varying noise level 1 (p-ε,p+ε)

4 The Chinese University of Hong Kong Institute of Network Coding 4 1 2 errors propagate through mix-and-forward error varying noise level

5 The Chinese University of Hong Kong Institute of Network Coding 5 coding kernels unknown a priori 1 2 3 12 34 5 6 7 8 9 [f 1,3,f 1,5 ] [f 2,4,f 2,7 ] [f 3,6,f 4,6 ] [f 6,8,f 6,9 ] varying noise level errors propagate through mix-and-forward

6 The Chinese University of Hong Kong Institute of Network Coding 6 Alice Bob Mincut = C … …

7 The Chinese University of Hong Kong Institute of Network Coding 7 Alice Bob C n …… Mincut = C … …

8 The Chinese University of Hong Kong Institute of Network Coding 8 Alice Bob Cm x n X …….................. Cm x n Y …….................. α β … …

9 The Chinese University of Hong Kong Institute of Network Coding 9 S1S1 S2S2 ……SnSn S 11 S 12 …S 1m S 21 S 22 …S 2m ……S n1 S n2 …S nm mn bits S 11 S 12. S 1m S 21 S 22. S 2m …… S n1 S n2. S nm mx n binary matrix n symbols over One Packet:

10 The Chinese University of Hong Kong Institute of Network Coding 10 T a symbol over T 11 T 12 …… T 1m T 21 T 22 …… T 2m. T m1 T m2 …… T mm m x m binary matrix TS Multiplication over T 11 T 12 …… T 1m T 21 T 22 …… T 2m. T m1 T m2 …… T mm S 11 S 21. S m1 Multiplication over binary field

11 The Chinese University of Hong Kong Institute of Network Coding 11 Noiseless Network XY Cm x nCm x CmCm x n × … …

12 The Chinese University of Hong Kong Institute of Network Coding 12 111101001001……110101011011…… Link A 000110100……000101100…… Link B 111101001001……100 011011…… Link A 000110100……000110100…… Link B OR Errors can be arbitrarily distributed, with an upper bound of fraction p. Worst possible damage can happen to received packets.

13 The Chinese University of Hong Kong Institute of Network Coding 13 Z …….................. Em x n Worst-case bit-flip error matrix Z: no more than pEmn 1s, arbitrarily distributed E: num of edges in the network

14 The Chinese University of Hong Kong Institute of Network Coding 14 Z …….................. Em x n Error bits on the 1 st edge 111101101001……100111111001…… Edge 1 Worst-case bit-flip error matrix Z: no more than pEmn 1s, arbitrarily distributed E: num of edges in the network

15 The Chinese University of Hong Kong Institute of Network Coding 15 011011 010010 Z …….................. Em x n Error bits on the 1 st edge 111101101001……100111111001…… Edge 1 Worst-case bit-flip error matrix Z: no more than pEmn 1s, arbitrarily distributed E: num of edges in the network

16 The Chinese University of Hong Kong Institute of Network Coding 16 Z … X Y Cm x nCm x CmCm x Em × × Em x n Cm x n … …

17 The Chinese University of Hong Kong Institute of Network Coding 17 × × × 00101...001000101...0010

18 The Chinese University of Hong Kong Institute of Network Coding 18 XiXi YiYi + d i columns = … d i is the minimum number of columns of that need to be added to TX(i) to obtain Y(i). Claim: is a distance metric.

19 The Chinese University of Hong Kong Institute of Network Coding 19 For all p less than C/(2Em), an upper bound on the achievable rate of any code over the worst-case binary-error network is Theorem 1

20 The Chinese University of Hong Kong Institute of Network Coding 20 Total number of Cm x n binary matrices (volume of the big square) is. Lower bound of the volume of the balls Consider those Z’s where every column has pEm ones in it, distinct Z results in distinct. The number of distinct is at least ~ Upper bound on the size of any codebook is Asymptotically in n, the Hamming-type upper bound is. Proof (sketch) pEmn

21 The Chinese University of Hong Kong Institute of Network Coding 21 Coherent NC: receiver knows the internal coding coefficients, hence knows T and. Non-coherent NC: coding coefficients, hence T and, are unknown in advance, more realistic setting. However, the random linear coding coefficients are usually chosen on the fly.

22 The Chinese University of Hong Kong Institute of Network Coding 22 Coherent NC: receiver knows the internal coding coefficients, hence knows T and. Non-coherent NC: coding coefficients, hence T and, are unknown in advance, more realistic setting. However, the random linear network coding coefficients are usually chosen on the fly.

23 The Chinese University of Hong Kong Institute of Network Coding 23 Coherent NC: receiver knows the internal coding coefficients, hence knows T and. Non-coherent NC: coding coefficients, hence T and, are unknown in advance, more realistic setting. However, the random linear network coding coefficients are usually chosen on the fly.

24 The Chinese University of Hong Kong Institute of Network Coding 24 Coherent NC: receiver knows the internal coding coefficients, hence knows T and. Non-coherent NC: coding coefficients, hence T and, are unknown in advance, more realistic setting. However, the random linear network coding coefficients are usually chosen on the fly.

25 The Chinese University of Hong Kong Institute of Network Coding 25 Coherent GV-type network codes achieve a rate of at least Theorem 2 Non-coherent GV-type network codes achieve a rate of at least Theorem 3

26 The Chinese University of Hong Kong Institute of Network Coding 26 Need an upper bound on volume of instead of the lower bound on volume of as in Thm1. (sphere packing vs. covering) Different Y, or equivalently, can be bounded above by the number of different Z, which equals The summation can be bounded from above by ~ Lower bound on the size of the codebook Asymptotically in n, the rate of coherent GV-type NC. Proof of Thm2 (sketch) TX(1) 2pEmn TX(2) 2pEmn TX(3) 2pEmn

27 The Chinese University of Hong Kong Institute of Network Coding 27 Crucial difference with the proof of Thm2: the process of choosing codewords. Consider all possible values of, at most (and hence T, since it comprises of a specific subset of C columns of ). The number of potential codewords that can be chosen in the codebook is at least which equals Asymptotically in n, it leads to the same rate of as coherent NC in Theorem2. Proof of Thm3 (sketch)

28 The Chinese University of Hong Kong Institute of Network Coding 28 For all p less than,the Hamming-type and GV-type bounds hold. Claim Theorem 1 (Hamming-type upper bound)requires that. For the GV-type bound in Thm2 and Thm3 to give non-negative rates,. When p is small, Proof

29 The Chinese University of Hong Kong Institute of Network Coding 29 Coherent/non-coherent GV-type lower bound: GV-type codes: End-to-end nature Complexity: poly. in block length Hamming-type upper bound: Worst-case bit-flip error model

30 The Chinese University of Hong Kong Institute of Network Coding 30

31 The Chinese University of Hong Kong Institute of Network Coding 31


Download ppt "Qiwen Wang, Sidharth Jaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information."

Similar presentations


Ads by Google