Network RS Codes for Efficient Network Adversary Localization Sidharth Jaggi Minghua Chen Hongyi Yao.

Slides:



Advertisements
Similar presentations
General Linear Model With correlated error terms  =  2 V ≠  2 I.
Advertisements

An Easy-to-Decode Network Coding Scheme for Wireless Broadcasting
Cyclic Code.
Error Control Code.
Henry C. H. Chen and Patrick P. C. Lee
1 Index Coding Part II of tutorial NetCod 2013 Michael Langberg Open University of Israel Caltech (sabbatical)
296.3Page :Algorithms in the Real World Error Correcting Codes II – Cyclic Codes – Reed-Solomon Codes.
Digital Fountain Codes V. S
15-853:Algorithms in the Real World
José Vieira Information Theory 2010 Information Theory MAP-Tele José Vieira IEETA Departamento de Electrónica, Telecomunicações e Informática Universidade.
Information and Coding Theory
D.J.C MacKay IEE Proceedings Communications, Vol. 152, No. 6, December 2005.
Information Theoretical Security and Secure Network Coding NCIS11 Ning Cai May 14, 2011 Xidian University.
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
Error-Correcting Codes
1 Finite-Length Scaling and Error Floors Abdelaziz Amraoui Andrea Montanari Ruediger Urbanke Tom Richardson.
1 Network Coding: Theory and Practice Apirath Limmanee Jacobs University.
Distributed Regression: an Efficient Framework for Modeling Sensor Network Data Carlos Guestrin Peter Bodik Romain Thibaux Mark Paskin Samuel Madden.
Resilient Network Coding in the presence of Byzantine Adversaries Michelle Effros Michael Langberg Tracey Ho Sachin Katti Muriel Médard Dina Katabi Sidharth.
Probability theory 2011 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different definitions.
Network Coding and Reliable Communications Group Algebraic Network Coding Approach to Deterministic Wireless Relay Networks MinJi Kim, Muriel Médard.
10th Canadian Workshop on Information Theory June 7, 2007 Rank-Metric Codes for Priority Encoding Transmission with Network Coding Danilo Silva and Frank.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Quantum Algorithms I Andrew Chi-Chih Yao Tsinghua University & Chinese U. of Hong Kong.
Processing Along the Way: Forwarding vs. Coding Christina Fragouli Joint work with Emina Soljanin and Daniela Tuninetti.
Multimedia Databases LSI and SVD. Text - Detailed outline text problem full text scanning inversion signature files clustering information filtering and.
15-853Page :Algorithms in the Real World Error Correcting Codes I – Overview – Hamming Codes – Linear Codes.
Low Complexity Algebraic Multicast Network Codes Sidharth “Sid” Jaggi Philip Chou Kamal Jain.
The Scaling Law of SNR-Monitoring in Dynamic Wireless Networks Soung Chang Liew Hongyi YaoXiaohang Li.
Feng Lu Chuan Heng Foh, Jianfei Cai and Liang- Tien Chia Information Theory, ISIT IEEE International Symposium on LT Codes Decoding: Design.
Repairable Fountain Codes Megasthenis Asteris, Alexandros G. Dimakis IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 32, NO. 5, MAY /5/221.
Network Coding and Information Security Raymond W. Yeung The Chinese University of Hong Kong Joint work with Ning Cai, Xidian University.
EE360 PRESENTATION On “Mobility Increases the Capacity of Ad-hoc Wireless Networks” By Matthias Grossglauser, David Tse IEEE INFOCOM 2001 Chris Lee 02/07/2014.
Adaptive CSMA under the SINR Model: Fast convergence using the Bethe Approximation Krishna Jagannathan IIT Madras (Joint work with) Peruru Subrahmanya.
Exercise in the previous class p: the probability that symbols are delivered correctly C: 1 00 → → → → What is the threshold.
AN ORTHOGONAL PROJECTION
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
Resilient Network Coding in the Presence of Eavesdropping Byzantine Adversaries Michael Langberg Sidharth Jaggi Open University of Israel ISIT 2007 Tsinghua.
1 Network Coding and its Applications in Communication Networks Alex Sprintson Computer Engineering Group Department of Electrical and Computer Engineering.
COEN 180 Erasure Correcting, Error Detecting, and Error Correcting Codes.
MIMO continued and Error Correction Code. 2 by 2 MIMO Now consider we have two transmitting antennas and two receiving antennas. A simple scheme called.
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
CS717 Algorithm-Based Fault Tolerance Matrix Multiplication Greg Bronevetsky.
Ahmed Osama Research Assistant. Presentation Outline Winc- Nile University- Privacy Preserving Over Network Coding 2  Introduction  Network coding 
A Passive Approach to Sensor Network Localization Rahul Biswas and Sebastian Thrun International Conference on Intelligent Robots and Systems 2004 Presented.
1 Markov Decision Processes Infinite Horizon Problems Alan Fern * * Based in part on slides by Craig Boutilier and Daniel Weld.
ADVANTAGE of GENERATOR MATRIX:
Chapter 31 INTRODUCTION TO ALGEBRAIC CODING THEORY.
Information Theory Linear Block Codes Jalal Al Roumy.
The Scaling Law of SNR-Monitoring in Dynamic Wireless Networks Soung Chang Liew Hongyi YaoXiaohang Li.
Computer Science Division
Modern information retreival Chapter. 02: Modeling (Latent Semantic Indexing)
Part 1: Overview of Low Density Parity Check(LDPC) codes.
The parity bits of linear block codes are linear combination of the message. Therefore, we can represent the encoder by a linear system described by matrices.
Some Computation Problems in Coding Theory
Exact Regenerating Codes on Hierarchical Codes Ernst Biersack Eurecom France Joint work and Zhen Huang.
Network Coding Tomography for Network Failures
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Exercise in the previous class (1) Define (one of) (15, 11) Hamming code: construct a parity check matrix, and determine the corresponding generator matrix.
Network Topology Single-level Diversity Coding System (DCS) An information source is encoded by a number of encoders. There are a number of decoders, each.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
Reliability of Disk Systems. Reliability So far, we looked at ways to improve the performance of disk systems. Next, we will look at ways to improve the.
REVIEW Linear Combinations Given vectors and given scalars
IERG6120 Lecture 22 Kenneth Shum Dec 2016.
MinJi Kim, Muriel Médard, João Barros
Amplify-and-Forward Schemes for Wireless Communications
ECE 544 Protocol Design Project 2016
Presentation transcript:

Network RS Codes for Efficient Network Adversary Localization Sidharth Jaggi Minghua Chen Hongyi Yao

Disease Localization Heart 2

Network Adversary Localization Adversarial errors: The corrupted packets are carefully chosen by the enemies for specific reasons. Our object: Locating network adversaries

Network coding Network coding suffices to achieve to the optimal throughput for multicast[RNSY00]. Random linear network coding suffices, in addition to its distributed feature and low design complexity[TMJMD03]. S r1r1 r2r2 m1m1 m1m1 m2m2 m2m2 m2m2 m1m1 m 1 +m 2 am 1 +bm 2 5

Network Coding Aids Localization Routing scheme is used by u: x(e 3 )=x(e 1 ), x(e 4 )=x(e 2 ). Probe messages: M=[1, 2] s u r e1e1 e2e2 e3e3 e4e4 1 2 x= Y E =[3, 2] Y M =[1,2] E=Y E -Y M =[2,0] x[1,0]x[0,1] e3e3 e1e Y E =[7, 5] Y M =[5,3] E=Y E -Y M =[2,2] 3 2 e1e1 Random Network coding (RLNC): x(e 3 )=x(e 1 )+2x(e 2 ), x(e 4 )=x(e 1 )+x(e 2 ). x[1,1]x[2,1] x x x x Routing scheme is not enough for r to locate adversarial edge e 1. Network coding scheme is enough for r to locate adversarial edge e 1. 7 x 0x 2x back

RLNC for Adversary Localization [YSM10] 8 Desired features of RLNC  Distributed Implementation.  Achieving communication capacity.  Locate maximum number of adversaries.

RLNC for Adversary Localization [YSM10] 8 Drawbacks of RLNC  Require topology information.  Locating adversaries is a computational hard problem.

Our contribution: Network Reed-Solomon Codes 8 Network RS Codes preserves all the desired features of RLNC.  Distributed Implementation.  Achieving communication capacity.  Locate maximum number of adversaries. Furthermore, Network RS Codes Do not require topology information. Locate network adversaries efficiently.

Concept: IRV Edge Impulse Response Vector (IRV): The linear transform from the edge to the receiver. Using IRVs we and locate failures [1 0 0] [0 3 2] [2 9 6] Relation between IRVs and network structure: e1e1 e2e2 e3e3 IRV(e 1 ) is in the linear space spanned by IRV(e 2 ) and IRV(e 3 ) Unique mapping from edge to IRV: Two independent edges can have independent IRVs.

Adversary Localization by IRV Using network error correction codes [JLKHKM07], error vector E can be decoded at the receiver. Error E is in fact a linear combination of IRVs={IRV(e 1 ), IRV(e 2 ),…,IRV(e m )}. That is E=c 1 IRV(e 1 ) + c 2 IRV(e 2 ) + … + c m IRV(e m ). In particular, only the IRVs of adversarial edges has nonzero coefficients to E.

Adversary Localization by IRV Without loss of generality, assume e 1, e 1, …, e z are the adversarial edges. Thus, E=c 1 IRV(e 1 ) + c 2 IRV(e 2 ) + … + c z IRV(e z ). The adversarial edge number z is much smaller than the total edge number m. Therefore, locating adversaries is mathematically equivalent with sparsely decomposing E into IRVs.

Why RLNC is not good? Locating adversaries is mathematically equivalent with sparsely decomposing E into IRVs. For RLNC, IRVs are sensitive to network topology… For RLNC, IRVs are randomized chosen. Sparse decomposition into randomized vectors are hard [V97].

Key idea of Network RS Codes Motivated by classical Reed Solomon (RS) codes [MS77]. We want the IRV of e i to be its RS IRV IRV’(e i ), which is a randomly chosed column of RS parity check matrix. IRV’(e 2 )IRV’(e 4 )IRV’(e 1 )IRV’(e 5 )IRV’(e 3 ) (A 1 ) 1 (A 2 ) 1 (A 3 ) 1 (A 4 ) 1 (A 5 ) 1 (A 1 ) 2 (A 2 ) 2 (A 3 ) 2 (A 4 ) 2 (A 5 ) 2 (A 1 ) 3 (A 2 ) 3 (A 3 ) 3 (A 4 ) 3 (A 5 ) 3 (A 1 ) 4 (A 2 ) 4 (A 3 ) 4 (A 4 ) 4 (A 5 ) 4 Parity Check Matrix H of a RS code.

Nice properties of RS parity check matrix H Assume E is a sparse linear combination of the columns of H. We can decompose E into sparse columns of H in a computational efficient manner. Thus, if all edge IRVs equal their RS IRVs, we can locate network adversaries efficiently.

To achieve RS IRVs Each node, say u, performs local coding as follows.  Node u assume e 1 and e 2 have RS IRVs, i.e., IRV(e 1 )=IRV’(e 1 ) and IRV(e 2 )=IRV’(e 2 ).  Recall that the IRV of e 3 is in the span of IRV(e 1 ) and IRV(e2).  Node u chooses the coding coefficients such that IRV(e 3 )=IRV’(e 3 ). u e3e3 e2e2 e1e1

To achieve RS IRVs Surprisingly, previous local node scheme guarantees the desired global performance: each user’s IRV equals the corresponding RS IRV. Distributed Implementation. No topology information is needed.

Summary of our contribution Code TypeImplementCommunication Capacity Number of locatable Adversaries RLNC [YJM10]DistributedAchievedMaximum Network RS Codes DistributedAchievedMaximum Code TypeTopology Information Computational Complexity RLNC [YJM10]RequiredExponential time Network RS Codes Not neededPolynomial time

Network Coding Tomography for Network Failures Thanks! Questions? 14 Details in: Hongyi Yao and Sidharth Jaggi and Minghua Chen, Passive network tomography: A network coding approach, under submission to IEEE Trans. on Information Theory, and arxiv: