1 Index Coding Part II of tutorial NetCod 2013 Michael Langberg Open University of Israel Caltech (sabbatical)

Slides:



Advertisements
Similar presentations
Lower Bounds for Additive Spanners, Emulators, and More David P. Woodruff MIT and Tsinghua University To appear in FOCS, 2006.
Advertisements

Routing and Congestion Problems in General Networks Presented by Jun Zou CAS 744.
On the Amortized Complexity of Zero-Knowledge Proofs Ronald Cramer, CWI Ivan Damgård, Århus University.
Shortest Vector In A Lattice is NP-Hard to approximate
Generalization and Specialization of Kernelization Daniel Lokshtanov.
C&O 355 Mathematical Programming Fall 2010 Lecture 22 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
B IPARTITE I NDEX C ODING Arash Saber Tehrani Alexandros G. Dimakis Michael J. Neely Department of Electrical Engineering University of Southern California.
Great Theoretical Ideas in Computer Science for Some.
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
Fast FAST By Noga Alon, Daniel Lokshtanov And Saket Saurabh Presentation by Gil Einziger.
CSC5160 Topics in Algorithms Tutorial 2 Introduction to NP-Complete Problems Feb Jerry Le
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
1 Network Coding: Theory and Practice Apirath Limmanee Jacobs University.
Dynamic Index Coding Broadcast Station N N Michael J. Neely, Arash Saber Tehrani, Zhen Zhang University of Southern California Paper available.
Dynamic Index Coding User set N Packet set P Broadcast Station N N p p p Michael J. Neely, Arash Saber Tehrani, Zhen Zhang University.
Ugo Montanari On the optimal approximation of descrete functions with low- dimentional tables.
Network Coding Theory: Consolidation and Extensions Raymond Yeung Joint work with Bob Li, Ning Cai and Zhen Zhan.
Network Coding Project presentation Communication Theory 16:332:545 Amith Vikram Atin Kumar Jasvinder Singh Vinoo Ganesan.
1 Simple Network Codes for Instantaneous Recovery from Edge Failures in Unicast Connections Salim Yaacoub El Rouayheb, Alex Sprintson Costas Georghiades.
Deterministic Network Coding by Matrix Completion Nick Harvey David Karger Kazuo Murota.
Dean H. Lorenz, Danny Raz Operations Research Letter, Vol. 28, No
Network Coding and Reliable Communications Group Algebraic Network Coding Approach to Deterministic Wireless Relay Networks MinJi Kim, Muriel Médard.
1 Brief Announcement: Distributed Broadcasting and Mapping Protocols in Directed Anonymous Networks Michael Langberg: Open University of Israel Moshe Schwartz:
Page 1 Page 1 Network Coding Theory: Tutorial Presented by Avishek Nag Networks Research Lab UC Davis.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Variable-Length Codes: Huffman Codes
Orthogonality and Least Squares
Low Complexity Algebraic Multicast Network Codes Sidharth “Sid” Jaggi Philip Chou Kamal Jain.
Network Alignment: Treating Networks as Wireless Interference Channel Chun Meng Univ. of California, Irvine.
Networking Seminar Network Information Flow R. Ahlswede, N. Cai, S.-Y. R. Li, and R. W. Yeung. Network Information Flow. IEEE Transactions on Information.
Simple and Improved Parameterized Algorithms for Multiterminal Cuts Mingyu Xiao The Chinese University of Hong Kong Hong Kong SAR, CHINA CSR 2008 Presentation,
The Quasi-Randomness of Hypergraph Cut Properties Asaf Shapira & Raphael Yuster.
Network Coding and Information Security Raymond W. Yeung The Chinese University of Hong Kong Joint work with Ning Cai, Xidian University.
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
NETWORK CODING. Routing is concerned with establishing end to end paths between sources and sinks of information. In existing networks each node in a.
Analysis of Precoding-based Intersession Network Coding and The Corresponding 3-Unicast Interference Alignment Scheme Jaemin Han, Chih-Chun Wang * Center.
NP Complexity By Mussie Araya. What is NP Complexity? Formal Definition: NP is the set of decision problems solvable in polynomial time by a non- deterministic.
1 The edge removal problem Michael Langberg SUNY Buffalo Michelle Effros Caltech.
1 Network Coding and its Applications in Communication Networks Alex Sprintson Computer Engineering Group Department of Electrical and Computer Engineering.
1 Network Coding and its Applications in Communication Networks Alex Sprintson Computer Engineering Group Department of Electrical and Computer Engineering.
EMIS 8373: Integer Programming NP-Complete Problems updated 21 April 2009.
Greedy Algorithms and Matroids Andreas Klappenecker.
CSCI 3160 Design and Analysis of Algorithms Tutorial 10 Chengyu Lin.
CS717 Algorithm-Based Fault Tolerance Matrix Multiplication Greg Bronevetsky.
Flows in Planar Graphs Hadi Mahzarnia. Outline O Introduction O Planar single commodity flow O Multicommodity flows for C 1 O Feasibility O Algorithm.
Presenter : Kuang-Jui Hsu Date : 2011/3/24(Thur.).
Maze Routing Algorithms with Exact Matching Constraints for Analog and Mixed Signal Designs M. M. Ozdal and R. F. Hentschke Intel Corporation ICCAD 2012.
1 The Encoding Complexity of Network Coding Michael Langberg California Institute of Technology Joint work with Jehoshua Bruck and Alex Sprintson.
The High, the Low and the Ugly Muriel Médard. Collaborators Nadia Fawaz, Andrea Goldsmith, Minji Kim, Ivana Maric 2.
1 CS612 Algorithms for Electronic Design Automation CS 612 – Lecture 8 Lecture 8 Network Flow Based Modeling Mustafa Ozdal Computer Engineering Department,
Flow in Network. Graph, oriented graph, network A graph G =(V, E) is specified by a non empty set of nodes V and a set of edges E such that each edge.
Network RS Codes for Efficient Network Adversary Localization Sidharth Jaggi Minghua Chen Hongyi Yao.
Onlinedeeneislam.blogspot.com1 Design and Analysis of Algorithms Slide # 1 Download From
Given this 3-SAT problem: (x1 or x2 or x3) AND (¬x1 or ¬x2 or ¬x2) AND (¬x3 or ¬x1 or x2) 1. Draw the graph that you would use if you want to solve this.
1)Effect of Network Coding in Graphs Undirecting the edges is roughly as strong as allowing network coding simplicity is the main benefit 2)Effect of Network.
1 A Reductionist view of Network Information Theory Michael Langberg SUNY Buffalo.
Great Theoretical Ideas in Computer Science.
Network Topology Single-level Diversity Coding System (DCS) An information source is encoded by a number of encoders. There are a number of decoders, each.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Network Coding Beyond Network Coding
Hans Bodlaender, Marek Cygan and Stefan Kratsch
Michael Langberg: Open University of Israel
Network Coding and its Applications in Communication Networks
Richard Anderson Lecture 25 NP-Completeness
3.5 Minimum Cuts in Undirected Graphs
On the effect of randomness on planted 3-coloring models
NP-Completeness Yin Tat Lee
Algorithms (2IL15) – Lecture 7
Zeev Dvir (Princeton) Shachar Lovett (IAS)
Presentation transcript:

1 Index Coding Part II of tutorial NetCod 2013 Michael Langberg Open University of Israel Caltech (sabbatical)

Outline This part of tutorial: Will show an equivalence between the network coding and index coding problems. Will show an equivalence between the network coding and index coding problems. Outline: Outline: Preliminary: Network Coding model. Preliminary: Network Coding model. Preliminary: Index Coding model. Preliminary: Index Coding model. Equivalence for linear encoding/decoding [ElRouayhebSprintsonGeorghiades]. Equivalence for linear encoding/decoding [ElRouayhebSprintsonGeorghiades]. Equivalence for general encoding/decoding [EffrosElRouayhebLangberg]. Equivalence for general encoding/decoding [EffrosElRouayhebLangberg]. Multicast vs. Unicast Index Coding [MalekiCadambeJafar]. Multicast vs. Unicast Index Coding [MalekiCadambeJafar]. Open Questions. Open Questions. 2

General theme Will show an equivalence between the network coding and index coding problems. An efficient reduction that allows to solve NC using any scheme to solve IC. An efficient reduction that allows to solve NC using any scheme to solve IC. 3 s1s1 t2t2 t1t1 t3t3 s2s2 s1s1s1s1 s2s2s2s2 s3s3s3s3 s4s4s4s4 s5s5s5s5 s6s6s6s6 t1t1t1t1 t2t2t2t2 t3t3t3t3 t4t4t4t4 t5t5t5t5 t6t6t6t6 Solve IC Obtain solution to NC NCIC Network communication challenging: combines topology with information. Network communication challenging: combines topology with information. Reduction separates information from topology. Reduction separates information from topology. Significantly simplifies the study of Network Comm. Significantly simplifies the study of Network Comm. Index Coding is a simple but representative instance of general network communication. Index Coding is a simple but representative instance of general network communication.

Directed acyclic network N. Directed acyclic network N. Edge e has capacity c e. Edge e has capacity c e. Source vertices S. Source vertices S. Terminal vertices T. Terminal vertices T. Requirement matrix: Requirement matrix: Transfer information from S to T. Transfer information from S to T. Objective: Objective: Information flow using Network Coding that satisfies terminals. Information flow using Network Coding that satisfies terminals. 4 General Network Coding s1s1 t2t2 t1t1 t3t3 s2s2 s3s3

5 Assumptions Sources S i hold independent information. Sources S i hold independent information. Zero error in communication. Zero error in communication. We consider the multiple unicast communication requirement (w.l.o.g. [DoughertyZeger] ): We consider the multiple unicast communication requirement (w.l.o.g. [DoughertyZeger] ): k source/terminal pairs (S i,T i ) that wish to communicate over N. k source/terminal pairs (S i,T i ) that wish to communicate over N. N S2S2S2S2 S1S1S1S1 S4S4S4S4 S3S3S3S3 T2T2T2T2 T1T1T1T1 T4T4T4T4 T3T3T3T3 S1S1 t2t2 t1t1 S2S2

6 NC preliminaries Communication at rate R = (R 1,…,R k ) is achievable over instance NC with block length n if  random variables {S i },{X e }: Rate: Source S i = R.V. independent and uniform over [2 R i n ]. Rate: Source S i = R.V. independent and uniform over [2 R i n ]. Edge capacity: For each edge e: X e = R.V. with support [2 c e n ]. Edge capacity: For each edge e: X e = R.V. with support [2 c e n ]. Functionality: for each edge e we have f e = function from incoming R.V.’s X e1,…,X e,in(e) to X e (i.e., X e =f e (X e1,…,X e,in(e) )). Functionality: for each edge e we have f e = function from incoming R.V.’s X e1,…,X e,in(e) to X e (i.e., X e =f e (X e1,…,X e,in(e) )). Decoding: for each terminal t i we define Decoding: for each terminal t i we define a decoding function yielding sources S i reqired. a decoding function yielding sources S i reqired. R=(R 1,…R k ) is ”n-feasible” if  code with block length n. R=(R 1,…R k ) is ”n-feasible” if  code with block length n. Alternatively we say that NC is (R,n)-feasible. Alternatively we say that NC is (R,n)-feasible. s2s2s2s2 s1s1s1s1 s4s4s4s4 s3s3s3s3 t2t2t2t2 t1t1t1t1 t4t4t4t4 t3t3t3t3 X1X1X1X1 X2X2X2X2 X3X3X3X3 XeXeXeXe fefefefe

Index Coding [Birk,Bar-Yossef et al.] IC is a special case of NC A set S of sources. A set T of terminals. Each terminal has some subset of sources (as side info.) and wants some subset of sources. Broadcast link has capacity c B. Other links have unlimited cap. Objective: To satisfy all terminals. using broadcast rate c B. s1s1s1s1 s2s2s2s2 s3s3s3s3 s4s4s4s4 t1t1t1t1 t2t2t2t2 t3t3t3t3 t4t4t4t4 cBcB

Index Coding Communication at rate R = (R 1,…,R k ) is achievable with block length n if  random variables {S i },X B : Rate: Source S i = R.V. independent and uniform over [2 R i n ]. Rate: Source S i = R.V. independent and uniform over [2 R i n ]. Encoding: X B = f B (S 1,…,S k ) is R.V. with support [2 c B n ]. Encoding: X B = f B (S 1,…,S k ) is R.V. with support [2 c B n ]. Decoding: for each terminal t i we define a decoding function g i taking as input the broadcasted message X B and the side information of t i ; and returning the sources S i wanted by t i. Decoding: for each terminal t i we define a decoding function g i taking as input the broadcasted message X B and the side information of t i ; and returning the sources S i wanted by t i. R=(R 1,…R k ) is ”n-feasible” if R=(R 1,…R k ) is ”n-feasible” if  code with block length n. Will use notation: IC is (R,n)-feasible. Will use notation: IC is (R,n)-feasible. s1s1s1s1 s2s2s2s2 s3s3s3s3 s4s4s4s4 t1t1t1t1 t2t2t2t2 t3t3t3t3 t4t4t4t4 IC is a simple instance of the NC problem: only a single encoding node. cBcB

Connecting NC to IC Step 1: Need to define reduction from NC to IC. Step 1: Need to define reduction from NC to IC. Step 2: Need to prove Step 2: Need to prove NC is (R,n)-feasible iff IC is (R’,n)-feasible. Would like: Reduction/code const. to be very efficient. Would like: Reduction/code const. to be very efficient. 9 s1s1 t2t2 t1t1 s2s2 s1s1s1s1 s2s2s2s2 s3s3s3s3 s4s4s4s4 s5s5s5s5 s6s6s6s6 t1t1t1t1 t2t2t2t2 t3t3t3t3 t4t4t4t4 t5t5t5t5 t6t6t6t6 Solve IC Obtain solution to NC NCIC

10 Outline Step 1: Present reduction from NC to IC. Step 1: Present reduction from NC to IC. Step 2: Equivalence for linear and general encoding/decoding. [ElRouayhebSprintsonGeorghiades], [EffrosElRouayhebLangberg]. Step 2: Equivalence for linear and general encoding/decoding. [ElRouayhebSprintsonGeorghiades], [EffrosElRouayhebLangberg]. s1s1 t2t2 t1t1 s2s2 s1s1s1s1 s2s2s2s2 s3s3s3s3 s4s4s4s4 s5s5s5s5 s6s6s6s6 t1t1t1t1 t2t2t2t2 t3t3t3t3 t4t4t4t4 t5t5t5t5 t6t6t6t6 NCIC Theorem: For any NC, R one can construct IC, R’ such that for any n: NC is (R,n)-feasible iff IC is (R’,n)-feasible.

The reduction NC sources Network: edges edges NC terminals NC sources NC edges NC term. NC edges NCIC Index Coding instance: Index Coding instance: Sources corresponding to NC sources, and NC edges. Sources corresponding to NC sources, and NC edges. Terminals corresponding to NC term., NC edges, special terminal. Terminals corresponding to NC term., NC edges, special terminal. For edge e: terminal t e in IC wants IC source X e and has as side information all IC sources incoming to e in NC. For edge e: terminal t e in IC wants IC source X e and has as side information all IC sources incoming to e in NC. IC encodes topology of NC in its terminals! X1X1X1X1 X2X2X2X2 X3X3X3X3 XeXeXeXe

The reduction in more detail NC sources Network: edges edges NC terminals NC sources NC edges NC term. NC edges NCIC Sources: |S|+|E| sources, one for each source of NC and one for each edge of NC: {S i ’} and {S e ’}. Sources: |S|+|E| sources, one for each source of NC and one for each edge of NC: {S i ’} and {S e ’}. Terminals: |T|+|E|+1 terminals: Terminals: |T|+|E|+1 terminals: One terminal t i ’ for each t i : wants S i ’ and has {S e ’} for e in In(t i ). One terminal t i ’ for each t i : wants S i ’ and has {S e ’} for e in In(t i ). t e ’ for each edge e: wants S e ’ and has {S a ’} for edge a in In(e). t e ’ for each edge e: wants S e ’ and has {S a ’} for edge a in In(e). One special terminal t all : wants {S e ’} and has {S i ’}. One special terminal t all : wants {S e ’} and has {S i ’}. X1X1X1X1 X2X2X2X2 X3X3X3X3 titititi

The reduction in more detail NC sources Network: edges edges NC terminals NC sources NC edges NC term. NC edges Sources: |S|+|E| sources, one for each source of NC and one for each edge of NC: {S i ’} and {S e ’}. Sources: |S|+|E| sources, one for each source of NC and one for each edge of NC: {S i ’} and {S e ’}. Terminals: |T|+|E|+1 terminals: Terminals: |T|+|E|+1 terminals: One for each terminal t i : wants S i ’ and has {S e ’} for e in In(t i ). One for each terminal t i : wants S i ’ and has {S e ’} for e in In(t i ). One for each edge e: wants S e ’ and has {S a ’} for edges a in In(e). One for each edge e: wants S e ’ and has {S a ’} for edges a in In(e). One special terminal t all : wants {S e ’} and has {S i ’}. One special terminal t all : wants {S e ’} and has {S i ’}. Bottle neck edge of capacity c B =  c e. Bottle neck edge of capacity c B =  c e. Given rate vector R=(R 1,…,R k ) we construct rate vector R’=({R i ’};{R e ’}): Given rate vector R=(R 1,…,R k ) we construct rate vector R’=({R i ’};{R e ’}): R i ’=R i and R e ’=c e. R i ’=R i and R e ’=c e. HasWants ti’ti’{S e ’} for e in In(t i ).Si’Si’ te’te’{S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’} Theorem: For any NC, R one can construct IC, R’ such that for any n: NC is (R,n)-feasible iff IC is (R’,n)-feasible. ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} Terminalst 1,…,t k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e X1X1X1X1 X2X2X2X2 X3X3X3X3 titititi

Theorem Theorem: Theorem: NC is (R,n)-feasible iff IC is (R’,n)-feasible. NC sources NC edges NC terminals NC sources NC edges NC term. NC edges ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} Terminalst 1,…,t k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e s1s1s1s1 s2s2s2s2 s3s3s3s3 s4s4s4s4 s5s5s5s5 s6s6s6s6 t1t1t1t1 t2t2t2t2 t3t3t3t3 t4t4t4t4 t5t5t5t5 t6t6t6t6 s1s1 t2t2 t1t1 s2s2 HasWants ti’ti’{S e ’} for e in In(t i ).Si’Si’ te’te’{S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’}

Geometric view Theorem: Theorem: NC is (R,n)-feasible iff IC is (R’,n)-feasible. NC sources NC edges NC terminals NC sources NC edges NC term. NC edges ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} Terminalst 1,…,t k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e s1s1s1s1 s2s2s2s2 s3s3s3s3 s4s4s4s4 s5s5s5s5 s6s6s6s6 t1t1t1t1 t2t2t2t2 t3t3t3t3 t4t4t4t4 t5t5t5t5 t6t6t6t6 s1s1 t2t2 t1t1 s2s2 R e ’=c e  e: R e ’=c e

What now? Outline: NC feasible implies IC feasible (works for both linear and non-linear). NC feasible implies IC feasible (works for both linear and non-linear). IC feasible implies NC feasible (will show new proof for linear that modifies to non linear). IC feasible implies NC feasible (will show new proof for linear that modifies to non linear). 16 Theorem: For any NC, R one can construct IC, R’ such that for any n: NC is (R,n)-feasible iff IC is (R’,n)-feasible.

Use global NC encoding functions. Use global NC encoding functions. Seen that X e = f e (X In (e)). Seen that X e = f e (X In (e)). Edge e also has a function F e : Edge e also has a function F e : X e = F e (S 1,…,S k ). X e = F e (S 1,…,S k ). IC: We need to define X B of rate c B. IC: We need to define X B of rate c B. Recall that c B =Σc e. Recall that c B =Σc e. Recall that X e of rate c e. Recall that X e of rate c e. For all e let X B (e)=S e ’+F e (S 1 ’,…,S k ’). For all e let X B (e)=S e ’+F e (S 1 ’,…,S k ’). There is a separation between {S i ’} and {S e ’}. There is a separation between {S i ’} and {S e ’}. Lets see that this works (decoding …). Lets see that this works (decoding …). 17 NC IC ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} Terminalst 1,…,t k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e HasWants ti’ti’ {S e ’} for e in In(t i ).Si’Si’ te’te’ {S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’} Theorem: For any NC, R one can construct IC, R’ such that for any n: NC is (R,n)-feasible iff IC is (R’,n)-feasible. Rate: Source S i = R.V. independent and uniform over [2 R i n ]. Rate: Source S i = R.V. independent and uniform over [2 R i n ]. Edge capacity: For each edge e: X e = R.V. with support [2 c e n ]. Edge capacity: For each edge e: X e = R.V. with support [2 c e n ]. Functionality: for each edge e we have f e = function from incoming R.V.’s X e1,…,X e,in(e) to X e (i.e., X e =f e (X e1,…,X e,in(e) )). Functionality: for each edge e we have f e = function from incoming R.V.’s X e1,…,X e,in(e) to X e (i.e., X e =f e (X e1,…,X e,in(e) )). Decoding: for each terminal t i we define a decoding function yielding sources S i required. Decoding: for each terminal t i we define a decoding function yielding sources S i required.

Use global encoding functions of NC. Use global encoding functions of NC. Each edge e has a function F e such that F e (S 1,…,S k )=X e. Each edge e has a function F e such that F e (S 1,…,S k )=X e. Recall X e of support [2 c e n ]. Recall X e of support [2 c e n ]. Recall c B =Σc e. Recall c B =Σc e. We need to define X B of total support [2 c B n ]. We need to define X B of total support [2 c B n ]. Let X B (e)=S e ’+F e (S 1 ’,…,S k ’). Let X B (e)=S e ’+F e (S 1 ’,…,S k ’). Decoding: Decoding: Consider terminal t e ’: wants S e ’ and has {S a ’} for edges a in In(e). Consider terminal t e ’: wants S e ’ and has {S a ’} for edges a in In(e). t e ’ also receives the broadcast X B. t e ’ also receives the broadcast X B. For each a compute X B (a)-S a ’ = S a ’+F a (S 1 ’,…,S k ’)-S a ’ = F a (S 1 ’,…,S k ’). For each a compute X B (a)-S a ’ = S a ’+F a (S 1 ’,…,S k ’)-S a ’ = F a (S 1 ’,…,S k ’). Use local encoding function f e to compute: Use local encoding function f e to compute: f e (F a1 (S 1 ’,…,S k ’),…, F a3 (S 1 ’,…,S k ’)) = F e (S 1 ’,…,S k ’) Compute X B (e)-F e (S 1 ’,…,S k ’) = S e ’+F e (S 1 ’,…,S k ’)-F e (S 1 ’,…,S k ’) = S e ’. Compute X B (e)-F e (S 1 ’,…,S k ’) = S e ’+F e (S 1 ’,…,S k ’)-F e (S 1 ’,…,S k ’) = S e ’. Same process for other terminals. Same process for other terminals. 18 NC IC ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} Terminalst 1,…,t k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e X a1 X a2 X a3 XeXeXeXe HasWants ti’ti’ {S e ’} for e in In(t i ).Si’Si’ te’te’ {S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’} f e (X a1,X a2,X a3 )=X e t e ’ will simulate the NC solution on edge e. Basic idea: simulate the NC solution!

What now? Outline: NC feasible implies IC feasible (works for both linear and non-linear). NC feasible implies IC feasible (works for both linear and non-linear). IC feasible implies NC feasible (will show new proof for linear that modifies to non linear). IC feasible implies NC feasible (will show new proof for linear that modifies to non linear). 19

20 Given a linear code for IC, how do we build one for NC? Given a linear code for IC, how do we build one for NC? Encoding for IC includes a linear encoding function f B Encoding for IC includes a linear encoding function f B {S i ’}, {S e ’} f B ({S i ’}, {S e ’}) = Can prove that A E is square and full rank. Can prove that A E is square and full rank. Crucial property: Crucial property: Fix any value s I ’ for S I ’=S 1 ’,…,S k ’ Fix any value s I ’ for S I ’=S 1 ’,…,S k ’ There exists unique value s E ’ for S E ’=S e1 ’,…,S em ’ such that s I ’, s E ’ There exists unique value s E ’ for S E ’=S e1 ’,…,S em ’ such that f B (s I ’, s E ’) =0. This will allow the construction of a NC! ASAS AEAE + Linear: IC NC ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} TerminalsT 1,…,T k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e t all only has {S i ’} and wants all {S e ’} HasWants ti’ti’ {S e ’} for e in In(t i ).Si’Si’ te’te’ {S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’} Rate: Sources S’ = ({S i ’},{S e ’}) of support [2 R i ’n ], [2 R e ’n ]. Rate: Sources S’ = ({S i ’},{S e ’}) of support [2 R i ’n ], [2 R e ’n ]. Bottleneck: X B = f B ({S i ’},{S e ’}) of support [2 c B n ]. Bottleneck: X B = f B ({S i ’},{S e ’}) of support [2 c B n ].

21 {S i ’}, {S e ’} f B ({S i ’}, {S e ’}) = A E is square and full rank. A E is square and full rank. Crucial property: For all s I ’ exists s E ’ s.t. s I ’, s E ’ Crucial property: For all s I ’ exists s E ’ s.t. f B (s I ’, s E ’) =0. ASAS AEAE + Linear: IC NC ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} TerminalsT 1,…,T k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e sI’sI’sI’sI’ sE’sE’sE’sE’ s I ’, s E ’ f B (s I ’, s E ’) Value = 0 Will define NC by ‘projecting’ f B onto the white curve!

HasWants ti’ti’ {S e ’} for e in In(t i ).Si’Si’ te’te’ {S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’} 22 Consider edge e in NC. Consider edge e in NC. We will define local encoding function f e (X a1,X a2,X a3 )=X e. We will define local encoding function f e (X a1,X a2,X a3 )=X e. Will define f e based on decoding g e ’ function of IC for terminal t e ’. Will define f e based on decoding g e ’ function of IC for terminal t e ’. g e ’(S a1 ’,S a2 ’,S a2 ’, f B ({S i ’}, {S e ’}))=S e ’. f e (X a1,X a2,X a3 )=g e ‘(X a1,X a2,X a3, 0). f e (X a1,X a2,X a3 )=g e ‘(X a1,X a2,X a3, 0). f e is a valid local encoding function. f e is a valid local encoding function. NC decoding defined similarly. NC decoding defined similarly. X a1 X a2 X a3 XeXeXeXe Linear: IC NC ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} TerminalsT 1,…,T k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e sI’sI’sI’sI’ sE’sE’sE’sE’ Value = 0

Consider terminal i in NC. Consider terminal i in NC. We need to define local decoding function g i (X a1,X a2,X a3 )=S i. We need to define local decoding function g i (X a1,X a2,X a3 )=S i. Will define g i based on decoding g i ’ function of IC for terminal t i ’. Will define g i based on decoding g i ’ function of IC for terminal t i ’. g i ‘(S a1 ’,S a2 ’,S a2 ’, f B ({S i ’}, {S e ’}))=S i ’. g i (X a1,X a2,X a3 )=g i ‘(X a1,X a2,X a3, 0). g i (X a1,X a2,X a3 )=g i ‘(X a1,X a2,X a3, 0). Recall: f e (X a1,X a2,X a3 )=g e ‘(X a1,X a2,X a3, 0). Recall: f e (X a1,X a2,X a3 )=g e ‘(X a1,X a2,X a3, 0). Both f e and g i are valid encoding/decoding functions. Both f e and g i are valid encoding/decoding functions. Need to prove correct decoding! Need to prove correct decoding! 23 X a1 X a2 X a3 SiSiSiSi Linear: IC NC ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} TerminalsT 1,…,T k {t i ’},{t e ’},t all Capacitiescece cb=cecb=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e sI’sI’sI’sI’ sE’sE’sE’sE’ Value = 0 HasWants ti’ti’ {S e ’} for e in In(t i ).Si’Si’ te’te’ {S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’}

24 f e (X a1,X a2,X a3 )=g e ’(X a1,X a2,X a3, 0). f e (X a1,X a2,X a3 )=g e ’(X a1,X a2,X a3, 0). g i (X a1,X a2,X a3 )=g i ’(X a1,X a2,X a3, 0). g i (X a1,X a2,X a3 )=g i ’(X a1,X a2,X a3, 0). Consider source info s I =s I ’=s 1,…,s k Consider source info s I =s I ’=s 1,…,s k Let s E ’ be corresponding value on curve. Let s E ’ be corresponding value on curve. Will show by induction that running NC on input s I corresponds to running IC on input (s I ’, s E ’). Will show by induction that running NC on input s I corresponds to running IC on input (s I ’, s E ’). Inductive claim: information x e in NC is exactly s e ’. Inductive claim: information x e in NC is exactly s e ’. Now for decoding s i at terminal i use g i Now for decoding s i at terminal i use g i X a1 X a2 X a3 S i /S a Decoding: IC NC ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} TerminalsT 1,…,T k {T i ’},{T e ’},T all Capacitiescece cE=cecE=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e sI’sI’sI’sI’ sE’sE’sE’sE’ Value = 0 x e =f e (x a1,x a2,x a3 ) = g e ’(x a1,x a2,x a3, 0) = g e ’(x a1,x a2,x a3, f B (s I ’,s E ’)) = g e ’(s a1 ’,s a2 ’,s a3 ’, f B (s I ’,s E ’)) = s e ’ Decoder i =g i (x a1,x a2,x a3 ) = g i ’(x a1,x a2,x a3, 0) = g i ’(s a1 ’,s a2 ’,s a3 ’, f B (s I ’,s E ’)) = s i ’ = s i Basic idea: NC is simulating the IC solution! We get a valid NC!

What now? Outline: NC feasible implies IC feasible (works for both linear and non-linear). NC feasible implies IC feasible (works for both linear and non-linear). IC feasible implies NC feasible (will show new proof for linear that modifies to non linear). IC feasible implies NC feasible (will show new proof for linear that modifies to non linear). 25

We use exact same proof! We use exact same proof! Where did we use linearity? Where did we use linearity? Crucial property: For all s I ’ exists s E ’ s.t. s I ’, s E ’ Crucial property: For all s I ’ exists s E ’ s.t. f B (s I ’, s E ’) =0. Need to prove property for general encoding functions. Need to prove property for general encoding functions. Property follows from terminal t all. Property follows from terminal t all. Given S I ’ and X B =S I ’, S E ’ we must be able to decode S E ’ Given S I ’ and X B =f B (S I ’, S E ’) we must be able to decode S E ’ Thus fixing s I ’, is 1-1 as a function of S E ’ Thus fixing s I ’, f B is 1-1 as a function of S E ’ Support of X B equals support of S E ’. Support of X B equals support of S E ’. Each row is a permutation. Each row is a permutation. Thus property holds! Thus property holds! 26 General : IC NC ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} TerminalsT 1,…,T k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e sI’sI’sI’sI’ sE’sE’sE’sE’ Value = 0 HasWants ti’ti’ {S e ’} for e in In(t i ).Si’Si’ te’te’ {S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’} differ Number of different X B values is exactly equal to number of different S E ’ values.

What now? Outline: NC feasible implies IC feasible (works for both linear and non-linear). NC feasible implies IC feasible (works for both linear and non-linear). IC feasible implies NC feasible (will show new proof for linear that modifies to non linear). IC feasible implies NC feasible (will show new proof for linear that modifies to non linear). Multicast IC can be represented by Unicast IC (linear only) [MalekiCadambeJafar]. Multicast IC can be represented by Unicast IC (linear only) [MalekiCadambeJafar]. 27

In previous reduction we use an IC instance which “multicasts” information to different terminals. In previous reduction we use an IC instance which “multicasts” information to different terminals. Same information is wanted by more than one terminal. Same information is wanted by more than one terminal. In NC, any (multiple) multicast can be reduced to (multiple) unicast [DoughertyZeger]. In NC, any (multiple) multicast can be reduced to (multiple) unicast [DoughertyZeger]. Does the same phenomena hold for IC? Does the same phenomena hold for IC? 28 Multicast vs Unicast ReductionNCIC SourcesS 1,…,S k {S i ’}, {S e ’} TerminalsT 1,…,T k {t i ’},{t e ’},t all Capacitiescece cB=cecB=ce RateR 1,…,R k {R i ’}, {R e ’} R i ’=R i R e ’=c e HasWants ti’ti’ {S e ’} for e in In(t i ).Si’Si’ te’te’ {S a ’} for a in In(e).Se’Se’ t all {S i ’}{S e ’}

In previous reduction we use an IC instance which “multicasts” information to different terminals. In previous reduction we use an IC instance which “multicasts” information to different terminals. Same information is wanted by more than one terminal. Same information is wanted by more than one terminal. In NC, any (multiple) multicast can be reduced to (multiple) unicast. In NC, any (multiple) multicast can be reduced to (multiple) unicast. Does the same phenomena hold for IC? Does the same phenomena hold for IC? Recent work by [MalekiCadambeJafar] show that unicast suffices in case (if restricted to linear encoding/decoding). Recent work by [MalekiCadambeJafar] show that unicast suffices in case (if restricted to linear encoding/decoding). Recent work Recent work Implies that for linear encoding: NC reduces to (multiple) unicast IC! Implies that for linear encoding: NC reduces to (multiple) unicast IC! Each terminal wants different message. Each terminal wants different message. Same number of sources and terminals. Same number of sources and terminals. IC can be characterized by side information graph, rather than side information hypergraph. IC can be characterized by side information graph, rather than side information hypergraph. 29 Multicast vs Unicast

Insufficiency of Linear Codes for multiple unicast IC Groupcast Groupcast (GC) Multiple-Unicast (MU) Slides by Viveck Cadambe

Insufficiency of Linear Codes for multiple unicast IC Groupcast Multiple Unicast Groupcast (GC) Multiple-Unicast (MU)

Insufficiency of Linear Codes for multiple unicast IC Groupcast Multiple Unicast Groupcast (GC) Multiple-Unicast (MU)

Insufficiency of Linear Codes for multiple unicast IC Groupcast Multiple Unicast Groupcast (GC) Multiple-Unicast (MU) For linear schemes

Insufficiency of Linear Codes for multiple unicast IC Groupcast Multiple Unicast Groupcast (GC) Multiple-Unicast (MU) In general, for linear schemes Needs auxiliary destinations to achieve equivalence of linear schemes

Some open problems Multicast vs. Unicast for general encoding. Multicast vs. Unicast for general encoding. Would be surprising: problems known to be more difficult in the multiple-multicast setting (e.g., IC via cycle packing [ChaudhryAsadSprintsonLangberg] ). Would be surprising:  problems known to be more difficult in the multiple-multicast setting (e.g., IC via cycle packing [ChaudhryAsadSprintsonLangberg] ). Capacity: can one determine if rate R is in capacity region of NC via knowledge of capacity region of IC? Capacity: can one determine if rate R is in capacity region of NC via knowledge of capacity region of IC? Reduction is not robust enough to withhold the closure operation in the definition of capacity. Reduction is not robust enough to withhold the closure operation in the definition of capacity. Answer is yes for linear case. Also for co-located NC sources [WongLangbergEffros]. Answer is yes for linear case. Also for co-located NC sources [WongLangbergEffros]. 35 R e ’=c e  e: R e ’=c e

Some open problems vs zero error in communication:  vs zero error in communication: Does allowing some error increase rate in NC/IC? Does allowing some error increase rate in NC/IC? IC there is no advantage [LangbergEffros] to allowing small error in communication … can this extend to NC? IC there is no advantage [LangbergEffros] to allowing small error in communication … can this extend to NC? NC – not known! In NC “no advantage” known for co-located [ChanGrant] [LangbergEffros] and other cases. NC – not known! In NC “no advantage” known for co-located [ChanGrant] [LangbergEffros] and other cases. Can we use equivalence between NC and IC? Can we use equivalence between NC and IC? Intriguing connections to other problems such as the edge removal problem [HoEffrosJalali]. Intriguing connections to other problems such as the edge removal problem [HoEffrosJalali]. Algorithms!: Algorithms!: Wide open … both in NC setting and IC setting … Wide open … both in NC setting and IC setting … 36

Conclusions Network communication challenging: combines topology with information. Network communication challenging: combines topology with information. Discussed equivalence between the network coding and index coding problems. Discussed equivalence between the network coding and index coding problems. Reduction separates information from topology. Reduction separates information from topology. Significantly simplifies the study of Network Comm. Significantly simplifies the study of Network Comm. Index Coding is a simple but representative instance of general network communication. Index Coding is a simple but representative instance of general network communication. 37 Thanks! NC sources NC edges NC terminals NC sources NC edges NC term. NC edges