Download presentation
Presentation is loading. Please wait.
1
Zero-error source-channel coding with source side information at the decoder J. Nayak, E. Tuncel and K. Rose University of California, Santa Barbara
2
Outline The problem Asymptotically vanishing error case Zero error Unrestricted Input Restricted Input How large are the gains? Conclusions
3
The Problem Is separate source and channel coding optimal? S-C Encoder S-C Decoder Û V U X Channel p(y|x) Y n nn n n sc Channel Encoder Channel Decoder ÛU X Channel p(y|x) Y n n n n c c Source Encoder Source Decoder V n s s i î Does an encoder-decoder pair exist?
4
Asymptotically Vanishing Probability of Error Source coding: R>H(U|V) Slepian-Wolf code Channel coding: R<C Source-channel code (Shamai et. al.) Communication not possible if H(U|V)>C Separate source and channel coding asymptotically optimal
5
Channel Characteristic graph of the channel Examples Noiseless channel: Edge free graph Conventional channel: Complete graph Channel transition probability p(y|x), y Y, x X
6
Channel Code Code = symbols from an independent set 1-use capacity = log 2 (G x ) n uses of the channel Graph = G X n, n-fold AND product of G X Zero error capacity of a graph Depends only on characteristic graph G X
7
Source With Side Information (U,V) U x V ~ p(u,v) Support set S UV = {(u,v) U x V : p(u,v)>0} Confusability graph on U : G U =( U,E U ) Examples U=V : Edge free graph U,V independent: Complete graph
8
Source Code Rate depends only on G U Connected nodes cannot receive same codeword Encoding=Coloring G U Rate = log 2 (G U ) Two cases Unrestricted inputs Restricted inputs
9
Unrestricted Input (u,v) not necessarily in S UV Decode correctly if (u,v) S UV 1-instance rate: log 2 (G U ) n-instance graph Graph = G u (n), n-fold OR product of G u Asymptotic rate for UI code
10
Restricted Input (u,v) in S UV 1-instance rate: log 2 [ (G U )] n-instance graph Graph = G u n, n-fold AND product of G u Asymptotic rate = Witsenhausen rate of source graph
11
Source-Channel Coding 1 source instance -1 channel use code Encoder Decoder u 1 and u 2 are not distinguishable given side information sc 1 (u 1 ) and sc 1 (u 2 ) should not result in same output y u 1 and u 2 connected in G U sc 1 (u 1 ) and sc 1 (u 2 ) are not connected in G X and sc 1 (u 2 ) sc 1 (u 1 )
12
Source-Channel Coding If n-n UI (RI) code exists for some n, ( G U,G X ) is a UI (RI) compatible pair ( G, G ) is always a UI and RI compatible pair
13
Unrestricted Input Source a b cd e Channel E A B C D C(G X5 ) = log 2 [ 5 ] R UI (G U5 ) = log 2 [ 5/2 ]> C(G X5 ) Source = Complement of channel abcde ADB EC A B C D E =
14
Restricted Input Previous example not useful R W (G U5 ) = log 2 [ 5 ] = C(G X5 ) Source graph G U = complement of channel graph G X Approach: Find f(G) such that C(G) f(G) R W (G) If either inequality strict, done!
15
Lovász theta function: Lovász: Key result:
16
Restricted Input G U = Schläfli graph ( 27 vertex graph) = G X Haemers Code exists since G U = G X
17
How large are the gains? Channel uses per source symbol Alon ‘90: There exist graphs such that C(G) < log k and Given l, there exist G such that
18
Conclusions Under a zero error constraint separate source and channel coding is asymptotically sub-optimal. Not so for the asymptotically vanishing error case. In the zero-error case, the gains by joint coding can be arbitrarily large.
19
Scalar Code Design Complexity Instance: Source graph G Question: Does a scalar source-channel code exist from G to channel H? Equivalent to graph homomorphism problem from G into H NP-complete for all H (Hell & Nesetril ’90)
20
Future Work Do UI compatible pairs ( G U,G X ) exist with R W (G U ) < C(G X ) < R UI (G U ) ? For what classes of graphs is separate coding optimal?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.