Download presentation
Presentation is loading. Please wait.
Published byKory Walton Modified over 8 years ago
1
Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information Vinod Sharma
2
Outline Model General Theorem Special cases Example GMAC Discrete Sources Gaussian Sources Orthogonal channels Fading Channel Hierarchical Networks Conclusions
3
Joint Source-Channel Coding on a Multiple Access Channel YnYn X 1n U 1n MAC Enc 1 Enc2 U 2n X 2n Dec ZnZn Z 2n Z 1n
4
Joint Source-channel coding on a Multiple access channel (Contd) U in, n ≥ 1 iid data generated by user i. (U 1n, U 2n ) correlated. Z in : side information at the encoder i. Z n : side information at the decoder. X in : channel input from i th encoder at time n. Y n : channel output at time n. MAC is Memoryless : Aim: To transmit data on MAC s.t. the decoder can decode within given distortion. Source-channel separation does not hold for this system. Therefore joint source-channel coding is needed.
5
Transmission of data from Sensor nodes to Fusion Node Cluster Multiple Access channel Sensor nodes Cluster head Fusion node Side information
6
Transmission of data from Sensor nodes to Fusion Node Sensor nodes transmit data to their cluster heads. Cluster heads are more powerful nodes. Cluster heads transmit data directly to the fusion node. Within a cluster it becomes a Multiple Access channel. Due to proximity of sensor nodes, data generated by them is correlated. MAC is a building block for general sensor networks also.
7
Theorem (R Rajesh, V K Varshneya, V Sharma “Distributed joint source channel coding on a multiple access channel with side information” ISIT 08) there exists a function such that A source can be transmitted over the multiple access channel with distortions (D 1, D 2 ) if there exist random variables (W 1, W 2, X 1, X 2 ) such that 1) 2) f D : W 1 x W 2 x Z → (Û 1, Û 2 )
8
I (U 1, Z 1 ; W 1 | W 2, Z) < I (X 1 ; Y | X 2, W 2, Z) I (U 2, Z 2 ; W 2 | W 1, Z) < I (X 2 ; Y | X 1, W 1, Z) I (U 1, U 2, Z 1, Z 2 ; W 1, W 2 | Z) < I (X 1, X 2 ; Y | Z ) Where W 1 is the set in which W i takes values. and the constraints Theorem 3)
9
Theorem (Contd) Proof of theorem uses : -Vector quantization. -Joint source-channel coding. -Decoding by joint weak typicality. -Estimation of the sources Comments : Correlations in (U 1, U 2 ) help in reducing LHS and increasing RHS
10
Generalizations The above theorem can be generalized to Multiple user case. Recover functions of the sources. Discrete /Continuous source/channel alphabets. Includes the practically important Gaussian MAC. Unbounded distortion measures. Includes Mean Square Error (MSE).
11
Special Cases 1.Lossless MAC with correlated Sources (Z 1, Z 2, Z ) independent of (U 1,U 2 ), W i = U i, i = 1, 2 Then our result reduces to: H(U 1 |U 2 ) < I(X 1 ;Y|X 2,U 2 ), H(U 2 |U 1 ) < I(X 2 ;Y|X 1,U 1 ), H(U 1,U 2 ) < I(X 1,X 2 ;Y). Recovers Cover, El. Gamel, Salehi (1980).
12
Special Cases 2. Lossy Multiple Access communication: Take (Z 1, Z 2, Z) independent of (U 1, U 2,W 1,W 2 ) Then our result reduces to I (U 1 ;W 1 | W 2 ) < I (X 1 ;Y | X 2, W 2 ), I (U 2 ; W 2 | W 1 ) < I (X 2 ;Y |X 1, W 1 ), I (U 1, U 2 ;W 1, W 2 ) < I (X 1, X 2 ;Y). Generalizes Cover, El. Gamel, Salehi (1980) to lossy case.
13
Special Cases 3. Lossy distributed source coding with side information: MAC taken as a dummy channel with Y=(X 1,X 2 ) R 1 > I (U 1, Z 1 ; W 1 | W 2, Z), R 2 > I (U 2, Z 2 ; W 2 | W 1,, Z), R 1 + R 2 > I (U 1, U 2, Z 1, Z 2 ; W 1, W 2 | Z). Generalizes Slepian-Wolf (1973), Wyner and Ziv (1976), Gastpar (2004). 4. Lossy GMAC Y = X 1 + X 2 + N Generalizes Lapidoth and Tinguely (2006)
14
Special Cases 5. Compound MAC and Interference channel with side Information Two decoders- Decoder i has access to Y i and Z i Take U i =W i, i=1, 2. Applying Theorem twice for receiver only side information case for i=1,2 H(U 1 | U 2,Z i ) < I (X 1 ;Y i | X 2, U 2, Z i ), H(U 2 | U 1, Z i ) < I (X 2 ;Y i | X 1, U 1, Z i ), H(U 1, U 2 |Z i ) < I (X 1, X 2 ;Y i |Z i ). This gives the sufficient conditions for interference channels for the strong interference case. Recovers the results of D. Gunduz and E Erkip (ISIT 07)
15
Special Cases Also recovers the results: Lossless transmission over a MAC with receiver side information – D Gunduz, E Erkip UCSD ITA Workshop 07 Mixed side information- M Fleming, M Effros IEEE TIT 06 Lossless Multiple access communication with common information – Slepian and Wolf - 1973 Correlated sources over orthogonal channels- J Barros and S Servetto- ISIT 03
16
Example 1 (U 1,U 2 ) have joint distribution p(0, 0) = p(1, 1) =1/3 p(1, 0) = p(0, 1) =1/6 H(U 1 ) = H(U 2 ) = 1. For lossless transmission Using independent coding we need rates R 1 ≥ H(U 1 ),R 2 ≥ H(U 2 ) Exploiting correlations using Slepian-Wolf coding we need R 1 ≥ H(U 1 |U 2 ) = 0.918,R 2 ≥ 0.918 and R 1 + R 2 ≥ H(U 1,U 2 ) = 1.918.
17
Example(contd) MAC : Y = X 1 + X 2 X 1,X 2 take values in {0, 1} and Y in {0, 1, 2}. This MAC does not satisfy source channel separation conditions. Sum capacity of this channel with independent X 1,X 2 = 1.5 bits/symbol interval. Even Slepian-Wolf coding will not provide lossless transmission.
18
Example (contd) Joint source-channel coding : –Take X i = U i, i = 1, 2 –Then channel can transmit at rate 1.585. –Still we cannot transmit losslessly. With Distortion: –Consider Hamming distance –Allowable distortion = 4% –Then we need R 1 ≥ H(p) - H(d) = 0.758 –Thus with independent coding we will not get this distortion but with correlated (X 1,X 2 ) it is possible.
19
Example (contd) With side conditions Let Z 1 obtained from U 2 via a binary symmetric channel with cross over prob. 0.3. Similarly for Z 2. Z= (Z 1, Z 2,V) Let V = U 1.U 2.N, with N independent of U 1,U 2 and P[N = 0] = P[N = 1] = 0.5 The sum rate needed for lossless transmission ≥ 1.8 bits with Z 1 only. ≥ 1.683 bits with Z 1 and Z 2. ≥ 1.606 bits with Z only. ≥ 1.412 with Z 1, Z 2, Z. Thus with Z 1, Z 2, Z we can transmit losslessly with independent coding.
20
Gaussian MAC (GMAC) Discrete Alphabet sources over a GMAC Y = X 1 + X 2 + N, N (X 1, X 2 ), N ~ N(0, 2 N ) Initially we consider no side information. Power constraint: For lossless transmission, we need H (U 1 |U 2 ) < I (X 1 ;Y|X 2,U 2 ) H (U 2 |U 1 ) < I (X 2 ;Y|X 1,U 1 )(1) H (U 1,U 2 ) < I (X 1,X 2 ;Y) Where {X 1, U 1, U 2, X 2 } forms a Markov chain. Comment: Above inequalities are not explicit enough. We make them more explicit
21
Lemma H (U 1 |U 2 ) < I (X 1 ;Y|X 2,U 2 ) < I (X 1 ;Y|X 2 ) H (U 2 |U 1 ) < I (X 2 ;Y|X 1,U 1 )< I (X 2 ;Y|X 1 )(1a) H (U 1,U 2 ) < I (X 1,X 2 ;Y) Lemma 1: I(X 1 ;Y|X 2 ), I(X 2 ;Y|X 1 ) and I(X 1,X 2 ;Y) are maximized by jointly Gaussian (X 1,X 2 ) with mean zero and correlation .
22
Comments Conditions obtained by relaxing the R.H.S in (1) are more explicit. Use to obtain efficient coding schemes. For coding schemes check if (1) is satisfied. Otherwise change appropriately. We develop a distributed coding scheme that maps U i to X i such that (X 1,X 2 ) are jointly Gaussian with a given correlation
23
Lemma Lemma 2 If X 1 - U 1 - U 2 - X 2 and (X 1,X 2 ) are jointly Gaussian then Lemma 3 Any jointly Gaussian two dimensional density can be arbitrarily closely approximated by a weighted sum of product of marginal Gaussian densities:
24
A joint source-channel coding scheme (R Rajesh, V Sharma “A joint source-channel coding scheme for transmission of discrete correlated sources over a Gaussian MAC ”, ISITA 08) p i, q i in Lemma 3 can be –ve. Thus for approximating f (x 1, x 2 ) = desired jointly Gaussian density with correlation Find g from Lemma 3 that minimizes with p i ≥ 0, q i ≥ 0 p i.q i = 1 and mean zero.
25
Discrete Alphabet sources over a GMAC( contd.) Thus taking (X 1,X 2 ) jointly Gaussian with mean zero, correlation and var (X i ) = P i provides conditions for lossless transmission: (2)
26
Example 2 (U 1,U 2 ) has distribution : p(0, 0) = p(1, 1) = p(0, 1) =1/3, p(1, 0) = 0 Power constraints : P 1 = 3, P 2 = 4, 2 N = 1, H(U 1,U 2 ) = 1.585 –For independent (X 1,X 2 ), RHS in the third inequality in (2) is 1.5. –Thus, (U 1,U 2 ) cannot be transmitted on the GMAC with independent (X 1,X 2 ). – [0.144, 0.7024] will satisfy all three constraints.
27
Example 2 (contd) For = 0.3 and 0.6 we obtained the density from the above optimization problem. The upper bound from Lemma 2 is 0.546.
28
Example 2 (contd)
29
GAUSSIAN SOURCES OVER A GAUSSIAN MAC (R Rajesh and V Sharma “Source-channel coding for Gaussian sources over a Gaussian multiple access channel”, Allerton 07) (U 1,U 2 ) mean zero, Jointly Gaussian and covariance Y = X 1 + X 2 + N N ~ N(0, 2 N ) independent of X 1,X 2. P i is average power constraint for user i. There is no side information. Particularly relevant model for change of detection problem in sensor networks. Source-channel separation does not hold.
30
GAUSSIAN SOURCES OVER A GMAC We consider joint source-channel coding: (i) Amplify and forward (AF) (ii) Separation-Based (SB) (iii) Lapidoth-Tinguely (LT)
31
Amplify and Forward scheme (AF) Y X2X2 X1X1 GMAC U1U1 U2U2 Dec Scale For two user symmetric case it is optimal for SNR ≤ /(1- 2 ) (Lapidoth and Tinguely (2006)). X i are scaled U i s.t the average power constraints P i, i= 1, 2 are satisfied. Then distortions are
32
Separation Based approach(SB) Vector quantization followed by Slepian-Wolf coding Rate region for given (D 1, D 2 ) given in Viswanath (IEEE TIT 08) X 1, X 2 independent Capacity of GMAC R 1 ≤ I (X 1,Y|X 2 ) = 0.5log(1 +P 1 / N 2 ) R 2 ≤ I (X 2,Y|X 1 ) = 0.5log(1 + P 2 / N 2 ) R 1 + R 2 ≤ I (X 1,X 2 ;Y) = 0.5log(1 + (P 1 + P 2) / N 2 )
33
Joint Source Channel coding (of Lapidoth and Tinguely) Vector quantize the sources i=1,2 Generate 2 nRi iid Gaussian codewords with variance 1. Code U i n by mapping to nearest code word Scale codeword i to average power P i Correlation between the codewords
34
Recovers the result in Lapidoth(2006 ) Joint Source Channel coding (LT) If R 1, R 2 satisfy Then we obtain,
35
Comparison of the three schemes U i ~ N(0,1), i=1,2, =0.1
36
SNR Vs Distortion performance =0.75
37
Conclusions from the above plots AF is close to necessary conditions and hence optimal at low SNR. The other two schemes perform worse at low SNR. SB and LT schemes perform better than AF at high SNRs. LT scheme performs better than SB scheme. Performance of SB and LT are close to each other for low for all SNR & for high at low SNR For asymmetric case, for AF, investing all P i is suboptimal - we have developed optimal power allocation schemes.
38
Conclusions with Side information (R Rajesh and V Sharma “Joint Source-channel coding for Gaussian sources over a Gaussian multiple access channel with side information”, NCC 09) Decoder only side information is much more useful than encoder only side information. The reduction in distortion is proportional to the side information quality. AF is optimal at low SNR with or without side information. Distortions in AF do not go to zero, when channel SNR is increased, with or without side information. LT is always better than SB in the no side information case. But with side information SB is sometimes better than LT. In the asymmetric case, when the difference between powers are large, encoder side information is more useful at lower and at higher side channel SNR.
39
Transmission of correlated sources over orthogonal MAC (R Rajesh and V Sharma “Correlated Gaussian sources over orthogonal Gaussian channels”, ISITA 08) Y= (Y 1, Y 2 ) P(y 1, y 2 | x 1, x 2 ) = P(y 1 | x 1 ) P( y 2 | x 2 ) Source channel separation holds even with side information for lossless transmission ( We have derived the exact region). Source coding of (U 1,U 2 ) via Slepian-Wolf (vector quantize first in case of continuous sources) Optimal signaling is by independent (X 1,X 2 ) which does not depend on the sources (U 1,U 2 ).
40
Correlated Gaussian sources over orthogonal GMAC (U 1,U 2 ) zero mean correlation . Var( U i ) = i 2 Y i = X i + N i N i independent of X i (N 1,N 2 ) zero mean independent Optimal scheme is SB. Send X 1 independent of X 2 AF performs close to the optimal scheme
41
Comparison of AF and SB SNR vs distortion performance =0.7
42
Comparison of AF and SB SNR vs distortion performance =0.3
43
MAC with Fading {h in, n ≥ 1} iid fading for sensor i. Known at transmitter and receiver. Aim: To transmit data on the Fading MAC s.t. the decoder can decode within given distortion. Theorem The following conditions should hold (R Rajesh and V Sharma “Transmission of correlated sources over a fading Multiple Access Channel”, Allerton 08) (3) 1) 2)
44
Gaussian Fading MAC Yn =H 1n X 1n + H 2n X 2n + N n N ~ N(0, ) independent of X 1, X 2. P i is average power constraint for user i. Distortion Measure - MSE Source-channel separation does not hold.
45
Gaussian Fading MAC If (X 1,X 2 ) has correlation then we get
46
Power Allocation Maximize the RHS in the third inequality and such that it satisfies all the conditions Compare with – RTDMA, MRTDMA, UPA
47
Generalizations Partial Channel state information at the transmitter and partial state information at the decoder. Partial CSIT, Perfect CSIR
48
Special Cases Lossless transmission of independent sources with partial CSIT, perfect CSIR - G. Keshet, Y. Steinberg, and N. Merhav (NOW publications) Transmission of independent sources with Perfect CSIT, no CSIR - S. Sigurjonsson and Y. H. Kim (ISIT 05) Transmission of independent sources and Gaussian MAC with Partial CSIT, perfect CSIR, lossless transmission – M Mecking ( ISIT 02)
49
Hierarchical Network (R Rajesh and V Sharma “Amplify and Forward for correlated data gathering over hierarchical sensor networks”, WCNC 09 ( To appear)) MAC with side information as the building block Identify the network components- Sensor Nodes and Relay Nodes Multiple user behaviour of AF and SB-TDMA AF – Multiple users
50
AF vs SB-TDMA Comparison of AF and SB-TDMA ( Slepian-Wolf kind of source coding and then transmission through TDMA links ) AF performs well for both Sensor nodes and Relay nodes as number of nodes increase!!!
51
Hierarchical Network SNR(dB)AFSB-TDMA 02.703.5 32.222.95 71.842.3 101.71.93 131.621.63 14.71.591.48 SNR vs Sum of distortions 2 Cluster heads, 2 Nodes per cluster head Correlation structure = {0.8, 0.6, 0.1, 0.3}
52
Hierarchical Network SNR(dB)AFSB-TDMA 05.276.64 34.395.53 54.014.88 73.734.30 103.493.66 123.413.29 5 Nodes per cluster head Correlation structure = {0.8, 0.6, 0.5, 0.2 } SNR vs Sum of distortions AF performs well for a 3 hop network also !!!
53
Conclusions Studied transmission of correlated sources over a MAC with side information. Generalizes the existing results available in literature. GMAC, Practically important system is studied in detail. For orthogonal channels with lossless transmission and side information - Exact capacity region Obtained several joint source-channel coding schemes in different scenarios.
54
Conclusions Studied Fading MAC with Partial CSIT, partial CSIR Power allocation Schemes – Shown that Knopp- Humblet scheme (RTDMA) is no longer optimal for correlated sources. Studied Efficient ways of combining side information with the Main information Hierarchical Networks – AF is a good coding scheme in sensor network scenario
55
Publications Book Chapter Vinod Sharma and R. Rajesh, “Distributed joint source- channel coding on a multiple access channel”, to appear in Handbook on Selected Topics in Information and Coding Theory, World Scientific, 2008 Journal Papers (under submission) R Rajesh, Vinod Sharma and V K Varsheneya, “Distributed joint source-channel coding on a multiple access channel with side information” R Rajesh and Vinod Sharma, “ Distributed joint source- channel coding of correlated sources over a Gaussian Multiple Acess Channel”
56
Publications CONFERENCE PAPERS R. Rajesh and V. Sharma, Amplify and Forward for Correlated Data Gathering over Hierarchical Sensor Networks, in Proc. IEEE Wireless Communications and Networking Conference (WCNC09), Budapest, Hungary April 2009 R. Rajesh and V. Sharma, Joint Source-Channel Coding for Correlated Gaussian Sources Over a Gaussian MAC with Side Information, National Conference on Communications 2009 (NCC09), IIT Guwahati, January 2009 R. Rajesh and V. Sharma, Correlated Gaussian sources over orthogonal Gaussian channels, to be presented in 2008 International Symposium on Information Theory and its Applications (ISITA2008), Auckland, New Zealand, December 2008 R. Rajesh and V. Sharma, A joint souce-channel coding scheme for transmission of discrete correlated souces over a Gaussian multiple access channel, to be presented in 2008 International Symposium on Information Theory and its Applications (ISITA2008), Auckland, New Zealand, December 2008
57
Publications R. Rajesh and V. Sharma, Transmission of correlated souces over fading multiple access channel, to be presented in 46th Annual Allerton Conference on Communication, Control and Computing, USA, September 2008. R. Rajesh, V. K. Varshneya and V. Sharma, Distributed joint source channel coding on a multiple access channel with side information, in Proc. IEEE International Symposium on Information theory (ISIT), Toronto, Canada, July 2008 R. Rajesh and V. Sharma, Source-channel coding for Gaussian sources over a Gaussian multiple access channel, 45th Annual Allerton Conference on Communication, Control and Computing, USA, September 2007
58
THANK YOU
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.