Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information Vinod Sharma.

Slides:



Advertisements
Similar presentations
Sampling and Pulse Code Modulation
Advertisements

Relaying in networks with multiple sources has new aspects: 1. Relaying messages to one destination increases interference to others 2. Relays can jointly.
Information Theory EE322 Al-Sanie.
OFDM Modulated Cooperative Multiple-Access Channel With Network-Channel Coding.
Hybrid Codes and the Point-to-Point Channel Paul Cuff Curt Schieler.
Capacity of Wireless Channels
Enhancing Secrecy With Channel Knowledge
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Chapter 6 Information Theory
Achilleas Anastasopoulos (joint work with Lihua Weng and Sandeep Pradhan) April A Framework for Heterogeneous Quality-of-Service Guarantees in.
June 4, 2015 On the Capacity of a Class of Cognitive Radios Sriram Sridharan in collaboration with Dr. Sriram Vishwanath Wireless Networking and Communications.
Three Lessons Learned Never discard information prematurely Compression can be separated from channel transmission with no loss of optimality Gaussian.
Ergodic Capacity of MIMO Relay Channel Bo Wang and Junshan Zhang Dept. of Electrical Engineering Arizona State University Anders Host-Madsen Dept. of Electrical.
Location Estimation in Sensor Networks Moshe Mishali.
1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)
© 2005, it - instituto de telecomunicações. Todos os direitos reservados. Gerhard Maierbacher Scalable Coding Solutions for Wireless Sensor Networks IT.
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
Gaussian Interference Channel Capacity to Within One Bit David Tse Wireless Foundations U.C. Berkeley MIT LIDS May 4, 2007 Joint work with Raul Etkin (HP)
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Noise, Information Theory, and Entropy
Noise, Information Theory, and Entropy
林茂昭 教授 台大電機系 個人專長 錯誤更正碼 數位通訊
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
An algorithm for dynamic spectrum allocation in shadowing environment and with communication constraints Konstantinos Koufos Helsinki University of Technology.
Computing and Communicating Functions over Sensor Networks A.Giridhar and P. R. Kumar Presented by Srikanth Hariharan.
Joint Physical Layer Coding and Network Coding for Bi-Directional Relaying Makesh Wilson, Krishna Narayanan, Henry Pfister and Alex Sprintson Department.
Channel Capacity.
CODED COOPERATIVE TRANSMISSION FOR WIRELESS COMMUNICATIONS Prof. Jinhong Yuan 原进宏 School of Electrical Engineering and Telecommunications University of.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
JWITC 2013Jan. 19, On the Capacity of Distributed Antenna Systems Lin Dai City University of Hong Kong.
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Transmission over composite channels with combined source-channel outage: Reza Mirghaderi and Andrea Goldsmith Work Summary STATUS QUO A subset Vo (with.
University of Houston Cullen College of Engineering Electrical & Computer Engineering Capacity Scaling in MIMO Wireless System Under Correlated Fading.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Dr. Sudharman K. Jayaweera and Amila Kariyapperuma ECE Department University of New Mexico Ankur Sharma Department of ECE Indian Institute of Technology,
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Joint Moments and Joint Characteristic Functions.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
A Low-Complexity Universal Architecture for Distributed Rate-Constrained Nonparametric Statistical Learning in Sensor Networks Avon Loy Fernandes, Maxim.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
CDC 2006, San Diego 1 Control of Discrete-Time Partially- Observed Jump Linear Systems Over Causal Communication Systems C. D. Charalambous Depart. of.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
ON AVCS WITH QUADRATIC CONSTRAINTS Farzin Haddadpour Joint work with Madhi Jafari Siavoshani, Mayank Bakshi and Sidharth Jaggi Sharif University of Technology,
Multi-Terminal Information Theory Problems in Sensor Networks Gregory J Pottie UCLA Electrical Engineering Department
The Capacity of Interference Channels with Partial Transmitter Cooperation Ivana Marić Roy D. Yates Gerhard Kramer Stanford WINLAB, Rutgers Bell Labs Ivana.
- A Maximum Likelihood Approach Vinod Kumar Ramachandran ID:
Progress Report for the UCLA OCDMA Project UCLA Graduate School of Engineering - Electrical Engineering Program Communication Systems Laboratory Miguel.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
IERG6120 Lecture 22 Kenneth Shum Dec 2016.
Advanced Wireless Networks
Ivana Marić, Ron Dabora and Andrea Goldsmith
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Howard Huang, Sivarama Venkatesan, and Harish Viswanathan
Foundation of Video Coding Part II: Scalar and Vector Quantization
Mainak Chowdhury, Andrea Goldsmith, Tsachy Weissman
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
Compute-and-Forward Can Buy Secrecy Cheap
Presentation transcript:

Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information Vinod Sharma

Outline Model General Theorem Special cases Example GMAC Discrete Sources Gaussian Sources Orthogonal channels Fading Channel Hierarchical Networks Conclusions

Joint Source-Channel Coding on a Multiple Access Channel YnYn X 1n U 1n MAC Enc 1 Enc2 U 2n X 2n Dec ZnZn Z 2n Z 1n

Joint Source-channel coding on a Multiple access channel (Contd) U in, n ≥ 1 iid data generated by user i. (U 1n, U 2n ) correlated. Z in : side information at the encoder i. Z n : side information at the decoder. X in : channel input from i th encoder at time n. Y n : channel output at time n. MAC is Memoryless : Aim: To transmit data on MAC s.t. the decoder can decode within given distortion. Source-channel separation does not hold for this system. Therefore joint source-channel coding is needed.

Transmission of data from Sensor nodes to Fusion Node Cluster Multiple Access channel Sensor nodes Cluster head Fusion node Side information

Transmission of data from Sensor nodes to Fusion Node Sensor nodes transmit data to their cluster heads. Cluster heads are more powerful nodes. Cluster heads transmit data directly to the fusion node. Within a cluster it becomes a Multiple Access channel. Due to proximity of sensor nodes, data generated by them is correlated. MAC is a building block for general sensor networks also.

Theorem (R Rajesh, V K Varshneya, V Sharma “Distributed joint source channel coding on a multiple access channel with side information” ISIT 08) there exists a function such that A source can be transmitted over the multiple access channel with distortions (D 1, D 2 ) if there exist random variables (W 1, W 2, X 1, X 2 ) such that 1) 2) f D : W 1 x W 2 x Z → (Û 1, Û 2 )

I (U 1, Z 1 ; W 1 | W 2, Z) < I (X 1 ; Y | X 2, W 2, Z) I (U 2, Z 2 ; W 2 | W 1, Z) < I (X 2 ; Y | X 1, W 1, Z) I (U 1, U 2, Z 1, Z 2 ; W 1, W 2 | Z) < I (X 1, X 2 ; Y | Z ) Where W 1 is the set in which W i takes values. and the constraints Theorem 3)

Theorem (Contd) Proof of theorem uses : -Vector quantization. -Joint source-channel coding. -Decoding by joint weak typicality. -Estimation of the sources Comments : Correlations in (U 1, U 2 ) help in reducing LHS and increasing RHS

Generalizations The above theorem can be generalized to Multiple user case. Recover functions of the sources. Discrete /Continuous source/channel alphabets. Includes the practically important Gaussian MAC. Unbounded distortion measures. Includes Mean Square Error (MSE).

Special Cases 1.Lossless MAC with correlated Sources (Z 1, Z 2, Z ) independent of (U 1,U 2 ), W i = U i, i = 1, 2 Then our result reduces to: H(U 1 |U 2 ) < I(X 1 ;Y|X 2,U 2 ), H(U 2 |U 1 ) < I(X 2 ;Y|X 1,U 1 ), H(U 1,U 2 ) < I(X 1,X 2 ;Y). Recovers Cover, El. Gamel, Salehi (1980).

Special Cases 2. Lossy Multiple Access communication: Take (Z 1, Z 2, Z) independent of (U 1, U 2,W 1,W 2 ) Then our result reduces to I (U 1 ;W 1 | W 2 ) < I (X 1 ;Y | X 2, W 2 ), I (U 2 ; W 2 | W 1 ) < I (X 2 ;Y |X 1, W 1 ), I (U 1, U 2 ;W 1, W 2 ) < I (X 1, X 2 ;Y). Generalizes Cover, El. Gamel, Salehi (1980) to lossy case.

Special Cases 3. Lossy distributed source coding with side information: MAC taken as a dummy channel with Y=(X 1,X 2 ) R 1 > I (U 1, Z 1 ; W 1 | W 2, Z), R 2 > I (U 2, Z 2 ; W 2 | W 1,, Z), R 1 + R 2 > I (U 1, U 2, Z 1, Z 2 ; W 1, W 2 | Z). Generalizes Slepian-Wolf (1973), Wyner and Ziv (1976), Gastpar (2004). 4. Lossy GMAC Y = X 1 + X 2 + N Generalizes Lapidoth and Tinguely (2006)

Special Cases 5. Compound MAC and Interference channel with side Information Two decoders- Decoder i has access to Y i and Z i Take U i =W i, i=1, 2. Applying Theorem twice for receiver only side information case for i=1,2 H(U 1 | U 2,Z i ) < I (X 1 ;Y i | X 2, U 2, Z i ), H(U 2 | U 1, Z i ) < I (X 2 ;Y i | X 1, U 1, Z i ), H(U 1, U 2 |Z i ) < I (X 1, X 2 ;Y i |Z i ). This gives the sufficient conditions for interference channels for the strong interference case. Recovers the results of D. Gunduz and E Erkip (ISIT 07)

Special Cases Also recovers the results: Lossless transmission over a MAC with receiver side information – D Gunduz, E Erkip UCSD ITA Workshop 07 Mixed side information- M Fleming, M Effros IEEE TIT 06 Lossless Multiple access communication with common information – Slepian and Wolf Correlated sources over orthogonal channels- J Barros and S Servetto- ISIT 03

Example 1 (U 1,U 2 ) have joint distribution p(0, 0) = p(1, 1) =1/3 p(1, 0) = p(0, 1) =1/6 H(U 1 ) = H(U 2 ) = 1. For lossless transmission Using independent coding we need rates R 1 ≥ H(U 1 ),R 2 ≥ H(U 2 ) Exploiting correlations using Slepian-Wolf coding we need R 1 ≥ H(U 1 |U 2 ) = 0.918,R 2 ≥ and R 1 + R 2 ≥ H(U 1,U 2 ) =

Example(contd) MAC : Y = X 1 + X 2 X 1,X 2 take values in {0, 1} and Y in {0, 1, 2}. This MAC does not satisfy source channel separation conditions. Sum capacity of this channel with independent X 1,X 2 = 1.5 bits/symbol interval. Even Slepian-Wolf coding will not provide lossless transmission.

Example (contd) Joint source-channel coding : –Take X i = U i, i = 1, 2 –Then channel can transmit at rate –Still we cannot transmit losslessly. With Distortion: –Consider Hamming distance –Allowable distortion = 4% –Then we need R 1 ≥ H(p) - H(d) = –Thus with independent coding we will not get this distortion but with correlated (X 1,X 2 ) it is possible.

Example (contd) With side conditions Let Z 1 obtained from U 2 via a binary symmetric channel with cross over prob Similarly for Z 2. Z= (Z 1, Z 2,V) Let V = U 1.U 2.N, with N independent of U 1,U 2 and P[N = 0] = P[N = 1] = 0.5 The sum rate needed for lossless transmission ≥ 1.8 bits with Z 1 only. ≥ bits with Z 1 and Z 2. ≥ bits with Z only. ≥ with Z 1, Z 2, Z. Thus with Z 1, Z 2, Z we can transmit losslessly with independent coding.

Gaussian MAC (GMAC) Discrete Alphabet sources over a GMAC Y = X 1 + X 2 + N, N  (X 1, X 2 ), N ~ N(0,  2 N ) Initially we consider no side information. Power constraint: For lossless transmission, we need H (U 1 |U 2 ) < I (X 1 ;Y|X 2,U 2 ) H (U 2 |U 1 ) < I (X 2 ;Y|X 1,U 1 )(1) H (U 1,U 2 ) < I (X 1,X 2 ;Y) Where {X 1, U 1, U 2, X 2 } forms a Markov chain. Comment: Above inequalities are not explicit enough. We make them more explicit

Lemma H (U 1 |U 2 ) < I (X 1 ;Y|X 2,U 2 ) < I (X 1 ;Y|X 2 ) H (U 2 |U 1 ) < I (X 2 ;Y|X 1,U 1 )< I (X 2 ;Y|X 1 )(1a) H (U 1,U 2 ) < I (X 1,X 2 ;Y) Lemma 1: I(X 1 ;Y|X 2 ), I(X 2 ;Y|X 1 ) and I(X 1,X 2 ;Y) are maximized by jointly Gaussian (X 1,X 2 ) with mean zero and correlation .

Comments Conditions obtained by relaxing the R.H.S in (1) are more explicit. Use to obtain efficient coding schemes. For coding schemes check if (1) is satisfied. Otherwise change  appropriately. We develop a distributed coding scheme that maps U i to X i such that (X 1,X 2 ) are jointly Gaussian with a given correlation

Lemma Lemma 2 If X 1 - U 1 - U 2 - X 2 and (X 1,X 2 ) are jointly Gaussian then Lemma 3 Any jointly Gaussian two dimensional density can be arbitrarily closely approximated by a weighted sum of product of marginal Gaussian densities:

A joint source-channel coding scheme (R Rajesh, V Sharma “A joint source-channel coding scheme for transmission of discrete correlated sources over a Gaussian MAC ”, ISITA 08) p i, q i in Lemma 3 can be –ve. Thus for approximating f  (x 1, x 2 ) = desired jointly Gaussian density with correlation  Find g from Lemma 3 that minimizes with p i ≥ 0, q i ≥ 0   p i.q i = 1 and mean zero.

Discrete Alphabet sources over a GMAC( contd.) Thus taking (X 1,X 2 ) jointly Gaussian with mean zero, correlation  and var (X i ) = P i provides conditions for lossless transmission: (2)

Example 2 (U 1,U 2 ) has distribution : p(0, 0) = p(1, 1) = p(0, 1) =1/3, p(1, 0) = 0 Power constraints : P 1 = 3, P 2 = 4,  2 N = 1, H(U 1,U 2 ) = –For independent (X 1,X 2 ), RHS in the third inequality in (2) is 1.5. –Thus, (U 1,U 2 ) cannot be transmitted on the GMAC with independent (X 1,X 2 ). –   [0.144, ] will satisfy all three constraints.

Example 2 (contd) For  = 0.3 and 0.6 we obtained the density from the above optimization problem. The upper bound from Lemma 2 is

Example 2 (contd)

GAUSSIAN SOURCES OVER A GAUSSIAN MAC (R Rajesh and V Sharma “Source-channel coding for Gaussian sources over a Gaussian multiple access channel”, Allerton 07) (U 1,U 2 ) mean zero, Jointly Gaussian and covariance Y = X 1 + X 2 + N N ~ N(0,  2 N ) independent of X 1,X 2. P i is average power constraint for user i. There is no side information. Particularly relevant model for change of detection problem in sensor networks. Source-channel separation does not hold.

GAUSSIAN SOURCES OVER A GMAC We consider joint source-channel coding: (i) Amplify and forward (AF) (ii) Separation-Based (SB) (iii) Lapidoth-Tinguely (LT)

Amplify and Forward scheme (AF) Y X2X2 X1X1 GMAC U1U1 U2U2 Dec Scale For two user symmetric case it is optimal for SNR ≤  /(1-  2 ) (Lapidoth and Tinguely (2006)). X i are scaled U i s.t the average power constraints P i, i= 1, 2 are satisfied. Then distortions are

Separation Based approach(SB) Vector quantization followed by Slepian-Wolf coding Rate region for given (D 1, D 2 ) given in Viswanath (IEEE TIT 08) X 1, X 2 independent Capacity of GMAC R 1 ≤ I (X 1,Y|X 2 ) = 0.5log(1 +P 1 /  N 2 ) R 2 ≤ I (X 2,Y|X 1 ) = 0.5log(1 + P 2 /  N 2 ) R 1 + R 2 ≤ I (X 1,X 2 ;Y) = 0.5log(1 + (P 1 + P 2) /  N 2 )

Joint Source Channel coding (of Lapidoth and Tinguely) Vector quantize the sources i=1,2 Generate 2 nRi iid Gaussian codewords with variance 1. Code U i n by mapping to nearest code word Scale codeword i to average power P i Correlation between the codewords

Recovers the result in Lapidoth(2006 ) Joint Source Channel coding (LT) If R 1, R 2 satisfy Then we obtain,

Comparison of the three schemes U i ~ N(0,1), i=1,2, =0.1

SNR Vs Distortion performance =0.75

Conclusions from the above plots AF is close to necessary conditions and hence optimal at low SNR. The other two schemes perform worse at low SNR. SB and LT schemes perform better than AF at high SNRs. LT scheme performs better than SB scheme. Performance of SB and LT are close to each other for low  for all SNR & for high  at low SNR For asymmetric case, for AF, investing all P i is suboptimal - we have developed optimal power allocation schemes.

Conclusions with Side information (R Rajesh and V Sharma “Joint Source-channel coding for Gaussian sources over a Gaussian multiple access channel with side information”, NCC 09) Decoder only side information is much more useful than encoder only side information. The reduction in distortion is proportional to the side information quality. AF is optimal at low SNR with or without side information. Distortions in AF do not go to zero, when channel SNR is increased, with or without side information. LT is always better than SB in the no side information case. But with side information SB is sometimes better than LT. In the asymmetric case, when the difference between powers are large, encoder side information is more useful at lower  and at higher side channel SNR.

Transmission of correlated sources over orthogonal MAC (R Rajesh and V Sharma “Correlated Gaussian sources over orthogonal Gaussian channels”, ISITA 08) Y= (Y 1, Y 2 ) P(y 1, y 2 | x 1, x 2 ) = P(y 1 | x 1 ) P( y 2 | x 2 ) Source channel separation holds even with side information for lossless transmission ( We have derived the exact region). Source coding of (U 1,U 2 ) via Slepian-Wolf (vector quantize first in case of continuous sources) Optimal signaling is by independent (X 1,X 2 ) which does not depend on the sources (U 1,U 2 ).

Correlated Gaussian sources over orthogonal GMAC (U 1,U 2 ) zero mean correlation . Var( U i ) =  i 2 Y i = X i + N i N i independent of X i (N 1,N 2 ) zero mean independent Optimal scheme is SB. Send X 1 independent of X 2 AF performs close to the optimal scheme

Comparison of AF and SB SNR vs distortion performance  =0.7

Comparison of AF and SB SNR vs distortion performance  =0.3

MAC with Fading {h in, n ≥ 1} iid fading for sensor i. Known at transmitter and receiver. Aim: To transmit data on the Fading MAC s.t. the decoder can decode within given distortion. Theorem The following conditions should hold (R Rajesh and V Sharma “Transmission of correlated sources over a fading Multiple Access Channel”, Allerton 08) (3) 1) 2)

Gaussian Fading MAC Yn =H 1n X 1n + H 2n X 2n + N n N ~ N(0, ) independent of X 1, X 2. P i is average power constraint for user i. Distortion Measure - MSE Source-channel separation does not hold.

Gaussian Fading MAC If (X 1,X 2 ) has correlation then we get

Power Allocation Maximize the RHS in the third inequality and such that it satisfies all the conditions Compare with – RTDMA, MRTDMA, UPA

Generalizations Partial Channel state information at the transmitter and partial state information at the decoder. Partial CSIT, Perfect CSIR

Special Cases Lossless transmission of independent sources with partial CSIT, perfect CSIR - G. Keshet, Y. Steinberg, and N. Merhav (NOW publications) Transmission of independent sources with Perfect CSIT, no CSIR - S. Sigurjonsson and Y. H. Kim (ISIT 05) Transmission of independent sources and Gaussian MAC with Partial CSIT, perfect CSIR, lossless transmission – M Mecking ( ISIT 02)

Hierarchical Network (R Rajesh and V Sharma “Amplify and Forward for correlated data gathering over hierarchical sensor networks”, WCNC 09 ( To appear)) MAC with side information as the building block Identify the network components- Sensor Nodes and Relay Nodes Multiple user behaviour of AF and SB-TDMA AF – Multiple users

AF vs SB-TDMA Comparison of AF and SB-TDMA ( Slepian-Wolf kind of source coding and then transmission through TDMA links ) AF performs well for both Sensor nodes and Relay nodes as number of nodes increase!!!

Hierarchical Network SNR(dB)AFSB-TDMA SNR vs Sum of distortions 2 Cluster heads, 2 Nodes per cluster head Correlation structure  = {0.8, 0.6, 0.1, 0.3}

Hierarchical Network SNR(dB)AFSB-TDMA Nodes per cluster head Correlation structure  = {0.8, 0.6, 0.5, 0.2 } SNR vs Sum of distortions AF performs well for a 3 hop network also !!!

Conclusions Studied transmission of correlated sources over a MAC with side information. Generalizes the existing results available in literature. GMAC, Practically important system is studied in detail. For orthogonal channels with lossless transmission and side information - Exact capacity region Obtained several joint source-channel coding schemes in different scenarios.

Conclusions Studied Fading MAC with Partial CSIT, partial CSIR Power allocation Schemes – Shown that Knopp- Humblet scheme (RTDMA) is no longer optimal for correlated sources. Studied Efficient ways of combining side information with the Main information Hierarchical Networks – AF is a good coding scheme in sensor network scenario

Publications Book Chapter Vinod Sharma and R. Rajesh, “Distributed joint source- channel coding on a multiple access channel”, to appear in Handbook on Selected Topics in Information and Coding Theory, World Scientific, 2008 Journal Papers (under submission) R Rajesh, Vinod Sharma and V K Varsheneya, “Distributed joint source-channel coding on a multiple access channel with side information” R Rajesh and Vinod Sharma, “ Distributed joint source- channel coding of correlated sources over a Gaussian Multiple Acess Channel”

Publications CONFERENCE PAPERS R. Rajesh and V. Sharma, Amplify and Forward for Correlated Data Gathering over Hierarchical Sensor Networks, in Proc. IEEE Wireless Communications and Networking Conference (WCNC09), Budapest, Hungary April 2009 R. Rajesh and V. Sharma, Joint Source-Channel Coding for Correlated Gaussian Sources Over a Gaussian MAC with Side Information, National Conference on Communications 2009 (NCC09), IIT Guwahati, January 2009 R. Rajesh and V. Sharma, Correlated Gaussian sources over orthogonal Gaussian channels, to be presented in 2008 International Symposium on Information Theory and its Applications (ISITA2008), Auckland, New Zealand, December 2008 R. Rajesh and V. Sharma, A joint souce-channel coding scheme for transmission of discrete correlated souces over a Gaussian multiple access channel, to be presented in 2008 International Symposium on Information Theory and its Applications (ISITA2008), Auckland, New Zealand, December 2008

Publications R. Rajesh and V. Sharma, Transmission of correlated souces over fading multiple access channel, to be presented in 46th Annual Allerton Conference on Communication, Control and Computing, USA, September R. Rajesh, V. K. Varshneya and V. Sharma, Distributed joint source channel coding on a multiple access channel with side information, in Proc. IEEE International Symposium on Information theory (ISIT), Toronto, Canada, July 2008 R. Rajesh and V. Sharma, Source-channel coding for Gaussian sources over a Gaussian multiple access channel, 45th Annual Allerton Conference on Communication, Control and Computing, USA, September 2007

THANK YOU