M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in.

Slides:



Advertisements
Similar presentations
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
Advertisements

Relaying in networks with multiple sources has new aspects: 1. Relaying messages to one destination increases interference to others 2. Relays can jointly.
Information Theory EE322 Al-Sanie.
1 A Brief Review of Joint Source-Channel Coding CUBAN/BEATS Meeting 29th April, 2004 Fredrik Hekland Department of Electronics and Telecommunication NTNU.
B IPARTITE I NDEX C ODING Arash Saber Tehrani Alexandros G. Dimakis Michael J. Neely Department of Electrical Engineering University of Southern California.
Hybrid Codes and the Point-to-Point Channel Paul Cuff Curt Schieler.
1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.
EE360: Lecture 13 Outline Cognitive Radios and their Capacity Announcements March 5 lecture moved to March 7, 12-1:15pm, Packard 364 Poster session scheduling.
Chapter 6 Information Theory
Achilleas Anastasopoulos (joint work with Lihua Weng and Sandeep Pradhan) April A Framework for Heterogeneous Quality-of-Service Guarantees in.
June 4, 2015 On the Capacity of a Class of Cognitive Radios Sriram Sridharan in collaboration with Dr. Sriram Vishwanath Wireless Networking and Communications.
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department.
1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)
1 Distortion-Rate for Non-Distributed and Distributed Estimation with WSNs Presenter: Ioannis D. Schizas May 5, 2005 EE8510 Project May 5, 2005 EE8510.
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
On Hierarchical Type Covering Ertem Tuncel 1, Jayanth Nayak 2, and Kenneth Rose 2 1 University of California, Riverside 2 University of California, Santa.
© 2005, it - instituto de telecomunicações. Todos os direitos reservados. Gerhard Maierbacher Scalable Coding Solutions for Wireless Sensor Networks IT.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Source-Channel Prediction in Error Resilient Video Coding Hua Yang and Kenneth Rose Signal Compression Laboratory ECE Department University of California,
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
EE360: Multiuser Wireless Systems and Networks Lecture 3 Outline Announcements l Makeup lecture Feb 2, 5-6:15. l Presentation schedule will be sent out.
Gaussian Interference Channel Capacity to Within One Bit David Tse Wireless Foundations U.C. Berkeley MIT LIDS May 4, 2007 Joint work with Raul Etkin (HP)
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Coding Schemes for Multiple-Relay Channels 1 Ph.D. Defense Department of Electrical and Computer Engineering University of Waterloo Xiugang Wu December.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
The Complex Braid of Communication and Control Massimo Franceschetti.
Rate-distortion Theory for Secrecy Systems
EEET 5101 Information Theory Chapter 1
Rate-distortion modeling of scalable video coders 指導教授:許子衡 教授 學生:王志嘉.
ECE559VV – Fall07 Course Project Presented by Guanfeng Liang Distributed Power Control and Spectrum Sharing in Wireless Networks.
Abhik Majumdar, Rohit Puri, Kannan Ramchandran, and Jim Chou /24 1 Distributed Video Coding and Its Application Presented by Lei Sun.
MD-based scheme could outperform MR-based scheme while preserving the source- channel interface Rate is not sufficient as source- channel interface, ordering.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Three-layer scheme dominates previous double-layer schemes Distortion-diversity tradeoff provides useful comparison in different operating regions Layered.
1 Codage avec Information Adjacante (DPC : Dirty paper coding) et certaines de ses applications : Tatouage (Watermarking) MIMO broadcast channels Gholam-Reza.
EE359 – Lecture 15 Outline Introduction to MIMO Communications MIMO Channel Decomposition MIMO Channel Capacity MIMO Beamforming Diversity/Multiplexing.
Wireless Communication Elec 534 Set I September 9, 2007 Behnaam Aazhang.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
A. Pascual Contributions and Proposals of UPC to Department 1 - NEWCOM 1 Contributions and Proposals of UPC - Department 1 NEWCOM Antonio Pascual Iserte.
Transmission over composite channels with combined source-channel outage: Reza Mirghaderi and Andrea Goldsmith Work Summary STATUS QUO A subset Vo (with.
Quantization Watermarking Design and Analysis of Digital Watermarking, Information Embedding, and Data Hiding Systems Brian Chen, Ph. D. Dissertation,
Dr. Sudharman K. Jayaweera and Amila Kariyapperuma ECE Department University of New Mexico Ankur Sharma Department of ECE Indian Institute of Technology,
Interference in MANETs: Friend or Foe? Andrea Goldsmith
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
We aim to exploit cognition to maximize network performance What is the side information at a cognitive node? What is the best encoding scheme given this.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
EE359 – Lecture 15 Outline Announcements: HW posted, due Friday MT exam grading done; l Can pick up from Julia or during TA discussion section tomorrow.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
A Low-Complexity Universal Architecture for Distributed Rate-Constrained Nonparametric Statistical Learning in Sensor Networks Avon Loy Fernandes, Maxim.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
1 WELCOME Chen. 2 Simulation of MIMO Capacity Limits Professor: Patric Ö sterg å rd Supervisor: Kalle Ruttik Communications Labortory.
Scheduling Considerations for Multi-User MIMO
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
March 18, 2005 Network Coding in Interference Networks Brian Smith and Sriram Vishwanath University of Texas at Austin March 18 th, 2005 Conference on.
EE360: Lecture 13 Outline Capacity of Cognitive Radios Announcements Progress reports due Feb. 29 at midnight Overview Achievable rates in Cognitive Radios.
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
The Capacity of Interference Channels with Partial Transmitter Cooperation Ivana Marić Roy D. Yates Gerhard Kramer Stanford WINLAB, Rutgers Bell Labs Ivana.
9: Diversity-Multiplexing Tradeoff Fundamentals of Wireless Communication, Tse&Viswanath MIMO III: Diversity-Multiplexing Tradeoff.
Biointelligence Laboratory, Seoul National University
Introduction to Information theory
Ivana Marić, Ron Dabora and Andrea Goldsmith
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in multiuser source coding and channel coding

Acknowledgements: Jim Chou, Univ. of California Phillip Chou, Microsoft Research David Tse, Univ. of California Pramod Viswanath, Univ. of Illinois Michael Gastpar, Univ. of California Prakash Ishwar, Univ. of California Martin Vetterli, EPFL

Outline Motivation, related work and background Duality between source and channel coding –Role of source distortion measure & channel cost measure Extension to the case of side information MIMO source coding and channel coding with one-sided collaboration Future work: Extensions to multiuser joint source- channel coding Conclusions

Motivation Expanding applications of MIMO source and channel coding Explore a unifying thread to these diverse problems We consider SCSI and CCSI as functional duals We consider 1. Distributed source coding 2. Broadcast channel coding 3. Multiple description source coding 4. Multiple access channel coding Functional dual

It all starts with Shannon “There is a curious and provocative duality between the properties of a source with a distortion measure and those of a channel. This duality is enhanced if we consider channels in which there is a “cost” associated with the different input letters, and it is desired to find the capacity subject to the constraint that the expected cost not exceed a certain quantity…..”

Related work (incomplete list) Duality between source coding and channel coding: Shannon (1959) Csiszar and Korner (textbook, 1981) Cover & Thomas (textbook: 1991): covering vs. packing Eyuboglu and Forney (1993): quantizing vs. modulation: boundary/granular gains vs. shaping/coding gains Laroia, Farvardin & Tretter (1994): SVQ versus shell mapping Duality between source coding with side information (SCSI) and channel coding with side information (CCSI): Chou, Pradhan & Ramchandran (1999) Barron, Wornell and Chen (2000) Su, Eggers & Girod (2000) Cover and Chiang (2001)

Notation: Source coding: Encoder XX Decoder X ^ Source alphabet Distribution Reconstruction alphabet Distortion measure Distortion constraint D: Encoder: Decoder Rate-distortion function R(D)= Minimum rate of representing X with distortion D:

Channel coding: Encoder m Decoder m ^ Input and output alphabets, Conditional distribution Cost measure Cost constraint W: Encoder: Decoder Capacity-cost function C(W)= Maximum rate of communication with cost W: Channel Source encoder and channel decoder have mapping with the same domain and range. Similarly, channel encoder and source decoder have the same domain and range.

Gastpar, Rimoldi & Vetterli ’00: To code or not to code? Encoder Channel Decoder S XY S ^ Source: p(s) Channel: p(y|x) For a given pair of p(s) and p(y|x), there exist a distortion measure and a cost measure such that uncoded mappings at the encoder and decoder are optimal in terms of end-to-end achievable performance. Encoder: f(.) Decoder: g(.) Bottom line: Any source can be “matched” optimally to any channel if you are allowed to pick the distortion & cost measures for the source & channel. Inspiration for cost function/distortion measure analysis:

X Quantizer Role of distortion measures: (Fact 1) Given a source: Let be some arbitrary quantizer. Then there exists a distortion measure such that: and Bottom line: any given quantizer is the optimal quantizer for any source provided you are allowed to pick the distortion measure

Given a channel: Let be some arbitrary input distribution. Then there exists a cost measure such that: and Bottom line: any given input distribution is the optimal input for any channel provided you are allowed to pick the cost measure X Channel Role of cost measures: (Fact 2) Now we are ready to characterize duality

Theorem 1a: For a given source coding problem with source distortion measure, distortion constraint D, let the optimal quantizer be inducing the distributions (using Bayes’ rule): Optimal Quantizer Duality between classical source and channel coding:

Then a unique dual channel coding problem with channel input alphabet, output alphabet X, cost measure and cost constraint W, such that: (i)R(D)=C(W); (ii) whereand Optimal Quantizer Channel REVERSAL OF ORDER

Interpretation of functional duality For a given source coding problem, we can associate a specific channel coding problem such that both problems induce the same optimal joint distribution the optimal encoder for one is functionally identical to the optimal decoder for the other in the limit of large block length an appropriate channel-cost measure is associated Source coding: distortion measure is as important as the source distribution Channel coding: cost measure is as important as the channel conditional distribution

Source coding with side information: EncoderDecoder X S X ^ The encoder needs to compress the source X. The decoder has access to correlated side information S. Studied by Slepian-Wolf ‘73, Wyner-Ziv ’76 Berger ’77 Applications: sensor networks, digital upgrade, diversity coding for packet networks

Encoder X S X ^ Encoder has access to some information S related to the statistical nature of the channel. Encoder wishes to communicate over this cost-constrained channel Studied by Gelfand-Pinsker ‘81, Costa ‘83, Heegard-El Gamal ‘85 Applications: watermarking, data hiding, precoding for known interference, multiantenna broadcast channels. Channel Decoder Channel coding with side information: mm ^

Duality (loose sense) CCSI Side information at encoder only Channel code is “partitioned” into a bank of source codes SCSI Side info. at decoder only Source code is “partitioned” into a bank of channel codes

Conditional source Side information Context-dependent distortion measure Encoder Decoder Source coding with side information at decoder (SCSI): (Wyner-Ziv ’76) S Encoder X Decoder X ^ U U Rate-distortion function: such that Intuition (natural Markov chains): side information S is not present at the encoder source X is not present at the decoder Completely determines the optimal joint distribution Note: ^

SCSI: Gaussian example: (reconstruction of (X-S)): Conditional source: X=S+V, p(v)~N(0,N) Side information: p(s)~N(0,Q) Distortion measure: (mean squared error reconstruction of (x-s)) X q U S Z X Encoder Test channel Decoder + (MMSE estimator)

Conditional channel Side information Cost measure Encoder Decoder Channel coding with side information at encoder (CCSI): Encoder U Decoder S U Capacity-Cost function: such that channel does not care about U encoder does not have access to X Intuition (natural Markov chains): Completely determines the optimal joint distribution (Gelfand-Pinsker ’81)

CCSI: Gaussian example (known interference): Conditional channel: Side information: Distortion measure: ( power constraint on ) + q Decoder + + U S Z X Channel Encoder U + (MMSE precoder) (Costa ’83)

U X Encoder Test channel X + q Decoder + S Z q U Encoder Channel Decoder SCSI CCSI

Theorem 2a: Given: Inducing: (natural CCSI constraint) X U X Encoder Induced test channel Decoder If : Find optimal: that minimizes using Bayes’ rule is satisfied

(i) Rate-distortion bound = capacity-cost bound (ii) achieve capacity-cost optimality (iii) and Channel= Side information =Cost measure= => a dual CCSI with X U X Encoder Induced test channel Decoder U X Channel Encoder Decoder U Cost constraint=W

Enc. X U SCSI Dec. S X ^ U U CCSI Enc. U S X ^ X Ch. Markov chains and duality SCSI CCSI p(s,x,u,x) ^ DUALITY

Duality implication: Generalization of Wyner-Ziv no-rate-loss case CCSI:( Cohen-Lapidoth, 2000, Erez-Shamai-Zamir, 2000 ) extension of Costa’s result for to arbitrary S with no rate-loss + + S Z X Channel Encoder Decoder U U New result: Wyner-Ziv’s no rate loss result can be extended to arbitrary source and side information as long as X=S+V, where V is Gaussian, for MSE distortion measure. ^ Encoder X Decoder X U U S

Functional duality in MIMO source and channel coding with one-sided collaboration : For ease of illustration, we consider 2-input-2-output system Consider only sum-rate, and single distortion/cost measure We consider functional duality in the distributional sense Future & on-going work: duality in the coding sense.

MIMO source coding with one-sided collaboration: Encoder-1 Encoder-2 Decoder-1 Decoder-2 Test Channel Either the encoders or the decoders (but not both) collaborate MIMO channel coding with one-sided collaboration: Encoder-1 Encoder-2 Decoder-1 Decoder-2 Channel Either the encoders or the decoders (but not both) collaborate

Distributed source coding Two correlated sources with given joint distribution joint distortion measure Encoders DO NOT collaborate, Decoders DO collaborate Problem: For a given joint distortion D, find the minimum sum-rate R Achievable rate region (Berger ‘77) Encoder-1 Encoder-2 Decoder-1 Decoder-2 Test Channel

Distributed source coding: Achievable sum-rate region: such that E[d]<D 1.Two sources can not see each other 2.The decoder can not see the source

Broadcast channel coding Broadcast channel with a given conditional distribution joint cost measure Encoders DO collaborate, Decoders DO NOT collaborate Problem: For a given joint cost W, find the maximum sum-rate R Achievable rate region (Marton ’79) Encoder-1 Encoder-2 Decoder-1 Decoder-2 Channel

Broadcast Channel Coding: Achievable sum-rate region: such that E[w]<W 1.Channel only cares about i/p 2. Encoder does not have the channel o/p

Duality (loose sense) in Distr. Source coding and Broadcast channel Distributed source coding Collaboration at decoder only Uses Wyner-Ziv coding: source code is “partitioned” into a bank of channel codes Broadcast channel coding Collaboration at encoder only Uses Gelfand-Pinsker coding: channel code is “partitioned” into a bank of source codes

Dist. Source Coding Broadcast Channel Coding DUALITY Theorem 3a:

Example: 2-in-2-out Gaussian Linear Channel: (Caire, Shamai, Yu, Cioffi, Viswanath, Tse) H + + Marton’s sum-rate is shown to be tight Using Sato’s bound => the capacity of Broadcast channel depends only on marginals. For optimal i/p distribution, if we keep the variance of the noise the same and change the correlation,at one point we get (also called worst-case noise). At this point we have duality!

Multiple access channel coding with independent message sets Encoder-1 Encoder-2 Decoder-1 Decoder-2 Multiple access channel with a given conditional distribution joint cost measure Encoders DO NOT collaborate, Decoders DO collaborate Problem: For a given joint cost W, find the maximum sum-rate R Capacity-cost function (Ahlswede ’71): such that are independent Channel

Multiple description source coding problem: Encoder Decoder-1 Decoder-2 Decoder-0 Encoder Decoder-1 Decoder-2 Decoder-0 Another version with essentially the same coding techniques, which is “amenable” to duality:

“Multiple Description Source Coding with no-excess sum-rate” Encoder-1 Encoder-2 Decoder-1 Decoder-2 Two correlated sources with given joint distribution joint distortion measure Encoders DO collaborate, Decoders DO NOT collaborate Problem: For a given joint distortion D, find the minimum sum-rate R Rate-distortion region (Ahlswede ‘85): such that are independent Test Channel

Duality (loose sense) in Multiple description coding and multiple access channel MD coding with no excess sum-rate Collaboration at encoder only Uses successive refinement strategy MAC with independent message sets Collaboration at decoder only Uses successive cancellation strategy

Theorem 4a: For a multiple description coding with no excess sum-rate with Given: Source alphabets: Reconstruction alphabets Find the optimal conditional distribution Induces Then there exists a multiple access channel with: Channel distribution: Input alphabets: Output alphabets: Joint cost measure:

1) sum-rate-distortion bound sum capacity-cost bound 2) achieve optimality for this MA channel coding problem 3) Joint cost measure is Similarly, for a given MA channel coding problem with independent message sets => a dual MD source coding problem with no excess sum-rate.

Example: Given a MA channel: H + + Sum-Capacity optimization: => H + + A + + Channel Decoder =>

Dual MD coding problem: H + + A + + Encoder Test Channel

What is addressed in this work: Duality in empirical per-letter distributions Extension of Wyner-Ziv no-rate loss result to more arbitrary cases Underlying connection between 4 multiuser communication problems What is left to be addressed: Duality in optimal source codes and channel codes Rate-loss in dual problems Joint source-channel coding in dual problems

Conclusions Distributional relationship between MIMO source & channel coding Functional characterization: swappable encoder and decoder codebooks Highlighted the importance of source distortion and channel cost measures Cross-leveraging of advances in the applications of these fields