Zero-error source-channel coding with source side information at the decoder J. Nayak, E. Tuncel and K. Rose University of California, Santa Barbara.

Slides:



Advertisements
Similar presentations
B IPARTITE I NDEX C ODING Arash Saber Tehrani Alexandros G. Dimakis Michael J. Neely Department of Electrical Engineering University of Southern California.
Advertisements

Capacity of Wireless Channels
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Three Lessons Learned Never discard information prematurely Compression can be separated from channel transmission with no loss of optimality Gaussian.
Lossy Source Coding under a Maximum Distortion Constraint with Decoder Side- Information Jayanth Nayak 1, Ertem Tuncel 2, and Kenneth Rose 1 1 University.
OCDMA Channel Coding Progress Report
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
The 1’st annual (?) workshop. 2 Communication under Channel Uncertainty: Oblivious channels Michael Langberg California Institute of Technology.
Perfect Graphs Lecture 23: Apr 17. Hard Optimization Problems Independent set Clique Colouring Clique cover Hard to approximate within a factor of coding.
Computability and Complexity 16-1 Computability and Complexity Andrei Bulatov NP-Completeness.
1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)
Joint with Christian KnauerFreie U., Berlin Andreas SpillnerJena Takeshi TokuyamaTohoku University Alexander WolffUniversity of Karlsruhe Algorithms for.
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
Processing Along the Way: Forwarding vs. Coding Christina Fragouli Joint work with Emina Soljanin and Daniela Tuninetti.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Algorithms for Maximum Induced Matching Problem Somsubhra Sharangi Fall 2008 CMPT 881.
Study Group Randomized Algorithms Jun 7, 2003 Jun 14, 2003.
The Zero-Error Capacity of Compound Channels Jayanth Nayak & Kenneth Rose University of California, Santa Barbara.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
1 Today, we are going to talk about: Shannon limit Comparison of different modulation schemes Trade-off between modulation and coding.
Improving the Performance of Turbo Codes by Repetition and Puncturing Youhan Kim March 4, 2005.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Noise, Information Theory, and Entropy
Sequential circuit design
Channel Polarization and Polar Codes
Noise, Information Theory, and Entropy
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Repairable Fountain Codes Megasthenis Asteris, Alexandros G. Dimakis IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 32, NO. 5, MAY /5/221.
Multilevel Coding and Iterative Multistage Decoding ELEC 599 Project Presentation Mohammad Jaber Borran Rice University April 21, 2000.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 7 Low Density Parity Check Codes.
© 2005, it - instituto de telecomunicações. Todos os direitos reservados. J.Almeida, J.Barros Instituto de Telecomunicações Universidade do Porto Joint.
Canonical Decomposition, Realizer, Schnyder Labeling and Orderly Spanning Trees of Plane Graphs Kazuyuki Miura, Machiko Azuma and Takao Nishizeki title.
Channel Capacity.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
Error-Correction &Crosstalk Avoidance in DSM Busses Ketan Patel and Igor Markov University of Michigan Electrical Engineering & Computer Science 2003 ACM.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Toward a Secure Data-Rate Theorem Paul Cuff. Control Setting Controller Encoder System (Plant) Sensors Rate R UiUi XiXi YiYi.
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
EE359 – Lecture 15 Outline Introduction to MIMO Communications MIMO Channel Decomposition MIMO Channel Capacity MIMO Beamforming Diversity/Multiplexing.
Design of a High-Throughput Low-Power IS95 Viterbi Decoder Xun Liu Marios C. Papaefthymiou Advanced Computer Architecture Laboratory Electrical Engineering.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
DIGITAL COMMUNICATIONS Linear Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Approximation Algorithms for Graph Homomorphism Problems Chaitanya Swamy University of Waterloo Joint work with Michael Langberg and Yuval Rabani Open.
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Fixed parameter algorithms for protein similarity search under mRNA structure constrains A joint work by: G. Blin, G. Fertin, D. Hermelin, and S. Vialette.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Various Orders and Drawings of Plane Graphs Takao Nishizeki Tohoku University.
CDC Control over Wireless Communication Channel for Continuous-Time Systems C. D. Charalambous ECE Department University of Cyprus, Nicosia, Cyprus.
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
N u 1 u 2 u Canonical Decomposition. V 8 V 7 V 6 V 5 V 4 V 3 V 2 V 1 n u 1 u 2 u.
Channel Capacity.
II. Linear Block Codes. © Tallal Elshabrawy 2 Digital Communication Systems Source of Information User of Information Source Encoder Channel Encoder Modulator.
On the Relation Between Simulation-based and SAT-based Diagnosis CMPE 58Q Giray Kömürcü Boğaziçi University.
New Characterizations in Turnstile Streams with Applications
Unequal Error Protection for Video Transmission over Wireless Channels
Network Coding Rates and Edge-Cut Bounds
Watermarking with Side Information
Lecture 23 NP-Hard Problems
Presentation transcript:

Zero-error source-channel coding with source side information at the decoder J. Nayak, E. Tuncel and K. Rose University of California, Santa Barbara

Outline The problem Asymptotically vanishing error case Zero error Unrestricted Input Restricted Input How large are the gains? Conclusions

The Problem Is separate source and channel coding optimal? S-C Encoder S-C Decoder Û V U X Channel p(y|x) Y n nn n n  sc Channel Encoder Channel Decoder ÛU X Channel p(y|x) Y n n n n  c c Source Encoder Source Decoder V n s   s i î Does an encoder-decoder pair exist?

Asymptotically Vanishing Probability of Error Source coding: R>H(U|V) Slepian-Wolf code Channel coding: R<C Source-channel code (Shamai et. al.) Communication not possible if H(U|V)>C Separate source and channel coding asymptotically optimal

Channel Characteristic graph of the channel Examples Noiseless channel: Edge free graph Conventional channel: Complete graph Channel transition probability p(y|x), y  Y, x  X

Channel Code Code = symbols from an independent set 1-use capacity = log 2  (G x ) n uses of the channel Graph = G X n, n-fold AND product of G X Zero error capacity of a graph Depends only on characteristic graph G X

Source With Side Information (U,V)  U x V ~ p(u,v) Support set S UV = {(u,v)  U x V : p(u,v)>0} Confusability graph on U : G U =( U,E U ) Examples U=V : Edge free graph U,V independent: Complete graph

Source Code Rate depends only on G U Connected nodes cannot receive same codeword  Encoding=Coloring G U Rate = log 2  (G U ) Two cases Unrestricted inputs Restricted inputs

Unrestricted Input (u,v) not necessarily in S UV Decode correctly if (u,v)  S UV 1-instance rate: log 2  (G U ) n-instance graph Graph = G u (n), n-fold OR product of G u Asymptotic rate for UI code

Restricted Input (u,v) in S UV 1-instance rate: log 2 [  (G U )] n-instance graph Graph = G u n, n-fold AND product of G u Asymptotic rate = Witsenhausen rate of source graph

Source-Channel Coding 1 source instance -1 channel use code Encoder Decoder u 1 and u 2 are not distinguishable given side information   sc 1 (u 1 ) and  sc 1 (u 2 ) should not result in same output y u 1 and u 2 connected in G U   sc 1 (u 1 ) and  sc 1 (u 2 ) are not connected in G X and  sc 1 (u 2 )   sc 1 (u 1 )

Source-Channel Coding If n-n UI (RI) code exists for some n, ( G U,G X ) is a UI (RI) compatible pair ( G, G ) is always a UI and RI compatible pair

Unrestricted Input Source a b cd e Channel E A B C D C(G X5 ) = log 2 [  5 ] R UI (G U5 ) = log 2 [ 5/2 ]> C(G X5 ) Source = Complement of channel abcde ADB EC A B C D E =

Restricted Input Previous example not useful R W (G U5 ) = log 2 [  5 ] = C(G X5 ) Source graph G U = complement of channel graph G X Approach: Find f(G) such that C(G)  f(G)  R W (G) If either inequality strict, done!

Lovász theta function: Lovász: Key result:

Restricted Input G U = Schläfli graph ( 27 vertex graph) = G X Haemers Code exists since G U = G X

How large are the gains? Channel uses per source symbol Alon ‘90: There exist graphs such that C(G) < log k and Given l, there exist G such that

Conclusions Under a zero error constraint separate source and channel coding is asymptotically sub-optimal. Not so for the asymptotically vanishing error case. In the zero-error case, the gains by joint coding can be arbitrarily large.

Scalar Code Design Complexity Instance: Source graph G Question: Does a scalar source-channel code exist from G to channel H? Equivalent to graph homomorphism problem from G into H NP-complete for all H (Hell & Nesetril ’90)

Future Work Do UI compatible pairs ( G U,G X ) exist with R W (G U ) < C(G X ) < R UI (G U ) ? For what classes of graphs is separate coding optimal?