Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.

Slides:



Advertisements
Similar presentations
Iterative Equalization and Decoding
Advertisements

Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.
1 Channel Coding in IEEE802.16e Student: Po-Sheng Wu Advisor: David W. Lin.
Chapter 6 Information Theory
Cellular Communications
Error Control Coding for Wyner-Ziv System Application 指 導 教 授:楊 士 萱 報 告 學 生:李 桐 照.
Digital Data Transmission ECE 457 Spring Information Representation Communication systems convert information into a form suitable for transmission.
Fundamental limits in Information Theory Chapter 10 :
Reinventing Compression: The New Paradigm of Distributed Video Coding Bernd Girod Information Systems Laboratory Stanford University.
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
Wyner-Ziv Coding of Motion Video
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Review of Probability and Random Processes
1 Department of Electrical Engineering, Stanford University Anne Aaron, Shantanu Rane, Eric Setton and Bernd Girod Transform-domain Wyner-Ziv Codec for.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Distributed Video Coding VLBV, Sardinia, September 16, 2005 Bernd Girod Information Systems Laboratory Stanford University.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Noise, Information Theory, and Entropy
Analysis of Iterative Decoding
§1 Entropy and mutual information
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION 2-dimensional transmission A.J. Han Vinck May 1, 2003.
Channel Coding Part 1: Block Coding
§4 Continuous source and Gaussian channel
Distributed Source Coding
Channel Capacity.
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
CODED COOPERATIVE TRANSMISSION FOR WIRELESS COMMUNICATIONS Prof. Jinhong Yuan 原进宏 School of Electrical Engineering and Telecommunications University of.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
§2 Discrete memoryless channels and their capacity function
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Design of Novel Two-Level Quantizer with Extended Huffman Coding for Laplacian Source Lazar Velimirović, Miomir Stanković, Zoran Perić, Jelena Nikolić,
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Real-Time Signal-To-Noise Ratio Estimation Techniques for Use in Turbo Decoding Javier Schlömann and Dr. Noneaker.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Entropy Coding of Video Encoded by Compressive Sensing Yen-Ming Mark Lai, University of Maryland, College Park, MD
CS654: Digital Image Analysis
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
Wyner-Ziv Coding of Motion Video Presented by fakewen.
Part 1: Overview of Low Density Parity Check(LDPC) codes.
1 Lab. 3 Digital Modulation  Digital modulation: CoderDAC Transmit filter Up- conversion Channel Down- conversion Receive filter ADC ProcessingDetectionDecoder.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Image Processing Architecture, © Oleh TretiakPage 1Lecture 4 ECE-C490 Winter 2004 Image Processing Architecture Lecture 4, 1/20/2004 Principles.
A Low-Complexity Universal Architecture for Distributed Rate-Constrained Nonparametric Statistical Learning in Sensor Networks Avon Loy Fernandes, Maxim.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
1 Department of Electrical Engineering, Stanford University Anne Aaron, Shantanu Rane, Rui Zhang and Bernd Girod Wyner-Ziv Coding for Video: Applications.
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Joint Decoding on the OR Channel Communication System Laboratory UCLA Graduate School of Engineering - Electrical Engineering Program Communication Systems.
Ch3: Model Building through Regression
Coding and Interleaving
Context-based Data Compression
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Wyner-Ziv Coding of Video - Towards Practical Distributed Coding -
Distributed Compression For Binary Symetric Channels
6.3 Sampling Distributions
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Presentation transcript:

Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference April 3, 2002

Overview Introduction Turbo Coder and Decoder Compression of Binary Sequences Extension to Continuous-valued Sequences Joint Source-Channel Coding Conclusion Compression with Side Information Using Turbo CodesApril 3, 2002

Distributed Source Coding Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent Slepian-Wolf Theorem

Research Problem Motivation –Slepian-Wolf theorem: It is possible to compress statistically dependent signals in a distributed manner to the same rate as with a system where the signals are compressed jointly. Objective –Design practical codes which achieve compression close to the Slepian-Wolf bound Compression with Side Information Using Turbo CodesApril 3, 2002

Asymmetric Scenario – Compression with Side Information Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent Compression techniques to send at rate close to H(Y) are well known Can perform some type of switching for more symmetric rates

Our Approach: Turbo Codes Turbo Codes –Developed for channel coding –Perform close to Shannon channel capacity limit (Berrou, et al., 1993) Similar work –Garcia-Frias and Zhao, 2001 (Univ. of Delaware) –Bajcsy and Mitran, 2001 (McGill Univ.) Compression with Side Information Using Turbo CodesApril 3, 2002

System Set-up X and Y are i.i.d binary sequences X 1 X 2 …X L and Y 1 Y 2 …Y L with equally probable ones and zeros. Let X i be independent of Y j for i  j, but dependent on Y i. X and Y dependency described by pmf P(x|y). Y is sent at rate R Y  H(Y) and is available as side information at the decoder Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent

Turbo Coder Compression with Side Information Using Turbo CodesApril 3, 2002 Interleaver length L L bits in L bits Systematic Convolutional Encoder Rate bits Discarded Systematic Convolutional Encoder Rate bits L bits Discarded

Turbo Decoder Compression with Side Information Using Turbo CodesApril 3, 2002 Interleaver length L L bits out Channel probabilities calculations bits in Channel probabilities calculations bits in SISO Decoder P channel P extrinsic P a priori Interleaver length L Deinterleaver length L SISO Decoder P channel P extrinsic P a priori Deinterleaver length L Decision P a posteriori

Simulation: Binary Sequences X - Y relationship – P(X i =Y i )=1-p and P(X i  Y i )=p System –16-state, Rate 4/5 constituent convolutional codes; – R X =0.5 bit per input bit with no puncturing –Theoretically, must be able to send X without error when H(X|Y)  0.5 Compression with Side Information Using Turbo CodesApril 3, 2002

Results: Compression of Binary Sequences Compression with Side Information Using Turbo CodesApril 3, 2002 R X = bit

Results for different rates Punctured the parity bits to achieve lower rates Compression with Side Information Using Turbo CodesApril 3,

Extension to Continuous-Valued Sequences Compression with Side Information Using Turbo CodesApril 3, 2002 X and Y are sequences of i.i.d continuous-valued random variables X 1 X 2 …X L and Y 1 Y 2 …Y L. Let X i be independent of Y j for i  j, but dependent on Y i. X and Y dependency described by pdf f (x|y). Y is known as side information at the decoder Interleaver length L L values Quantize to 2 M levels L symbols Convert to bits ML bits Turbo Coder To decoder

Simulation: Gaussian Sequences X - Y relationship –X is a sequence of i.i.d Gaussian random variables –Y i =X i +Z i, where Z is also a sequence of i.i.d Gaussian random variables, independent of X. f(x|y) is a Gaussian probability density function System –4-level Lloyd-Max scalar quantizer –16-state, rate 4/5 constituent convolutional codes –No puncturing so rate is 1 bit/source sample Compression with Side Information Using Turbo CodesApril 3, 2002

Compression with Side Information Using Turbo CodesApril 3, 2002 Results: Compression of Gaussian Sequences CSNR = ratio of the variance of X and Z R X =1 bit/sample 2.8 dB

Joint Source-Channel Coding Assume that the parity bits pass through a memoryless channel with capacity C We can include the channel statistics in the decoder calculations for P channel. From Slepian-Wolf theorem and definition of Channel capacity Compression with Side Information Using Turbo CodesApril 3, 2002

Compression with Side Information Using Turbo CodesApril 3, 2002 Results: Joint Source-Channel Coding R X =0.5 BSC with q= bit 0.15 bit

Conclusion We can use turbo codes for compression of binary sequences. Can perform close to the Slepian-Wolf bound for lossless distributed source coding. We can apply the system for compression of distributed continuous-valued sequences. Performs better than previous techniques. Easy extension to joint source-channel coding Compression with Side Information Using Turbo CodesApril 3, 2002