Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference April 3, 2002
Overview Introduction Turbo Coder and Decoder Compression of Binary Sequences Extension to Continuous-valued Sequences Joint Source-Channel Coding Conclusion Compression with Side Information Using Turbo CodesApril 3, 2002
Distributed Source Coding Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent Slepian-Wolf Theorem
Research Problem Motivation –Slepian-Wolf theorem: It is possible to compress statistically dependent signals in a distributed manner to the same rate as with a system where the signals are compressed jointly. Objective –Design practical codes which achieve compression close to the Slepian-Wolf bound Compression with Side Information Using Turbo CodesApril 3, 2002
Asymmetric Scenario – Compression with Side Information Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent Compression techniques to send at rate close to H(Y) are well known Can perform some type of switching for more symmetric rates
Our Approach: Turbo Codes Turbo Codes –Developed for channel coding –Perform close to Shannon channel capacity limit (Berrou, et al., 1993) Similar work –Garcia-Frias and Zhao, 2001 (Univ. of Delaware) –Bajcsy and Mitran, 2001 (McGill Univ.) Compression with Side Information Using Turbo CodesApril 3, 2002
System Set-up X and Y are i.i.d binary sequences X 1 X 2 …X L and Y 1 Y 2 …Y L with equally probable ones and zeros. Let X i be independent of Y j for i j, but dependent on Y i. X and Y dependency described by pmf P(x|y). Y is sent at rate R Y H(Y) and is available as side information at the decoder Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent
Turbo Coder Compression with Side Information Using Turbo CodesApril 3, 2002 Interleaver length L L bits in L bits Systematic Convolutional Encoder Rate bits Discarded Systematic Convolutional Encoder Rate bits L bits Discarded
Turbo Decoder Compression with Side Information Using Turbo CodesApril 3, 2002 Interleaver length L L bits out Channel probabilities calculations bits in Channel probabilities calculations bits in SISO Decoder P channel P extrinsic P a priori Interleaver length L Deinterleaver length L SISO Decoder P channel P extrinsic P a priori Deinterleaver length L Decision P a posteriori
Simulation: Binary Sequences X - Y relationship – P(X i =Y i )=1-p and P(X i Y i )=p System –16-state, Rate 4/5 constituent convolutional codes; – R X =0.5 bit per input bit with no puncturing –Theoretically, must be able to send X without error when H(X|Y) 0.5 Compression with Side Information Using Turbo CodesApril 3, 2002
Results: Compression of Binary Sequences Compression with Side Information Using Turbo CodesApril 3, 2002 R X = bit
Results for different rates Punctured the parity bits to achieve lower rates Compression with Side Information Using Turbo CodesApril 3,
Extension to Continuous-Valued Sequences Compression with Side Information Using Turbo CodesApril 3, 2002 X and Y are sequences of i.i.d continuous-valued random variables X 1 X 2 …X L and Y 1 Y 2 …Y L. Let X i be independent of Y j for i j, but dependent on Y i. X and Y dependency described by pdf f (x|y). Y is known as side information at the decoder Interleaver length L L values Quantize to 2 M levels L symbols Convert to bits ML bits Turbo Coder To decoder
Simulation: Gaussian Sequences X - Y relationship –X is a sequence of i.i.d Gaussian random variables –Y i =X i +Z i, where Z is also a sequence of i.i.d Gaussian random variables, independent of X. f(x|y) is a Gaussian probability density function System –4-level Lloyd-Max scalar quantizer –16-state, rate 4/5 constituent convolutional codes –No puncturing so rate is 1 bit/source sample Compression with Side Information Using Turbo CodesApril 3, 2002
Compression with Side Information Using Turbo CodesApril 3, 2002 Results: Compression of Gaussian Sequences CSNR = ratio of the variance of X and Z R X =1 bit/sample 2.8 dB
Joint Source-Channel Coding Assume that the parity bits pass through a memoryless channel with capacity C We can include the channel statistics in the decoder calculations for P channel. From Slepian-Wolf theorem and definition of Channel capacity Compression with Side Information Using Turbo CodesApril 3, 2002
Compression with Side Information Using Turbo CodesApril 3, 2002 Results: Joint Source-Channel Coding R X =0.5 BSC with q= bit 0.15 bit
Conclusion We can use turbo codes for compression of binary sequences. Can perform close to the Slepian-Wolf bound for lossless distributed source coding. We can apply the system for compression of distributed continuous-valued sequences. Performs better than previous techniques. Easy extension to joint source-channel coding Compression with Side Information Using Turbo CodesApril 3, 2002