Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.

Similar presentations


Presentation on theme: "1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007."— Presentation transcript:

1 1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007

2 2 Outline ●Concept of DSC ●Slepian-Wolf coding (lossless) ●Wyner-Ziv coding (lossy) ●Application areas

3 3 Distributed Source Coding - Sensor Networks

4 4 XY Correlated Sources Entropy H(X) Conditional entropy H(Y|X) Joint Entropy H(X,Y) Mutual Information I(X;Y)

5 5 Co-located, Correlated observations ● X and Y correlated ●Both encoder and decoder know the correlation R = H(X,Y)= H(Y) + H(X|Y) < H(X) + H(Y) R = H(X,Y)= H(Y) + H(X|Y) < H(X) + H(Y)

6 6 Distributed, but Correlated Observations ● X and Y spatially separated, but still correlated ●Informed encoders  Rate: R=R X +R Y =H(X,Y)=H(Y)+H(X|Y) ●Uninformed, naive encoders  Rate: R=R X +R Y =H(X)+H(Y) > H(X,Y)

7 7 Slepian-Wolf Coding (SWC) R =R X +R Y = H(X,Y) = H(Y) + H(X|Y) still possible!! ● X and Y spatially separated, but still correlated ●Encoder/decoder designed w.r.t. p(X,Y) ●No communication between encoders!

8 8 Achievable Rate Region - SWC No errors Vanishing error probability for long sequences Code X with Y as side-information Code Y with X as side-information Time-sharing/ Source splitting/ Code partitioning Slepian & Wolf, “Noiseless Coding of Correlated Information Sources,” IEEE Trans. Inf.Theory, Jul.1973

9 9 Principle - SWC YnYn XnXn R Y = nH(Y) R X = nH(X|Y) Apply 2 nH(X|Y) colors randomly 2 nH(Y) codewords

10 10 Toy Example – Binary Source ● X and Y each 3 bits ● X and Y differs at most in one bit 1.Make sets of X ’s with Hamming distance 3:  X: {000,111}, {100,011}, {010,101}, {001,110} 2.Send index of set (requires 2 bits) 3.Send Y (requires 3 bits) 4.Decode X by using the element in the set which is closest to Y 5.Declare error if no element with d H ≤1 Coset

11 11 SWC design ●Proof in Slepian&Wolf’s article “non-constructive” ●Important realization: SWC is a channel coding problem ●“Virtual” correlation channel between X and Y ●A good channel code for this channel can provide a good SW code by using coset codes as bins

12 12 Wyner’s Scheme ●Use a linear block code, send syndrome ● (n,k) block code, 2 (n-k) syndromes, each corresponding to a set of 2 k words of length n. ●Each set is a coset code. ●Compression ratio of n:(n-k). A. Wyner, “Recent Results in the Shannon Theory,” IEEE Trans. Inf.Theory, Jan.1974

13 13 Practical SWC Design ●Use more powerful channel codes  LDPC / Turbo codes ●Send parity bits  Zhao & Garcia-Frias, “Data compression of correlated non-binary sources using punctured turbo codes”, DCC’02 ●Or send syndrome  Liveris et al., “Compression of binary sources with side-information at the decoder using LDPC codes,” IEEE Commun.Lett. vol.6, no.10, 2002

14 14 SWC using LDPC codes Xiong et al., “Distributed Source Coding for Sensor Networks,” IEEE Sig.Proc.Mag., Sept. 2004

15 15 Continuous Sources – Wyner-Ziv Coding (WZC) ●Generalizes SWC by introducing a fidelity criterion ●A joint source-channel coding problem ●We need  Good source coder to achieve the source coding gains (e.g.TCQ)  Good channel code which approaches Slepian-Wolf limit (LDPC)

16 16 Wyner-Ziv Rate-Distortion Function under the following conditions:

17 17 Distributed Source Coding Using Syndromes (DISCUS) ●First constructive design approach for WZC ●Trellis-based quantization and coset construction.  2-5 dB away from WZ-bound ●[Yang et al. ’03]: SWC-TCVQ  Irregular LDPC, n=10 6  2-D TCVQ  Quadratic Gaussian: 0.47 dB away for 3.3 bit/sym Pradhan & Ramchandran,“Distributed Source Coding Using Syndromes (DISCUS): Design and Construction,” Data Compression Conf. (DCC), 1999

18 18 Other Approaches to Lossy DSC ●Distributed Karhunen-Loève transform  Local minima  ●Distributed scalar quantizers optimized for noisy channels  Simpler encoder  Local minima 

19 19 Application Areas ●Sensor networks ●Multimedia transmission ●Robust coding for co-located sources  Digitally enhanced analog TV  Multiple description coding ●Data hiding / watermarking ●Coding for multiple access channels ●MIMO broadcast channel ●Searchable compression (…)

20 20 Sensor Networks ●Possible rate savings with WZC ●Hard to find correlation model   Can be determined through training  But what about time-varying correlation?

21 21 Wyner-Ziv for Video Compression (1/3) ●MPEG: High encoder complexity ●Portables: Less powerful hardware ●Solution: Wyner-Ziv video coding  Shifts complexity to the decoder  Transcoding to MPEG provides simple decoder for receiver

22 22 Wyner-Ziv for Video Compression (2/3)

23 23 Wyner-Ziv for Video Compression (3/3) Girod et al.,“Distributed Video Coding,” Proc. IEEE, Jan.2005

24 24 Digitally Enhanced Analog TV

25 25 Watermarking ●“Hide” a message W inside a host X ●A dual problem to DSC  Channel coding with side-information at encoder ●Attacker tries to remove/destroy watermark W  Source X must be preserved For AWGN attack, knowledge of X only at the encoder is as good as knowing X at both encoder and decoder. Costa," Writing on Dirty Paper,” IEEE Trans.Inf.Theory, May 1983

26 26 MIMO Broadcast Channel ●Non-degraded broadcast channel  Cannot use superposition coding with successive decoding  Related to watermarking: Dirty paper coding! ●Costa’s “writing on dirty paper” scheme  Adapt to interference, don’t try to cancel it  User 1’s signal hosts, insert “watermark” as message to User 2 Complexity at receiver Complexity at transmitter

27 27 Summary ●Distributed Source Coding  Enables compression of correlated, spatially separated sources  Slepian-Wolf Coding: Lossless  Wyner-Ziv Coding: Lossy ●Other uses  Multimedia  Watermarking  Multiple access / broadcast channels / MIMO

28 28 Further Reading Slepian & Wolf, “Noiseless Coding of Correlated Information Sources,” IEEE Trans.Inf.Theory, Jul. 1973 Wyner & Ziv, “The Rate-Distortion Function for Source Coding with Side Information at the Decoder,” IEEE Trans.Inf.Theory, Jan. 1976 Pradhan & Ramchandran,“Distributed Source Coding Using Syndromes (DISCUS): Design and Construction,” IEEE Trans.Inf.Theory, Mar.2003 Pradhan et al., “Distributed Compression in a Dense Microsensor Network,” IEEE Sig.Proc.Mag., Mar.2002. Xiong et al., “Distributed Source Coding for Sensor Networks,” IEEE Sig.Proc.Mag., Sept. 2004 Yang et al. “Wyner-Ziv Coding Based on TCQ and LDPC Codes”, 37 th Asilomar Conference on Sig.,Sys.and Comp. 2004 Girod et al.,“Distributed Video Coding,” Proc. IEEE, Jan.2005 Cox et al.,”Watermarking as Communications with Side Information,” Proc. IEEE, Jul. 1999

29 29 Wyner’s Scheme – Toy example Parity check matrix Input vector Syndrome X and Y each 3 bits X and Y differs at most in one bit

30 30 Wyner’s Scheme – Toy example - Cosets xs=xH T 00000 11100 01001 10101 00110 11010 01111 10011

31 31 Wyner’s Scheme – Toy example Send these


Download ppt "1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007."

Similar presentations


Ads by Google