Download presentation
Presentation is loading. Please wait.
Published byVerawati Hermanto Modified over 6 years ago
1
Distributed Compression For Binary Symetric Channels
Kivanc Ozonat Distributed Compression For Binary Symetric Channels
2
Distributed Compression For Binary Symetric Channels
Introduction Description of the Problem Slepian-Wolf Theorem Prior Work Basic Encoder-Decoder Scheme Methodology Results Distributed Compression For Binary Symetric Channels
3
Distributed Compression For Binary Symetric Channels
Problem Description Given two correlated data sets, a noisy version, X , at the decoder and the original, Y, at the encoder, how to transmit Y with the best coding efficiency? No communication of X and Y at the encoder Encoder Decoder Y X Distributed Compression For Binary Symetric Channels
4
Distributed Compression For Binary Symetric Channels
Slepian-Wolf Theorem Slepian-Wolf : Given the following scheme, (X,Y) (X,Y) R1 Encode X Encode Y R2 Distributed Compression For Binary Symetric Channels
5
Distributed Compression For Binary Symetric Channels
Slepian-Wolf Theorem Can transmit X and Y, if: - R1 > H(X|Y) , R2 > H(Y|X), and - R1+ R2 > H(X,Y). R2 H(Y) H(Y|X) R1 H(X|Y) H(X) Distributed Compression For Binary Symetric Channels
6
Distributed Compression For Binary Symetric Channels
Slepian-Wolf Theorem Our problem is a special case of this: R2 H(Y) H(Y|X) H(Y|X) R1 H(X|Y) H(X) H(X) Distributed Compression For Binary Symetric Channels
7
Distributed Compression For Binary Symetric Channels
Prior Work Bin 1 [0 0 0] [1 1 1] [0 10] [1 0 1] Bin 2 [0 0 1] [1 1 0] [0 1 0] Bin 3 [0 10] [1 0 1] Bin 4 Y = [0 1 1] [0 1 1] [1 0 0] Channel Decoder Encoder Distributed Compression For Binary Symetric Channels
8
Distributed Compression For Binary Symetric Channels
Prior Work How to maximally separate “very long” input sequences? Use error-correcting codes. Distributed Compression For Binary Symetric Channels
9
Distributed Compression For Binary Symetric Channels
Prior Work 1-p p with EQUAL input probabilities of 0 and 1. p 1 1 1-p by Ramchandran, Pradhan. Distributed Compression For Binary Symetric Channels
10
Distributed Compression For Binary Symetric Channels
Prior Work What if the input probabilities are NOT EQUAL? Distributed Compression For Binary Symetric Channels
11
Distributed Compression For Binary Symetric Channels
Methodology Plane 1 Bit Plane 1 Huffman Code The Input Sequence X Form the Bins using Error Correcting Codes Decoder Bit Plane 2 Plane 2 Plane N Bit Plane N Y sequence Distributed Compression For Binary Symetric Channels
12
Distributed Compression For Binary Symetric Channels
Encoder Inputs: 0 (with probability .7) and 1 (with probability .3) Huffman code 2-length sequences: 00 (with probability .49) 01 (with probability .21) 10 (with probability .21) 11 (with probability .09) Bit-Plane 1: 0, 1 , 1 ,1 Bit-Plane 2: -, 0 , 1 ,1 Bit-Plane 3: - , - , 0 ,1 Distributed Compression For Binary Symetric Channels
13
Distributed Compression For Binary Symetric Channels
Encoder [001001] [00], [10], [01] Error Control Coding To Form Bins 011 [0], [110], [10] -10 -0- Distributed Compression For Binary Symetric Channels
14
Distributed Compression For Binary Symetric Channels
Decoder Decoder receives a BIN NUMBER, which corresponds to MULTIPLE CODEWORDS. How to select the “correct codeword” out of these multiple codewords? Use MAXIMUM LIKELIHOOD detection. Distributed Compression For Binary Symetric Channels
15
Distributed Compression For Binary Symetric Channels
Decoder [011] Decoder Bin 4 [011] [110] This is what the decoder receives Huffman codes for 2 length sequences [z1 z2 z3] Assume Y= [01, 11, 10] Compute the probability of [z1 z2 z3] given 01,11,10, using the channel error probability. Distributed Compression For Binary Symetric Channels
16
Distributed Compression For Binary Symetric Channels
Parameters Plane 1 Bit Plane 1 Huffman Code The Input Sequence X Form the Bins using Error Correcting Codes Decoder Bit Plane 2 Plane 2 Plane N Bit Plane N Length 4 Use BCH (15,k) Y sequence Distributed Compression For Binary Symetric Channels
17
Distributed Compression For Binary Symetric Channels
Bit Rate vs. Probability of Occurrence of 0’s (at the fixed error rate p of 0.06) Distributed Compression For Binary Symetric Channels
18
Distributed Compression For Binary Symetric Channels
Difference between the Actual Bit Rate and the Slepian-Wolf Bound vs Error Probability (p) Distributed Compression For Binary Symetric Channels
19
Distributed Compression For Binary Symetric Channels
Conclusions Huffman Code is not a very good choice Better error correcting codes can be selected. Gives good results for low error (p) cases and for cases in which the Huffman code gives nearly equal distribution of 0s and 1s. Distributed Compression For Binary Symetric Channels
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.