Download presentation
Presentation is loading. Please wait.
Published byMorgan Singleton Modified over 9 years ago
1
Noiseless Coding
2
Introduction Noiseless Coding Compression without distortion Basic Concept Symbols with lower probabilities are represented by the binary indices with longer length Methods Huffman codes, Lempel-Ziv codes, Arithmetic codes and Golomb codes
3
Entropy Consider a set of symbols S={S 1,...,S N }. The entropy of the symbols is defined as where P(S i ) is the probability of S i.
4
Example: Consider a set of symbols {a,b,c} with P(a)=1/4, P(b)=1/4 and P(c)=1/2. The entropy of the symbols is then given by
5
Consider a message containing symbols in S. Define rate of a source coding technique as the average number of bits representing each symbol after compressing.
6
Example: Suppose the following message is desired to be compressed. a a a b c a Suppose a encoding technique uses 7 bits to represent the message. The rate of the the encoding technique therefore is 7/6. (since there are 6 symbols)
7
The lowest rate for encoding a message without distortion is the entropy of the symbols in the message. Shannon’s source coding theorem:
8
Therefore, in an optimal noiseless source encoder, the average number of bits used to represent each symbol S i is It will take larger number of bits to represent a symbol having small probability.
9
Because the entropy is the limit of the noiseless encoder, we usually call the noiseless encoder, the entropy encoder.
10
Huffman Codes We start with a set of symbols, where each symbol is associated with a probability. Merge two symbols having lowest probabilities to a new symbol.
11
Repeat the merging process until all the symbols are merged to a single symbol. Following the merging path, we can form the Huffman codes.
12
Example Consider the following three symbols : a ( with prob. 0.5 ) b ( with prob. 0.3 ) c ( with prob. 0.2 )
13
a b c Huffman Codes : a 1 b 01 c 00 Merging Process 1 0 0 1
14
Example Suppose the following message is desired to be compressed. a a a b c a The results of the Huffman coding are : 1 1 1 01 00 1 Total # of bits used to represent the message : 8 bits (Rate=8/6=4/3)
15
If the message is not compressed by the Huffman codes, each symbol should be represented by 2 bits. Total # of bits used to represent the message therefore is 12 bits. We have saved 4 bits using the Huffman codes.
16
Example
18
Discussions It does not matter how the symbols are arranged. It does not matter how the final code tree are labeled (with 0s and 1s). Huffman code is not unique.
19
Lempel-Ziv Codes Parse the input sequence into non- overlapping blocks of different lengths. Construct a dictionary based on the blocks. Use the dictionary for both encoding and decoding.
20
It is NOT necessary to pre-specify the probability associated with each symbol.
21
Dictionary Generation process 1. Initialize the dictionary to contain all blocks of length one. 2. Search for the longest block W which has appeared in the dictionary. 3. Encode W by its index in the dictionary. 4. Add W followed by the first symbol of the next block to the dictionary. 5. Go to Step 2.
22
Example Consider the following input message a b a a b a Initial dictionary: indexentry 0a 1b
23
encoder side W = a, output 0 Add ab to the dictionary indexentry 0a 1b 2ab a b a
24
encoder side W = b, output 1 Add ba to the dictionary indexentry 0a 1b 2ab 3ba a b a
25
W = a, output 0 Add aa to the dictionary indexentry 0a 1b 2ab 3ba 4aa encoder side a b a
26
W = ab, output 2 Add aba to the dictionary indexentry 0a 1b 2ab 3ba 4aa 5aba encoder side a b a
27
W = a, output 0 Stop indexentry 0a 1b 2ab 3ba 4aa 5aba encoder side
28
decoder side Initial dictionary Input 0, generate a indexentry 0a 1b 0 a
29
decoder side Receive 1, generate b Add ab to the dictionary indexentry 0a 1b 2ab a 1 b
30
decoder side Receive 0, generate a Add ba to the dictionary indexentry 0a 1b 2ab 3ba a 0 ba
31
decoder side Receive 2, generate ab Add aa to the dictionary indexentry 0a 1b 2ab 3ba 4aa a 2 baab
32
decoder side Receive 0, generate a Add aba to the dictionary indexentry 0a 1b 2ab 3ba 4aa 5aba a 0 baaba
33
Example
34
a a a b c a Consider again the following message The initial dictionary is given by indexentry 0a 1b 2c
35
After the encoding process, the output of the encoder is given by 0 3 1 2 0 The final dictionary is given by indexentry 0a 1b 2c 3aa 4aab 5bc 6ca
36
decoder side Initial dictionary Receive 0, generate a indexentry 0a 1b 2c 0 a
37
decoder side Receive 3, generate ? Decoder get stuck !!! We need Welch correction to this problem. indexentry 0a 1b 2c a 3 ?
38
It turns out that this behavior can arise whenever one sees a pattern of the form xwxwx where x is a single symbol, and w is either empty or a sequence of symbols such that xw already appears in the encoder and decoder table, but xwx does not. Welch correction
39
In this case the encoder will send the index for xw, and add xwx to the table with a new index i. Next it will parse xwx and send the new index i. The decoder will receive the index i but will not yet have the corresponding word in the dictionary. Welch correction
40
Therefore, when the decoder can not find the corresponding word for an index i, the word must be xwx, where xw can be found from the last decoded symbols. Welch correction
41
Here, the last decoded symbol is a. Therefore, x=a, and w= , Hence, xwx=aa. indexentry 0a 1b 2c 3aa a 3 aa decoder side
42
Receive 1, generate b Add aab to the dictionary a 1 aab indexentry 0a 1b 2c 3aa 4aab
43
decoder side Receive 2 and 0, generate c and a Final dictionary a 2 aabca 0 indexentry 0a 1b 2c 3aa 4aab 5bc 6ca
44
Example Consider the following message a b a b a b a After the encoding process, the output of the encoder is given by 0 1 2 4 The final dictionary is given by indexentry 0a 1b 2ab 3ba 4aba
45
decoder side Receive 0, 1 and 2, generate a, b and ab current dictionary indexentry 0a 1b 2ab 3ba a 1 bab 02
46
decoder side Receive 4, generate ? current dictionary indexentry 0a 1b 2ab 3ba abab 4 ?
47
Here, the last decoded symbol is ab. Therefore, x=a, and w= b, Hence, xwx=aba. abab 4 a b a indexentry 0a 1b 2ab 3ba 4aba
48
Discussions Theoretically, the size of the dictionary can grow infinitely large. In practice, the dictionary size is limited. Once the limit is reached, no more entries are added. Welch had recommended a dictionary of size 4096. This corresponds to 12 bits per index.
49
Discussions The length of indices may vary. When the number of entries n in the dictionary is such that 2 m n > 2 m-1 then the length of indices can be m.
50
Discussions Use the message as an example, the encoded indices are a a a b c a 0 3 1 2 0 00 11 001 010 000 Need 13 bits (Rate=13/6)
51
Discussions The above examples, as most other illustrative examples in the literature, does not result in real compression. Actually, more bits are used to represent the indices than the original data. This is because the length of the input data in the example is too short. In practice, the Lempel-Ziv algorithm works well (lead to actual compression) only when the input data is sufficiently large and there are sufficient redundancy in the data.
52
Discussions Many popular programs (e.g. Unix compress and uncompress, gzip and gunzip, GIF format and Windows WinZip) are based on the Lempel-Ziv algorithm.
53
Arithmetic Codes A message is represented by an interval of real numbers between 0 and 1. As the message becomes longer, the interval needed to represent it becomes smaller. =>The number of bits needed to specify that interval grows.
54
Arithmetic Codes Successive symbols of the message reduce the size of the interval according to the symbol probabilities.
55
Example Again we consider the following three symbols : a ( with prob. 0.5 ) b ( with prob. 0.3 ) c ( with prob. 0.2 ) Suppose we also encode the same message as the previous example : a a a b c a
56
The encoding process 1 0.5 0.25 0.125 0.1 0.1 0.09625 0.5 0.250.1250.0625 0.10.0925 0.09625 0 0 0 0 0.0625 0.0925 0.0925 a a a b c a Final interval c b a
57
The final interval therefore is [ 0.0625, 0.09625 ) Any number in this final interval can be used for the decoding process. For instance, we pick 0.09375 ∈ [ 0.0625, 0.09625 )
58
1 0.5 0.25 0.125 0.1 0.1 0.5 0.250.1250.0625 0.10.0925 0.09625 0 0 0 0 0.0625 0.0925 c b a 0.09375 ∈ [ 0,0.5 ) output a 0.09375 ∈ [ 0,0.25 ) output a 0.09375 ∈ [ 0,0.125 ) output a 0.09375 ∈ [ 0.925,0.1 ) output c 0.09375 ∈ [ 0.0625,0.1 ) output b 0.09375 ∈ [ 0.0925,0.09625 ) output a
59
Decoder Therefore, the decoder successfully identify the source sequence a a a b c a Note that 0.09375 can be represented by the binary sequence 0 0 0 1 1 (0.5) (0.25) (0.125) (0.0625) (0.03125) We only need 5 bits to represent the message (Rate=5/6).
60
Discussions Given the same message a a a b c a No compression 12 bits Huffman codes 8 bits Lempel-Ziv 13 bits Arithmetic codes 5 bits
61
Discussions The length of the interval may become very small for a long message, causing underflow problem.
62
Discussions The encoder does not transmit any thing until the entire message has been encoded. In most applications an incremental mode is necessary.
63
Discussions The symbol frequencies (i.e.,probabilities) might vary with time. It is therefore desired to use an adaptive symbol frequency model for encoding and decoding.
64
Golomb Codes Well-suited for messages containing lots of 0 ’ s and not too many 1 ’ s. Example: Fax Documents
65
First step of Golomb Code: Convert the input sequence into integers Example: –001000000100000000010000000000100000 00000000000000000000001 –2,6,9,10,27
66
Second step of Golomb code: Convert the integers into encoded bitstream –Select an integer m. –For each integer obtained from the first step n, compute q and r, where n = qm + r
67
–Binary code for n has two parts: 1.q is coded in unary 2.r can be coded in fixed length code or variable length code. –Example: m=5, value of r can be 0,1,2,3,4. Their VLC (denoted by )are:
68
–Example: The binary code for n has the following form: Therefore, the encoded bistream is given by
69
References 1. K. Sayood, Introduction to Data Compression, Morgan Kaufmann, 2000.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.