Presentation is loading. Please wait.

Presentation is loading. Please wait.

Huffman Code and Data Decomposition Pranav Shah CS157B.

Similar presentations


Presentation on theme: "Huffman Code and Data Decomposition Pranav Shah CS157B."— Presentation transcript:

1 Huffman Code and Data Decomposition Pranav Shah CS157B

2 Why Data Compression?  Fixed Length Data inefficient for transfers and storage.

3 Types Of Compressions  Lossless Compression  Exact Original data reconstructed from compressed data.  Nothing lost.  Examples : Zip, Bank Account Records

4 Types of Compressions  Lossy Compression  Approximation of original data reconstructed from compressed data.  Examples : JPEG – Loss of data quality after repeated compressions. File Size: 87 KBFile Size:26 KB

5 Variable Length Bit Coding  Maps source symbols to a variable number of bits.  Allows sources to be compressed and decompressed with zero error.  Examples : Huffman Coding, Lempel-Ziv Coding and Arithmetic Coding

6 Variable Bit Coding Rules  Use Minimum Number of bits.  Helps to speed up the transfer rate and increase storage.

7 Variable Bit Coding Rules  Cannot have code which contains prefix for another code  Example: Assume A has the code 01. Then, B cannot have the code 010 as it contains A.  Enable left to right unambiguous decoding.  Example: If you have 01, then you know that it is A and not any other character (Not B!)

8 Huffman Code  Entropy encoding algorithm used for lossless data compression.  Variable length code using average length formula : L = l 1 p 1 + l 2 p 2 + … + l M p M where l 1,l 2,l 3 …l M = length and p 1,p 2,p 3 …p M = Probabilities of Source Alphabets A 1,A 2,…A M being generated.  Uses binary tree.  The Huffman Code generated using binary Huffman Code construction method.  Equivalent to simple binary block encoding (Example: ASCII)

9 Algorithm  Make a leaf node for each code symbol  Add the generation probability of each symbol to the leaf node.  Take the two leaf nodes with the smallest probability and connect them into a new node.  Add 1 or 0 to each of the two branches.  The probability of the new node is the sum of the probabilities of the two connecting nodes.  If there is only one node left, the code construction is completed. If not, go back to (2)

10 Example CharactersFrequency A19% (0.19) B28% (0.28) C13% (0.13) D30% (0.30) E10% (0.10)

11 Step 1 Take lowest two frequencies and make a node. 0.100.13 0.23 Character s Frequenc y A19% (0.19) EC23% (0.23) B28% (0.28) C13% (0.13) D30% (0.30) E10% (0.10)

12 Step 2 Take next two lowest and connect into a node. 0.100.13 0.23 0.42 0.19 CharactersFrequency A19% (0.19) EC23% (0.23) B28% (0.28) C13% (0.13) D30% (0.30) AEC42% (0.42) E10% (0.10)

13 Step 3  Continue… 0.100.13 0.19 0.23 0.42 0.58 0.30 0.28 CharactersFrequency A19% (0.19) EC23% (0.23) B28% (0.28) C13% (0.13) D30% (0.30) AEC42% (0.42) BD58% (0.58) E10% (0.10)

14 Completed Tree 1.0 0.42 0.58 0.280.300.19 0.10 0.23 0.13

15 Add 0 or 1 to each branch 1.0 0.42 0.58 0.280.300.19 0.10 0.23 0.13 0 1 00 0 11 1

16 Generated Code CharactersFrequencyCode A19% (0.19)00 B28% (0.28)10 C13% (0.13)011 D30% (0.30)11 E10% (0.10)010

17

18 References  http://gadgethobby.com/wp-content/plugins/blog/images/data- compression.jpg  http://www.steves-digicams.com/knowledge-center/jpeg-images- counting-your-losses.html  http://en.wikipedia.org/wiki/Variable-length_code  http://en.wikipedia.org/wiki/Huffman_coding  http://www.aykew.com/aboutwork/speed.html http://www.aykew.com/aboutwork/speed.html  http://www.000studio.com/kobe_biennale2007/main/gallery.php?i d=1


Download ppt "Huffman Code and Data Decomposition Pranav Shah CS157B."

Similar presentations


Ads by Google