Design of Novel Two-Level Quantizer with Extended Huffman Coding for Laplacian Source Lazar Velimirović, Miomir Stanković, Zoran Perić, Jelena Nikolić,

Slides:



Advertisements
Similar presentations
Compression techniques. Why we need compression. Types of compression –Lossy and lossless Concentrate on lossless techniques. Run Length coding. Entropy.
Advertisements

The Normal Distribution
Data Compression CS 147 Minh Nguyen.
Introduction to Computer Science 2 Lecture 7: Extended binary trees
Lecture 4 (week 2) Source Coding and Compression
Linearization of the Optimal Compressor Function for Gaussian Mixture Model Zoran H. Perić, Jelena R. Nikolić, Vladica N. Đor đ ević.
Sampling and Pulse Code Modulation
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Huffman code and ID3 Prof. Sin-Min Lee Department of Computer Science.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
SIMS-201 Compressing Information. 2  Overview Chapter 7: Compression Introduction Entropy Huffman coding Universal coding.
Data Compression.
Huffman Encoding 16-Apr-17.
Lecture 6: Huffman Code Thinh Nguyen Oregon State University.
ENGS Assignment 3 ENGS 4 – Assignment 3 Technology of Cyberspace Winter 2004 Thayer School of Engineering Dartmouth College Assignment 3 – Due Sunday,
Spatial and Temporal Data Mining
DL Compression – Beeri/Feitelson1 Compression דחיסה Introduction Information theory Text compression IL compression.
Losslessy Compression of Multimedia Data Hao Jiang Computer Science Department Sept. 25, 2007.
CS 206 Introduction to Computer Science II 04 / 29 / 2009 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 12 / 10 / 2008 Instructor: Michael Eckmann.
Methods of Image Compression by PHL Transform Dziech, Andrzej Slusarczyk, Przemyslaw Tibken, Bernd Journal of Intelligent and Robotic Systems Volume: 39,
Fundamentals of Multimedia Chapter 7 Lossless Compression Algorithms Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Data Compression Basics & Huffman Coding
Noise, Information Theory, and Entropy
Noise, Information Theory, and Entropy
Computer Vision – Compression(2) Hanyang University Jong-Il Park.
Huffman coding By: Justin Bancroft. Huffman coding is the most efficient way to code DCT coefficients.
CS Spring 2011 CS 414 – Multimedia Systems Design Lecture 7 – Basics of Compression (Part 2) Klara Nahrstedt Spring 2011.
Huffman Encoding Veronica Morales.
Fundamentals of Data Analysis Lecture 10 Management of data sets and improving the precision of measurement pt. 2.
Prof. Amr Goneid Department of Computer Science & Engineering
Compression.  Compression ratio: how much is the size reduced?  Symmetric/asymmetric: time difference to compress, decompress?  Lossless; lossy: any.
Lossless Compression CIS 465 Multimedia. Compression Compression: the process of coding that will effectively reduce the total number of bits needed to.
An introduction to audio/video compression Dr. Malcolm Wilson.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Huffman Code and Data Decomposition Pranav Shah CS157B.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Introduction to Digital Signals
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis Lecture 34: Different Coding Techniques.
Huffman’s Algorithm 11/02/ Weighted 2-tree A weighted 2-tree T is an extended binary tree with n external nodes and each of the external nodes is.
Foundation of Computing Systems
Bahareh Sarrafzadeh 6111 Fall 2009
ELE 488 F06 ELE 488 Fall 2006 Image Processing and Transmission ( ) Image Compression Review of Basics Huffman coding run length coding Quantization.
Lossless Decomposition and Huffman Codes Sophia Soohoo CS 157B.
Chapter 7 Lossless Compression Algorithms 7.1 Introduction 7.2 Basics of Information Theory 7.3 Run-Length Coding 7.4 Variable-Length Coding (VLC) 7.5.
Lecture 12 Huffman Algorithm. In computer science and information theory, a Huffman code is a particular type of optimal prefix code that is commonly.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 7 – Basics of Compression (Part 2) Klara Nahrstedt Spring 2012.
ECE 101 An Introduction to Information Technology Information Coding.
Chapter 11. Chapter Summary  Introduction to trees (11.1)  Application of trees (11.2)  Tree traversal (11.3)  Spanning trees (11.4)
An introduction to audio/video compression Prepared by :: Bhatt shivani ( )
Entropy vs. Average Code-length Important application of Shannon’s entropy measure is in finding efficient (~ short average length) code words The measure.
Lossless Compression-Statistical Model Lossless Compression One important to note about entropy is that, unlike the thermodynamic measure of entropy,
Design & Analysis of Algorithm Huffman Coding
EE465: Introduction to Digital Image Processing
Section 7.3: Probability Distributions for Continuous Random Variables
Data Compression.
Greedy Technique.
Data Compression.
Data Compression CS 147 Minh Nguyen.
Data Compression If you’ve ever sent a large file to a friend, you may have compressed it into a zip archive like the one on this slide before doing so.
Analysis & Design of Algorithms (CSCE 321)
Advanced Algorithms Analysis and Design
Huffman Coding CSE 373 Data Structures.
Huffman Encoding.
Lecture 8 Huffman Encoding (Section 2.2)
Presentation transcript:

Design of Novel Two-Level Quantizer with Extended Huffman Coding for Laplacian Source Lazar Velimirović, Miomir Stanković, Zoran Perić, Jelena Nikolić,

Abstract  We have proposed the novel model of two-level scalar quantizer with extended Huffman coding.  We have designed two-level scalar quantizer such that to achieve as close as possible approaching of the bit rate to the source entropy under the given constrain that the SQNR value does not deviate more than 1 dB from the optimal SQNR Lloyd-Max's quantizer value.  In our model the asymmetry of representation levels is assumed to provide an unequal probability of representation levels for the symmetric Laplacian probability density function, that in turn provides the proper basis for the further implementation of a lossless compression techniques.

Two-level quantizer with variable decision threshold  Asymmetry of representation levels - as for the Lloyd-Max's quantizer determined from the centroid condition:  Signal to quantization noise ratio:  Distortion: variable decision threshold t 1 - it is determined depending on the SQNR that has to be achieved

Two-level quantizer with variable decision threshold  The extended Huffman coding is the procedure of determining the optimal length of code words for blocks of two or more symbols  Probabilities:,  The extended Huffman coding procedure blocks more than one symbol together, we can now define probabilities of two and three symbol blocks as :,  Source entropy:,  The average bit rate :,

Extended Huffman coding  The procedure of determining the length of the code words using the extended Huffman coding and the code book construction is consisted of the following steps:  Determining the symbol block probabilities, further sorting in descending order and finally assigning appropriate probabilities to the initial nodes of the graph.  Application of an iterative process, where in each iteration the connection of the two nodes with the smallest probabilities is done and the sum of their probabilities is assigned to a new node. Processing further until the nodes’ sum of the probabilities joining in the last step becomes equal to one  The construction of code words. Code word for each symbol is determined by beginning from the tree root (node with probability 1) and branches, to which the allocation of zero value is acquired (upper branch) and 1 (lower branch). Assignment process continues to the left until all possible branches are covered. Code word is formed from zeros and ones that are on the path from the root to the node that corresponds to that symbol

Extended Huffman coding Example of extended Huffman code construction: forming the tree and assigning the code words

Numerical results  Numerical results presented in this work for the proposed two- level quantizer with extended Huffman coding are obtained for the cases where the SQNR value does not deviate more than 1 dB from the optimal quantizer SQNR value with the same number of quantization levels  The optimal SQNR value of the Lloyd-Max's quantizer having two quantization levels is 3 dB, which means that the SQNR range in which we consider the performance of the proposed quantizer is [2 dB, 3 dB]  The calculated performance of the proposed quantizer in the case of two and three symbol blocks are shown in next picture

Numerical results The dependency of the bit rate and the entropy on the distortion for the proposed quantizer

Numerical results Performance of the Proposed Quantizer in the Case of Two and Three Symbol Blocks

Numerical results  One can notice that the of the proposed quantizer approaches the source entropy H where this convergence is greater in the case of three symbol blocks than in the case of two symbol blocks  From the results given in Table and Figure one can observe that when the SQNR value deviats up to 0.5 dB from the optimal SQNR value, there is a little deviation of from H in the case of three symbol blocks.  It is important to notice that for the proposed quantizer in the case of three symbol blocks with an average bit rate reduction of 0.35 bits, the reduction in SQNR of 0.5 dB is achieved. This is about 0.9 dB smaller SQNR reduction for the same amount of the compression than the one ascertained in the considered range of average bits rate

Conclusion  Novel class of asymmetrical quantizers having variable decision threshold with extended Huffman coding is proposed  Based on the proposed quantizer analysis, it is shown that by using the extended Huffman coding technique and the set of quantizers with variable decision threshold, approaching of the average bit rate to the source entropy can be achieved.

THANKS A LOT ! Contact