An Algorithm for Construction of Error-Correcting Symmetrical Reversible Variable Length Codes Chia-Wei Lin, Ja-Ling Wu, Jun-Cheng Chen Presented by Jun-Cheng.

Slides:



Advertisements
Similar presentations
DCSP-8: Minimal length coding II, Hamming distance, Encryption Jianfeng Feng
Advertisements

Introduction to Computer Science 2 Lecture 7: Extended binary trees
Lecture 4 (week 2) Source Coding and Compression
Applied Algorithmics - week7
Lecture 3: Source Coding Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Greedy Algorithms Amihood Amir Bar-Ilan University.
Information Theory EE322 Al-Sanie.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Data Compression.
Lecture04 Data Compression.
Compression & Huffman Codes
Cellular Communications
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
Lecture 6: Huffman Code Thinh Nguyen Oregon State University.
Optimal Merging Of Runs
DL Compression – Beeri/Feitelson1 Compression דחיסה Introduction Information theory Text compression IL compression.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Efficient Fine Granularity Scalability Using Adaptive Leaky Factor Yunlong Gao and Lap-Pui Chau, Senior Member, IEEE IEEE TRANSACTIONS ON BROADCASTING,
Video Streaming: An FEC-Based Novel Approach Jianfei Cai, Chang Wen Chen Electrical and Computer Engineering, Canadian Conference on.
Variable-Length Codes: Huffman Codes
Noise, Information Theory, and Entropy
Spring 2015 Mathematics in Management Science Binary Linear Codes Two Examples.
Huffman Codes Message consisting of five characters: a, b, c, d,e
Entropy coding Present by 陳群元. outline constraints  Compression efficiency  Computational efficiency  Error robustness.
1 Analysis of Algorithms Chapter - 08 Data Compression.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
exercise in the previous class
1 SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Cyclic Codes.
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
COMPRESSION. Compression in General: Why Compress? So Many Bits, So Little Time (Space) CD audio rate: 2 * 2 * 8 * = 1,411,200 bps CD audio storage:
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
Watermarking Part 2: Future Work Electrical and Computer Engineering Department Villanova University 18 August 2004 Robert J. Berger II Michael P. Marcinak.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Design of Novel Two-Level Quantizer with Extended Huffman Coding for Laplacian Source Lazar Velimirović, Miomir Stanković, Zoran Perić, Jelena Nikolić,
DIGITAL COMMUNICATIONS Linear Block Codes
Huffman Code and Data Decomposition Pranav Shah CS157B.
Coding Theory Efficient and Reliable Transfer of Information
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
CS654: Digital Image Analysis Lecture 34: Different Coding Techniques.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Some Computation Problems in Coding Theory
Foundation of Computing Systems
Bahareh Sarrafzadeh 6111 Fall 2009
Lossless Decomposition and Huffman Codes Sophia Soohoo CS 157B.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
1 Data Compression Hae-sun Jung CS146 Dr. Sin-Min Lee Spring 2004.
بسم الله الرحمن الرحيم My Project Huffman Code. Introduction Introduction Encoding And Decoding Encoding And Decoding Applications Applications Advantages.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
Efficient Huffman Decoding Aggarwal, M. and Narayan, A., International Conference on Image Processing, vol. 1, pp. 936 – 939, 2000 Presenter :Yu-Cheng.
Ch4. Zero-Error Data Compression Yuan Luo. Content  Ch4. Zero-Error Data Compression  4.1 The Entropy Bound  4.2 Prefix Codes  Definition and.
HUFFMAN CODES.
EE465: Introduction to Digital Image Processing
The Viterbi Decoding Algorithm
Trellis Codes With Low Ones Density For The OR Multiple Access Channel
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
S Digital Communication Systems
Analysis & Design of Algorithms (CSCE 321)
Advanced Algorithms Analysis and Design
Huffman Coding CSE 373 Data Structures.
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
Huffman Coding Greedy Algorithm
Lecture 15 The Minimum Distance of a Code (Section 4.4)
Theory of Information Lecture 13
Authors:Bijan G. Mobasseri、Domenick Cinalli
Presentation transcript:

An Algorithm for Construction of Error-Correcting Symmetrical Reversible Variable Length Codes Chia-Wei Lin, Ja-Ling Wu, Jun-Cheng Chen Presented by Jun-Cheng Chen 2004/09/24

Outline Introduction Notations and preliminaries The proposed algorithm Experimental results Conclusion

Introduction to RVLC (1/3) Variable length codes (VLC) are of prime importance in the efficient transmission of digital signals. –High compression efficiency There are other criteria that may be important in the application environment. –Channel bit-error resilience, maximum codeword length limitation Reversibility of variable length codes makes instantaneous decoding possible both in the forward and backward directions.

Introduction to RVLC (2/3) A reversible variable length code (RVLC) must satisfy the prefix-free and suffix-free condition for instantaneous forward and backward decoding. Symbol ProbabilityC1C2 ABCDEABCDE Average code length Table 0: Huffman code and reversible variable length codes(RVLCs) C1: Huffman code C2: Symmetrical RVLC

Introduction to RVLC (3/3) If a bit error occurs, VLC will propagate the bit error and the data after the bit error becomes useless. However, RVLC can recover data after the bit error. VLC RVLC Bit error

Notations and preliminaries (1/3) n source symbols: probabilities of source symbols: n codewords: length of codeword : – : codeword sequences – : hamming distance

Notations and preliminaries (2/3) (minimum block distance) In the case of RVLCs Note: if a VLC does not have equal-length codewords, then its minimum block distance is undefined., if is undefined, if is defined

codewords Block distance:2 Block distance:3 Block distance:4 (minimum block distance): 2 Find minimum blcok distance Minimum Block Distance Find minimum hamming distance for each level

Notations and preliminaries (3/3) R(c, CL d, d) is the replacement of a codeword c in the codeword list CL d –children(c) are defined by all of the first symmetrical codewords on paths from the codeword c to leaf codewords. –CL d is a codeword list in which the constraint d b  d holds for all its codewords. –children(c, d) is a subset of the symmetrical children of a symmetrical codeword c in which the constraint d b  d holds for all the children in the subset. –children(c, CL d, d) is a subset of symmetrical children of c that the union of the subset and CL d is still a codeword list in which the constraint d b  d holds for all its codewords.

00 codeword ’00’ 000 children(‘00’) = {000} children(‘00’, 2) = {000} T-List: (1,00,010, 0110) children(‘00’,T-List, 2) = {} 0000 R(‘ 00 ’,T-List,2) ={0000} 01

The Proposed Algorithm (1/4) Goal: The free distance of the proposed RVLC is always greater than one, which can result in certain improvement in symbol error rate relative to VLCs with free distance one.

The Proposed Algorithm (2/4) The algorithm –Step1: Assign the initial codeword list (“1”, “00”, “010”, “0110”,…) to the target list, T-List. –Step2: If any codeword c in T-List satisfies the condition of replacement, replace the codeword c with R(c, T-List, 2) that results in the smallest average codeword length. If the number of codewords in T-List is more than n, the number of source symbols, keep the first n codewords in T-List and discard the others. (Notice that the block distance is still greater than one after the codeword replacement) –Step3: Repeat Step2 until there is no codeword in T-List satisfying the condition of replacement.

The Proposed Algorithm (3/4) Six symbols with probabilities (0.286, 0.214, 0.143, 0.143, 0.071) are given Table 1: An example of the proposed algorithm, where P U (U) denotes the probability of the source symbol U. UP U (U)After step 1After step 2 a a a a a a Average codeword length

The Proposed Algorithm (4/4) Table 2: Some temporary results of the proposed algorithm. T-list: (1, 00, 010, 0110, 01110, ) Iteration #1 cR(c,T-List, 2) Merge result Average code length after replacement 1 11, 101, 1001, 10001, , 11, 010, 101, 0110, 1001, 01110, 10001, , , 00100, , , , , Others are 3.285, 2.856, and 2.856

Experimental Results (1/3) Test corpuses are from Canterbury Corpus File Set. (available in Table 3 respectively lists various symmetrical RVLCs constructed by other algorithms, and the proposed algorithm for the English alphabet set. Table 4 lists the results of the Canterbury Corpus file set compressed by using other algorithms and the proposed algorithm

Experiment Results (2/3) Table 3: Symmetrical RVLCs for the English alphabet set

Experiment Results (3/3) Table 4: the average codeword lengths of various symmetrical RVLCs for the Canterbury Corpus file set

Conclusion (1/2) The optimal design of RVLC with good distance properties and compression efficiency is still an open problem. The proposed algorithm is suitable for any VLC- involved applications to enhance the error- correcting capability of critical time-constrained applications. It should be pointed out that symmetrical RVLCs with cannot be surely obtained by the proposed algorithm.

Conclusion (2/2) The major contribution of the proposed algorithm is that it yields more efficient symmetrical RVLCs with the same free distance than other known algorithms.

Level 4 No. of Available Candidate Codewords at level 5 No. of Available Candidate Codewords at level 6 No. of Available Candidate Codewords at level 7 Candidate codewords MSSL MSSL: MSSL:3 MSSL (Maximum Symmetrical Suffix Length). not symmetrical