CS654: Digital Image Analysis Lecture 34: Different Coding Techniques.

Slides:



Advertisements
Similar presentations
T.Sharon-A.Frank 1 Multimedia Compression Basics.
Advertisements

Data Compression CS 147 Minh Nguyen.
Introduction to Computer Science 2 Lecture 7: Extended binary trees
Lecture 4 (week 2) Source Coding and Compression
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
IKI 10100: Data Structures & Algorithms Ruli Manurung (acknowledgments to Denny & Ade Azurat) 1 Fasilkom UI Ruli Manurung (Fasilkom UI)IKI10100: Lecture3.
Greedy Algorithms (Huffman Coding)
Data Compression Michael J. Watts
Lecture04 Data Compression.
Compression & Huffman Codes
School of Computing Science Simon Fraser University
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
SWE 423: Multimedia Systems
Spatial and Temporal Data Mining
SWE 423: Multimedia Systems Chapter 7: Data Compression (2)
A Data Compression Algorithm: Huffman Compression
DL Compression – Beeri/Feitelson1 Compression דחיסה Introduction Information theory Text compression IL compression.
Compression & Huffman Codes Fawzi Emad Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
CS 206 Introduction to Computer Science II 04 / 29 / 2009 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 12 / 10 / 2008 Instructor: Michael Eckmann.
Fundamentals of Multimedia Chapter 7 Lossless Compression Algorithms Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
Lossless Data Compression Using run-length and Huffman Compression pages
Data Compression Basics & Huffman Coding
1 Lossless Compression Multimedia Systems (Module 2) r Lesson 1: m Minimum Redundancy Coding based on Information Theory: Shannon-Fano Coding Huffman Coding.
Spring 2015 Mathematics in Management Science Binary Linear Codes Two Examples.
Software Research Image Compression Mohamed N. Ahmed, Ph.D.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
Lecture 1 Contemporary issues in IT Lecture 1 Monday Lecture 10:00 – 12:00, Room 3.27 Lab 13:00 – 15:00, Lab 6.12 and 6.20 Lecturer: Dr Abir Hussain Room.
Noiseless Coding. Introduction Noiseless Coding Compression without distortion Basic Concept Symbols with lower probabilities are represented by the binary.
Computer Vision – Compression(2) Hanyang University Jong-Il Park.
Source Coding-Compression
CS Spring 2011 CS 414 – Multimedia Systems Design Lecture 7 – Basics of Compression (Part 2) Klara Nahrstedt Spring 2011.
Page 110/6/2015 CSE 40373/60373: Multimedia Systems So far  Audio (scalar values with time), image (2-D data) and video (2-D with time)  Higher fidelity.
Data Compression1 File Compression Huffman Tries ABRACADABRA
Lecture Objectives  To learn how to use a Huffman tree to encode characters using fewer bytes than ASCII or Unicode, resulting in smaller files and reduced.
Image Compression (Chapter 8) CSC 446 Lecturer: Nada ALZaben.
Multimedia Data Introduction to Lossless Data Compression Dr Sandra I. Woolley Electronic, Electrical.
Lossless Compression CIS 465 Multimedia. Compression Compression: the process of coding that will effectively reduce the total number of bits needed to.
COMPRESSION. Compression in General: Why Compress? So Many Bits, So Little Time (Space) CD audio rate: 2 * 2 * 8 * = 1,411,200 bps CD audio storage:
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Huffman Code and Data Decomposition Pranav Shah CS157B.
Spring 2000CS 4611 Multimedia Outline Compression RTP Scheduling.
Chapter 8 Image Compression. Portable bit map Family (BMP, Lena 66,616B) Graphics interchange Format (GIF, Lena 70,458B) Tag image file format (TIFF/TIF,
CS Spring 2011 CS 414 – Multimedia Systems Design Lecture 6 – Basics of Compression (Part 1) Klara Nahrstedt Spring 2011.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
Digital Image Processing Lecture 22: Image Compression
Bahareh Sarrafzadeh 6111 Fall 2009
Lossless Compression(2)
Multi-media Data compression
1 Data Compression Hae-sun Jung CS146 Dr. Sin-Min Lee Spring 2004.
Chapter 7 Lossless Compression Algorithms 7.1 Introduction 7.2 Basics of Information Theory 7.3 Run-Length Coding 7.4 Variable-Length Coding (VLC) 7.5.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 7 – Basics of Compression (Part 2) Klara Nahrstedt Spring 2012.
Dr. Hadi AL Saadi Image Compression. Goal of Image Compression Digital images require huge amounts of space for storage and large bandwidths for transmission.
Entropy vs. Average Code-length Important application of Shannon’s entropy measure is in finding efficient (~ short average length) code words The measure.
Lossless Compression-Statistical Model Lossless Compression One important to note about entropy is that, unlike the thermodynamic measure of entropy,
Design & Analysis of Algorithm Huffman Coding
Compression & Huffman Codes
Data Compression.
Multimedia Outline Compression RTP Scheduling Spring 2000 CS 461.
Data Compression.
Huffman Coding, Arithmetic Coding, and JBIG2
Data Compression CS 147 Minh Nguyen.
Why Compress? To reduce the volume of data to be transmitted (text, fax, images) To reduce the bandwidth required for transmission and to reduce storage.
Chapter 11 Data Compression
Trees Addenda.
Data Structure and Algorithms
Data Compression.
CSE 589 Applied Algorithms Spring 1999
Presentation transcript:

CS654: Digital Image Analysis Lecture 34: Different Coding Techniques

Recap of Lecture 33 Morphological Algorithms Introduction to Image Compression Data, Information Measure of Information Lossless and Lossy encryption

Outline of Lecture 34 Lossless Compression Different Coding Techniques RLE Huffman Arithmatic LZW

Lossless Compression Types of coding Repetitive Sequence Encoding Statistical EncodingPredictive CodingBitplane coding RLEHuffman Arithmatic LZW DPCM

Run length Encoder - Algorithm Start on the first element of input Examine next value If same as previous value Keep a counter of consecutive values Keep examining the next value until a different value or end of input then output the value followed by the counter. Repeat If not same as previous value Output the previous value followed by ‘1’ (run length). Repeat

Run-length coding (RLC) (inter-pixel redundancy) Used to reduce the size of a repeating string of symbols (i.e., runs):  (1,5) (0, 6) (1, 1) Encodes a run of symbols into two bytes: (symbol, count) Can compress any type of data but cannot achieve high compression ratios compared to other compression methods.

2D RLE

Differential Pulse Code Modulation (DPCM) Encode the changes between consecutive samples Example The value of the differences between samples are much smaller than those of the original samples. Less bits are used to encode the signal (e.g. 7 bits instead of 8 bits)

DPCM Entropy encoder Entropy decoder Predictor - + Error Channel Input Output

DPCM Example Differential Pulse Code Modulation (DPCM) Example: change reference symbol if delta becomes too large works better than RLE for many digital images (1.5-to-1) AAABBCDDDD A

Huffman Coding (coding redundancy) A variable-length coding technique. Symbols are encoded one at a time! There is a one-to-one correspondence between source symbols and code words Optimal code (i.e., minimizes code word length per source symbol).

Huffman Code Approach Variable length encoding of symbols Exploit statistical frequency of symbols Efficient when symbol probabilities vary widely Principle Use fewer bits to represent frequent symbols Use more bits to represent infrequent symbols AABA AAAB

Huffman Code Example Expected size Original  1/8  2 + 1/4  2 + 1/2  2 + 1/8  2 = 2 bits / symbol Huffman  1/8  3 + 1/4  2 + 1/2  1 + 1/8  3 = 1.75 bits / symbol SymbolABCD Frequency13%25%50%12% Original Encoding bits Huffman Encoding bits2 bits1 bit3 bits

Huffman Code Data Structures Binary (Huffman) tree Represents Huffman code Edge  code (0 or 1) Leaf  symbol Path to leaf  encoding Example A = “110”, B = “10”, C = “0” Priority queue To efficiently build binary tree D C B A 01

Huffman Code Algorithm Overview Encoding 1.Calculate frequency of symbols in file 2.Create binary tree representing “best” encoding 3.Use binary tree to encode compressed file 1.For each symbol, output path from root to leaf 2.Size of encoding = length of path 4.Save binary tree

Huffman Code – Creating Tree Place each symbol in leaf Weight of leaf = symbol frequency Select two trees L and R (initially leafs) Such that L, R have lowest frequencies in tree Create new (internal) node Left child  L Right child  R New frequency  frequency( L ) + frequency( R ) Repeat until all nodes merged into one tree

Huffman Tree Construction ACEHI

Huffman Tree Construction ACEHI

Huffman Tree Construction A C E H I

Huffman Tree Construction A C E H I

Huffman Tree Construction A CE H I E = 01 I = 00 C = 10 A = 111 H = 110

Huffman Coding Example Huffman code Input ACE Output (111)(10)(01) = E = 01 I = 00 C = 10 A = 111 H = 110

Huffman Code Algorithm Overview Decoding Read compressed file & binary tree Use binary tree to decode file Follow path from root to leaf

Huffman Decoding A CE H I

Huffman Decoding A CE H I

Huffman Decoding A CE H I A

Huffman Decoding A CE H I A

Huffman Decoding A CE H I AC

Huffman Decoding A CE H I AC

Huffman Decoding A CE H I ACE

Limitation of Huffman Code

Arithmetic (or Range) Coding Instead of encoding source symbols one at a time, sequences of source symbols are encoded together. There is no one-to-one correspondence between source symbols and code words. Slower than Huffman coding but typically achieves better compression.

Arithmetic Coding (cont’d) A sequence of source symbols is assigned to a sub-interval in [0,1) which corresponds to an arithmetic code, e.g., Start with the interval [0, 1) As the number of symbols in the message increases, the interval used to represent the message becomes smaller. α1 α2 α3 α3 α4α1 α2 α3 α3 α4 [ , ) arithmetic code

Arithmatic Coding We need a way to assign a code word to a particular sequence w/o having to generate codes for all possible sequences Huffman requires keeping track of code words for all possible blocks Each possible sequence gets mapped to a unique number in [0,1) The mapping depends on the prob. of the symbols

Arithmatic Coding Example SymbolsProbabilities α10.2 α20.2 α3α30.4 α α1α1 α2α2 α3α3 α4α4 Encode message: α 1 α 2 α 3 α 3 α [ , ) or (must be inside interval)

Decoding Decode α1α1 α2α2 α3α3 α4α4 α3 α3 α1 α2 α4α3 α3 α1 α2 α4

Arithmetic Encoding: Expression Formula for dividing the interval

Arithmetic Decoding: Expression 1. Initial value 2. Calculate 4. Update limits 5. Repeat until entire sequence is decoded