Distributed Source Coding 教師 : 楊士萱 老師 學生 : 李桐照. Talk OutLine Introduction of DSCIntroduction of DSC Introduction of SWCQIntroduction of SWCQ ConclusionConclusion.

Slides:



Advertisements
Similar presentations
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
Advertisements

Transform-domain Wyner-Ziv Codec for Video 教師 : 楊士萱 老師 學生 : 李桐照 同學.
Sampling and Pulse Code Modulation
1 exercise in the previous class Determine the stationary probabilities. Compute the probability that 010 is produced. A BC 0/0.4 0/0.5 1/0.6 0/0.81/0.5.
1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.
Chain Rules for Entropy
Protein- Cytokine network reconstruction using information theory-based analysis Farzaneh Farhangmehr UCSD Presentation#3 July 25, 2011.
Chapter 6 Information Theory
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Error Control Coding for Wyner-Ziv System Application 指 導 教 授:楊 士 萱 報 告 學 生:李 桐 照.
Losslessy Compression of Multimedia Data Hao Jiang Computer Science Department Sept. 25, 2007.
Lossless data compression Lecture 1. Data Compression Lossless data compression: Store/Transmit big files using few bytes so that the original files.
1 Chapter 5 A Measure of Information. 2 Outline 5.1 Axioms for the uncertainty measure 5.2 Two Interpretations of the uncertainty function 5.3 Properties.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Information Theory and Security
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info.
Introduction to Quantum Shannon Theory Patrick Hayden (McGill University) 12 February 2007, BIRS Quantum Structures Workshop | 
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
Basic Concepts in Information Theory
Some basic concepts of Information Theory and Entropy
§1 Entropy and mutual information
2. Mathematical Foundations
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
1. Entropy as an Information Measure - Discrete variable definition Relationship to Code Length - Continuous Variable Differential Entropy 2. Maximum Entropy.
§4 Continuous source and Gaussian channel
1 Information and interactive computation January 16, 2012 Mark Braverman Computer Science, Princeton University.
Chapter 11: The Data Survey Supplemental Material Jussi Ahola Laboratory of Computer and Information Science.
FRE /09/2005 Distributed Source Coding of Still Images Ç. Dikici,R. Guermazi, K. Idrissi, A.Baskurt LIRIS, UMR 5205 CNRS INSA de Lyon, France.
X1X1 X2X2 Encoding : Bits are transmitting over 2 different independent channels.  Rn bits Correlation channel  (1-R)n bits Wireless channel Code Design:
Channel Capacity.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
JHU CS /Jan Hajic 1 Introduction to Natural Language Processing ( ) Essential Information Theory I AI-lab
Compression video overview 演講者:林崇元. Outline Introduction Fundamentals of video compression Picture type Signal quality measure Video encoder and decoder.
§2 Discrete memoryless channels and their capacity function
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Mathematical Foundations Elementary Probability Theory Essential Information Theory Updated 11/11/2005.
CS654: Digital Image Analysis
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Universal Linked Multiple Access Source Codes Sidharth Jaggi Prof. Michelle Effros.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Lecture 3 Appendix 1 Computation of the conditional entropy.
Mutual Information, Joint Entropy & Conditional Entropy
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
This file contains figures from the book: Information Theory A Tutorial Introduction by Dr James V Stone 2015 Sebtel Press. Copyright JV Stone. These.
Essential Probability & Statistics (Lecture for CS397-CXZ Algorithms in Bioinformatics) Jan. 23, 2004 ChengXiang Zhai Department of Computer Science University.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by J.W. Ha Biointelligence Laboratory, Seoul National University.
Statistical methods in NLP Course 2 Diana Trandab ă ț
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Statistical methods in NLP Course 2
Distributed Compression For Still Images
Introduction to Information theory
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
Information Theory Michael J. Watts
Context-based Data Compression
Digital Multimedia Coding
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Distributed Compression For Binary Symetric Channels
Presentation transcript:

Distributed Source Coding 教師 : 楊士萱 老師 學生 : 李桐照

Talk OutLine Introduction of DSCIntroduction of DSC Introduction of SWCQIntroduction of SWCQ ConclusionConclusion

Introduction of DSC Distributed Source Coding Compression of two or more correlated source The source do not communicate with each other (hence distributed coding) Decoding is done jointly (say at the base station)

Introduction of DSC

Introduction of SWCQ Review of Information Theory Definition: (DMS) I ( P(x) ) = log1/ P(x) = – log P(x) If we use the base 2 logs, the resulting unit of information is call a bit Definition: (DMS) The Entropy H(X) of a discrete random variable X is defined by Information Entropy

Introduction of SWCQ Review of Information Theory Definition: (DMS) The joint entropy of 2 RV X,Y is the amount of the information needed on average to specify both their values Joint Entropy Conditional Entropy Definition: (DMS) The conditional entropy of a RV Y given another X, expresses how much extra information one still needs to supply on average to communicate Y given that the other party knows X

Introduction of SWCQ Review of Information Theory Definition: (DMS) I(X,Y) is the mutual information between X and Y. It is the reduction of uncertainty of one RV due to knowing about the other, or the amount of information one RV contains about the other Mutual Information

Introduction of SWCQ Review of Information Theory Mutual Information

Introduction of SWCQ Review of Data Compression Transform Coding: Take a sequence of inputs and transform them into another sequence in which most of the information is contained in only a few elements. And, then discarding the elements of the sequence that do not contain much information, we can get a large amount of compression. Nested quantization: quantization with side info Slepian-Wolf coding: entropy coding with side info

Introduction of SWCQ Classic Source Coding

Introduction of SWCQ Classic Source Coding

Introduction of SWCQ Classic Source Coding SWCQ DSC

Introduction of SWCQIntroduction of SWCQ A Case of SWC

Introduction of SWCQIntroduction of SWCQ A Case of SWC Joint Encoding (Y is available when coding X) Joint Encoding Code Y at Ry ≧ H(Y) : use Y to predict X and then code the difference at Rx ≧ H(XlY ) All together, Rx+Ry ≧ H(XlY)+H(Y)=H(X,Y)

Introduction of SWCQIntroduction of SWCQ A Case of SWC Distributed Encoding (Y is not available when coding X) What is the min rate to code X in this case? SW Theorem: Still H(XlY) Separate encoding as efficient as joint encoding

Introduction of SWCQIntroduction of SWCQ A Case of SWC Our Focus R CSC min = H(X)+H(Y) R DSC min = H(X,Y) R CSC min>= R DSC min

Introduction of SWCQ The SW Rate Region (for two sources) RXRX RYRY Slepian-Wolf H(X)H(X) H(X|Y) H(Y|X) H(Y)H(Y)

Conclusi on : Compression of two or more correlated sources use DSC good than CSC.