Mutual Information and Channel Capacity Multimedia Security.

Slides:



Advertisements
Similar presentations
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Advertisements

EE 4780 Huffman Coding Example. Bahadir K. Gunturk2 Huffman Coding Example Suppose X is a source producing symbols; the symbols comes from the alphabet.
Information Theory EE322 Al-Sanie.
An introduction to Data Compression
Capacity of Wireless Channels
Enhancing Secrecy With Channel Knowledge
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Some Common Binary Signaling Formats: NRZ RZ NRZ-B AMI Manchester.
Chain Rules for Entropy
Protein- Cytokine network reconstruction using information theory-based analysis Farzaneh Farhangmehr UCSD Presentation#3 July 25, 2011.
Entropy Rates of a Stochastic Process
Chapter 6 Information Theory
Measures of Information Hartley defined the first information measure: –H = n log s –n is the length of the message and s is the number of possible values.
Middle Term Exam 03/04, in class. Project It is a team work No more than 2 people for each team Define a project of your own Otherwise, I will assign.
SWE 423: Multimedia Systems Chapter 7: Data Compression (3)
Fundamental limits in Information Theory Chapter 10 :
SWE 423: Multimedia Systems Chapter 7: Data Compression (2)
Mutual Information for Image Registration and Feature Selection
1 Chapter 5 A Measure of Information. 2 Outline 5.1 Axioms for the uncertainty measure 5.2 Two Interpretations of the uncertainty function 5.3 Properties.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Efficient Quantum State Tomography using the MERA in 1D critical system Presenter : Jong Yeon Lee (Undergraduate, Caltech)
Lossless Compression - I Hao Jiang Computer Science Department Sept. 13, 2007.
CSI Uncertainty in A.I. Lecture 201 Basic Information Theory Review Measuring the uncertainty of an event Measuring the uncertainty in a probability.
Source Coding Hafiz Malik Dept. of Electrical & Computer Engineering The University of Michigan-Dearborn
Information Theory and Security
Mutual Information Narendhran Vijayakumar 03/14/2008.
Noise, Information Theory, and Entropy
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
Noise, Information Theory, and Entropy
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
Basic Concepts in Information Theory
Some basic concepts of Information Theory and Entropy
§1 Entropy and mutual information
STATISTIC & INFORMATION THEORY (CSNB134)
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
CY2G2 Information Theory 5
Channel Capacity.
Grasshopper communication What do they have to talk about? Are you a male or a female? Are you receptive to mating? Are you a grasshopper? Do you belong.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
DIGITAL COMMUNICATIONS Linear Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Presented by Minkoo Seo March, 2006
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
The Channel and Mutual Information
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Joint Decoding on the OR Channel Communication System Laboratory UCLA Graduate School of Engineering - Electrical Engineering Program Communication Systems.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Information Theoretical Analysis of Digital Watermarking
Theory of Information Lecture 13
Watermarking with Side Information
Presentation transcript:

Mutual Information and Channel Capacity Multimedia Security

2 Information Source Source symbols Source probability Entropy

3 Source Encoder E(.)

4 Mutual Information : I(B;A) = I(A;B) = H(B) - H(B|A) = H(A) - H(A|B) Information Source A Source Encoder E(.) observer A observer B

5 Mutual Information S’pose we represent the information source and the encoder as “black boxes” and station two perfect observes at the scene to watch what happens. The first observer observes the symbols output from the source A, while the second observer watches the code symbols output from the encoder “E”.

6 We assume that the first observer has perfect knowledge of source A and symbol probabilities P A and the second observer has equally perfect knowledge of code alphabet B and codeword probabilities P B. Neither observer, however, has any knowledge whatsoever of the other observer’s black box.

7 Now s’pose each time observer B observes a codeword he asks observer A what symbol had been sent by the information source. How much information does observer B obtain from observer A? If the answer to this is “None”, then all of the information presented to the encoder passed through it to reach observer B and the encoder was information lossless.

8 On the other hand, if observer A’s report occasionally surprises observer B, then some information was lost in the encoding process. A’s report then serves to decrease the uncertainty observer B has concerning the symbols being emitted by black box “E”. The reduction in uncertainty about B conveyed by the observation A is called the mutual information, I(B;A).

9 The information presented to observer B by his observation is merely the entropy H(B). If the observer B observes symbol b (  B) and then learns from his partner that the source symbol was a, observer A’s report conveys information

10 and, average over the source of all observations, the average information conveyed by A’s report will be The amount by which B’s uncertainty is therefore reduced is

11 Since I(B;A) = H(B) - H(B|A) and H(B|A) ≧ 0 then I(B;A) ≦ H(B) That is, the mutual information is upper bounded by the entropy of the source encoder.

12 I(B;A) = H(B) iff H(B|A)=0 The conditional entropy is a measure of how much information loss occurs in the encoding process, and if it is equal to zero, then the encoder is information lossless. w.l.o.g., the encoder can be viewed as a channel in which the Source alphabet is the same as the codeword alphabet, and the encoding function behaves like the symbol transition map.

13 : transition probability of the channel 0 1 where : bit-error probability.

14 Each time the source (transmitter) sends a symbol, it is said to use the channel. The channel capacity is the maximum average information that can be sent per channel use. Notice that the mutual information is a function of the probability distribution of A. By changing P a, we get different I(A;B).

15 For a fixed transition probability matrix, a change in P a also results in a different output symbol distribution P B. The maximum mutual information achieved for a given transition probability matrix [a fixed channel characteristics] is the channel capacity

16 The relative entropy (or Kullback-Leibler distance) between two probability mass function p(x) and q(x) is defined as The mutual information I(X;Y) is the relative entropy between the joint distribution and the product distribution: