§2 Discrete memoryless channels and their capacity function

Slides:



Advertisements
Similar presentations
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Advertisements

Sampling and Pulse Code Modulation
(speaker) Fedor Groshev Vladimir Potapov Victor Zyablov IITP RAS, Moscow.
Information Theory EE322 Al-Sanie.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Probability Theory Part 1: Basic Concepts. Sample Space - Events  Sample Point The outcome of a random experiment  Sample Space S The set of all possible.
IERG5300 Tutorial 1 Discrete-time Markov Chain
II. Linear Block Codes. © Tallal Elshabrawy 2 Last Lecture H Matrix and Calculation of d min Error Detection Capability Error Correction Capability Error.
Chain Rules for Entropy
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Entropy Rates of a Stochastic Process
Chapter 6 Information Theory
Fundamental limits in Information Theory Chapter 10 :
Sampling Distributions
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
Variable-Length Codes: Huffman Codes
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Channel Polarization and Polar Codes
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
§1 Entropy and mutual information
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Channel Coding Part 1: Block Coding
§4 Continuous source and Gaussian channel
CY2G2 Information Theory 5
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Channel Capacity
Information and Coding Theory Transmission over noisy channels. Channel capacity, Shannon’s theorem. Juris Viksna, 2015.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Channel Capacity.
§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.
Rei Safavi-Naini University of Calgary Joint work with: Hadi Ahmadi iCORE Information Security.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
§6 Linear Codes § 6.1 Classification of error control system § 6.2 Channel coding conception § 6.3 The generator and parity-check matrices § 6.5 Hamming.
DIGITAL COMMUNICATIONS Linear Block Codes
Coding Theory Efficient and Reliable Transfer of Information
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
The parity bits of linear block codes are linear combination of the message. Therefore, we can represent the encoder by a linear system described by matrices.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Basic Concepts of Encoding Codes and Error Correction 1.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
The Finite-state channel was introduced as early as 1953 [McMillan'53]. Memory captured by channel state at end of previous symbol's transmission: - S.
Using Feedback in MANETs: a Control Perspective Todd P. Coleman University of Illinois DARPA ITMANET TexPoint fonts used.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
7.2 Means & Variances of Random Variables AP Statistics.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Normal Distribution and Parameter Estimation
General Strong Polarization
Lecture on Markov Chain
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Subject Name: Information Theory Coding Subject Code: 10EC55
Distributed Compression For Binary Symetric Channels
Information Theoretical Analysis of Digital Watermarking
Theory of Information Lecture 13
Presentation transcript:

§2 Discrete memoryless channels and their capacity function §2.1 Channel capacity §2.2 The channel coding theorem §2.1 Channel capacity §2.2 The channel coding theorem

R = I(X;Y) = H(X) – H(X|Y) §2.1 Channel capacity 1. Definition Review: R = I(X;Y) = H(X) – H(X|Y) Definition: bit/sig Let any discrete random variables X and Y, channel capacity is defined as Rt = I(X;Y)/t bit/second R--information rate

1. Definition C=1-H(p) §2.1 Channel capacity Example 2.1.1 Compute the capacity of BSC? Solution: C=1-H(p)

2. Simple discrete channel §2.1 Channel capacity 2. Simple discrete channel One to one channel

2. Simple discrete channel §2.1 Channel capacity 2. Simple discrete channel loseless channel

2. Simple discrete channel §2.1 Channel capacity 2. Simple discrete channel noiseless channel

3. Symmetric channel §2.1 Channel capacity {p1’p2’… ps’} {q1’q2’… qr’} definition: If the transition matrix P is as follow, it’s a symmetric channel: 1) Every row of P can be a permutation of the first row; 2) Every column of P can be a permutation of the first column. {p1’p2’… ps’} {q1’q2’… qr’}

3. Symmetric channel §2.1 Channel capacity Properties: 1) H(Y|X)=H(Y|X=ai)=H(p1’, p2’,…,ps’), i=1,2,…,r; 2) If the input random variable X has equal probabilities, then the output random variable Y has equal probabilities.

3. Symmetric channel §2.1 Channel capacity Example 2.1.2 Strongly symmetric channel

3. Symmetric channel §2.1 Channel capacity Example 2.1.3 Weakly symmetric channel

3. Symmetric channel §2.1 Channel capacity characteristic of weakly symmetric channel The columns of its transition matrix P can be partitioned into subsets Ci such that , for each i, in the matrix Pi formed by the columns in Ci, each row is a permutation of every other row, and the same is true of columns. (P68 :T2.2 in textbook)

3. Symmetric channel §2.1 Channel capacity Example 2.1.4 Compute the capacity of BEC? Weakly symmetric channel

3. Symmetric channel §2.1 Channel capacity characteristic of weakly symmetric channel The columns of its transition matrix P can be partitioned into subsets Ci such that , for each i, in the matrix Pi formed by the columns in Ci each row is a permutation of every other row, and the same is true of columns.

§2.1 Channel capacity 3. Symmetric channel problem C=?

4. Discrete memoryless extended channel §2.1 Channel capacity 4. Discrete memoryless extended channel

Review KeyWords: Channel capacity Simple DMC Symmetric channel (Strongly symmetric channel, Weakly symmetric channel)

Homework 1. P71: T2.19(a) ; 2. Calculate the channel capacity , and find the maximizing probability distribution.

Homework 3. Channel capacity. Consider the discrete memoryless channel Y = X+Z (mod 11), where and X Assume that Z is independent of X. Find the capacity. What is the maximizing p*(x)?

4. Find the capacity of the noisy typewriter channel. Homework 4. Find the capacity of the noisy typewriter channel. (the channel input is either received unchanged at the output with probability ½, or is transformed into the next letter with probability ½ ) 2009.03.04到此,次8 汇报了第4次讨论题(刘滔,三班)

§2 Discrete memoryless channels and their capacity funtion §2.1 The capacity function §2.2 The channel coding theorem §2.1 The capacity function §2.2 The channel coding theorem

1. The concept of channel coding §2.2 The channel coding theorem 1. The concept of channel coding M C Source Encoder Channel Sink Decoder R M’ General digital communication system

MLD 1. The concept of channel coding §2.2 The channel coding theorem Example 2.2.1 Repeating code MLD 000 001 000 010 Extended channel 000 100 111 011 110 111 101 111

1. The concept of channel coding §2.2 The channel coding theorem 1. The concept of channel coding Example 2.2.2 Block code(5,2)

1. The concept of channel coding §2.2 The channel coding theorem 1. The concept of channel coding Example 2.2.2 Block code(5,2)

(p62 corollary in textbook) §2.2 The channel coding theorem 2. The channel coding theorem Theorem 2.1 (the channel coding theorem for DMC’s) . For any R < C and , for all sufficiently large n there exists a code [C] = {x1,…,xM} of length n and a decoding rule such that: 1) , 2) PE< . (p62 corollary in textbook)

2. The channel coding theorem Statement 2 (The channel coding theorem): All rates below capacity C are achievable. Specifically, for every rate R ≤ C, there exists a sequence of (2nR,n) codes with maximum probability of error . Conversely, any sequence of (2nR, n) codes with must have R ≤ C. 2009.3.9到此,次9.今天学生鲜花——为女人节,很是感动! 本次课有空余,最后啰嗦了多句,还是提前3~5分钟下课,留给学生问问题

Home work thinking The repeating code (2n+1,1), using MLD decoder. Show that its average error probability is p is the error probability of BSC, compute PE when p = 0.01, n = 1,2,3,4.