1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.

Slides:



Advertisements
Similar presentations
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Advertisements

Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty.
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Applied Algorithmics - week7
Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Chapter 6 Information Theory
Digital Data Transmission ECE 457 Spring Information Representation Communication systems convert information into a form suitable for transmission.
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Fundamental limits in Information Theory Chapter 10 :
Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
Distributed Source Coding 教師 : 楊士萱 老師 學生 : 李桐照. Talk OutLine Introduction of DSCIntroduction of DSC Introduction of SWCQIntroduction of SWCQ ConclusionConclusion.
Reliability and Channel Coding
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
Noise, Information Theory, and Entropy
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
Basic Concepts in Information Theory
§1 Entropy and mutual information
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
Channel Coding Part 1: Block Coding
§4 Continuous source and Gaussian channel
CY2G2 Information Theory 5
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Channel Capacity
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
Richard W. Hamming Learning to Learn The Art of Doing Science and Engineering Session 13: Information Theory ` Learning to Learn The Art of Doing Science.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Introduction to Information theory A.J. Han Vinck University of Duisburg-Essen April 2012.
Channel Capacity.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio.
§2 Discrete memoryless channels and their capacity function
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Information Theory Basics What is information theory? A way to quantify information A lot of the theory comes from two worlds Channel.
DIGITAL COMMUNICATIONS Linear Block Codes
Coding Theory Efficient and Reliable Transfer of Information
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
Postacademic Interuniversity Course in Information Technology – Module C1p1 Chapter 4 Communications, Theory and Media.
Tufts University. EE194-WIR Wireless Sensor Networks. February 17, 2005 Increased QoS through a Degraded Channel using a Cross-Layered HARQ Protocol Elliot.
The Channel and Mutual Information
INFORMATION THEORY Pui-chor Wong.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
Computer Architecture Error Correcting Codes Ralph Grishman Oct (Text pp and B-65-67) NYU.
Error Detecting and Error Correcting Codes
This file contains figures from the book: Information Theory A Tutorial Introduction by Dr James V Stone 2015 Sebtel Press. Copyright JV Stone. These.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Introduction to Information theory
… General Decoder for a Linear Block Code … …
Information Theory Michael J. Watts
Digital Multimedia Coding
Chapter 6.
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Subject Name: Information Theory Coding Subject Code: 10EC55
Distributed Compression For Binary Symetric Channels
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Theory of Information Lecture 13
Presentation transcript:

1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information

2 Student presentation next week Up to 5 minute presentations followed by discussions. All presentations in Power Point Format: Title of project/research Motivation (why is the problem important) Background (who did what) Specific objectives (what do you plan to do) Literature Each student will provide feedback about each presentation (grades - A, B, C, F- and comments).

3 Distributed system models

4 System models Functional models Performance models Reliability models Security models The effect of the technology substrate.

5 Attributes of a man-made system A. Functionality B Performance and dependability Reliability Availability Maintainability Safety C. Cost

6 Major concerns Unreliable communication. Independent failures of communication links and computing nodes. Discrepancy among communication and computing bandwidth and latency. Bandwidth Latency

7 Information transmission and communication channel models Physical signals Digital/analog channels Modulation/demodulation Sampling and quantization Channel latency and bandwidth

8

9 Entropy Input and output channel alphabets. The output of a communication channel depends statistically upon its input. The output gives an idea of what was sent. Measure of the uncertainty of a random variable. Examples: Binary random variable H(x) = -p log(p) – (1-p) log(1-p) Horse race

10 Entropy of a binary random variable

11 Joint entropy, conditional entropy, mutual information H(X,Y) – joint entropy of X and Y H(X/Y) – conditional entropy of X given Y H(X,Y) = H(X) + H(Y/X) = H(Y) + H(X/Y) I(X;Y) = H(X) – H(X/Y)  mutual information I(X;Y) is a measure of the dependency between rv’s X and Y. H(X) = H(X/Y) + I(X;Y) H(Y) = H(Y/X) + I(X;Y) H(X,Y) = H(X) + H(Y) – I(X;Y)

12

13 Noiseless and noisy binary symmetric channels

14 Noisy binary symmetric channel Each of the two input symbols 0 and 1 is altered with probability p and received as 1 and 0 respectively. Then I(X;Y) = H(Y) + H(Y/X) = = H(Y) + p log(p) + (1-p) log(1-p) We can maximize I(X;Y) when H(Y) = 1

15 Encoding Encoding used to: Make transmission resilient to errors (error detection and error correction) Reduce the amount of information transmitted through a communication channel (compression) Ensure information confidentiality (encryption) Source Encoding Channel Encoding

16

17 Channel capacity and Shannon’s theorem Given a channel with input X and output Y the channel capacity defines the highest rate the information can be transmitted through the channel C = max I(X;Y) Shannon’s theorem The effect of the signal to noise ratio (S/N) C = B log ( 1 + S/N)

18 Error detection and error correction Error detection parity bit used to detect any odd # of errors Error correction Code: a set of code words Block codes: m – information symbols k – parity check symbols n = m + k

19

20 Hamming distance The number of position two binary code words differ. Hamming distance is a metric Non-negative Symmetric Triangle inequality Example The distance of a code Nearest neighbor decoding

21 Error correction and error detection capabilities of a code If C is an [n,M] code with an odd distance d = 2e +1 Then C can: correct e errors and detect 2e+1 errors

22

23 The Hamming bound What is the minimum number of parity check symbols necessary to correct one error?