Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.

Similar presentations


Presentation on theme: "1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information."— Presentation transcript:

1 1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information

2 2 Student presentation next week Up to 5 minute presentations followed by discussions. All presentations in Power Point Format: Title of project/research Motivation (why is the problem important) Background (who did what) Specific objectives (what do you plan to do) Literature Each student will provide feedback about each presentation (grades - A, B, C, F- and comments).

3 3 Distributed system models

4 4 System models Functional models Performance models Reliability models Security models The effect of the technology substrate.

5 5 Attributes of a man-made system A. Functionality B Performance and dependability Reliability Availability Maintainability Safety C. Cost

6 6 Major concerns Unreliable communication. Independent failures of communication links and computing nodes. Discrepancy among communication and computing bandwidth and latency. Bandwidth Latency

7 7 Information transmission and communication channel models Physical signals Digital/analog channels Modulation/demodulation Sampling and quantization Channel latency and bandwidth

8 8

9 9 Entropy Input and output channel alphabets. The output of a communication channel depends statistically upon its input. The output gives an idea of what was sent. Measure of the uncertainty of a random variable. Examples: Binary random variable H(x) = -p log(p) – (1-p) log(1-p) Horse race

10 10 Entropy of a binary random variable

11 11 Joint entropy, conditional entropy, mutual information H(X,Y) – joint entropy of X and Y H(X/Y) – conditional entropy of X given Y H(X,Y) = H(X) + H(Y/X) = H(Y) + H(X/Y) I(X;Y) = H(X) – H(X/Y)  mutual information I(X;Y) is a measure of the dependency between rv’s X and Y. H(X) = H(X/Y) + I(X;Y) H(Y) = H(Y/X) + I(X;Y) H(X,Y) = H(X) + H(Y) – I(X;Y)

12 12

13 13 Noiseless and noisy binary symmetric channels

14 14 Noisy binary symmetric channel Each of the two input symbols 0 and 1 is altered with probability p and received as 1 and 0 respectively. Then I(X;Y) = H(Y) + H(Y/X) = = H(Y) + p log(p) + (1-p) log(1-p) We can maximize I(X;Y) when H(Y) = 1

15 15 Encoding Encoding used to: Make transmission resilient to errors (error detection and error correction) Reduce the amount of information transmitted through a communication channel (compression) Ensure information confidentiality (encryption) Source Encoding Channel Encoding

16 16

17 17 Channel capacity and Shannon’s theorem Given a channel with input X and output Y the channel capacity defines the highest rate the information can be transmitted through the channel C = max I(X;Y) Shannon’s theorem The effect of the signal to noise ratio (S/N) C = B log ( 1 + S/N)

18 18 Error detection and error correction Error detection parity bit used to detect any odd # of errors Error correction Code: a set of code words Block codes: m – information symbols k – parity check symbols n = m + k

19 19

20 20 Hamming distance The number of position two binary code words differ. Hamming distance is a metric Non-negative Symmetric Triangle inequality Example The distance of a code Nearest neighbor decoding

21 21 Error correction and error detection capabilities of a code If C is an [n,M] code with an odd distance d = 2e +1 Then C can: correct e errors and detect 2e+1 errors

22 22

23 23 The Hamming bound What is the minimum number of parity check symbols necessary to correct one error?


Download ppt "1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information."

Similar presentations


Ads by Google