Presentation is loading. Please wait.

Presentation is loading. Please wait.

This file contains figures from the book: Information Theory A Tutorial Introduction by Dr James V Stone 2015 Sebtel Press. Copyright JV Stone. These.

Similar presentations


Presentation on theme: "This file contains figures from the book: Information Theory A Tutorial Introduction by Dr James V Stone 2015 Sebtel Press. Copyright JV Stone. These."— Presentation transcript:

1

2 This file contains figures from the book: Information Theory A Tutorial Introduction by Dr James V Stone 2015 Sebtel Press. Copyright JV Stone. These figures are released for use under the Creative Commons License specified on next slide. A copy of this file can be obtained from http://jim-stone.staff.shef.ac.uk/BookInfoTheory2013/InfoTheoryBookFigures.html

3

4 Chapter 1

5 A B D 0 1 1 = 3 0 0 0 = 0 0 0 1 = 1 0 1 0 = 2 1 0 0 = 4 1 0 1 = 5 1 1 0 = 6 1 1 1 = 7 C ----- -- ----- Left Right 0 1 0 0 1 1 1 1 1 1 0 0 0 0

6 0 1 1 = 3 0 0 0 = 0 0 0 1 = 1 0 1 0 = 2 1 0 0 = 4 1 0 1 = 5 1 1 0 = 6 1 1 1 = 7 Fish Bird Dog Car Van Truck Bus No Yes Q1 Q2 Q3 Cat

7

8

9

10

11 Chapter 2

12 Random variable X Experiment Coin flip Outcome = head Mapping X(head) = 1 Random variable X Experiment Coin flip Outcome = tail Mapping X(tail) = 0

13

14

15 Output y Noise  Channel Encoder x = g(s) Data s Decoder Input x

16

17

18

19

20

21 Chapter 3

22 Noise Channel Encoder Data Decoder Input Output X

23

24

25 A0.87 B0.04 C0.04 D0.03 E0.02 0.08 0.05 0.13 1.00 0 1 1 0 1 1 0 0

26 from xkcd.com

27

28

29

30

31 From xkcd.com

32 Chapter 4

33 Output y Noise  Channel Encoder x = g(s) Data s Decoder Input x

34

35 a b c

36 H(Y) H(X) I(X,Y) H(Y|X) H(X|Y) H(X,Y)

37 H(Y) H(X) I(X,Y) H(Y|X) H(X|Y) H(X,Y)

38

39 2 H(Y|X) 2 H(X|Y) Number of outputs = 2 H(Y) Number of inputs = 2 H(X) Number of possible outputs for one input = 2 H(Y|X) Number of possible inputs for one output = 2 H(X|Y) Inputs X Outputs Y

40

41 Output entropy H(Y) Mutual information I(X,Y) Output noise entropy H(Y|X) Input entropy H(X) Mutual information I(X,Y) Input noise entropy H(X|Y) (a) (b)

42

43

44 ABCDEFGHIABCDEFGHI B E H

45

46 Chapter 5

47

48

49

50

51

52 Residual uncertainty Residual uncertainty Residual uncertainty b) After receiving 1 bit, U=50% a) After receiving 2 bits, U=25% c) After receiving 1/2 a bit, U=71%

53

54 Noise Channel Encoder Data Decoder Input Output

55 Chapter 6

56 Output y Noise  Channel Encoder x = g(s) Data s Decoder Input x

57

58

59

60

61

62

63 H(Y) H(X) I(X,Y) H(Y|X) H(X|Y) H(X,Y)

64 Chapter 7

65

66

67

68

69

70

71 Chapter 8

72

73

74

75

76 Warm AB

77 ColdHot AB

78 Chapter 9

79

80

81 a b c d e

82 82

83 83 Blue-Yellow channel Luminance channel Red-Green channel S-cone M-cone L-cone

84

85 THE END.


Download ppt "This file contains figures from the book: Information Theory A Tutorial Introduction by Dr James V Stone 2015 Sebtel Press. Copyright JV Stone. These."

Similar presentations


Ads by Google