Download presentation
Presentation is loading. Please wait.
Published byΕυαδνη Ζάχος Modified over 5 years ago
1
Lecture 7 Information Sources; Average Codeword Length (Section 2.1)
Theory of Information Lecture 7 Theory of Information Lecture 7 Information Sources; Average Codeword Length (Section 2.1)
2
Theory of Information Lecture 7
Definitions Theory of Information Lecture 7 By McMillan’s Theorem, we must allow reasonably long codewords. But we want efficiency. Way out: use short codewords for frequent symbols, longer codewords for infrequent ones. Central in this approach becomes the concept of information source. DEFINITION An information source is S=(S,P), where S={s1,…,sq} is a source alphabet, and P is a probability law for it. Such a P can be written as the probability distribution P={p1,…,pq}. DEFINITION Let (S,P) be an information source, and let (C,f) be an encoding scheme for S={s1,…,sq}. The average codeword length of (C,f) is len(f(s1))P(s1) + … + len(f(sq))P(sq).
3
Theory of Information Lecture 7
Example Theory of Information Lecture 7 Source alphabet: {a,b,c,d} Probability law: P(a)=0.5, P(b)=0.2, P(c)=0.2, P(d)=0.1 Encoding scheme f: f(a)=0, f(b)=11, f(c)=100, f(d)=101 Encoding scheme g: g(a)=101, g(b)=100, g(c)=11, g(d)=0 What is the average codeword length of (C,f)? What is the average codeword length of (C,f)?
4
Theory of Information Lecture 7
Homework Theory of Information Lecture 7 Exercises 1 and 2 of Section 2.1.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.