Download presentation
1
Presented by Minkoo Seo March, 2006
Entropy Presented by Minkoo Seo March, 2006
2
Self information (Surprisal)
Suppose that almost all children like ice cream. You ask her, “Do you like ice cream?” Her answer is always yes. There’s no surprisal and no information.
3
Self information (Surprisal) (cont)
If a child answers, “No”, then you are surprised. You expect more information from surprising answers. In other words,
4
Self information (Surprisal) (cont)
Now, suppose that there are two events. We want to get ‘self information’ value from them. The result would be
5
Self information (Surprisal) (cont)
Multiplication is neither easy nor intuitive. We want information can be added like other measures. So, information Probability of a message
6
Entropy Suppose that you are tossing a coin.
Your observation was TTHTHTTT. Then, how can we measure the average information per symbol?
7
Entropy (cont) We define Shannon’s entropy, or decrease in uncertainty, as Expected self information All messages
8
Joint Entropy Joint entropy of two discrete random variable is defined as
9
Conditional Entropy Conditional entropy of a discrete random variable X given Y=y is defined as Basic property of the conditional entropy Information of X when we know Y Information of X,Y Given Y
10
Mutual Information A quantity that measures the mutual dependence of the two variables. How much information can be obtained by observing the other.
11
Mutual Information (cont)
Definition Amount of uncertainty of B(or A) reduced when A(or B) is known Information of B gained by observing A Information of B Information of B which is remained when we know A
12
Applications Coding theory Clustering
Bits needed for describing a data Clustering Make a cluster from two variables whose mutual information is large Large mutual information implies that those two variables conveys the almost same information
13
References Wikipedia Ian Korf et al, BLAST, O’Reilly, 2003.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.