Download presentation
Presentation is loading. Please wait.
1
Entropy CSCI284/162 Spring 2009 GWU
2
Measurement of Uncertainty
Flip a fair coin Is it reasonable to say that the outcome has one bit of uncertainty? Flip the coin n times – how much uncertainty in the outcome? Is it reasonable to say: an event with probability 2-n has uncertainty n? That is, uncertainty is –log(probability)? 4/29/2019 CS284/Spring09/GWU/Vora/Entropy
3
CS284/Spring09/GWU/Vora/Entropy
What if The coin is biased so that it always shows heads? How much uncertainty? What if coin is biased so it shows heads with probability p and tails with probability 1-p? If uncertainty is -log probability, take its average value: -plogp - (1-p)log(1-p) 4/29/2019 CS284/Spring09/GWU/Vora/Entropy
4
CS284/Spring09/GWU/Vora/Entropy
Shannon Entropy If a random variable X takes on values Xi with probability pi, the entropy of X is defined as: H (X) = - i pi log pi Loosely speaking, it is the average number of bits required to represent the variable 4/29/2019 CS284/Spring09/GWU/Vora/Entropy
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.