Download presentation
Presentation is loading. Please wait.
Published byMervyn Peters Modified over 9 years ago
1
Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science Waikato University tcs@cs.waikato.ac.nz AI 2010
2
Outline Motivation/background Representations of knowledge
4
Truth table Attribute vs. entity FruitbatEagleTigerRock… Is alive?TTTF Flies?TTFF Lays eggs?FTFF …
5
Multivalued truth table Attribute vs. entity FruitbatEagleTigerRock… Is alive?0.900.960.880.01 Flies?0.850.970.040.22 Lays eggs?0.200.910.000.01 … YES, NO, SOMETIMES, USUALLY, MAYBE, SELDOM, RARELY, etc
6
Mutual Information A measure of the information (i.e. uncertainty) in an event ω whose probability is p ω can be expressed in bits as I (ω) = - log 2 p ω Entropy is the average information content I (ω) = - p ω log 2 p ω Mutual information is the amount of information two events share MI(X, Y) = I(x) + I(y) – [I(x) + I(y|x)]
7
Mutual information example Given: P(A) = 1/32 P(B) = 1/64 P(B|A) = 1/2 P(A|B) = 1/4 Then: I(A) = 5 I(B) = 6 I(A,B) = 1 MI(A,B) = 5 + 6 – (
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.