Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science Waikato University AI 2010
Outline Motivation/background Representations of knowledge
Truth table Attribute vs. entity FruitbatEagleTigerRock… Is alive?TTTF Flies?TTFF Lays eggs?FTFF …
Multivalued truth table Attribute vs. entity FruitbatEagleTigerRock… Is alive? Flies? Lays eggs? … YES, NO, SOMETIMES, USUALLY, MAYBE, SELDOM, RARELY, etc
Mutual Information A measure of the information (i.e. uncertainty) in an event ω whose probability is p ω can be expressed in bits as I (ω) = - log 2 p ω Entropy is the average information content I (ω) = - p ω log 2 p ω Mutual information is the amount of information two events share MI(X, Y) = I(x) + I(y) – [I(x) + I(y|x)]
Mutual information example Given: P(A) = 1/32 P(B) = 1/64 P(B|A) = 1/2 P(A|B) = 1/4 Then: I(A) = 5 I(B) = 6 I(A,B) = 1 MI(A,B) = – (