Download presentation
Presentation is loading. Please wait.
Published byCarmella Goodman Modified over 9 years ago
1
Information Theory and Games (Ch. 16)
2
Information Theory Information theory studies information flow Under this context information has no intrinsic meaning –Information may be partial (e.g., a sound) –Information measures the degree of uncertainty Basic model: (1) sender passes information to (2) receiver Measure of information gained is a number in the [0,1] range: –0 bit: gained no information –1 bit: gained the most information 12 information -How much information 2 gained? -Was there any distortion (“noise”) while passing the information?
3
Recall: Probability Distribution The events E 1, E 2, …, E k must meet the following conditions: One always occur No two can occur at the same time The probabilities p 1, …, p k are numbers associated with these events, such that 0 p i 1 and p 1 + … + p k = 1 A probability distribution assigns probabilities to events such that the two properties above holds
4
Information Gain versus Probability Suppose that I flip a “fair” coin: what is the probability that it will come heads: How much information you gain when it fall: 0.5 1 bit Suppose that I flip a “totally unfair” coin (always come heads): what is the probability that it will come heads: How much information you gain when it fall: 1 0
5
Information Gain versus Probability (2) Suppose that I flip a “very unfair” coin (99% will come heads): what is the probability that it will come heads: How much information you gain when it fall: 0.99 Fraction of A bit probability
6
Information Gain versus Probability (3) Imagine a stranger, “JL”. Which of the following questions, once answered, will provide more information about JL: Did you have breakfast this morning? What is your favorite color? Hints: What are your chances of guessing the answer correctly? What if you knew JL and you knew his preferences?
7
Information Gain versus Probability (4) If the probability that an event occurs is high, I gain less information when the event actually occurs If the probability that an event occurs is smaller, I gain more information when the event actually occurs In general, the information provided by an event decreases with the increase in the probability that that event occurs. Information gain of an event e (Shannon and Weaver, 1949): I(e) = log 2 (1/p(e))
8
Information, Uncertainty, and Meaningful Play Recall discussion of relation between uncertainty and Games –What happens if there is no uncertainty at all in a game (both at macro-level and micro-level)? What is the relation between uncertainty and information gain? If there is no uncertainty then information gain is 0. As a result, player’s actions are not meaningful!
9
Lets Play Twenty Questions I am thinking of an animal: You can ask “yes/no” questions only Winning condition: –If you guess the animal correctly after asking 20 questions or less, and – you can’t make more than 3 attempts to guess the right animal
10
What is happening? (Constitutive Rules) We are building a binary (two children) decision tree a question no yes # potential questions 2020 2121 2 2323 # levels 0 1 2 3 # questions made = log 2 (# potential questions)
11
Same Principle Operates for Online Version Game: http://www.20q.net/http://www.20q.net/ Ok so how can this be done? It uses information gain: Ex’ple Bar Fri Hun Pat TypeReswai t x1 no yes some french yes x4 no yes full thai no yes x5 no yes no full french yes no x6 x7 x8 x9 x10 x11 Table of movies stored in the system Patrons? noyes none some waitEstimate? no yes 0-10 >60 Full Alternate? Reservation? Yes 30-60 no yes No no Bar? Yes no yes Fri/Sat? NoYes yes no yes Hungry? yes No 10-30 Alternate? yes Yes no Raining? no yes no yes Nice: Resulting tree is optimal. Decision Tree
12
Example Entry Bar Fri HungryPatronsAltTypewait x1 no yes some yesFrench yes x4 no yes full yes Thai yes x5 no yes no full yesFrench no x6 yes no yes some noItalianyes x7 yes no none noBurgerno x8 no yes some no Thaiyes x9 yes no full noBurgerno x10 yes full yesItalianno x11 noNo no none no Thaino
13
Expected Information Gain We are given a probability distribution: The events E 1, E 2, …, E k The probabilities p 1, …, p k associated with these events We have the information gain for those events: I(E 1 ), I(E 2 ), …, I(E k ) The Expected Information Gain (EIG): EIG = p 1 * I(E 1 ) + … + p k * I(E k )
14
Decision Tree Obtained using expected information gain In this example it has the minimum height, which is nice (why?) Patrons? none no some yes full Hungry noyes YesType? Yes no Fri/Sat? french italian thai burger yes noyes noyes
15
Noise and Redundancy Noise: affects component to component communication –Example in a game? Redundancy: counterbalance to noise –Making sure information is communicated properly –Example in game? Balance act: noise versus redundancy –Too much information: signal might be lost –Too little information: signal might be lost Charades: playing with noise Crossword puzzle. Other example? 12 information -Noise: distortion in the communication. ExampleExample 12 information -Redundancy: passing the same information by two or more different channels information
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.