Information Theory and Games (Ch. 16). Information Theory Information theory studies information flow Information has no meaning –As opposed to daily.

Slides:



Advertisements
Similar presentations
Induction of Decision Trees (IDT)
Advertisements

Decision Making Under Uncertainty CSE 495 Resources: –Russell and Norwick’s book.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Introduction to Philosophy Lecture 6 Pascal’s wager
Probability, Part III.
Section 5.1 and 5.2 Probability
Types of Random Variables Discrete random variables are ones that have a finite or countable number of possible outcomes (like number of heads when flipping.
Chapter 5 Understanding Randomness
Cheap Talk. When can cheap talk be believed? We have discussed costly signaling models like educational signaling. In these models, a signal of one’s.
Final Exam: May 10 Thursday. If event E occurs, then the probability that event H will occur is p ( H | E ) IF E ( evidence ) is true THEN H ( hypothesis.
1-3A Experimental Probability Warm-up (IN) Learning Objective: to develop a sense that probability is a ration that measures the chance that an event will.
Unit 7: Chance Variation. Repeating with Excel Two EV’s and two SE’s Suppose we know the box (model) of a game, and we make many draws from the box with.
Probabilistic thinking – part 2
Understanding Randomness
Probability Distributions Finite Random Variables.
Math 310 Section 7.2 Probability. Succession of Events So far, our discussion of events have been in terms of a single stage scenario. We might be looking.
1 Definitions Experiment – a process by which an observation ( or measurement ) is observed Sample Space (S)- The set of all possible outcomes (or results)
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
1 Lossless Compression Multimedia Systems (Module 2) r Lesson 1: m Minimum Redundancy Coding based on Information Theory: Shannon-Fano Coding Huffman Coding.
Review of Probability and Binomial Distributions
Induction of Decision Trees (IDT) CSE 335/435 Resources: – –
Information Theory and Games (Ch. 16). Information Theory Information theory studies information flow Under this context information has no intrinsic.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Complexity and Emergence in Games (Ch. 14 & 15). Seven Schemas Schema: Conceptual framework concentrating on one aspect of game design Schemas: –Games.
AP Statistics: Section 8.2 Geometric Probability.
Wednesday, Dec. 11 th : “A” Day Thursday, Dec. 12 th : “B” Day Agenda  Collect vocabulary crossword puzzle  Questions/Problems on chapter review?
Probability Post-Class Activity. Review of class Empirical probability – based on observed data. Theoretical probability – based on a model of the experiment.
Introduction to Philosophy Lecture 6 Pascal’s wager By David Kelsey.
Understanding Randomness
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
What is Probability?. The Mathematics of Chance How many possible outcomes are there with a single 6-sided die? What are your “chances” of rolling a 6?
UNIT 5: PROBABILITY Basic Probability. Sample Space Set of all possible outcomes for a chance experiment. Example: Rolling a Die.
Game Theory, Part 2 Consider again the game that Sol and Tina were playing, but with a different payoff matrix: H T Tina H T Sol.
Chapter 6. Probability What is it? -the likelihood of a specific outcome occurring Why use it? -rather than constantly repeating experiments to make sure.
5.1 Probability in our Daily Lives.  Which of these list is a “random” list of results when flipping a fair coin 10 times?  A) T H T H T H T H T H 
Monday, Dec. 17 th : “A” Day Tuesday, Dec. 18 th : “B” Day Agenda  Periodic table project slides due today  Collect Clueless Crossword Puzzle/review.
How likely is it that…..?. The Law of Large Numbers says that the more times you repeat an experiment the closer the relative frequency of an event will.
Warm Up Section Explain why the following is not a valid probability distribution? 2.Is the variable x discrete or continuous? Explain. 3.Find the.
Math-Nuts Janie, Johnny, Jesse, Bob Introduction to Algebra and Geometry Fall 2011.
Chapter 16 Week 6, Monday. Random Variables “A numeric value that is based on the outcome of a random event” Example 1: Let the random variable X be defined.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
Bell Work Please write the fraction, decimal AND percent. 1)Convert 4/5 to a decimal and a percent. 1)Convert.675 to a fraction and a Percent. 1)Convert.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Presented by Minkoo Seo March, 2006
Middle Tennessee State University ©2013, MTStatPAL.
PROBABILITY Mrs. Leahy’s 3 rd Hour – 6 th Grade Advanced Math.
3/7/20161 Now it’s time to look at… Discrete Probability.
Incomplete Information and Bayes-Nash Equilibrium.
Section 5.3: Independence and the Multiplication Rule Section 5.4: Conditional Probability and the General Multiplication Rule.
ENRICHING STUDENTS MATHEMATICAL INTUITIONS WITH PROBABILITY GAMES AND TREE DIAGRAMS NCTM PRESENTATION BY: ADOLFO CANON Rice University Summer.
Chance We will base on the frequency theory to study chances (or probability).
Sample Space and Events Section 2.1 An experiment: is any action, process or phenomenon whose outcome is subject to uncertainty. An outcome: is a result.
Probability Test Review (What are your chances of passing?)
Recap From Previous Classes (I) Games as Schemes of Uncertainty –Macro-level: We don’t know outcome of game –Micro-level: Probability is assigned to outcome.
Uncertainty and Games (Ch. 15). Uncertainty If a game outcome is certain can it achieve meaningful play? –Example of such a game? Two kinds of uncertainty:
Discrete Random Variable Random Process. The Notion of A Random Variable We expect some measurement or numerical attribute of the outcome of a random.
Section 5.1 and 5.2 Probability
Now it’s time to look at…
Lecture 5 0f 6 TOPIC 10 Independent Events.
Unit 5: Probability Basic Probability.
Critical Thinking Lecture 14 Pascal’s wager
Sampling Distribution Models
Now it’s time to look at…
Probability, Part I.
Now it’s time to look at…
Entropy CSCI284/162 Spring 2009 GWU.
Lecture 2 Basic Concepts on Probability (Section 0.2)
Presentation transcript:

Information Theory and Games (Ch. 16)

Information Theory Information theory studies information flow Information has no meaning –As opposed to daily usage, and –As opposed to semiotics Component 1 passing information to component 2: Measure of information gained is a number in the [0,1] range: –0 bit: gained no information –1 bit: gained the most information 12 information -How much information 2 gained? -Was there any distortion (“noise”) while passing the information?

Recall: Probability Distribution The events E 1, E 2, …, E k must meet the following conditions: One always occur No two can occur at the same time The probabilities p 1, …, p k are numbers associated with these events, such that 0  p i  1 and p 1 + … + p k = 1 A probability distribution assigns probabilities to events such that the two properties above holds

Information Gain versus Probability Suppose that I flip a “fair” coin:  what is the probability that it will come heads:  How much information you gain when it fall: bit Suppose that I flip a “totally unfair” coin (always come heads):  what is the probability that it will come heads:  How much information you gain when it fall: 1 0

Information Gain versus Probability (2) Suppose that I flip a “very unfair” coin (99% will come heads):  what is the probability that it will come heads:  How much information you gain when it fall: 0.99 Fraction of A bit Imagine a stranger, “JL”. Which of the following questions, once answered, will provide more information about JL:  Did you have breakfast this morning?  What is your favorite color?

Information Gain versus Probability (3) If the probability that an event occurs is high, I gain less information when the event actually occurs If the probability that an event occurs is low, I gain more information when the event actually occurs In general, the information provided by an event decreases with the increase in the probability that that event occurs. Information gain of an event e (Shannon and Weaver, 1949): I(e) = log 2 (1/p(e)) Don’t be scared. We’ll come back tot his soon

Information, Uncertainty, and Meaningful Play Recall discussion of relation between uncertainty and Games –What happens if there is no uncertainty at all in a game (both at macro-level and micro-level)? What is the relation between uncertainty and information gain? If there is no uncertainty, information gained is 0

Lets Play Twenty Questions I am thinking of an animal: You can ask “yes/no” questions only Winning condition: –If you guess the animal correctly after asking 20 questions or less, and – you don’t make more than 3 attempts to guess the right animal

What is happening? (Constitutive Rules) We are building a binary (two children) decision tree a question no yes # nodes # levels The number of questions made. And it is equal to log 2 (#nodes)

Noise and Redundancy Noise: affects component to component communication –Example in a game? Redundancy: counterbalance to noise –Making sure information is communicated properly –Example in game? Balance act: –Too much structure and game becomes too overdetermined –Little structure: Chaos Charades: playing with noise Crossword puzzle 12 information -Noise: distortion in the communication 12 information -Redundancy: passing the same information by two or more different channels information

Open Discussion Course so far First time taught What would you change/continue? Remember: I never remember your names!