Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science.

Slides:



Advertisements
Similar presentations
Problem Set 3 20.)A six-sided die (whose faces are numbered 1 through 6, as usual) is known to be counterfeit: The probability of rolling any even number.
Advertisements

Slide 1Fig. 11.1, p.337. Slide 2Fig. 11.2, p.338.
Slide 1Fig. 19.1, p Slide 2Fig. 19.2, p. 583.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Slide 1Fig. 21.1, p.641. Slide 2Fig. 21.2, p.642.
Slide 1Fig. 10.1, p.293. Slide 2Fig. 10.1a, p.293.
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
Middle Term Exam 03/04, in class. Project It is a team work No more than 2 people for each team Define a project of your own Otherwise, I will assign.
Independent Events Let A and B be two events. It is quite possible that the percentage of B embodied by A is the same as the percentage of S embodied by.
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
De Morgan’s rule E 1 = pipe 1 breaks E 2 = pipe 2 breaks 1 2 Water Supply E = failure in water supply = E 1 ∪ E 2 no failure in water supply = Ē = E 1.
Bayes Theorem Mutually exclusive events A collection of events (B1, B2, …, Bk) is said to be mutually exclusive if no two of them overlap. If mutually.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
Fig. 3-1, p. 67. Fig. 3-2, p. 67 Fig. 3-3, p. 68.
© 2013 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Introductory Statistics: Exploring the World through.
Lossless Compression - I Hao Jiang Computer Science Department Sept. 13, 2007.
The Erik Jonsson School of Engineering and Computer Science Chapter 1 pp William J. Pervin The University of Texas at Dallas Richardson, Texas
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
Some basic concepts of Information Theory and Entropy
M28- Categorical Analysis 1  Department of ISM, University of Alabama, Categorical Data.
1 NA387 Lecture 6: Bayes’ Theorem, Independence Devore, Sections: 2.4 – 2.5.
Graziella Quattrocchi & Louise Marshall Methods for Dummies 2014
CHAPTER 5 Probability: Review of Basic Concepts
Previous Lecture: Data types and Representations in Molecular Biology.
General Probability Rules… If events A and B are completely independent of each other (disjoint) then the probability of A or B happening is just: We.
AP Review Day 2: Discrete Probability. Basic Probability Sample space = all possible outcomes P(A c ) = 1 – P(A) Probabilities have to be between 0 and.
VENN DIAGRAMS Slideshow 57, Mathematics Mr Richard Sasaki, Room 307.
1/14 Synthesis and Design of Parameter Extractors for Low-Power Pre-computation-Based Content-addressable Memory Using Gate- Block Selection Algorithm.
Slide 4- 1 Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Active Learning Lecture Slides For use with Classroom Response.
AP STATISTICS LESSON 6.3 (DAY 1) GENERAL PROBABILITY RULES.
Larson/Farber Ch. 3 Section 3.3 The Addition Rule Statistics Mrs. Spitz Fall 2008.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Class 2 Probability Theory Discrete Random Variables Expectations.
P(A) = 0.4 P(B) = 0.75 P(A  B) = 0.35 Draw a venn diagram to show this information (3) Calculate P(A  B)
The sample space (omega) collectively exhaustive for the experiment mutually exclusive right scope ‘granularity’ The probability law assigns a probability.
IB Computer Science – Logic
Probability You’ll probably like it!. Probability Definitions Probability assignment Complement, union, intersection of events Conditional probability.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
2. Introduction to Probability. What is a Probability?
1 Probability: Liklihood of occurrence; we know the population, and we predict the outcome or the sample. Statistics: We observe the sample and use the.
Probability Rules. We start with four basic rules of probability. They are simple, but you must know them. Rule 1: All probabilities are numbers between.
Probability. Rules  0 ≤ P(A) ≤ 1 for any event A.  P(S) = 1  Complement: P(A c ) = 1 – P(A)  Addition: If A and B are disjoint events, P(A or B) =
Education as a Signaling Device and Investment in Human Capital Topic 3 Part I.
In-Class Exercises: Axioms of Probability
Presented by Minkoo Seo March, 2006
I can find probabilities of compound events.. Compound Events  Involves two or more things happening at once.  Uses the words “and” & “or”
The Channel and Mutual Information
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
© 2013 Pearson Education, Inc. Reading Quiz For use with Classroom Response Systems Introductory Statistics: Exploring the World through Data, 1e by Gould.
BUSA Probability. Probability – the bedrock of randomness Definitions Random experiment – observing the close of the NYSE and the Nasdaq Sample.
Conditional Probability If two events are not mutually exclusive, the fact that we know that B has happened will have an effect on the probability of A.
Welcome to MM207 Unit 3 Seminar Dr. Bob Probability and Excel 1.
Review Conditional Probability –Denoted P(A|B), it represents the probability that A will occur given that B occurred. Independent events –Two events A.
Chapter 15: Probability Rules! Ryan Vu and Erick Li Period 2.
Chapter 3 Probability.
Source Coding Binit Mohanty Ketan Rajawat.
Chapter 4 Probability.
Conditional probability
12.4 Probability of Compound Events
Sec. 5-4: Multiplication Rule & Conditional P(x)
Digital Multimedia Coding
Representing Uncertainty
Mutually exclusive nothing in common.
Introduction to Probability & Statistics Expectations
From Randomness to Probability
Entropy CSCI284/162 Spring 2009 GWU.
General Probability Rules
Chapter 5 – Probability Rules
basic probability and bayes' rule
Probability.
Presentation transcript:

Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science Waikato University AI 2010

Outline  Motivation/background  Representations of knowledge

Truth table Attribute vs. entity FruitbatEagleTigerRock… Is alive?TTTF Flies?TTFF Lays eggs?FTFF …

Multivalued truth table Attribute vs. entity FruitbatEagleTigerRock… Is alive? Flies? Lays eggs? … YES, NO, SOMETIMES, USUALLY, MAYBE, SELDOM, RARELY, etc

Mutual Information  A measure of the information (i.e. uncertainty) in an event ω whose probability is p ω can be expressed in bits as I (ω) = - log 2 p ω  Entropy is the average information content I (ω) = - p ω log 2 p ω  Mutual information is the amount of information two events share MI(X, Y) = I(x) + I(y) – [I(x) + I(y|x)]

Mutual information example Given: P(A) = 1/32 P(B) = 1/64 P(B|A) = 1/2 P(A|B) = 1/4 Then: I(A) = 5 I(B) = 6 I(A,B) = 1 MI(A,B) = – (