1 CSC 8520 Spring 2010. Paula Matuszek CS 8520: Artificial Intelligence Resolution and Bayes’ Examples Paula Matuszek Spring, 2010.

Slides:



Advertisements
Similar presentations
This is a speed quiz. You will have just 10 seconds to get the right answer. Dogs Dad will appear when you have 3 seconds left Have you got what it takes???
Advertisements

TEST-TAKING STRATEGIES FOR THE OHIO ACHIEVEMENT READING ASSESSMENT
Inferential Statistics
1 A formula in predicate logic An atom is a formula. If F is a formula then (~F) is a formula. If F and G are Formulae then (F /\ G), (F \/ G), (F → G),
We have seen that we can use Generalized Modus Ponens (GMP) combined with search to see if a fact is entailed from a Knowledge Base. Unfortunately, there.
Figurative Language (Idioms and Hyperbole)
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
CHAPTER 21 Inferential Statistical Analysis. Understanding probability The idea of probability is central to inferential statistics. It means the chance.
F22H1 Logic and Proof Week 7 Clausal Form and Resolution.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo Poesio LECTURE 11 (Lab): Probability reminder.
1 CSC 8520 Spring Paula Matuszek CS 8520: Artificial Intelligence Final Notes Paula Matuszek Spring, 2013.
U eatworms.swmed.edu/~leon u
STAT Section 5 Lecture 23 Professor Hao Wang University of South Carolina Spring 2012 TexPoint fonts used in EMF. Read the TexPoint manual before.
1 BASIC NOTIONS OF PROBABILITY THEORY. NLE 2 What probability theory is for Suppose that we have a fair dice, with six faces, and that we keep throwing.
The Burnet News Club THE SEVEN ‘C’S TRUTH CHECKER The Seven ‘C’s Truth Checker.
Statistical Significance What is Statistical Significance? What is Statistical Significance? How Do We Know Whether a Result is Statistically Significant?
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
HYPOTHESIS TESTING Four Steps Statistical Significance Outcomes Sampling Distributions.
Assuming normally distributed data! Naïve Bayes Classifier.
Statistical Tests How to tell if something (or somethings) is different from something else.
What z-scores represent
CS 536 Spring Global Optimizations Lecture 23.
Cal State Northridge  320 Ainsworth Sampling Distributions and Hypothesis Testing.
Statistical Significance What is Statistical Significance? How Do We Know Whether a Result is Statistically Significant? How Do We Know Whether a Result.
September SOME BASIC NOTIONS OF PROBABILITY THEORY Universita’ di Venezia 29 Settembre 2003.
The Central Limit Theorem
Inferences About Process Quality
Chapter 10: Estimating with Confidence
Mitchell Hoffman UC Berkeley. Statistics: Making inferences about populations (infinitely large) using finitely large data. Crucial for Addressing Causal.
Probability Population:
1 Faculty of Social Sciences Induction Block: Maths & Statistics Lecture 5: Generalisability of Social Research and the Role of Inference Dr Gwilym Pryce.
Statistics 3502/6304 Prof. Eric A. Suess Chapter 4.
Validity All UH students are communists. All communists like broccoli. All UH students like broccoli.
Conditional & Joint Probability A brief digression back to joint probability: i.e. both events O and H occur Again, we can express joint probability in.
Exam 1 Median: 74 Quartiles: 68, 84 Interquartile range: 16 Mean: 74.9 Standard deviation: 12.5 z = -1: 62.4 z = -1: 87.4 z = -1z = +1 Worst Question:
Comparing two sample means Dr David Field. Comparing two samples Researchers often begin with a hypothesis that two sample means will be different from.
Prime Numbers and Composite Numbers
CSC 8520 Fall, Paula Matuszek 1 CS 8520: Artificial Intelligence Lab 1 Paula Matuszek Fall, 2008.
Estimation This is our introduction to the field of inferential statistics. We already know why we want to study samples instead of entire populations,
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Test for Significant Differences T- Tests. T- Test T-test – is a statistical test that compares two data sets, and determines if there is a significant.
Probability 2.0. Independent Events Events can be "Independent", meaning each event is not affected by any other events. Example: Tossing a coin. Each.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
1 CSC 9010 Spring Paula Matuszek CSC 9010 ANN Lab Paula Matuszek Spring, 2011.
Copyright © Cengage Learning. All rights reserved. 2 Descriptive Analysis and Presentation of Single-Variable Data.
Central Tendency & Dispersion
Scientific Method Probability and Significance Probability Q: What does ‘probability’ mean? A: The likelihood that something will happen Probability.
Exploratory studies: you have empirical data and you want to know what sorts of causal models are consistent with it. Confirmatory tests: you have a causal.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Stats Lunch: Day 3 The Basis of Hypothesis Testing w/ Parametric Statistics.
©2012 Paula Matuszek CSC 9010: Text Mining Applications Lab 3 Dr. Paula Matuszek (610)
Machine Learning in Practice Lecture 5 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
1 Hypothesis Testing Basic Problem We are interested in deciding whether some data credits or discredits some “hypothesis” (often a statement about the.
Perimeters and Areas of Similar Figures
The inference and accuracy We learned how to estimate the probability that the percentage of some subjects in the sample would be in a given interval by.
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 8: Estimating with Confidence Section 8.1 Confidence Intervals: The.
Connotation and Denotation. Denotation The literal definition of a word. Example— Home: the place where one lives permanently, especially as a member.
Artificial Intelligence and Authorship: When Computers Learn to Read Kristin Betancourt COSC 480.
Evaluating your project
Tests of Significance The reasoning of significance tests
What do cats eat ? They eat … 2A : Unit 8.
Calculating Probabilities
Calculating Probabilities
Analysis based on normal distributions
CS 220: Discrete Structures and their Applications
Fractions Pages 8 – 59.
Lesson 2-R Chapter 2 Review.
Calculating t X -  t = s x X1 – X2 t = s x1 – x2 s d One sample test
When do I need an invariant?
The boolean type and boolean operators
Presentation transcript:

1 CSC 8520 Spring Paula Matuszek CS 8520: Artificial Intelligence Resolution and Bayes’ Examples Paula Matuszek Spring, 2010

2 CSC 8520 Spring Paula Matuszek To resolve a pair of clauses which are entirely disjuncts (ORs) of terms, you find a term which exists as a positive term in one and a negative term in the other, and produce a new term which contains all of the terms of both except the pair, which cancel each other. That new term is the resolvent, and showing all possible resolutions involves finding each resolvent.. The idea is that if we assume that both of the following statements are true: dog ∨ cat ¬ dog ∨ horse then (dog ∨ cat) ∧ (¬ dog ∨ horse) ⇒ (cat ∨ horse).

3 CSC 8520 Spring Paula Matuszek dog ¬ dog cathorse dog ∨ cat ¬ dog ∨ horse (dog ∨ cat ) ∧ ( ¬ dog ∨ horse) cat ∨ horse ( (dog ∨ cat ) ∧ ( ¬ dog ∨ horse)) ⇒ ( cat ∨ horse) TFTTTTTTT TFTFTFFTT TFFTTTTTT TFFFTFFFT FTTTTTTTT FTTFTTTTT FTFTFTFTT FTFFFTFFT

4 CSC 8520 Spring Paula Matuszek Midterm question: Show all the possible resolutions for the following pairs of clauses: delicious ¬ delicious ∨ anchovies The only term we can resolve on is delicious, so the resulting clause is anchovles delicious ∨ anchovies ¬ delicious ∨ ¬ anchovies We can resolve on either delicious or anchovies (but not both at once) so we can get either delicious ∨ ¬ delicious (which is trivially true) anchovies ∨ ¬ anchovies (ditto) ¬ X ∨ Y X ∨ ¬ Y ∨ Z We can resolve on x or on y. Giving us either Y ∨ ¬Y v Z or X ∨ ¬X v Z. (also both trivially true)

5 CSC 8520 Spring Paula Matuszek Bayes Theorem Bayes’ Theorem, as Eric discussed, is a way of determining the conditional probability of A given B (written A|B). In other words, if we know B has happened, how likely is A? In order to compute the conditional probability of A|B we need to know three other probabilities, called the priors: –The probability of A (without any other information) –The probability of B (without any other information) –The conditional probability of B given A

6 CSC 8520 Spring Paula Matuszek Bayes’ Theorem Figure from

7 CSC 8520 Spring Paula Matuszek Bayes Example Your child has a rash. How likely is it that he has chicken pox? In other words, what is the probability of chicken pox | rash. –Chicken pox is going around your child’s class: p(chicken pox) =.3. –Overall in your child’s class about half of the children have a rash from something: p(rash) =.5 –Almost all kids with chicken pox get a rash: –p(rash | chicken pox) =.99. So p(chicken pox | rash) = (.99 x.3)/.5 =.594 In other words, there’s about a 60% chance that if your child has a rash, it’s chicken pox.

8 CSC 8520 Spring Paula Matuszek Thinking about Bayes’ What this is basically saying is: –Two things make the probability of A|B higher: A high overall (or prior) probability of A (there is a lot of chicken pox around) A high prior probability of B|A (usually children with chicken pox get a rash) –One thing makes the probability of A|B lower: A high prior probability of B (there are a lot of rashes around) In other words, p(B) is giving us a way to normalize the other probabilities, so that we don’t decide that chicken pox is likely based on a rash if everyone has a rash. Consider what the values would be if instead of “rash” we looked at “Chicken pox | drinks milk”. Such a high proportion of children drink milk that drinking milk really has no predictive value for determining whether a child has chicken pox.