CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 12 Jim Martin.

Slides:



Advertisements
Similar presentations
Decision Making Under Uncertainty CSE 495 Resources: –Russell and Norwick’s book.
Advertisements

Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
FT228/4 Knowledge Based Decision Support Systems
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6.
Uncertain knowledge and reasoning
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 6.1 Chapter Six Probability.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Section.
CPSC 422 Review Of Probability Theory.
Probability.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
Introduction to Information Retrieval Introduction to Information Retrieval Hinrich Schütze and Christina Lioma Lecture 11: Probabilistic Information Retrieval.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 9 Jim Martin.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 14 Jim Martin.
Bayesian Belief Networks
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 8 Jim Martin.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 19 Jim Martin.
AI Principles, Lecture on Reasoning Under Uncertainty Reasoning Under Uncertainty (A statistical approach) Jeremy Wyatt.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 12 Jim Martin.
Chap 4-1 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 4 Probability.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Methods in Computational Linguistics II Queens College Lecture 2: Counting Things.
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Uncertainty1 Uncertainty Russell and Norvig: Chapter 14 Russell and Norvig: Chapter 13 CS121 – Winter 2003.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Probability and naïve Bayes Classifier Louis Oliphant cs540 section 2 Fall 2005.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Uncertainty in Expert Systems
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
4 Proposed Research Projects SmartHome – Encouraging patients with mild cognitive disabilities to use digital memory notebook for activities of daily living.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Bayesian Networks: A Tutorial. 2 Introduction Suppose you are trying to determine if a patient has tuberculosis. You observe the following symptoms:
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Weng-Keen Wong, Oregon State University © Bayesian Networks: A Tutorial Weng-Keen Wong School of Electrical Engineering and Computer Science Oregon.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 188: Artificial Intelligence Spring 2007
Inference in Bayesian Networks
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Uncertainty.
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
CSCI 5832 Natural Language Processing
CS 188: Artificial Intelligence Fall 2007
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Probabilistic Reasoning
Presentation transcript:

CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 12 Jim Martin

CSCI 5582 Fall 2006 Today 10/12 Review Basic probability Break Belief Networks

CSCI 5582 Fall 2006 Review Where we are… –Agents can use search to find useful actions based on looking into the future –Agents can use logic to complement search to represent and reason about Unseen parts of the current environment Past environments Future environments –And they can play a mean game of chess

CSCI 5582 Fall 2006 Where we aren’t Agents can’t –Deal well with uncertain situations (not clear people are all that great at this) –Learn –See, speak, hear, move, or feel

CSCI 5582 Fall 2006 Exercise You go to the doctor and for insurance reasons they perform a test for a horrible disease You test positive The doctor says the test is 99% accurate Do you worry?

CSCI 5582 Fall 2006 An Exercise It depends; let’s say… –The disease occurs 1 in folks –And that the 99% means that 99 times out a 100 when you give the test to someone without the disease it will return negative –And that when you have the disease it always says you are positive –Do you worry?

CSCI 5582 Fall 2006 An Exercise The test’s false positive rate is 1/100 Only 1/10000 people have the disease If you gave the test to random people you would have –100 false positives –1 true positive Do you worry?

CSCI 5582 Fall 2006 An Exercise Do you worry? –Yes, I always worry –Yes, my chances of having the disease are 100x they were before I went to the doctor Went from 1/10000 to 1/100 (approx) –No, I live with a lot of other 1/100 bad things without worrying

CSCI 5582 Fall 2006 Another Example You hear on the news… –People who attend grad school to get a masters degree have a 10x increased chance of contracting schistosomiasis Do you worry? –Depends on where you go to grad school

CSCI 5582 Fall 2006 Back to Basics Prior (or unconditional) probability –Written as P(A) –For now think of A as a proposition that can turn out to be True or False –P(A) is your belief that A is true given that you know nothing else relevant to A

CSCI 5582 Fall 2006 Also Just as with logic we can create complex sentences with a partially compositional semantics (sort of)…

CSCI 5582 Fall 2006 Basics Conditional (or posterior) probabilities Written as P(A|B) Pronounced as the probability of A given B Think of it as your belief in A given that you know absolutely that B is true.

CSCI 5582 Fall 2006 And P(A|B)… your belief in A given that you know B is true AND B is all you know that is relevant to A

CSCI 5582 Fall 2006 Conditionals Defined Conditionals Rearranging And also

CSCI 5582 Fall 2006 Conditionals Defined

CSCI 5582 Fall 2006 Inference Inference means updating your beliefs as evidence comes in –P(A)… belief in A given that you know nothing else of relevance –P(A|B)… belief in A once you know B and nothing else relevant –P(A|B^C) belief in A once you know B and C and nothing else relevant

CSCI 5582 Fall 2006 Also What you’d expect… we can have P(A|B^C) or P(A^D|E) or P(A^B|C^D) etc…

CSCI 5582 Fall 2006 Joint Semantics Joint probability distribution… the equivalent of truth tables in logic Given a complete truth table you can answer any question you want Given the joint probability distribution over N variables you can answer any question you might want to that involve those variables

CSCI 5582 Fall 2006 Joint Semantics With logic you don’t always need the whole truth table; you can use inference methods and compositional semantics –I.e if I know the truth values for A and B, I can retrieve the value of A^B With probability, you need the joint to do inference unless you’re willing to make some assumptions

CSCI 5582 Fall 2006 Joint Toothache=TrueToothache=False Cavity = True Cavity = False What’s the probability of having a cavity and a toothache? What’s the probability of having a toothache? What’s the probability of not having a cavity? What’s the probability of having a toothache or a cavity?

CSCI 5582 Fall 2006 Note Adding up across a row is really a form of reasoning by cases… Consider calculating P(Cavity)… –We know that in this world you either have a toothache or you don’t. I.e toothaches partition the world. –So…

CSCI 5582 Fall 2006 Partitioning

CSCI 5582 Fall 2006 Combining Evidence Suppose you know the values for –P(A|B)=0.2 –P(A|C)=0.05 –Then you learn B is true What’s your belief in A? –Then you learn C is true What’s your belief in A?

CSCI 5582 Fall 2006 Combining Evidence

CSCI 5582 Fall 2006 Details… Where do all the numbers come from? –Mostly counting –Sometimes theory –Sometimes guessing –Sometimes all of the above

CSCI 5582 Fall 2006 Numbers P(A) P(A^B) P(A|B)

CSCI 5582 Fall 2006 Break HW Questions?

CSCI 5582 Fall 2006 Bayes We know… So rearranging things

CSCI 5582 Fall 2006 Bayes Memorize this

CSCI 5582 Fall 2006 Bayesian Diagnosis Given a set of symptoms choose the best disease (the disease most likely to give rise to those symptoms) –I.e. Choose the disease the gives the highest P(Disease|Symptoms) for all possible diseases But you probably can’t assess that… So maximize this…

CSCI 5582 Fall 2006 Meningitis

CSCI 5582 Fall 2006 Differential Diagnosis Why on earth would anyone know P(S)? And do you need to know it? –Asking for the most probable disease given some symptoms doesn’t entail knowing the probability of the diseases. Argmax D P(D|S) = P(S|D) P(D)/P(S) is the same as Argmax D P(D|S) = P(S|D) P(D)

CSCI 5582 Fall 2006 Well What if you needed the exact probability

CSCI 5582 Fall 2006 Next Time Graphical models or Belief Nets –Chapter 14 Quiz is postponed 1 Week –Now on 10/26 –Covers 7, 8, 9, 13 and 14