Computing probabilities using Expect problem-solving Trees: A worked example Jim Blythe USC/ISI.

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

1. An Overview of Prolog.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
Probabilistic networks Inference and Other Problems Hans L. Bodlaender Utrecht University.
OMEN: A Probabilistic Ontology Mapping Tool Mitra et al.
Chapter 12: Expert Systems Design Examples
A Probabilistic Framework for Information Integration and Retrieval on the Semantic Web by Livia Predoiu, Heiner Stuckenschmidt Institute of Computer Science,
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
QM Spring 2002 Business Statistics Introduction to Inference: Hypothesis Testing.
Learning with Bayesian Networks David Heckerman Presented by Colin Rickert.
Bayes Nets Rong Jin. Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing.
Bayesian Belief Networks
Hypothesis Testing Lecture 4. Examples of various hypotheses The sodium content in Furresøen is x Sodium content in Furresøen is equal to the content.
Slide 1 Statistics Workshop Tutorial 4 Probability Probability Distributions.
1 Learning with Bayesian Networks Author: David Heckerman Presented by Yan Zhang April
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Chapter Nine: Evaluating Results from Samples Review of concepts of testing a null hypothesis. Test statistic and its null distribution Type I and Type.
Learning Structure in Bayes Nets (Typically also learn CPTs here) Given the set of random variables (features), the space of all possible networks.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
K. J. O’Hara AMRS: Behavior Recognition and Opponent Modeling Oct Behavior Recognition and Opponent Modeling in Autonomous Multi-Robot Systems.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Jim Little UBC CS 322 – CSP October 20, 2014.
Module networks Sushmita Roy BMI/CS 576 Nov 18 th & 20th, 2014.
Uncertainty Management in Rule-based Expert Systems
Chapter 11 Statistical Techniques. Data Warehouse and Data Mining Chapter 11 2 Chapter Objectives  Understand when linear regression is an appropriate.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Hypothesis Testing Lecture 3. Examples of various hypotheses Average salary in Copenhagen is larger than in Bælum Sodium content in Furresøen is equal.
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
CPS 170: Artificial Intelligence Markov processes and Hidden Markov Models (HMMs) Instructor: Vincent Conitzer.
Introduction on Graphic Models
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
Artificial Intelligence Knowledge Representation.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 1 FINAL EXAMINATION STUDY MATERIAL III A ADDITIONAL READING MATERIAL – INTRO STATS 3 RD EDITION.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
Integrative Genomics I BME 230. Probabilistic Networks Incorporate uncertainty explicitly Capture sparseness of wiring Incorporate multiple kinds of data.
Oliver Schulte Machine Learning 726
Chapter 7. Classification and Prediction
Exam Preparation Class
Probabilistic reasoning over time
From last time: on-policy vs off-policy Take an action Observe a reward Choose the next action Learn (using chosen action) Take the next action Off-policy.
Bayes Net Learning: Bayesian Approaches
Instructor: Vincent Conitzer
Data Mining Lecture 11.
Decision Theory: Single Stage Decisions
EPISTEMIC LOGIC.
CONCEPTS OF HYPOTHESIS TESTING
Issues in Knowledge Representation
Problems With Assistance Module 8 – Problem 1
Sergiy Vilkomir January 20, 2012
Probability Topics Random Variables Joint and Marginal Distributions
CS 188: Artificial Intelligence Fall 2008
Decision Theory: Single Stage Decisions
CS 188: Artificial Intelligence
Pruning 24-Feb-19.
TECHNIQUES OF INTEGRATION
Evaluating alternative anthrax production processes using a generic PSM: A worked example Yolanda Gil Jim Blythe Jihie Kim Surya Ramachandran USC/ISI.
28th September 2005 Dr Bogdan L. Vrusias
Probabilistic Reasoning
Simultaneous Equations
Instructor: Vincent Conitzer
Mathematical Foundations of BME Reza Shadmehr
Instructor: Vincent Conitzer
Probabilistic reasoning over time
Presentation transcript:

Computing probabilities using Expect problem-solving Trees: A worked example Jim Blythe USC/ISI

Overview Assume that some of the responses from the knowledge server or user might have uncertainty. Use the structure of Expect’s problem-solving tree (PS tree) to tell us the dependencies between the facts on which the final answer is based. Use formal rules to produce a bayesian belief net based on the PS tree. (Similar approach to knowledge-based model construction, Wellman et al. 92.) Use standard procedures for each special form or primitive method to fill conditional probability tables.

Example from the anthrax domain Goal: (determine-whether (obj production) (of anthrax) (supports maximum-damage-in-battlefield)) Two subgoals, determine ease of dispensing and strength in environment. Both depend on whether or not the agent is a dry agent. Assume it is uncertain whether the agent is dry, and that both subgoals are also uncertain.

Sketch of problem-solving tree Expect builds a PS tree by examining every possible case. Sketch of tree for this example: production supports the objective if it is both easy to dispense and strong in the environment to decide if it’s easy to dispense, decide if it’s a dry or wet agent If it’s dry, use procedure A If it’s wet, use procedure B to decide if it’s strong in the environment, decide if it’s a dry or wet agent ...

Start with the Expect PS Tree (only showing enodes) PS-en1 Determine-whether Thick lines denote method expansion Red borders mean the value can be uncertain PS-en3 AND.. PS-en4 Is-equal ease to easy PS-en5 Is-equal strength to strong PS-en6 Estimate ease PS-en15 Estimate strength PS-en8 If agent is dry, then .. Else .. PS-en16 If agent is dry, then .. Else .. PS-en11 Estimate for dry PS-en10 Estimate for wet PS-en9 Is agent dry? PS-en18 Estimate for dry PS-en19 Estimate for wet

Bayes net creation, step 1: merge enodes that are method expansions Method expansion nodes have the same value as their parents, so they can be excluded from the final belief net. PS-en1/3 AND.. PS-en4 Is-equal ease to easy PS-en5 Is-equal strength to strong PS-en6/8 If agent is dry, then .. Else .. PS-en15/16 If agent is dry, then .. Else .. PS-en11 Estimate for dry PS-en10 Estimate for wet PS-en9 Is agent dry? PS-en18 Estimate for dry PS-en19 Estimate for wet

Step 2: fill possible values from child result types and method definitions. For this example, each node has a small number of possible values true, false PS-en1/3 AND.. true, false true, false PS-en4 Is-equal ease to easy PS-en5 Is-equal strength to strong easy, hard strong, weak PS-en6/8 If agent is dry, then .. Else .. PS-en15/16 If agent is dry, then .. Else .. PS-en11 Estimate for dry PS-en10 Estimate for wet PS-en9 Is agent dry? PS-en18 Estimate for dry PS-en19 Estimate for wet easy, hard easy, hard true, false strong, weak strong, weak

Step 3: fill conditional probability tables from templates for special forms and primitives h t 1 f (if) (and) En4 En5 True False true 1 false true, false PS-en1/3 AND.. (is-equal) PS-en4 Is-equal PS-en5 Is-equal En15 true False Strong 1 Weak PS-en6/8 If PS-en15/16 If PS-en11 Estimate for dry PS-en10 Estimate for wet PS-en9 Is agent dry? PS-en18 Estimate for dry PS-en19 Estimate for wet easy, hard easy, hard true, false strong, weak strong, weak

Step 4: further collapse nodes to give final belief net Collapse each pair where parent has one child and child has one parent. Multiply CPTs: 1 f h t e F T n9 n10 n11 (if * is-equal) En4 En5 True False true 1 false true, false PS-en1/3 AND.. (and) PS-en4 Is-equal/if PS-en5 Is-equal/if PS-en11 Estimate for dry PS-en10 Estimate for wet PS-en9 Is agent dry? PS-en18 Estimate for dry PS-en19 Estimate for wet easy, hard easy, hard true, false strong, weak strong, weak

Using the belief net If probability distributions are given for any of the leaf procedures instead of a certain value, we can use the belief net to compute a probability distribution for our final answer. The result is sound assuming that the PS tree is correct for the case of certainty. The net can also be used in other directions, eg to compute the probability that the agent was dry based on observations of its effectiveness.

Advantage of probabilistic representation Will handle all cases correctly. Here the leaf nodes are given probabilities and the top node is computed. 1/2 and easy strong dry e|d e|w s|d s|w 1 1/2 1/4 and 1/2 1/2 easy strong 1/2 1 1 e|d e|w dry s|d s|w and 1/2 1/2 easy strong 1/2 1 1 e|d e|w dry s|d s|w

Other advantages Don’t have to mention uncertainty-handling explicitly in methods. Uses standard belief net evaluation software. Widely accepted approach, also used in HPKB (e.g. Koller’s work on Spook (Pfeffer et al. 99 )

What is required for this approach? Input probability distributions for nodes from the knowledge server or user, e.g. P(ease of dispensing | dry), P(dry), .. Template CPTs for special forms and primitive forms, e.g. if, and, or, is-equal, .. PS tree must evaluate all possible cases (it usually does) A way to represent methods with uncertain results (not shown here).

Extensions to the approach Handling numeric values, and other cases with a large number of possible values. (Should quantise or use abstractions) Handling concept membership uncertainty. Make sure the PS Tree includes all cases (e.g. with a covering) Use a template CPT that builds a distribution over the covering Representing uncertain methods. Try to avoid having to extend Expect Perhaps represent using uncertain methods via a reformulation?