Dealing with Uncertainty

Slides:



Advertisements
Similar presentations
Modelling uncertainty in 3APL Johan Kwisthout Master Thesis
Advertisements

FT228/4 Knowledge Based Decision Support Systems
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
Representations for KBS: Uncertainty & Decision Support
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Uncertainty in Expert Systems (Certainty Factors).
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Decision Making Under Risk Continued: Bayes’Theorem and Posterior Probabilities MGS Chapter 8 Slides 8c.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
FT228/4 Knowledge Based Decision Support Systems
COUNTING AND PROBABILITY
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
CSNB234 ARTIFICIAL INTELLIGENCE
Rule Based Systems Alford Academy Business Education and Computing
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Information Fusion Yu Cai. Research Paper Johan Schubert, “Clustering belief functions based on attracting and conflicting meta level evidence”, July.
Lecture 05 Rule-based Uncertain Reasoning
For Monday after Spring Break Read Homework: –Chapter 13, exercise 6 and 8 May be done in pairs.
Correlation and Linear Regression
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
Lecture 5a: Bayes’ Rule Class web site: DEA in Bioinformatics: Statistics Module Box 1Box 2Box 3.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Uncertainty Management in Rule-based Expert Systems
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Uncertainty in Expert Systems
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Reasoning with Uncertainty دكترمحسن كاهاني
International Conference on Fuzzy Systems and Knowledge Discovery, p.p ,July 2011.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
UNIVERSITI TENAGA NASIONAL 1 CCSB354 ARTIFICIAL INTELLIGENCE Chapter 8.2 Certainty Factors Chapter 8.2 Certainty Factors Instructor: Alicia Tang Y. C.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
© 2010 Pearson Prentice Hall. All rights reserved Chapter Hypothesis Tests Regarding a Parameter 10.
Chapter 13 Linear Regression and Correlation. Our Objectives  Draw a scatter diagram.  Understand and interpret the terms dependent and independent.
Chapter 4 Some basic Probability Concepts 1-1. Learning Objectives  To learn the concept of the sample space associated with a random experiment.  To.
Chapter 12 Certainty Theory (Evidential Reasoning) 1.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham.
Virtual University of Pakistan
Tests About a Population Proportion
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Probability and Statistics
Probability Axioms and Formulas
Presented by: Karen Miller
Inference and Tests of Hypotheses
What is Probability? Quantification of uncertainty.
Inexact Reasoning 1 Session 9
Reasoning Under Uncertainty in Expert System
Basic Probabilistic Reasoning
Chapter 6 Hypothesis tests.
Reasoning with Uncertainty
Honors Statistics From Randomness to Probability
Significance Tests: The Basics
Class #21 – Monday, November 10
Chapter 8: Estimating with Confidence
COUNTING AND PROBABILITY
Wellcome Trust Centre for Neuroimaging
28th September 2005 Dr Bogdan L. Vrusias
Linear Regression and Correlation
Certainty Factor Model
Presentation transcript:

Dealing with Uncertainty Topics we will cover Bayes Formula Dempster-Shafer theory of evidence Stanford Certainty Factors Other topics we will not cover in class Bayesian belief networks Causal networks Nonmonotonic logic Truth maintenance systems Set cover and logic-based abduction Reasoning with fuzzy sets You are only responsible for the topics we cover

Some Definitions For independent events A and B p(A and B) = p(A) * p(B)

A First Example - 1 Given two urns with blue and white balls Suppose urn 1 has 3 blue balls and 2 white balls Suppose urn 2 has 1 blue ball and 3 white balls Suppose the probability of choosing urn 1 is 1/3 and the probability of choosing urn 2 is 2/3 If a blue ball is drawn, what is the probability it came from urn 1? Using the formula for conditional probability We can only get B in two ways

A First Example - 2 But we know Substituting We know P(U1) = 1/3 and P(B|U1) = 3/5 and P(U2) = 2/3 and P(B|U2) = ¼ so

Baye’s Formula H1 H2 H3 . . . . Hn-1 Hn

A Medical Example - 1 Suppose there is a test for HIV positivity. Let TP be the event that the test is positive Let TN be the event that the test is negative Let HP be the event that the person tested is HIV positive Let HN be the event that the person tested is HIV negative A false positive is when the test returns a positive result when they do not have HIV A false negative is when the test returns a negative result when, in fact, they do have HIV Some data assume that the false positive rate, P(TP|HN), is 0.02 assume that the false negative rate, P(TN|HP), is 0.01 Assume the rate of HIV positivity in the population is 0.10 If a person is tested and the test comes back positive, what is the probability the person really has HIV?

A Medical Example - 2 According to Bayes' Formula but P(TP|HP) = 1 – P(TN|HP) Here is the danger of testing for rare diseases. If the rate of HIV in the population is 0.01, then the P(HP | TP) is only about 1/3! Considering some people might do some strange things after being told they tested positive, this rate of false positives is clearly unacceptable

Another Problem Some data A company rates 75% of its employees satisfactory 25% of the employees are rated unsatisfactory Of the satisfactory employees, 80% had prior experience Of the unsatisfactory employees, only 40% had prior experience If a person with prior experience is hired, what is the probability she will be satisfactory?

Using a Bayesian Approach Requirements All the P(E | Hi) must be known All the P(Hi) must be known The relationship between evidence and hypotheses must be independent (rarely the case in medicine) Difficulties A large database of information must be maintained Any new evidence causes the whole system to change Data collection and verification may not be possible Therefore, less rigorous ways of dealing with uncertainty have been developed

Dempster-Shafer Theory of Evidence Each proposition is assigned an interval [belief, plausibility] within which the degree of belief of each proposition must lie The belief ranges from 0 to 1; the plausibility of a proposition p is pl(p) = 1 – bl( not (p) ) Example: Mellisa says my computer is broken into M. is reliable 0.9 and unreliable 0.1 Thus a belief(computer broken into) = 0.9 M. says nothing about the belief the computer is not broken into, that that value is assigned 0.0 pl(not broken into) = 1 – 0.0 = 1.0 So the belief interval for Melissa is [0.9, 1.0]

Combining Evidence - 1 Bill says the computer is broken into B. is reliable 0.8 and unreliable 0.2 Assume B. and M. are acting independently Probability both B. and M. are reliable is 0.8*0.9 = 0.72 Probability both B. and M. are unreliable is 0.2*0.1 = 0.02 Probability at least one of B. and M. is reliable is 1 – 0.02 = 0.98 The resulting belief interval is [0.98, 1.0] Suppose Bill says the computer was not broken into They both cannot be correct and both cannot be reliable The other possible scenarios are M. is reliable, B. is not; or B. is reliable and M. is not, or both are not reliable These calculations are on the next page

Combining Evidence - 2 Calculations The prior probability only Melissa is reliable is 0.9 * (1 - 0.8) = 0.18; that only Bill is reliable is 0.8 * (1 – 0.9) = 0.08; and both are unreliable is 0.2 * 0.1 = 0.02 The probability at least one is not reliable is 0.18 + 0.08 + 0.02 = 0.28 The posterior probability only M. is reliable is 0.18/0.28 = 0.643; the posterior probability only B. is reliable is 0.08/0.28 = 0.286 pl(break_in) = 1 – bl(not break in) = 1 – 0.286 = 0.714 The belief measure for break in is [0.28, 0.714]

Another Example - 1 Demster’s Rule – in general Medical diagnosis Patient has cold (C), flu (F), headaches (H), or meningitis (M) Evidence supports hypothesis sets, such as a combination of {C, F, M} Suppose given the patient has a fever, the first belief, m1(C,F,M) = 0.6, then, at this point, the measure of all other beliefs is 0.4

Another Example - 2 Suppose new evidence is the patient has extreme nausea with m2(C,F,H) = 0.7 Given that no sets X  Y are empty, the denominator in our equation is 1 Combined belief is calculated as:

Another Example - 3 Suppose new lab evidence indicates mennningitis with m4(M) = 0.8 Combined belief is calculated as:

Another Example - 4 The denominator of the equation is 1 – 0.336 – 0.224 = 0.44 The final combined beliefs for m5 are:

Some Comments When there is conflicting evidence, belief in the empty set may be high (0.56 in this example) If H has n elements, then there are 2n subsets, so calculations can become cumbersome Dempster-Shafer reduces to Bayesian statistics if all the evidence is know precisely However, often there is only incomplete evidence and this is where Demster-Shafer is most useful

Another Problem Joe and Mary are brother and sister who are now in their mid-twenties. They met at lunch to discuss what appears to be the breakup of their parents’ thirty-year marriage. Joe believes they will break up and he can predict his parents behavior 70% of the time. Mary also believe they will break up and she is correct in predicting her parents behavior 80% of the time. Neither expresses an opinion on the probability that their parents will not break up. Using the techniques developed by Dempster and Shafer, what is the belief interval that the parents will actually get divorced? Show all your work and explain how you carried out the calculations.

Stanford Certainty Factor Algebra Although Bayesian Techniques are firmly based in probability theory, the calculations are complex and data must be continually updated A simpler measure of confidence was needed The medical diagnosis system MYCIN developed at Stanford introduced certainty factors Although the CF system is “ad hoc” in nature, it requires little calculation and allows confidences to be combined as the system progresses towards a goal The actual confidence values are much less important than the quality of the rules in the system

Defining the Certainty Factor Measures of Belief and Disbelief The certainty factor is A value of near +1 represents strong confidence A value near zero represents a lack of evidence for or against the hypothesis A value of near –1 represents a lack of confidence

Manipulating Confidence Factors An expert assigns a confidence for each rule in the rule base These values are used to calculate new values as the system progresses towards a solution The content of the rule itself is much more significant than the CF assigned Combining CF for premises using AND and OR

An Example Calculation Suppose a rule in the knowledge base is: Suppose P1, P2, and P3 have CFs of 0.6, 0.4, 0.2, respectively

Combining Rules Suppose rules R1 and R2 support the result R Given the CF for R1 and R2, we calculate the CF for R as follows Our Lisp expert system shell will use certainty factors to search for the most likely solution