Chapter 12 Certainty Theory (Evidential Reasoning) 1.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction, or what is uncertainty? Introduction, or what is uncertainty? Basic probability theory Basic probability.
Advertisements

FT228/4 Knowledge Based Decision Support Systems
Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
1 Inferences with Uncertainty Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson Copyright 1998, Prentice Hall, Upper Saddle.
 Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty? n Basic.
Uncertainty in Expert Systems (Certainty Factors).
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
FT228/4 Knowledge Based Decision Support Systems
Chapter 4: Reasoning Under Uncertainty
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
Final Exam: May 10 Thursday. If event E occurs, then the probability that event H will occur is p ( H | E ) IF E ( evidence ) is true THEN H ( hypothesis.
Certainty and Evidence
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6.
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
CSNB234 ARTIFICIAL INTELLIGENCE
Rule Based Systems Alford Academy Business Education and Computing
Slides are based on Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty?
© 2002 Franz J. Kurfess Reasoning under Uncertainty 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 4: Modeling Decision Processes Decision Support Systems in the.
Lecture 04 Rule Representation
© Franz J. Kurfess Reasoning under Uncertainty CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© Franz J. Kurfess Reasoning under Uncertainty 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Lecture 05 Rule-based Uncertain Reasoning
Developing Ideas for Research and Evaluating Theories of Behavior
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Chapter 4: Reasoning Under Uncertainty
Knowledge-Based Systems Knowledge-Based Systems Dr. Marco Antonio Ramos Corchado Computer Science Department.
Rule-based Knowledge (Expert) Systems
© C. Kemke Reasoning under Uncertainty 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
1 CHAPTER 13 Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson 6th ed, Copyright 2001, Prentice Hall, Upper Saddle River,
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
Statistical Hypotheses & Hypothesis Testing. Statistical Hypotheses There are two types of statistical hypotheses. Null Hypothesis The null hypothesis,
Abdul Rahim Ahmad MITM 613 Intelligent System Chapter 3: Dealing with Uncertainty.
Uncertainty Management in Rule-based Expert Systems
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Uncertainty in Expert Systems
Reasoning Under Uncertainty. 2 Objectives Learn the meaning of uncertainty and explore some theories designed to deal with it Find out what types of errors.
Reasoning with Uncertainty دكترمحسن كاهاني
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
 Negnevitsky, Pearson Education, Introduction, or what is knowledge? Knowledge is a theoretical or practical understanding of a subject or a domain.
UNIVERSITI TENAGA NASIONAL 1 CCSB354 ARTIFICIAL INTELLIGENCE Chapter 8.2 Certainty Factors Chapter 8.2 Certainty Factors Instructor: Alicia Tang Y. C.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Chapter 1: The Science of Biology Section 1: What is Science?
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Unit 5: Hypothesis Testing
Chapter 10: Using Uncertain Knowledge
Inexact Reasoning 1 Session 9
REASONING WITH UNCERTANITY
Basic Probabilistic Reasoning
Representing Uncertainty
Developing and Evaluating Theories of Behavior
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
Significance Tests: The Basics
Significance Tests: The Basics
28th September 2005 Dr Bogdan L. Vrusias
Certainty Factor Model
Presentation transcript:

Chapter 12 Certainty Theory (Evidential Reasoning) 1

Probability Theory Advantages: The oldest and best-established technique to deal with inexact knowledge and random data. works well in such areas where statistical data is usually available (with accurate probabilities) – e.g. forecasting and planning Disadvantages: reliable statistical information is not always available The conditional independence of evidence is not available in all real-world situations the Bayesian belief propagation is of exponential complexity,  impractical for large knowledge bases 2

3 Probability Theory Disadvantages (continued.) The conditional probabilities may be inconsistent with the prior probabilities given by the expert. – Because the expert made different assumptions when assessing the conditional and prior probabilities – Proved in pages , Durkin – Example and more explanation in pages 72-74, Negnevitsky Most human feel uncomfortable estimating probability, but are more willing to express their certainty (psychologically confirmed) – The probability of my catching a cold is 90%.  – I really believe that I’m catching a cold.

4 Bias of the Bayesian Method Consider, for example, a car that does not start and makes odd noises when you press the starter. The conditional probability of the starter being faulty if the car makes odd noises may be expressed as: IFthe symptomis "odd noises" THENthe starteris bad {with probability 0.7} p(starter is not bad | odd noises) = = p(starter is good | odd noises) = = 0.3

5 Bias of the Bayesian Method Therefore, we can obtain a companion rule that states IFthe symptomis “odd noises” THENthe starteris good {with probability 0.3} Domain experts do not deal with conditional probabilities and often deny the very existence of the hidden implicit probability (0.3 in our example). We would also use available statistical information and empirical studies to derive the following rules: IFthe starteris bad THENthe symptomis “odd noises” {probability 0.85} IFthe starteris bad THENthe symptomis not “odd noises” {probability 0.15}

6 Bias of the Bayesian Method To use the Bayesian rule, we still need the prior probability, the probability that the starter is bad if the car does not start. Suppose, the expert supplies us the value of 5 per cent. Now we can apply the Bayesian rule to obtain: The number obtained is significantly lower than the expert’s estimate of 0.7 given at the beginning of this section!!! The reason for the inconsistency is that the expert made different assumptions when assessing the conditional and prior probabilities.

7 Certainty Factors A simple approach & popular alternative to Bayesian reasoning. used in cases where the probabilities are not known or are too difficult or expensive to obtain better explanations of the control flow through a rule-based expert system lacks the mathematical correctness of the probability theory (the lack of a formal foundation) – The certainty theory is not “mathematically pure” but does mimic the thinking process of a human expert.

8 Certainty Factors & Probability Theory both share a common problem: finding an expert able to quantify personal, subjective and qualitative information. combined approach: PROSPECTOR – a geological expert system – probability-based knowledge base; but asking certainty measure from user – Example program using PROSPECTOR approach, pages , Durkin, Ch. 11.

Uncertain Terms 9

10 Certainty Factors The syntax of the rules in the knowledge base: IF THEN {cf } where cf represents belief in hypothesis H given that evidence E has occurred.

11 Certainty Factors MB: measure of belief MD: measure of disbelief The values of MB(H, E) and MD(H, E) range between 0 and 1. The total strength of belief or disbelief in a hypothesis:

12 Certainty Factors MB and MD can be calculated from the probabilities: p(H) is the prior probability of hypothesis H being true; p(H|E) is the probability that hypothesis H is true given evidence E. Q. Explain how MB & MD will change, when p(H|E) > p(H) and p(H|E) < p(H).

Evidential Reasoning The certainty factor assigned by a rule is propagated through the reasoning chain. Handling: – uncertain evidence – conjunctive rule, – disjunctive rule, – hypothesis with different degrees of belief 13

14 Evidential Reasoning: Uncertain Evidence IF E {cf(E)}, THEN H {cf} cf (H,E) = cf (E) x cf For example: IFsky is clear {cf 0.5} THENthe forecast is sunny {cf 0.8} cf (H,E) = 0.5 x 0.8 = 0.4 This result can be interpreted as "It may be sunny".

15 Evidential Reasoning: Conjunctive Rules Conjunctive Rules: cf (H, E 1  E 2 ...  E n ) = min [cf (E 1 ), cf (E 2 ),..., cf (E n )] x cf For example: IFskyis clear {cf 0.9} ANDthe forecastis sunny {cf 0.7} THENthe actionis 'wear sunglasses' {cf 0.8} cf (H, E 1  E 2 ) = min [0.9, 0.7] x 0.8 = 0.7 x 0.8 = 0.56

16 Evidential Reasoning: Disjunctive Rules Disjunctive Rules: cf (H, E1  E2 ...  E n ) = max [cf (E 1 ), cf (E 2 ),..., cf (E n )] x cf For example: IFskyis overcast {cf 0.6} ORthe forecastis rain {cf 0.8} THENthe actionis 'take an umbrella' {cf 0.9} cf (H, E 1  E 2 ) = max [0.6, 0.8] x 0.9 = 0.8 x 0.9 = 0.72

17 Evidential Reasoning: Hypothesis with different degrees of belief Example: Rule 1:IFA is X THENC is Z {cf 0.8} Rule 2:IFB is Y THENC is Z {cf 0.6} If both Rule 1 and Rule 2 are fired, what certainty should be assigned to object C having value Z? Combining certainty factors for a hypothesis.

18 Evidential Reasoning: Hypothesis with different degrees of belief where: cf 1 is the confidence in hypothesis H established by Rule 1; cf 2 is the confidence in hypothesis H established by Rule 2; |cf 1 | and |cf 2 | are absolute magnitudes of cf 1 and cf 2, respectively.

19 FORECAST an application of certainty factors

20

FORECAST an application of certainty factors 21

FORECAST an application of certainty factors 22

FORECAST an application of certainty factors 23