Resolving Rule Conflicts Assign explicit priorities to rules Specificity of a rule’s antecedents R1: IF car will not start THEN battery is dead R2: IF.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction, or what is uncertainty? Introduction, or what is uncertainty? Basic probability theory Basic probability.
Advertisements

1 Knowledge Based Systems (CM0377) Lecture 11 (Last modified 26th March 2001)
FT228/4 Knowledge Based Decision Support Systems
9 x9 81 4/12/2015 Know Your Facts!. 9 x2 18 4/12/2015 Know Your Facts!
10/4/2003COMP 474/674 Fall 2003 Michelle Khalife1 Conflict Resolution in CLIPS COMP 474/674 FALL 2003 Michelle Khalifé.
1 Inferences with Uncertainty Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson Copyright 1998, Prentice Hall, Upper Saddle.
Test practice Multiplication. Multiplication 9x2.
1 x0 0 4/15/2015 Know Your Facts!. 9 x1 9 4/15/2015 Know Your Facts!
1 x0 0 4/16/2015 Know Your Facts!. 1 x8 8 4/16/2015 Know Your Facts!
 Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty? n Basic.
3 x0 0 7/18/2015 Know Your Facts!. 4 x3 12 7/18/2015 Know Your Facts!
Uncertainty in Expert Systems (Certainty Factors).
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Expert systems facts & rules inference engines certainty user interfaces systems structure.
Conflict Resolution  what to do if there is more than 1 matching rule in each inference cycle? match WM with LHS of rules select one rule (conflict resolution)
B. Ross Cosc 4f79 1 Forward chaining backward chaining systems: take high-level goal, and prove it by reducing it to a null goal - to reduce it to null,
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Certainty and Evidence
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
System Architecture Intelligently controlling image processing systems.
FT228/4 Knowledge Based Decision Support Systems Rule-Based Systems Ref: Artificial Intelligence A Guide to Intelligent Systems Michael Negnevitsky – Aungier.
Vectors and Vector Equations (9/14/05) A vector (for us, for now) is a list of real numbers, usually written vertically as a column. Geometrically, it’s.
Lecture 04 Rule Representation
Shet, Vinay et alii Predicate Logic based Image Grammars for Complex Pattern International Journal of Computer Vision (2011) Hemerson Pistori Biotechnology.
ICT619 Intelligent Systems Topic 2: Expert Systems.
Objectives Explore the sources of uncertainty in rules Analyze some methods for dealing with uncertainty Learn about the Dempster-Shafer theory Learn.
Chapter 5: Inexact Reasoning
EXPERT SYSTEMS Part I.
Lecture 05 Rule-based Uncertain Reasoning
CS 561, Session 25 1 Introduction to CLIPS Overview of CLIPS Facts Rules Rule firing Control techniques Example.
Problem Management Launch Michael Hall Real-World IT
Rule-Based Expert Systems
Multiplication Facts. 1 x3 3 Think Fast… 2 x4 8.
4 x1 4 10/18/2015 Know Your Facts!. 5 x /18/2015 Know Your Facts!
3 x0 0 10/18/2015 Know Your Facts!. 11 x /18/2015 Know Your Facts!
Slide 3- 1 Rule 1 Derivative of a Constant Function.
 Architecture and Description Of Module Architecture and Description Of Module  KNOWLEDGE BASE KNOWLEDGE BASE  PRODUCTION RULES PRODUCTION RULES 
Inexact Reasoning. 2 Objectives Explore the sources of uncertainty in rules Analyze some methods for dealing with uncertainty Learn about the Dempster-Shafer.
© C. Kemke Reasoning under Uncertainty 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
Reasoning with Uncertainty دكترمحسن كاهاني
Number your paper from 1 through 25.. Multiplication Facts Ready Set Begin.
Multiplication Facts Table of Contents 0’s 1’s 2’s 3’s 4’s 5’s 6’s 7’s 8’s 9’s 10’s.
Multiplication Facts X 3 = 2. 8 x 4 = 3. 7 x 2 =
Multiplication Facts. 9 6 x 4 = 24 5 x 9 = 45 9 x 6 = 54.
Semantic Web Knowledge Fusion Jennifer Sleeman University of Maryland, Baltimore County Motivation Definitions Methodology Evaluation Future Work Based.
1 Intelligent Systems and Control Rule-based expert systems n Introduction, or what is knowledge? n Rules as a knowledge representation technique n The.
Expert System Seyed Hashem Davarpanah University of Science and Culture.
Multiplication Facts. 2x2 4 8x2 16 4x2 8 3x3.
Multiplication Facts Review: x 1 = 1 10 x 8 =
Multiplication Facts All Facts. 0 x 1 2 x 1 10 x 5.
Chapter 12 Certainty Theory (Evidential Reasoning) 1.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
Knowledge Representation Techniques
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Advanced AI Session 2 Rule Based Expert System
Multiplication Facts.
Multiplication Facts.
Inexact Reasoning 1 Session 9
8.1 Multiplication Properties of Exponents
Multiplication Facts.
ASSIGNMENT NO.-2.
CS62S: Expert Systems Based on:
Multiplication Facts.
Learn Your 2x Facts.
Multiplication Facts.
Multiplication Facts 3 x Table.
Presentation transcript:

Resolving Rule Conflicts Assign explicit priorities to rules Specificity of a rule’s antecedents R1: IF car will not start THEN battery is dead R2: IF car will not start AND headlights will not work THEN battery is dead Order in which rule are entered in kbase timestamp, rule entered first given priority Order of rules in kbase Recency of facts entered real-time ES

Certainty factors higher CF rule fires first rule that gives higher confidence to goal being pursued

Certainty Theory Representing uncertain evidence Representing uncertain rules Combining evidence from multiple sources eg. IF A and B THEN X cf 0.8 IF C THEN X cf 0.7 What is the certainty of X? -1<= CF <= <= CF <= 100 Used in –rule conclusions, values of variables in premise, answers to user queries

Certainty theory Commutative property –if more than one rule gathers information, then the combined CF value can not be dependent upon the order of the processing of the rules Asymptotic property –the certainty model should incrementally add belief to a hypothesis as new positive evidence is obtained; however, unless we encounter some evidence that absolutely confirms the hypothesis, we cannot be totally certain. –Thus, confirming evidence increases out belief, but unless absolute certainty is found, cf approaches 1, but never equals 1.

Rule: IF E THEN H CF(Rule) CF(H,E) = CF(E) * CF(Rule) Example: IF econ-two-years = strong THEN likelihood-of-inflation = strong CF 40 Given econ-two-years = strong with cf=70 CF(likelihood-of-inflation = strong ) = (40 * 70)/100 = 28

Conjunctive: IF E 1 and E 2 and … and E n THEN H CF(Rule) CF(H, E 1 and E 2 and … and E n ) = min{CF(E i )}* CF(Rule) Disjunctive IF E 1 or E 2 or … or E n THEN H CF(Rule) CF(H, E 1 or E 2 or … or E n ) = max{CF(E i )}* CF(Rule)

Certainty propagation in similarly concluded rules R1: IF E 1 THEN H CF 1 R2: IF E 2 THEN H CF 2 (supporting evidence increases our belief) CF combine (CF 1,CF 2 ) = CF 1 + CF 2 (1 - CF 1 ), when both > 0 = (CF 1 + CF 2 ) / (1 - min(|CF 1 |, |CF 2 |), when one < 0 = CF 1 + CF 2 (1+ CF 1 ), when both < 0.

Premise with AND (conjunctive) Example: IF economy-two-years = strong AND availability-of-investment-capital = low THEN likelihood-of-inflation = strong CF of condition = min(cf 1, cf 2 ) Premise with AND (conjunctive) Example: IF economy-two-years = poor OR unemployment-outlook = poor THEN economic-outlook = poor CF of condition = max(cf 1, cf 2 )

Premise with both AND and OR Example: IF has-credit-card = yes(cf= 80) OR cash = ok(cf = 90) AND payments = ok(cf = 85) THEN approval = ok (has-credit-card = yes AND payments = ok)[min(80,85)] OR (cash = ok AND payments = ok)[min(90,85)] CF = (80*85)/100 = 97 (some systems use this) CF = max(80,85) = 85 (some systems use this)

Example: R1: IF weatherman says it will rain THEN it will rain CF 0.8 R2: IF farmer says it will rain THEN it will rain CF 0.8 Case (a): Weatherman and farmer are certain in rain CF(E 1 ) = CF(E 2 ) = 1.0 CF(H, E 1 ) = CF(E 1 ) * CF(Rule 1 ) = 1.0*0.8 = 0.8 CF(H, E 2 ) = CF(E 2 ) * CF(Rule 2 ) = 1.0*0.8 = 0.8 CF combine (CF 1, CF 2 ) = CF 1 + CF 2 (1- CF 1 ) = (1-0.8) = 0.96 CF of a hypothesis which is supported by more than one rule, can be incrementally increased by supporting evidence from both rules.

Case (b):Weatherman certain in rain, farmer certain in no rain CF(E 1 ) = 1.0, CF(E 2 ) = -1.0 CF 1 = 0.8, CF 2 = -0.8 CF combine (CF 1, CF 2 ) = (0.8 + (-0.8)) / (1 - min(0.8, 0.8)} = 0 (unknown) Case (b): CF(E 1 ) = -0.8, CF(E 2 ) = -0.6 CF combine (CF 1, CF 2 ) = ( ) = Incremental decrease in certainty from more than one source of disconfirming evidence Case (c): CF combine (CF 1, CF 2, CF 3,….) = = CF old Single piece of disconfirming evidence CF new = -0.8 Cf combine = ( ) / ( ) = Single piece of disconfirming evidence does not have a major impact on many pieces of confirming evidence.

Certainty Threshold CF-condition rule fails to fire Rule condition fails if CF(premise) < Threshold But if rule fires, and after that CF(conclusion) < Threshold the conclusion will still be asserted.

Interpreting CF values Definitely not-1.0 Almost certainly not-0.8 Maybe0.4 Probably not-0.6 Probably0.6 Maybe not-0.4 Almost certainly0.8 Unknown -0.2 to 0.2 Definitely1.0 Acquiring CF from experts Prompting users for CF values

Example R1: IF the weather looks lousy(E1) OR I am in a lousy mood (E2) THEN I shouldn’t go to the ball game CF 0.9 (H1) R2: IF I believe it is going to rain (E3) THEN the weather looks lousy CF 0.8 (E1) R3: IF I believe it is going to rain (E3) AND the weatherman says it is going to rain(E4) THEN I am in a lousy mood CF 0.9(E2) R4: IF the weatherman says it is going to rain(E4) THEN the weather looks lousy CF 0.7(E1) R5: IF the weather looks lousy(E1) THEN I am in a lousy mood CF 0.95(E2)

Assume user enters following facts: –I believe it is going to rain, CF(E3) = 0.95 –Weatherman says it is going to rain, CF(E4) = 0.8 Goal: I shouldn’t go to the ball game (H1) Step 1: Pursue R1: Pursue premise of R1 “weather looks lousy” (E1): R2 and R4 Step 2: Pursue R2 CF(E1, E3) = CF(E3) * CF(Rule R2) = 0.8*0.95 = 0.76 Step 3: Pursue R4 CF(E1, E4) = CF(E4)*CF(Rule R4) = 0.7*0.85=0.60 Step 4: Combine evidence for E1 CF(E1) = CF(E1,E3) + CF(E1,E4)*(1 - CF(E1,E3)) = * ( ) = 0.90

Step 5: Pursue premise 2 of Rule 1 (E2) : R3 and R5 Step 6: Pursue R5 CF(E2, E1) = CF(E1)* CF(Rule R5) = 0.95*0.9 = 0.86 Step 7: Pursue R3 CF(E2, E3 and E4) = min{CF(E3), CF(E4)} * CF(Rule R3) = min{0.95,0.85}*0.9 = 0.77 Step 8: Combine evidence for E2: “I am in a lousy mood” CF(E2) = CF(E2, E1) + CF(E2, E3 and E4) ( 1 - CF(E2, E1)) = ( ) = 0.97 Step 9: return to Rule R1 From steps 4 and 9 CF(H1, E1 or E2) = max{CF(E1), CF(E2)} * CF(Rule R1) = max{0.9,0.97}*0.9 = 0.87 = CF(I shouldn’t go to the ball game) Conclusion: I almost definitely shouldn’t go to the ball game.

Controlling search with CF Meta rules Example: IF CF(Problem is in electrical system) < 0.5 THEN PURSUE problem is fuel system