Download presentation
Presentation is loading. Please wait.
Published byMary Matthews Modified over 8 years ago
1
UNIVERSITI TENAGA NASIONAL 1 CCSB354 ARTIFICIAL INTELLIGENCE Chapter 8.2 Certainty Factors Chapter 8.2 Certainty Factors Instructor: Alicia Tang Y. C.
2
UNIVERSITI TENAGA NASIONAL 2 Uncertainty Handling Overview In Expert Systems, we must often attempt to draw correct conclusions from poorly formed and uncertain evidence using unsound inference rules. This is not an impossible task; we do it successfully in almost every aspect of our daily survival. Doctors deliver correct medical treatment for ambiguous symptoms; we understand natural language statements that are incomplete or ambiguous and so on. There are many approaches to representing uncertainty in AI.
3
UNIVERSITI TENAGA NASIONAL 3 AI methods to handle uncertainty Abductive reasoning Property inheritance Fuzzy logic Certainty factor Probabilistic inference (e.g. Bayes theorem) Dempster-Shafer theory Non monotonic reasoning e.g. Suppose: If x is a bird then x flies Abductive reasoning would say that “All fly things are birds” By property inheritance “All birds can fly” but, remember the case that Penguin cannot fly?
4
UNIVERSITI TENAGA NASIONAL 4 Evaluation Criteria for uncertainty handling methods Expressive power Logical correctness Computational efficiency of inference
5
UNIVERSITI TENAGA NASIONAL 5 Scheme used by expert system in Handling Uncertainty MYCINuses Certainty Factor The CF can be used to rank hypotheses in order of importance. For example if a patient has certain symptoms that suggest several possible diseases, then the disease with the higher CF would be the one that is first investigated. REVEAL Fuzzy logic was used PROSPECTOR Bayes theorem
6
UNIVERSITI TENAGA NASIONAL 6 Bayesian Approach (I) Bayesian approach (or Bayes theorem) is based on formal probability theory. It provides a way of computing the probability of a hypothesis (without sampling) following from a particular piece of evidence, given only the probabilities with which the evidence follows from actual cause. Best defined technique for managing uncertainty Important for quantitative analysis use
7
UNIVERSITI TENAGA NASIONAL 7 Bayesian approach (II) p(E | H i ) * p(H i ) p(H i | E) = ------------------------------ n p(E | H k ) * p(H k ) k= 1 Here, as you can see, a number of assumptions (i.e. independence of evidence) which cannot be made for many applications (such as in medical cases). evidence assumption
8
UNIVERSITI TENAGA NASIONAL 8 Bayes theorem (III) where : p(H i | E) is the probability that H i is true given evidence E. p(H i ) is the probability that H i is true overall. p(E | H i ) is the probability of observing evidence E when H i is true. n is the number of possible hypotheses. If there are not many cases of success of people who obtained an ‘A’ by studying hard then your chances of getting an ‘A’ caused by ‘hardworking’ is also lower! Those who obtained an ‘A’ and they indeed studied every night before exam You will get an A if you study till late night for a week before exam
9
UNIVERSITI TENAGA NASIONAL 9 Bayes’s Rule and knowledge based systems Advantages: - Most significant is their sound theoretical foundation in probability theory. - Most mature uncertainty reasoning methods - Well defined semantics for decision making Main disadvantage: - They require a significant amount of probability data to construct a knowledge base.
10
UNIVERSITI TENAGA NASIONAL 10 Certainty Factor (CF) Certainty factors measure the confidence that is placed on a conclusion based on the evidence known so far. A certainty factor is the difference between the following two components : CF = MB[h:e] - MD[h:e] A positive CF means the evidence supports the hypothesis since MB > MD.
11
UNIVERSITI TENAGA NASIONAL 11 CF[h:e] = MB[h:e] - MD[h:e] …………………… (I) CF[h:e] is the certainty of a hypothesis h given the evidence e. MB[h:e] is the measure of belief in h given e. MD[h:e] is the measure of disbelief in h given e. CFs can range from -1 (completely false) to +1 (completely true) with fractional values in between, and zero representing ignorance. MDs and MBs can range between 0 to 1 only. 0 - 1 1 - 0
12
UNIVERSITI TENAGA NASIONAL 12 MB(P1 AND P2) = MIN(MB(p1), MB(p2)) ……. (II) MB(P1 OR P2) = MAX(MB(P1), MB(P2))……… (III) the MB in the negation of a fact can be derived as: MB(NOT P1) = 1 - MB(P1) ………………………. (IV)
13
UNIVERSITI TENAGA NASIONAL 13 Each rule can have an credibility (attenuation) A number from 0 to 1 which indicates its reliability. The credibility is then multiplied by the MB for the conclusion of the rule. MB(Conclusion) = MB(conditions) * credibility ….. (V) MB[h:e1,e2] = MB[h:e1] + MB[h:e2] * (1-MB[h:e1]) …….. (VI) For each rule The hypotheses
14
UNIVERSITI TENAGA NASIONAL 14 CERTAINTY FACTOR This scheme does not permit the distinction between conflict of interest (MB and MD are both high) as oppose to lack of evidence (MB and MD both low), which could sometimes be important.
15
UNIVERSITI TENAGA NASIONAL 15 CERTAINTY FACTOR When experts put together the rule base they must agree on a CF to go with each rule. This CF reflects their confidence in the rule’s reliability. Certainty measures may be adjusted to tune the system’s overall performance, although slight variations in this confidence measure tend to have little effect on the overall running of the system.
16
UNIVERSITI TENAGA NASIONAL 16 CERTAINTY FACTOR The conditions of each rule are formed of the and and or of a number of facts. When a production rule is used, the certainty factors that are associated with each condition are combined to produce the overall premise in the following manner.
17
UNIVERSITI TENAGA NASIONAL 17 A Worked Example Rule 1 IF X drives a Gen2 AND X reads the Berita Harian THEN X will vote Barisan Nasional Rule 2 IF X loves the setia song OR X supports Vision 2020 THEN X will vote Barisan Nasional Rule 3 IF X uses unleaded petrol OR X does not support Vision 2020 THEN X will not vote Barisan Nasional
18
UNIVERSITI TENAGA NASIONAL 18 Let assume that the individual MBs for the Conditions are as follows: X drives a Gen20.9 X reads the Berita Harian0.7 X loves the Setia song0.8 X supports Vision 20200.6 X uses unleaded petrol0.7
19
UNIVERSITI TENAGA NASIONAL 19 The credibility of the rules are as follows: Rule 10.7 Rule 20.8 Rule 30.6 The credibility of the rules are as follows: Rule 10.7 Rule 20.8 Rule 30.6
20
UNIVERSITI TENAGA NASIONAL 20 To determine:CF[ X votes BN: Rule 1, Rule 2, Rule 3 ] Rule1 and Rule2 give the MB in the proposition “X votes BN” : MB[X votes BN: Rule 1] = MIN (0.9, 0.7) * 0.7 = 0.49 -- using II and V MB[X votes BN: Rule 2] = MAX (0.8, 0.6) * 0.8 = 0.64 -- using III and V MB[X votes BN: Rule 3] = MAX (0.7), (1-0.6)) * 0.6 = 0.42 -- using II, IV and V Hypotheses, assumed goal
21
UNIVERSITI TENAGA NASIONAL 21 Combining the Rule 1 and Rule 2: MB[X votes BN: Rule1, Rule2] = MB[X votes BN: Rule1] + MB[X votes BN: Rule2] * ( 1 - MB[X votes BN: Rule 1] ) ---- using (VI) = 0.49 + 0.64 * (1 - 0.49) = 0.82 Combining the three rules: CF [ X votes BN: Rule 1, Rule 2, Rule 3 ] = MB[X votes BN: Rule 1, Rule 2] - MD[X votes BN: Rule 3] = 0.82 - 0.42 = 0.4 So, what do you think is the answer for the question: “Will someone in KL vote for BN party”? Disbelieve you will note I believe you wont vote
22
UNIVERSITI TENAGA NASIONAL 22 In an expert system that implements “uncertainty handling” The answer is “May be” (and not a “yes” or a “no”) Isn’t it exactly the way you and I say it! Certainty Factor has been criticised to be excessively ad-hoc. The semantic of the certainty value can be subjective and relative. But the human expert’s confidence in his reasoning is also approximate, heuristic and informal
23
UNIVERSITI TENAGA NASIONAL 23 Advantages: - a simple computational model that permits experts to estimate their confidence in conclusion - it permits the expressions of belief and disbelief in each hypothesis (expression of multiple sources of evidence is thus allowed) - gathering the value of CF is easier than those in other methods
24
UNIVERSITI TENAGA NASIONAL 24 Dempster-Shafer theory (1967, Arthur Shafer) This theory was designed as a mathematical theory of evidence where a value between 0 and 1 is assigned to some fact as its degree of support. Similar to Bayesian method but is more general. As the belief in a fact and its negation need not sum to one ‘1’. Both values can be zero (reflecting that no information is available to make a judgment) Read text if you want to find out more about this scheme
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.