Download presentation
Presentation is loading. Please wait.
Published byDoreen McCormick Modified over 9 years ago
1
KETIDAKPASTIAN (UNCERTAINTY) Yeni Herdiyeni – http://ilkom.fmipa.ipb.ac.id/yeni “Uncertainty is defined as the lack of the exact knowledge that would enable us to reach a perfectly reliable conclusion”
2
Topik Pengertian Ketidakpastian (Uncertainty) Konsep Dasar Peluang (Probability) Faktor kepastian (Certainty Factor) Penalaran Bayes (Bayesian Reasoning) Fuzzy Neural Network Genetic Algorithm
3
Ketidakpastian Hujan atau Tidak ya….???
4
Expert System in Nursing For example, a nursing diagnosis may require to know if a patient had suffered from a certain allergy? The patient may not know or remember the answer to this question. Therefore, the patient information is unavailable to ensure a correct diagnosis. This means that decisions will often be based on incomplete or uncertain data. Clearly, this may result in uncertain conclusions.
5
Ide Konsep Dasar Peluang (Probability) Faktor kepastian (Certainty Factor) Penalaran Bayes (Bayesian Reasoning) Fuzzy Neural Network Genetic Algorithm
6
Aplikasi Financial Market : Stock Market Games : Gambling Weather Forecasts Risk Management dll
7
Studi Kasus PASS : An Expert System with Certainty Factors for Predicting Student Success Variabel : Jenis Kelamin, spesialisasi, Peringkat Peringkat: Moderate : >=10 and <12.5 Good : >=12.5 and <15.5 Very Good: >=15.5 and <18.5 Excellent: >=18.5 and <=20
8
Ketidakpastian Informasi seringkali tidak lengkap, tidak konsisten, tidak pasti (uncertain) Weak Implication Imprecise language : ambigu, imprecise Unknown data Menggabungkan perbendaan pandangan dari pakar
9
9 Sources of Uncertain Knowledge Imprecise language
10
Uncertainty In evidence: 'if pressure gauge is high then liquid is boiling' Electrical components are notoriously faulty, the pressure gauge may actually be stuck In inferring conclusion: 'if patient has a sore throat then patient has tonsillitis' A doctor may infer such a conclusion but it would not be an absolute, binary, one. Vagueness of language: 'if the car is a Porsch then it is fast'. What do we mean when using the term fast?
11
Representasi Ketidakpastian Certainty Theory Probabilistic Theory of evidence (Bayes Theorm) Fuzzy logic Neural Network GA
12
Certainty Theory
13
A certainty factor (cf ), a number to measure the expert’s belief. Maximum value of the certainty factor is +1.0 (definitely true) and minimum -1.0 (definitely false). Two aspects: Certainty in the Evidence The evidence can have a certainty factor attached Certainty in Rule Note CF values are not probabilities, but informal measures of confidence
14
Certainty Factors TermCertainty Factor Definitely not Almost certainly not-0.8 Probably not-0.6 Maybe not-0.4 Unknown-0.2 to +0.2 Maybe+0.4 Probably+0.6 Almost certainly+0.8 Definitely+1.0 Uncertain term and their intepretation
15
Expert Systems with Certainty Theory In expert systems with certainty factors, the knowledge base consists of a set of rules that have the following syntax: IF THEN {cf } Where cf represents belief in hypothesis H given that evidence E has occurred.
16
The certainty factor assigned by a rule is propagated through the reasoning chain Now belief in hypothesis H given that evidence E has occurred, is: cf (H,E) = cf (E) * cf(Rule) For example: IF sky is clear THEN the forecast is sunny {cf 0.8} and current certainty factor of “sky is clear” is 0.5 cf (H,E) = 0.5 * 0.8 = 0.4 Example The certainty that the forcast is sunny = 0.4
17
Multiple Antecedents So if a certainty is attached to the evidence What happens when we have rules with more than one piece of evidence? With conjunctions (i.e. and) Use the minimum cf of evidence With disjunctions (i.e. or) Use the maximum cf of evidence
18
Conjunctive Antecedents - Example Conjunctive Rules: cf(H, E 1 and E 2 and …E i ) = min{c(E 1, E 2, …E i )} *cf(Rule) IF there are dark clouds E 1 AND the wind is strongerE 2 THEN it will rain {cf 0.8} So assume that cf (E 1 ) = 0.5 and cf (E 2 ) = 0.9, then cf(H, E) = min{0.5, 0.9} * 0.8 = 0.4
19
Disjunctive Antecedents - Example Disjunctive Rules: cf(H, E 1 or E 2 or …E i ) = max{cf(E 1, E 2, …E i )} * cf(Rule) IF there are dark clouds E 1 OR the wind is strongerE 2 THEN it will rain {cf 0.8} Again assume that cf(E 1 ) = 0.5 and cf(E 2 ) = 0.9, then cf(H, E) = max{0.5, 0.9} * 0.8 = 0.72
20
What if two pieces of evidence leads to the same conclusion? Common sense suggests that, if two pieces of evidence from different sources support the same hypothesis then confidence in this hypothesis should increase more than if only one piece of evidence had been obtained. Similarly Concluded Rules Rule 1: IF A is X THEN C is Z { cf 0.8} Rule 2:IF B is Y THEN C is Z { cf 0.6}
21
Similarly Concluded Rules - Example Two rules can lead to same conclusion: IF weatherperson predicts rain E 1 THEN it will rain { cf 1 0.7} IF farmer predicts rain E 2 THEN it will rain { cf 2 0.9} Assume cf(E 1 ) = 1.0 and that cf(E 2 ) = 1.0 cfcfcf cf 1 (H 1, E 1 ) = cf (E 1 ) * cf (Rule 1 ) cf cf 1 (H 1, E 1 ) = 1.0 * 0.7 = 0.7 cfcfcf cf 2 (H 2, E 2 ) = cf (E 2 ) * cf (Rule 2 ) cf cf 2 (H 2, E 2 ) = 1.0 * 0.9 = 0.9
22
Similarly Concluded Rules If we obtain supporting evidence for hypothesis from two sources, then should feel more confident in conclusion So: cf combine ( cf 1, cf 2 ) = cf 1 + cf 2 * (1 - cf 1 ) = 0.7 + 0.9 * ( 1 – 0.7) = 0.97
23
Certainty Theory So certainty theory allows us to reason with uncertainty in: evidence inferring conclusion – i.e. the hypothesis But what about vagueness of language: 'if the car is a Porsch then it is fast'. For this we use Fuzzy Logic
24
Probability
25
21 st September 2006Bogdan L. Vrusias © 2006 25 Basic Probability Theory The concept of probability has a long history that goes back thousands of years when words like “probably”, “likely”, “maybe”, “perhaps” and “possibly” were introduced into spoken languages. However, the mathematical theory of probability was formulated only in the 17th century. The probability of an event is the proportion of cases in which the event occurs. Probability can also be defined as a scientific measure of chance.
26
26 Basic Probability Theory Probability can be expressed mathematically as a numerical index with a range between zero (an absolute impossibility) to unity (an absolute certainty). Most events have a probability index strictly between 0 and 1, which means that each event has at least two possible outcomes: favourable outcome or success, and unfavourable outcome or failure.
27
27 Basic Probability Theory If s is the number of times success can occur, and f is the number of times failure can occur, then and p + q = 1 If we throw a coin, the probability of getting a head will be equal to the probability of getting a tail. In a single throw, s = f = 1, and therefore the probability of getting a head (or a tail) is 0.5.
28
The Axioms of Probability 0 <= P(A) <= 1 P(True) = 1 P(False) = 0 P(A or B) = P(A) + P(B) - P(A and B)
29
Theorems from the Axioms 0 <= P(A) <= 1, P(True) = 1, P(False) = 0 P(A or B) = P(A) + P(B) - P(A and B) From these we can prove: P(not A) = P(~A) = 1-P(A) A B P(A or B) B P(A and B)
30
Copyright © Andrew W. Moore Another important theorem 0 <= P(A) <= 1, P(True) = 1, P(False) = 0 P(A or B) = P(A) + P(B) - P(A and B) From these we can prove: P(A) = P(A ^ B) + P(A ^ ~B) How? A B P(A or B) B P(A and B)
31
31 Conditional Probability Let A be an event in the world and B be another event. Suppose that events A and B are not mutually exclusive, but occur conditionally on the occurrence of the other. The probability that event A will occur if event B occurs is called the conditional probability. Conditional probability is denoted mathematically as p(A|B) in which the vertical bar represents "given" and the complete probability expression is interpreted as “Conditional probability of event A occurring given that event B has occurred”.
32
Copyright © Andrew W. Moore Conditional Probability P(A|B) = Fraction of worlds in which B is true that also have A true F H H = “Have a headache” F = “Coming down with Flu” P(H) = 1/10 P(F) = 1/40 P(H|F) = 1/2 “Headaches are rare and flu is rarer, but if you’re coming down with ‘flu there’s a 50-50 chance you’ll have a headache.”
33
Copyright © Andrew W. Moore Conditional Probability F H H = “Have a headache” F = “Coming down with Flu” P(H) = 1/10 P(F) = 1/40 P(H|F) = 1/2 P(H|F) = Fraction of flu-inflicted worlds in which you have a headache = #worlds with flu and headache ------------------------------------ #worlds with flu = Area of “H and F” region ------------------------------ Area of “F” region = P(H ^ F) ----------- P(F)
34
Copyright © Andrew W. Moore Definition of Conditional Probability P(A ^ B) P(A|B) = ----------- P(B) Corollary: The Chain Rule P(A ^ B) = P(A|B) P(B)
35
Copyright © Andrew W. Moore Probabilistic Inference F H H = “Have a headache” F = “Coming down with Flu” P(H) = 1/10 P(F) = 1/40 P(H|F) = 1/2 One day you wake up with a headache. You think: “Drat! 50% of flus are associated with headaches so I must have a 50-50 chance of coming down with flu” Is this reasoning good?
36
Copyright © Andrew W. Moore Probabilistic Inference F H H = “Have a headache” F = “Coming down with Flu” P(H) = 1/10 P(F) = 1/40 P(H|F) = 1/2 P(F ^ H) = … P(F|H) = …
37
21 st September 2006Bogdan L. Vrusias © 2006 37 Conditional Probability The number of times A and B can occur, or the probability that both A and B will occur, is called the joint probability of A and B. It is represented mathematically as p(A B). The number of ways B can occur is the probability of B, p(B), and thus Similarly, the conditional probability of event B occurring given that event A has occurred equals
38
38 Conditional Probability Hence And Substituting the last equation into the equation yields the Bayesian rule:
39
Copyright © Andrew W. Moore What we just did… P(A ^ B) P(A|B) P(B) P(B|A) = ----------- = --------------- P(A) P(A) This is Bayes Rule Bayes, Thomas (1763) An essay towards solving a problem in the doctrine of chances. Philosophical Transactions of the Royal Society of London, 53:370-418
40
Copyright © Andrew W. Moore Another way to understand the intuition Thanks to Jahanzeb Sherwani for contributing this explanation:
41
Papers Uncertainty Management in Expert System PASS : An Expert System with Certainty Factors for Predicting Student Success FuzzyCLIPS
42
Next Week Bayesian Inference Fuzzy
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.