Download presentation
Presentation is loading. Please wait.
Published byEdwina Warner Modified over 9 years ago
1
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
2
Uncertainty Uncertainty is the lack of exact knowledge that would enable us to reach a fully reliable solution – Classical logic assumes perfect knowledge exists: IFA is true THENB is true Describing uncertainty: – If A is true, then B is true with probability P
3
Probability Theory The probability of an event is the proportion of cases in which the event occurs – Numerically ranges from zero to unity (i.e. 0 to 1) P ( success ) + P ( failure ) = 1
4
Conditional Probability Suppose events A and B are not mutually exclusive, but occur conditionally on the occurrence of the other – The probability that event A will occur if event B occurs is called the conditional probability probability of A given B
5
Conditional Probability The probability that both A and B occur is called the joint probability of A and B, written p ( A ∩ B )
6
Conditional Probability Similarly, the conditional probability that event B will occur if event A occurs can be written as:
7
Conditional Probability
8
The Bayesian rule (named after Thomas Bayes, an 18th-century British mathematician): The Bayesian Rule
9
If event A depends on exactly two mutually exclusive events, B and ¬ B, we obtain: Similarly, if event B depends on exactly two mutually exclusive events, A and ¬ A, we obtain: The Bayesian Rule
10
Substituting p ( B ) into the Bayesian rule yields: The Bayesian Rule
11
Expert systems use the Bayesian rule to rank potentially true hypotheses based on evidences The Bayesian Rule
12
If event E occurs, then the probability that event H will occur is p ( H | E ) IF E ( evidence ) is true THEN H ( hypothesis ) is true with probability p The Bayesian Rule
13
Expert identifies prior probabilities for hypotheses p ( H ) and p ( ¬ H ) Expert identifies conditional probabilities for: – p ( E | H ): Observing evidence E if hypothesis H is true – p ( E | ¬ H ): Observing evidence E if hypothesis H is false Can be burdensome to the expert.... The Bayesian Rule
14
Experts provide p ( H ), p ( ¬ H ), p ( E | H ), and p ( E | ¬ H ) Users describe observed evidence E – Expert system calculates p ( H | E ) using Bayesian rule – p ( H | E ) is the posterior probability that hypothesis H occurs upon observing evidence E What about multiple hypotheses and evidences? The Bayesian Rule
15
p(A)p(A)
16
Expand the Bayesian rule to work with multiple hypotheses ( H 1... H m ) and evidences ( E 1... E n ) Assuming conditional independence among evidences E 1... E n The Bayesian Rule
17
Expert is given three conditionally independent evidences E 1, E 2, and E 3 – Expert creates three mutually exclusive and exhaustive hypotheses H 1, H 2, and H 3 – Expert provides prior probabilities p ( H 1 ), p ( H 2 ), p ( H 3 ) – Expert identifies conditional probabilities for observing each evidence E i for all possible hypotheses H k Bayesian Rule Example
18
Expert data: Bayesian Rule Example
19
expert system computes posterior probabilities user observes E 3
20
Bayesian Rule Example user observes E 1 expert system computes posterior probabilities
21
Bayesian Rule Example expert system computes posterior probabilities user observes E 2
22
Bayesian Rule Example Initial expert-based ranking: – p ( H 1 ) = 0.40; p ( H 2 ) = 0.35; p ( H 3 ) = 0.25 Expert system ranking after observing E 1, E 2, E 3 : – p ( H 1 ) = 0.45; p ( H 2 ) = 0.0; p ( H 3 ) = 0.55 Success hinges on the expert defining all probabilities! – Also dependent on the knowledge engineer interpreting and programming expert data
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.