Download presentation
Presentation is loading. Please wait.
Published byValentine Morris Modified over 9 years ago
1
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty
2
Uncertainty Traditional models of reasoning (human; computer) use “all-or-nothing” (discrete) variables and rules: Hungry(Fido) Toothache(Simon) Toothache(X) → Cavity(X) Reality is usually more complicated: Toothache(X) → Cavity(X) 70% of the time Toothache(X) → Gingivitis(X) 20% of the time
3
Uncertainty In general, all-or-nothing rules fail for three reasons: 1. Laziness – we don't have enough time or resources list all such rules for a given domain. 2. Theoretical ignorance - we don't have a complete theory of the domain. 3. Practical ignorance – even with all rules and perfect theory, we can't make the necessary observations.
4
Uncertainty and Rational Decisions Utility – how useful is a particular outcome to the agent? Probability – how likely is a particular outcome? Utility + Probability = Decision theory E.g., Lottery: High utility ($$$) x extremely low probability → bad decision !
5
Basic Probability Prior probability – how likely is something, without any other knowledge? P(cavity) = 0.05 Conditional (posterior) probability – how likely is something, once you know something else? P(toothache|cavity) = 0.7 Product Rule: P(A|B) = P(A & B) / P(B) P(A &B) = P(A|B) * P(B) = P(B|A)* P(A)
6
Basic Probability Probability Distribution: All possible values of a given variable, and their probabilities (sum = 1): cavity=0.8; gingivitis = 0.1; abcess = 0.05; ? = 0.05 Joint probability: How likely is it that two things occur (are observed) together? rainy & cloudy = 0.3; cloudy & cool = 0.4
7
Axioms of Probability 1) All probabilities are between 0 and 1. 2) Necessarily true propositions (A V ~A) have prob 1; necessarily false (A & ~A) have prob. 0. 3) P (A V B) = P (A) + P (B) – P (A & B)
8
Axioms of Probability P (A V B) = P (A) + P (B) – P (A & B) A B A & B E.g., in Los Angeles, maybe P(sunny) = 0.8 ; P(warm) = 0.7. Since P is always less than 1, can't just add 08 + 0.7 to get P(sunny V warm). Need to subtract P(sunny & warm) = 0.6 to get P(sunny V warm) = 0.9.
9
Bayes’ Rule From the Product Rule: P(A|B) = P(A & B) / P(B) P(A &B) = P(A|B) * P(B) = P(B|A) * P(A) Rev. Thomas Bayes (1702-1761) We derive Bayes’ Rule by substitution: P(A|B) = P(A & B) / P(B) = P(B|A) * P(A) / P(B)
10
Bayesian (“Belief”) Nets Burglary Earthquake Alarm JohnCalls MaryCalls AP(J) T.90 F.05 AP(M) T.70 F.01 BEP(A) TT.95 TF.94 FT.29 FF.001 P(B).001 P(E).002
11
Bayesian Nets Using recently developed techniques (Pearl 1982), we can ask, e.g., “how likely is there to be a burglary, given that John has called?” Can also learn relationships, creating “hidden” variables and probability tables, based on observations. Current “hot topic” in AI.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.