Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes’ Nets 10/13/2009 Dan Klein – UC Berkeley.

Similar presentations


Presentation on theme: "CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes’ Nets 10/13/2009 Dan Klein – UC Berkeley."— Presentation transcript:

1 CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes’ Nets 10/13/2009 Dan Klein – UC Berkeley

2 Announcements  Assignments  P3 due yesterday  W2 due Thursday  W1 returned in front (after lecture)  Midterm  10/22, 6-9pm, 155 Dwinelle  Prep page / details will be posted on web soon  One sheet of notes allowed  Basic calculators allowed, not required  Review sessions TBA 2

3 Probabilistic Models  Models describe how (a portion of) the world works  Models are always simplifications  May not account for every variable  May not account for all interactions between variables  “All models are wrong; but some are useful.” – George E. P. Box  What do we do with probabilistic models?  We (or our agents) need to reason about unknown variables, given evidence  Example: explanation (diagnostic reasoning)  Example: prediction (causal reasoning)  Example: value of information 3

4 Probabilistic Models  A probabilistic model is a joint distribution over a set of variables  Given a joint distribution, we can reason about unobserved variables given observations (evidence)  General form of a query:  This kind of posterior distribution is also called the belief function of an agent which uses this model Stuff you care about Stuff you already know 4

5 Model for Ghostbusters TBGP(T,B,G) +t+b+g0.16 +t+b gg 0.16 +t bb +g0.24 +t bb gg 0.04  t +b+g0.04 tt +b gg 0.24 tt bb +g0.06 tt bb gg  Reminder: ghost is hidden, sensors are noisy  T: Top sensor is red B: Bottom sensor is red G: Ghost is in the top  Queries: P( +g) = ?? P( +g | +t) = ?? P( +g | +t, -b) = ??  Problem: joint distribution too large / complex Joint Distribution

6 Independence  Two variables are independent if:  This says that their joint distribution factors into a product two simpler distributions  Another form:  We write:  Independence is a simplifying modeling assumption  Empirical joint distributions: at best “close” to independent  What could we assume for {Weather, Traffic, Cavity, Toothache}? 6

7 Example: Independence  N fair, independent coin flips: h0.5 t h t h t 7

8 Example: Independence? TWP warmsun0.4 warmrain0.1 coldsun0.2 coldrain0.3 TWP warmsun0.3 warmrain0.2 coldsun0.3 coldrain0.2 TP warm0.5 cold0.5 WP sun0.6 rain0.4 8

9 Conditional Independence  P(Toothache, Cavity, Catch)  If I have a cavity, the probability that the probe catches in it doesn't depend on whether I have a toothache:  P(+catch | +toothache, +cavity) = P(+catch | +cavity)  The same independence holds if I don’t have a cavity:  P(+catch | +toothache,  cavity) = P(+catch|  cavity)  Catch is conditionally independent of Toothache given Cavity:  P(Catch | Toothache, Cavity) = P(Catch | Cavity)  Equivalent statements:  P(Toothache | Catch, Cavity) = P(Toothache | Cavity)  P(Toothache, Catch | Cavity) = P(Toothache | Cavity) P(Catch | Cavity)  One can be derived from the other easily 9

10 Conditional Independence  Unconditional (absolute) independence very rare (why?)  Conditional independence is our most basic and robust form of knowledge about uncertain environments:  What about this domain:  Traffic  Umbrella  Raining  What about fire, smoke, alarm? 10

11 The Chain Rule  Trivial decomposition:  With assumption of conditional independence:  Bayes’ nets / graphical models help us express conditional independence assumptions 11

12 Ghostbusters Chain Rule TBGP(T,B,G) +t+b+g0.16 +t+b gg 0.16 +t bb +g0.24 +t bb gg 0.04  t +b+g0.04 tt +b gg 0.24 tt bb +g0.06 tt bb gg  Each sensor depends only on where the ghost is  That means, the two sensors are conditionally independent, given the ghost position  T: Top square is red B: Bottom square is red G: Ghost is in the top  Givens: P( +g ) = 0.5 P( +t | +g ) = 0.8 P( +t |  g ) = 0.4 P( +b | +g ) = 0.4 P( +b |  g ) = 0.8 P(T,B,G) = P(G) P(T|G) P(B|G)

13 Bayes’ Nets: Big Picture  Two problems with using full joint distribution tables as our probabilistic models:  Unless there are only a few variables, the joint is WAY too big to represent explicitly  Hard to learn (estimate) anything empirically about more than a few variables at a time  Bayes’ nets: a technique for describing complex joint distributions (models) using simple, local distributions (conditional probabilities)  More properly called graphical models  We describe how variables locally interact  Local interactions chain together to give global, indirect interactions  For about 10 min, we’ll be vague about how these interactions are specified 13

14 Example Bayes’ Net: Insurance 14

15 Example Bayes’ Net: Car 15

16 Graphical Model Notation  Nodes: variables (with domains)  Can be assigned (observed) or unassigned (unobserved)  Arcs: interactions  Similar to CSP constraints  Indicate “direct influence” between variables  Formally: encode conditional independence (more later)  For now: imagine that arrows mean direct causation (in general, they don’t!) 16

17 Example: Coin Flips X1X1 X2X2 XnXn  N independent coin flips  No interactions between variables: absolute independence 17

18 Example: Traffic  Variables:  R: It rains  T: There is traffic  Model 1: independence  Model 2: rain causes traffic  Why is an agent using model 2 better? R T 18

19 Example: Traffic II  Let’s build a causal graphical model  Variables  T: Traffic  R: It rains  L: Low pressure  D: Roof drips  B: Ballgame  C: Cavity 19

20 Example: Alarm Network  Variables  B: Burglary  A: Alarm goes off  M: Mary calls  J: John calls  E: Earthquake! 20

21 Bayes’ Net Semantics  Let’s formalize the semantics of a Bayes’ net  A set of nodes, one per variable X  A directed, acyclic graph  A conditional distribution for each node  A collection of distributions over X, one for each combination of parents’ values  CPT: conditional probability table  Description of a noisy “causal” process A1A1 X AnAn A Bayes net = Topology (graph) + Local Conditional Probabilities 21

22 Probabilities in BNs  Bayes’ nets implicitly encode joint distributions  As a product of local conditional distributions  To see what probability a BN gives to a full assignment, multiply all the relevant conditionals together:  Example:  This lets us reconstruct any entry of the full joint  Not every BN can represent every joint distribution  The topology enforces certain conditional independencies 22

23 Example: Coin Flips h0.5 t h t h t X1X1 X2X2 XnXn Only distributions whose variables are absolutely independent can be represented by a Bayes’ net with no arcs. 23

24 Example: Traffic R T +r1/4 rr 3/4 +r +t3/4 tt 1/4 rr +t1/2 tt 24

25 Example: Alarm Network Burglary Earthqk Alarm John calls Mary calls BP(B) +b0.001 bb 0.999 EP(E) +e0.002 ee 0.998 BEAP(A|B,E) +b+e+a0.95 +b+e aa 0.05 +b ee +a0.94 +b ee aa 0.06 bb +e+a0.29 bb +e aa 0.71 bb ee +a0.001 bb ee aa 0.999 AJP(J|A) +a+j0.9 +a jj 0.1 aa +j0.05 aa jj 0.95 AMP(M|A) +a+m0.7 +a mm 0.3 aa +m0.01 aa mm 0.99

26 Bayes’ Nets  So far: how a Bayes’ net encodes a joint distribution  Next: how to answer queries about that distribution  Key idea: conditional independence  Last class: assembled BNs using an intuitive notion of conditional independence as causality  Today: formalize these ideas  Main goal: answer queries about conditional independence and influence  After that: how to answer numerical queries (inference) 26

27 Example: Traffic  Causal direction R T r1/4 rr 3/4 r t tt 1/4 rr t1/2 tt r t3/16 r tt 1/16 rr t6/16 rr tt 27

28 Example: Reverse Traffic  Reverse causality? T R t9/16 tt 7/16 t r1/3 rr 2/3 tt r1/7 rr 6/7 r t3/16 r tt 1/16 rr t6/16 rr tt 28

29 Causality?  When Bayes’ nets reflect the true causal patterns:  Often simpler (nodes have fewer parents)  Often easier to think about  Often easier to elicit from experts  BNs need not actually be causal  Sometimes no causal net exists over the domain (especially if variables are missing)  E.g. consider the variables Traffic and Drips  End up with arrows that reflect correlation, not causation  What do the arrows really mean?  Topology may happen to encode causal structure  Topology really encodes conditional independence 29

30 Example: Naïve Bayes  Imagine we have one cause y and several effects x:  This is a naïve Bayes model  We’ll use these for classification later 30

31 Example: Alarm Network 31

32 The Chain Rule  Can always factor any joint distribution as an incremental product of conditional distributions  Why is the chain rule true?  This actually claims nothing…  What are the sizes of the tables we supply? 32

33 Example: Alarm Network Burglary Earthquake Alarm John calls Mary calls


Download ppt "CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes’ Nets 10/13/2009 Dan Klein – UC Berkeley."

Similar presentations


Ads by Google