Download presentation
Presentation is loading. Please wait.
1
Bayes Nets Rong Jin
2
Hidden Markov Model Inferring from observations (o i ) to hidden variables (q i ) This is a general framework for representing and reasoning about uncertainty Representing uncertain information with random variables (nodes) Representing the relationship between information with conditional probability distribution (directed arcs) Infer from observation (shadowed nodes) to the hidden variables (circled nodes) q0q0 q1q1 q2q2 q3q3 q4q4 O0O0 O1O1 O3O3 O4O4 O2O2
3
An Example of Bayes Network S: It is sunny L: Ali arrives slightly late O: Slides are put on web late
4
Bayes Network Example SO L Absence of an arrow: Random S and O are independent. Knowing S will not help predicate O Two arrows into L: L depends on S and O. Knowing S and O will help predicate L.
5
Inference in Bayes Network S = 1, O = 0, P(L) = ? S = 1, P(O) = ?, P(L) = ? L = 1, P(S) = ?, P(O) = ? L = 1, S = 1, P(O) = ? SO L
6
Conditional Independence Formal definition: A and B are conditional independent given C iff Different from independence C A B Example: A: shoe size B: glove size C: heigh Shoe size is not independent from glove size
7
Distinguish Two Cases C A B A: shoe size B: glove size C: heigh SO L S: It is sunny L: Ali arrives slightly late O: Slides are put on web late Given C: A and B are independent Without C: A and B can be dependent Without L: S and O are independent Given L: S and O can be dependent
8
Another Example for Bayes Nets Inference questions W=1, P(R) =? W= 1, P(C) = ? W= 1, C = 1, P(S) = ?, P(C) = ?, P(S,R) = ? Cloudy Sprinkle Rain WetGrass
9
Bayes Nets Formalized A Bayes net (also called a belief network) is an augmented directed acyclic graph, represented by the pair V, E where: V is a set of vertices. E is a set of directed edges joining vertices. No loops of any length are allowed. Each vertex in V contains the following information: The name of a random variable A probability distribution table indicating how the probability of this variable’s values depends on all possible combinations of parental values.
10
Building a Bayes Net 1. Choose a set of relevant variables. 2. Choose an ordering for them 3. Assume they’re called X 1.. X m (where X 1 is the first in the ordering, X 1 is the second, etc) 4. For i = 1 to m: 1. Add the X i node to the network 2. Set Parents(X i ) to be a minimal subset of { X 1 …X i-1 } such that we have conditional independence of X i and all other members of { X 1 …X i-1 } given Parents(X i ) 3. Define the probability table of P(X i =k Assignments of Parents(X i ) ).
11
Example of Building Bayes Nets Suppose we’re building a nuclear power station. There are the following random variables: GRL : Gauge Reads Low. CTL : Core temperature is low. FG : Gauge is faulty. FA : Alarm is faulty AS : Alarm sounds If alarm working properly, the alarm is meant to sound if the gauge stops reading a low temp. If gauge working properly, the gauge is meant to read the temp of the core.
12
Bayes Net for Power Station CTL GRL AS FA FG GRL : Gauge Reads Low. CTL : Core temperature is low. FG : Gauge is faulty. FA : Alarm is faulty AS : Alarm sounds
13
Inference with Bayes Nets Key issue: computing joint probability P(X 1 = x 1 ^ X 2 =x 2 ^ ….X n-1 =x n-1 ^ X n =x n ) Using the conditional independence relations to simplify the computation
14
Example for Inference Inference questions W=1, P(R) =? W= 1, P(C) = ? W= 1, C = 1, P(S) = ?, P(C) = ?, P(S,R) = ? Cloudy Sprinkle Rain WetGrass
15
Problem with Inference using Bayes Nets Inference Infer from observations E O to unknown variables E u Suppose you have m binary-valued variables in your Bayes Net and expression E o mentions k variables. How much work is the above computation?
16
Problem with Inference using Bayes Nets General querying of Bayes nets is NP-complete. Some solutions: Belief propagation Take advantage of the structure of Bayes nets Stochastic simulation Similar to the sampling approaches for Bayesian average
17
More Interesting Questions Learning Bayes nets Given the topological structure of a Bayes net, learn all the conditional probability tables from examples Example: Hierarchical mixture model Learning the topological structure of Bayes net Very very hard question Unfortunately, the lecturer does not have enough knowledge to teach you if he wants to !
18
Learning Cond. Probabilities in Bayes Nets Three types of training examples 1. C, S, R, W 2. C, R, W 3. S, C, W Maximum likelihood approach for estimating the conditional probabilities EM algorithm for optimization Cloudy Sprinkle Rain WetGrass
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.