Download presentation
Presentation is loading. Please wait.
Published byEthelbert Gibbs Modified over 9 years ago
1
Bayesian Network By DengKe Dong
2
Key Points Today Intro to Graphical Model Conditional Independence Intro to Bayesian Network Reasoning BN: D-Separation Inference In Bayesian Network Belief Propagation Learning in Bayesian Network
3
Intro to Graphical Model Two types of graphical models: -- Directed graphs (aka Bayesian Networks) -- Undirected graphs (aka Markov Random Fields) Graphical Structure Plus associated parameters define joint probability distribution over of variables / nodes
4
Key Points Today Intro to Graphical Model Conditional Independence Intro to Bayesian Network Reasoning BN: D-Separation Inference In Bayesian Network Belief Propagation Learning in Bayesian Network
5
Conditional Independence Definition X is conditionally independent of Y given Z, if the probability distribution governing X is independent of the value of Y, given the value of Z we denote it as: P (X ⊥ Y | Z), if
6
Conditional Independence Condition on its parents A conditional probability distribution ( CPD ) is associated with each node N, defining as: P(N | Parents(N)) where the function Parents(N) returns the set of N’s immediate parents
7
Key Points Today Intro to Graphical Model Conditional Independence Intro to Bayesian Network Reasoning BN: D-Separation Inference In Bayesian Network Belief Propagation Learning in Bayesian Network
8
Bayesian Network Definition Definition: A directed acyclic graph defining a joint probability distribution over a set of variables, where each node denotes a random variable, and each edge denotes the dependence between the connected nodes. for example :
9
Bayesian Network Definition Conditional Independencies in Bayesian Network: Each node is conditionally independent of its non-descendents, given only its immediate parents, So, the joint distribution over all variables in the network is defined in terms of these CPD’s, plus the graph.
10
Bayesian Network Definition example: Chain rules for Probability: P(S,L,R,T,W) = P(S)P(L|S)P(R|S,L)P(T|S,L,R)P(W|S,L,R,T) CPD for each node Xi describing as P(Xi | Pa(Xi)): P(S,L,R,T,W) = P(S)P(L|S)P(R|S)P(T|L)P(W|L,R) So, in a Bayes net
11
Bayesian Network Definition Construction Choose an ordering over variables, e.g. X1, X2, …, Xn For i=1 to n Add Xi to the network Select parents Pa(Xi) as minimal subset of X1…Xi-1, such that Notice this choice of parents assures
12
Key Points Today Intro to Graphical Model Conditional Independence Intro to Bayesian Network Reasoning BN: D-Separation Inference In Bayesian Network Belief Propagation Learning in Bayesian Network
13
Reasoning BN: D-Separation Conditional Independence, Revisited We said: -- Each node is conditionally independent of its non-descendents, given its immediate parents Does this rule given us all of the conditional independence relations implied by the Bayes Network ? a.NO b.E.g., X1 and X4 are conditionally independent given {X2, X3} c.But X1 and X4 not conditionally independent given X3 d.For this, we need to understand D-separation …
14
Reasoning BN: D-Separation Three example to understand D-Separation Head to Tail Tail to Tail Head to Head
15
Reasoning BN: D-Separation
18
X and Y are conditionally independent given Z, if and only if X and Y are D-separated by Z Suppose we have three sets of random variables: X, Y and Z X and Y are D-separated by Z (and therefore conditionally independence, given Z) iff every path from any variable in X to any variable in Y is blocked A path from variable A to variable B is blocked if it includes a node such that either arrows on the path meet either head-to-tail or tail-to-tail at the node and this node is in Z the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, is in Z
19
Key Points Today Intro to Graphical Model Conditional Independence Intro to Bayesian Network Reasoning BN: D-Separation Inference In Bayesian Network Belief Propagation Learning in Bayesian Network
20
Inference In Bayesian Network In general, intractable (NP-Complete) For certain case, tractable Assigning probability to fully observed set of variables Or if just one variable unobserved Or for singly connected graphs (ie., no undirected loops) Variable elimination Belief propagation For multiply connected graphs Junction tree Sometimes use Monte Carlo method Generate many samples according to the Bayes Net distribution, then count up the results Variational methods for tractable approximate solutions
21
Inference In Bayesian Network Prob. of Joint assignment: easy Suppose we are interested in joint assignment What is P(f,a,s,h,n)?
22
Inference In Bayesian Network Prob. of Joint assignment: easy Suppose we are interested in joint assignment What is P(f,a,s,h,n)?
23
Inference In Bayesian Network Prob. of marginals: not so easy How do we calculate P(N = n) ?
24
Inference In Bayesian Network
26
Inference On a Chain Converting Directed to Undirected Graphs
27
Inference On a Chain Converting Directed to Undirected Graphs
28
Inference On a Chain Compute the marginals
29
Inference On a Chain Compute the marginals
30
Inference On a Chain Compute the marginals
31
Inference On a Chain Compute the marginals
32
Inference On a Chain
33
Key Points Today Intro to Graphical Model Conditional Independence Intro to Bayesian Network Reasoning BN: D-Separation Inference In Bayesian Network Belief Propagation Learning in Bayesian Network
34
Belief Propagation 基于 贝 叶斯网 络、 MRF’s 、 因子 图 的置信 传 播算法已分别被开 发 出来 , 基于 不同模型的置信 传 播算法在数学上是等价的 。 从叙述 简 洁的角度考 虑,这 里 主要介 绍 基于 MRF’s 的 标 准置信 传 播算法 。
35
Belief Propagation
39
在 实际计 算中 , belief 从 图边缘 的 结 点开始 计 算 , 并且只 更新所有必需信息都已知的信息 。 利用公式 1 和公式 2 , 依次 计 算每个 结 点的 belief 。 对 于无 环 的 MRF’s , 通常情况下 , 每个信息只需 计 算一次 , 极大地提升了 计 算效率
40
Key Points Today Intro to Graphical Model Conditional Independence Intro to Bayesian Network Reasoning BN: D-Separation Inference In Bayesian Network Belief Propagation Learning in Bayesian Network
41
Thanks
42
Learning in Bayesian Network What you should know
43
Learning in Bayesian Network
44
Learning CPTs from Fully Observed Data
45
MLE estimate of from fully observed data
46
Learning in Bayesian Network
49
EM Algorithm
53
Using Unlabeled Data to Help Train Naïve Bayes Classifier
57
Summary Intro to Graphical Model Conditional Independence Intro to Bayesian Network Reasoning BN: D-Separation Inference In Bayesian Network Belief Propagation Learning in Bayesian Network
58
Thanks
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.