Download presentation
Presentation is loading. Please wait.
Published byRodney Ford Modified over 9 years ago
1
Section 2 Appendix Tensor Notation for BP 1
2
In section 2, BP was introduced with a notation which defined messages and beliefs as functions. This Appendix includes an alternate (and very concise) notation for the Belief Propagation algorithm using tensors.
3
Tensor Notation Tensor multiplication: Tensor marginalization: 3
4
Tensor Notation 4 A real function with r keyword arguments Axis-labeled array with arbitrary indices Database with column headers A rank-r tensor is… == X 13 25 XYvalue 1red12 2red20 1blue18 2blue30 Yvalu e red4 blue6 Tensor multiplication: (vector outer product) Y redblue X 11218 22030 Y red4 blue6 Xvalu e 13 25
5
Tensor Notation 5 A real function with r keyword arguments Axis-labeled array with arbitrary indices Database with column headers A rank-r tensor is… == X a3 b5 Xvalu e a4 b6 Tensor multiplication: (vector pointwise product) Xvalu e a3 b5 X a4 b6 X a12 b30 Xvalu e a12 b30
6
Tensor Notation 6 A real function with r keyword arguments Axis-labeled array with arbitrary indices Database with column headers A rank-r tensor is… == Tensor multiplication: (matrix-vector product) Xvalu e 17 28 X 17 28 Y red blue X 134 256 Y red blue X 12128 24048 XY value 1red3 2 5 1blu e 4 2 6 XY value 1red21 2red40 1blu e 28 2blu e 48
7
Tensor Notation 7 A real function with r keyword arguments Axis-labeled array with arbitrary indices Database with column headers A rank-r tensor is… == Y redblue X 134 256 Y redblue 810 XYvalue 1red3 2 5 1blue4 2 6 Yvalu e red8 blue10 Tensor marginalization:
8
Input: a factor graph with no cycles Output: exact marginals for each variable and factor Algorithm: 1.Initialize the messages to the uniform distribution. 1.Choose a root node. 2.Send messages from the leaves to the root. Send messages from the root to the leaves. 1.Compute the beliefs (unnormalized marginals). 2.Normalize beliefs and return the exact marginals. Sum-Product Belief Propagation 8
9
9 Beliefs Messages VariablesFactors X2X2 ψ1ψ1 X1X1 X3X3 X1X1 ψ2ψ2 ψ3ψ3 ψ1ψ1 X1X1 ψ2ψ2 ψ3ψ3 ψ1ψ1 X2X2 ψ1ψ1 X1X1 X3X3
10
Sum-Product Belief Propagation 10 Beliefs Messages VariablesFactors X2X2 ψ1ψ1 X1X1 X3X3 X1X1 ψ2ψ2 ψ3ψ3 ψ1ψ1 X1X1 ψ2ψ2 ψ3ψ3 ψ1ψ1 X2X2 ψ1ψ1 X1X1 X3X3
11
X1X1 ψ2ψ2 ψ3ψ3 ψ1ψ1 Sum-Product Belief Propagation 11 v.4 n 6 p 0 Variable Belief
12
X1X1 ψ2ψ2 ψ3ψ3 ψ1ψ1 Sum-Product Belief Propagation 12 Variable Message
13
Sum-Product Belief Propagation 13 Factor Belief ψ1ψ1 X1X1 X3X3 vn p 3.26.4 d 180 n 00
14
Sum-Product Belief Propagation 14 Factor Message ψ1ψ1 X1X1 X3X3
15
Input: a factor graph with cycles Output: approximate marginals for each variable and factor Algorithm: 1.Initialize the messages to the uniform distribution. 1.Send messages until convergence. Normalize them when they grow too large. 1.Compute the beliefs (unnormalized marginals). 2.Normalize beliefs and return the approximate marginals. Loopy Belief Propagation 15
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.