Download presentation
Presentation is loading. Please wait.
Published byJerome Cunningham Modified over 9 years ago
1
The Factor Graph Approach to Model-Based Signal Processing Hans-Andrea Loeliger
2
2 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion
3
3 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion
4
4 Introduction Engineers like graphical notation It allow to compose a wealth of nontrivial algorithms from tabulated “local” computational primitive
5
5 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion
6
6 Factor Graphs A factor graph represents the factorization of a function of several variables Using Forney-style factor graphs
7
7 Factor Graphs cont’d Example:
8
8 Factor Graphs cont’d (a)Forney-style factor graph (FFG); (b) factor graph as in [3]; (c) Bayesian network; (d) Markov random field (MRF)
9
9 Factor Graphs cont’d Advantages of FFGs: suited for hierarchical modeling compatible with standard block diagram simplest formulation of the summary- product message update rule natural setting for Forney’s result on FT and duality
10
10 Auxiliary Variables Let Y 1 and Y 2 be two independent observations of X:
11
11 Modularity and Special Symbols Let and with Z 1, Z 2 and X independent The “+”-nodes represent the factors and
12
12 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion
13
13 Computing Marginals Assume we wish to compute For example, assume that can be written as
14
14 Computing Marginals cont’d
15
15 Message Passing View cont’d
16
16 Sum-Product Rule The message out of some node/factor along some edge is formed as the product of and all incoming messages along all edges except, summed over all involved variables except
17
17 denotes the message in the direction of the arrow denotes the message in the opposite direction Arrows and Notation for Messages
18
18 Marginals and Output Edges
19
19 Max-Product Rule The message out of some node/factor along some edge is formed as the product of and all incoming messages along all edges except, maximized over all involved variables except
20
20 Message of the form: Arrow notation: / is parameterized by mean / and variance / Scalar Gaussian Message
21
21 Scalar Gaussian Computation Rules
22
22 Vector Gaussian Messages Message of the form: Message is parameterized either by mean vector m and covariance matrix V=W -1 or by W and Wm
23
23 Vector Gaussian Messages cont’d Arrow notation: is parameterized by and or by and Marginal: is the Gaussian with mean and covariance matrix
24
24 Single Edge Quantities
25
25 Elementary Nodes
26
26 Matrix Multiplication Node
27
27 Composite Blocks
28
28 Reversing a Matrix Multiplication
29
29 Combinations
30
30 General Linear State Space Model
31
31 General Linear State Space Model If is nonsingular and - forward and - backward If is singular and - forward and - backward Cont’d
32
32 General Linear State Space Model By combining the forward version with backward version, we can get Cont’d
33
33 Gaussian to Binary
34
34 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion
35
35 Message Types A key issue with all message passing algorithms is the representation of messages for continuous variables The following message types are widely applicable Quantization of continuous variables Function value and gradient List of samples
36
36 Message Types cont’d All these message types, and many different message computation rules, can coexist in large system models SD and EM are two example of message computation rules beyond the sum-product and max-product rules
37
37 LSSM with Unknown Vector C
38
38 Steep Descent as Message Passing Suppose we wish to find
39
39 Steep Descent as Message Passing Steepest descent: where s is a positive step-size parameter Cont’d
40
40 Steep Descent as Message Passing Gradient messages: Cont’d
41
41 Steep Descent as Message Passing Cont’d
42
42 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion
43
43 Conclusion The factor graph approach to signal processing involves the following steps: 1)Choose a factor graph to represent the system model 2)Choose the message types and suitable message computation rules 3)Choose a message update schedules
44
44 Reference [1] H.-A. Loeliger, et al., “The factor graph approach to model- based signal processing” [2] H.-A. Loeliger, “An introduction to factor graphs,” IEEE Signal Proc. Mag., Jan. 2004, pp.28-41 [3] F.R. Kschischang, B.J. Fery, and H.-A. Loeliger, “Factor graphs and the sum-product algorithm,” IEEE Trans. Inform. Theory, vol. 47, pp.498-519
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.