The Factor Graph Approach to Model-Based Signal Processing Hans-Andrea Loeliger.

Slides:



Advertisements
Similar presentations
Introduction to Graphical Models Brookes Vision Lab Reading Group.
Advertisements

Bayesian Belief Propagation
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
ICCV 2007 tutorial Part III Message-passing algorithms for energy minimization Vladimir Kolmogorov University College London.
Dynamic Bayesian Networks (DBNs)
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
CHAPTER ONE Matrices and System Equations
Belief Propagation on Markov Random Fields Aggeliki Tsoli.
Hidden Markov Models M. Vijay Venkatesh. Outline Introduction Graphical Model Parameterization Inference Summary.
A Graphical Model For Simultaneous Partitioning And Labeling Philip Cowans & Martin Szummer AISTATS, Jan 2005 Cambridge.
Paper Discussion: “Simultaneous Localization and Environmental Mapping with a Sensor Network”, Marinakis et. al. ICRA 2011.
Parameter Estimation: Maximum Likelihood Estimation Chapter 3 (Duda et al.) – Sections CS479/679 Pattern Recognition Dr. George Bebis.
An Optimal Learning Approach to Finding an Outbreak of a Disease Warren Scott Warren Powell
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
1 Hidden Markov Model Instructor : Saeed Shiry  CHAPTER 13 ETHEM ALPAYDIN © The MIT Press, 2004.
1 Bayesian Restoration Using a New Nonstationary Edge-Preserving Image Prior Giannis K. Chantas, Nikolaos P. Galatsanos, and Aristidis C. Likas IEEE Transactions.
. Class 5: Hidden Markov Models. Sequence Models u So far we examined several probabilistic model sequence models u These model, however, assumed that.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Today Wrap up of probability Vectors, Matrices. Calculus
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
Computer vision: models, learning and inference
Factor Graphs Young Ki Baik Computer Vision Lab. Seoul National University.
A Brief Introduction to Graphical Models
Introduction to Adaptive Digital Filters Algorithms
Gaussian Mixture Model and the EM algorithm in Speech Recognition
Square n-by-n Matrix.
CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep
EM and expected complete log-likelihood Mixture of Experts
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
Probabilistic Graphical Models
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Ahmed Osama Research Assistant. Presentation Outline Winc- Nile University- Privacy Preserving Over Network Coding 2  Introduction  Network coding 
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
Non-Bayes classifiers. Linear discriminants, neural networks.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
11 1 Backpropagation Multilayer Perceptron R – S 1 – S 2 – S 3 Network.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Chapter 3: Maximum-Likelihood Parameter Estimation l Introduction l Maximum-Likelihood Estimation l Multivariate Case: unknown , known  l Univariate.
CS Statistical Machine learning Lecture 24
Lecture 2: Statistical learning primer for biologists
Chapter 8: Adaptive Networks
Kalman Filtering And Smoothing
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
8.5.3 – Unit Vectors, Linear Combinations. In the case of vectors, we have a special vector known as the unit vector – Unit Vector = any vector with a.
Pattern Recognition and Machine Learning
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Hidden Markov Models Achim Tresch MPI for Plant Breedging Research & University of Cologne.
CS479/679 Pattern Recognition Dr. George Bebis
Chapter 3: Maximum-Likelihood Parameter Estimation
Today.
Computing Gradient Hung-yi Lee 李宏毅
Menglong Li Ph.d of Industrial Engineering Dec 1st 2016
Hidden Markov Models Part 2: Algorithms
Lecture 7. Learning (IV): Error correcting Learning and LMS Algorithm
Input Output HMMs for modeling network dynamics
Markov Random Fields Presented by: Vladan Radosavljevic.
Backpropagation.
A Fast Fixed-Point Algorithm for Independent Component Analysis
More Parameter Learning, Multinomial and Continuous Variables
Expectation-Maximization & Belief Propagation
LECTURE 15: REESTIMATION, EM AND MIXTURES
Biointelligence Laboratory, Seoul National University
Presentation transcript:

The Factor Graph Approach to Model-Based Signal Processing Hans-Andrea Loeliger

2 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion

3 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion

4 Introduction Engineers like graphical notation It allow to compose a wealth of nontrivial algorithms from tabulated “local” computational primitive

5 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion

6 Factor Graphs A factor graph represents the factorization of a function of several variables Using Forney-style factor graphs

7 Factor Graphs cont’d Example:

8 Factor Graphs cont’d (a)Forney-style factor graph (FFG); (b) factor graph as in [3]; (c) Bayesian network; (d) Markov random field (MRF)

9 Factor Graphs cont’d Advantages of FFGs: suited for hierarchical modeling compatible with standard block diagram simplest formulation of the summary- product message update rule natural setting for Forney’s result on FT and duality

10 Auxiliary Variables Let Y 1 and Y 2 be two independent observations of X:

11 Modularity and Special Symbols Let and with Z 1, Z 2 and X independent The “+”-nodes represent the factors and

12 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion

13 Computing Marginals Assume we wish to compute For example, assume that can be written as

14 Computing Marginals cont’d

15 Message Passing View cont’d

16 Sum-Product Rule The message out of some node/factor along some edge is formed as the product of and all incoming messages along all edges except, summed over all involved variables except

17 denotes the message in the direction of the arrow denotes the message in the opposite direction Arrows and Notation for Messages

18 Marginals and Output Edges

19 Max-Product Rule The message out of some node/factor along some edge is formed as the product of and all incoming messages along all edges except, maximized over all involved variables except

20 Message of the form: Arrow notation: / is parameterized by mean / and variance / Scalar Gaussian Message

21 Scalar Gaussian Computation Rules

22 Vector Gaussian Messages Message of the form: Message is parameterized either by mean vector m and covariance matrix V=W -1 or by W and Wm

23 Vector Gaussian Messages cont’d Arrow notation: is parameterized by and or by and Marginal: is the Gaussian with mean and covariance matrix

24 Single Edge Quantities

25 Elementary Nodes

26 Matrix Multiplication Node

27 Composite Blocks

28 Reversing a Matrix Multiplication

29 Combinations

30 General Linear State Space Model

31 General Linear State Space Model If is nonsingular and - forward and - backward If is singular and - forward and - backward Cont’d

32 General Linear State Space Model By combining the forward version with backward version, we can get Cont’d

33 Gaussian to Binary

34 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion

35 Message Types A key issue with all message passing algorithms is the representation of messages for continuous variables The following message types are widely applicable Quantization of continuous variables Function value and gradient List of samples

36 Message Types cont’d All these message types, and many different message computation rules, can coexist in large system models SD and EM are two example of message computation rules beyond the sum-product and max-product rules

37 LSSM with Unknown Vector C

38 Steep Descent as Message Passing Suppose we wish to find

39 Steep Descent as Message Passing Steepest descent: where s is a positive step-size parameter Cont’d

40 Steep Descent as Message Passing Gradient messages: Cont’d

41 Steep Descent as Message Passing Cont’d

42 Outline Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion

43 Conclusion The factor graph approach to signal processing involves the following steps: 1)Choose a factor graph to represent the system model 2)Choose the message types and suitable message computation rules 3)Choose a message update schedules

44 Reference [1] H.-A. Loeliger, et al., “The factor graph approach to model- based signal processing” [2] H.-A. Loeliger, “An introduction to factor graphs,” IEEE Signal Proc. Mag., Jan. 2004, pp [3] F.R. Kschischang, B.J. Fery, and H.-A. Loeliger, “Factor graphs and the sum-product algorithm,” IEEE Trans. Inform. Theory, vol. 47, pp