Download presentation
Presentation is loading. Please wait.
Published byEdit Biróné Modified over 5 years ago
1
Probabilistic image processing and Bayesian network
Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University References K. Tanaka: Statistical-mechanical approach to image processing (Topical Review), J. Phys. A, vol.35, pp.R81-R150 (2002). K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: Accuracy of the Bethe approximation for hyperparameter estimation in probabilistic image processing, J. Phys. A, vol.37, pp (2004). 17-18 October, 2005 Tokyo Institute of Technology
2
Bayesian Network and Belief Propagation
Bayes Formula Probabilistic Model Probabilistic Information Processing Belief Propagation J. Pearl: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference (Morgan Kaufmann, 1988). C. Berrou and A. Glavieux: Near optimum error correcting coding and decoding: Turbo-codes, IEEE Trans. Comm., 44 (1996). 17-18 October, 2005 Tokyo Institute of Technology
3
Formulation of Belief Propagation
Link between belief propagation and statistical mechanics. Y. Kabashima and D. Saad, Belief propagation vs. TAP for decoding corrupted messages, Europhys. Lett. 44 (1998). M. Opper and D. Saad (eds), Advanced Mean Field Methods ---Theory and Practice (MIT Press, 2001). Generalized belief propagation J. S. Yedidia, W. T. Freeman and Y. Weiss: Constructing free-energy approximations and generalized belief propagation algorithms, IEEE Transactions on Information Theory, 51 (2005). Information geometrical interpretation of belief propagation S. Ikeda, T. Tanaka and S. Amari: Stochastic reasoning, free energy, and information geometry, Neural Computation, 16 (2004). 17-18 October, 2005 Tokyo Institute of Technology
4
Extension of Belief Propagation
Generalized Belief Propagation J. S. Yedidia, W. T. Freeman and Y. Weiss: Constructing free-energy approximations and generalized belief propagation algorithms, IEEE Transactions on Information Theory, 51 (2005). Generalized belief propagation is equivalent to the cluster variation method in statistical mechanics R. Kikuchi: A theory of cooperative phenomena, Phys. Rev., 81 (1951). T. Morita: Cluster variation method of cooperative phenomena and its generalization I, J. Phys. Soc. Jpn, 12 (1957). 17-18 October, 2005 Tokyo Institute of Technology
5
Application of Belief Propagation
Image Processing K. Tanaka: Statistical-mechanical approach to image processing (Topical Review), J. Phys. A, 35 (2002). A. S. Willsky: Multiresolution Markov Models for Signal and Image Processing, Proceedings of IEEE, 90 (2002). Low Density Parity Check Codes Y. Kabashima and D. Saad: Statistical mechanics of low-density parity-check codes (Topical Review), J. Phys. A, 37 (2004). S. Ikeda, T. Tanaka and S. Amari: Information geometry of turbo and low-density parity-check codes, IEEE Transactions on Information Theory, 50 (2004). CDMA Multiuser Detection Algorithm Y. Kabashima: A CDMA multiuser detection algorithm on the basis of belief propagation, J. Phys. A, 36 (2003). T. Tanaka and M. Okada: Approximate Belief propagation, density evolution, and statistical neurodynamics for CDMA multiuser detection, IEEE Transactions on Information Theory, 51 (2005). Satisfability Problem O. C. Martin, R. Monasson, R. Zecchina: Statistical mechanics methods and phase transitions in optimization problems, Theoretical Computer Science, 265 (2001). M. Mezard, G. Parisi, R. Zecchina: Analytic and algorithmic solution of random satisfability problems, Science, 297 (2002). 17-18 October, 2005 Tokyo Institute of Technology
6
Tokyo Institute of Technology
Contents Introduction Belief Propagation Belief Propagation and Cluster Variation Method Belief Propagation for Gaussian Graphical Model Cluster Variation Method for Gaussian Graphical Model Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks 17-18 October, 2005 Tokyo Institute of Technology
7
It is very hard to calculate exactly except some special cases.
Belief Propagation How should we treat the calculation of the summation over 2N configuration? It is very hard to calculate exactly except some special cases. Formulation for approximate algorithm Accuracy of the approximate algorithm 17-18 October, 2005 Tokyo Institute of Technology
8
Tokyo Institute of Technology
Tractable Model Probabilistic models with no loop are tractable. Factorizable Probabilistic models with loop are not tractable. Not Factorizable 17-18 October, 2005 Tokyo Institute of Technology
9
Probabilistic model on a graph with no loop
1 2 3 4 5 6 Marginal probability of the node 2 17-18 October, 2005 Tokyo Institute of Technology
10
Probabilistic model on a graph with no loop
1 3 1 2 3 4 5 6 4 1 1 1 2 3 4 17-18 October, 2005 Tokyo Institute of Technology
11
Probabilistic model on a graph with no loop
1 2 3 4 5 6 17-18 October, 2005 Tokyo Institute of Technology
12
Probabilistic model on a graph with no loop
1 2 3 4 5 6 Marginal probability can be expressed in terms of the product of messages from all the neighbouring nodes of node 2. Message from the node 1 to the node 2 can be expressed in terms of the product of message from all the neighbouring nodes of the node 1 except one from the node 2. 17-18 October, 2005 Tokyo Institute of Technology
13
Belief Propagation on a graph with no loop
17-18 October, 2005 Tokyo Institute of Technology
14
Probabilistic Model on a Graph with Loops
Marginal Probability 17-18 October, 2005 Tokyo Institute of Technology
15
Message Passing Rule of Belief Propagation
1 3 4 2 5 The reducibility conditions can be rewritten as the following fixed point equations. This fixed point equations is corresponding to the extremum condition of the Bethe free energy. And the fixed point equations can be numerically solved by using the natural iteration. The algorithm is corresponding to the loopy belief propagation. Fixed Point Equations for Massage 17-18 October, 2005 Tokyo Institute of Technology
16
Approximate Representation of Marginal Probability
1 4 2 5 3 In the Bethe approximation, the marginal probabilities are assumed to be the following form in terms of the messages from the neighboring pixels to the pixel. These marginal probabilities satisfy the reducibility conditions at each pixels and each nearest-neighbor pair of pixels. The messages are determined so as to satisfy the reducibility conditions. Fixed Point Equations for Messages 17-18 October, 2005 Tokyo Institute of Technology
17
Fixed Point Equation and Iterative Method
17-18 October, 2005 Tokyo Institute of Technology
18
Tokyo Institute of Technology
Contents Introduction Belief Propagation Belief Propagation and Cluster Variation Method Belief Propagation for Gaussian Graphical Model Cluster Variation Method for Gaussian Graphical Model Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks 17-18 October, 2005 Tokyo Institute of Technology
19
Kullback-Leibler divergence and Free Energy
17-18 October, 2005 Tokyo Institute of Technology
20
Free Energy and Cluster Variation Method
KL Divergence 17-18 October, 2005 Tokyo Institute of Technology
21
Free Energy and Cluster Variation Method
KL Divergence Free Energy Bethe Free Energy 17-18 October, 2005 Tokyo Institute of Technology
22
Basic Framework of Cluster Variation Method
17-18 October, 2005 Tokyo Institute of Technology
23
Basic Framework of Cluster Variation Method
Lagrange Multipliers to ensure the constraints 17-18 October, 2005 Tokyo Institute of Technology
24
Basic Framework of Cluster Variation Method
Extremum Condition 17-18 October, 2005 Tokyo Institute of Technology
25
Approximate Marginal Probability in Cluster Variation Method
Extremum Condition 1 4 2 5 3 1 4 5 3 2 6 8 7 In the Bethe approximation, the marginal probabilities are assumed to be the following form in terms of the messages from the neighboring pixels to the pixel. These marginal probabilities satisfy the reducibility conditions at each pixels and each nearest-neighbor pair of pixels. The messages are determined so as to satisfy the reducibility conditions. 17-18 October, 2005 Tokyo Institute of Technology
26
Cluster Variation Method and Belief Propagation
Message Update Rule 1 4 2 5 3 1 4 5 3 2 6 8 7 In the Bethe approximation, the marginal probabilities are assumed to be the following form in terms of the messages from the neighboring pixels to the pixel. These marginal probabilities satisfy the reducibility conditions at each pixels and each nearest-neighbor pair of pixels. The messages are determined so as to satisfy the reducibility conditions. 17-18 October, 2005 Tokyo Institute of Technology
27
Tokyo Institute of Technology
Contents Introduction Belief Propagation Belief Propagation and Cluster Variation Method Belief Propagation for Gaussian Graphical Model Cluster Variation Method for Gaussian Graphical Model Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks 17-18 October, 2005 Tokyo Institute of Technology
28
Gaussian Graphical Model
17-18 October, 2005 Tokyo Institute of Technology
29
Message Passing Rule of Belief Propagation
1 3 4 2 5 The reducibility conditions can be rewritten as the following fixed point equations. This fixed point equations is corresponding to the extremum condition of the Bethe free energy. And the fixed point equations can be numerically solved by using the natural iteration. The algorithm is corresponding to the loopy belief propagation. 17-18 October, 2005 Tokyo Institute of Technology
30
Message Passing Rule of Belief Propagation
The reducibility conditions can be rewritten as the following fixed point equations. This fixed point equations is corresponding to the extremum condition of the Bethe free energy. And the fixed point equations can be numerically solved by using the natural iteration. The algorithm is corresponding to the loopy belief propagation. 3 Fixed-Point Equations 4 1 2 Natural Iteration 5 17-18 October, 2005 Tokyo Institute of Technology
31
Message Passing Rule of Belief Propagation
The reducibility conditions can be rewritten as the following fixed point equations. This fixed point equations is corresponding to the extremum condition of the Bethe free energy. And the fixed point equations can be numerically solved by using the natural iteration. The algorithm is corresponding to the loopy belief propagation. 3 4 1 2 5 17-18 October, 2005 Tokyo Institute of Technology
32
Tokyo Institute of Technology
Contents Introduction Belief Propagation Belief Propagation and Cluster Variation Method Belief Propagation for Gaussian Graphical Model Cluster Variation Method for Gaussian Graphical Model Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks 17-18 October, 2005 Tokyo Institute of Technology
33
Kullback-Leibler Divergence of Gaussian Graphical Model
Entropy Term 17-18 October, 2005 Tokyo Institute of Technology
34
Cluster Variation Method
Trial Function Tractable Form 17-18 October, 2005 Tokyo Institute of Technology
35
Cluster Variation Method for Gaussian Graphical Model
Trial Function Marginal Distribution of GGM is also GGM 17-18 October, 2005 Tokyo Institute of Technology
36
Cluster Variation Method for Gaussian Graphical Model
Bethe Free Energy in GGM 17-18 October, 2005 Tokyo Institute of Technology
37
Cluster Variation Method for Gaussian Graphical Model
17-18 October, 2005 Tokyo Institute of Technology
38
Tokyo Institute of Technology
Iteration Procedure Fixed Point Equation Iteration 17-18 October, 2005 Tokyo Institute of Technology
39
Cluster Variation Method and TAP Free Energy
Loopy Belief Propagation TAP Free Energy Mean Field Free Energy 17-18 October, 2005 Tokyo Institute of Technology
40
Tokyo Institute of Technology
Contents Introduction Belief Propagation Belief Propagation and Cluster Variation Method Belief Propagation for Gaussian Graphical Model Cluster Variation Method for Gaussian Graphical Model Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks 17-18 October, 2005 Tokyo Institute of Technology
41
Bayesian Image Analysis
Noise Transmission Original Image Degraded Image 17-18 October, 2005 Tokyo Institute of Technology
42
Bayesian Image Analysis
Degradation Process Additive White Gaussian Noise Transmission Original Image Degraded Image 17-18 October, 2005 Tokyo Institute of Technology
43
Bayesian Image Analysis
A Priori Probability Generate Standard Images Similar? 17-18 October, 2005 Tokyo Institute of Technology
44
Bayesian Image Analysis
A Posteriori Probability Gaussian Graphical Model 17-18 October, 2005 Tokyo Institute of Technology
45
Bayesian Image Analysis
A Posteriori Probability Gaussian Graphical Model 17-18 October, 2005 Tokyo Institute of Technology
46
Bayesian Image Analysis
A Priori Probability Degraded Image Degraded Image Original Image Pixels A Posteriori Probability 17-18 October, 2005 Tokyo Institute of Technology
47
Hyperparameter Determination by Maximization of Marginal Likelihood
In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Marginalization Degraded Image Original Image Marginal Likelihood 17-18 October, 2005 Tokyo Institute of Technology
48
Tokyo Institute of Technology
Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Marginal Likelihood Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Incomplete Data Equivalent 17-18 October, 2005 Tokyo Institute of Technology
49
Iterate the following EM-steps until convergence: EM Algorithm
Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Marginal Likelihood Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Iterate the following EM-steps until convergence: EM Algorithm A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977). 17-18 October, 2005 Tokyo Institute of Technology
50
One-Dimensional Signal
127 255 100 200 Original Signal Degraded Signal Estimated Signal EM Algorithm 17-18 October, 2005 Tokyo Institute of Technology
51
Image Restoration by Gaussian Graphical Model
EM Algorithm with Belief Propagation Original Image Degraded Image MSE: 1512 MSE: 1529 17-18 October, 2005 Tokyo Institute of Technology
52
Exact Results of Gaussian Graphical Model
Multi-dimensional Gauss integral formula 17-18 October, 2005 Tokyo Institute of Technology
53
Tokyo Institute of Technology
Comparison of Belief Propagation with Exact Results in Gaussian Graphical Model MSE Belief Propagation 327 36.302 Exact 315 37.919 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE Belief Propagation 260 33.998 Exact 236 34.975 17-18 October, 2005 Tokyo Institute of Technology
54
Image Restoration by Gaussian Graphical Model
Original Image Degraded Image Belief Propagation Exact MSE: 1512 MSE: 325 MSE:315 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. Lowpass Filter Wiener Filter Median Filter MSE: 411 MSE: 545 MSE: 447 17-18 October, 2005 Tokyo Institute of Technology
55
Image Restoration by Gaussian Graphical Model
Original Image Degraded Image Belief Propagation Exact MSE: 1529 MSE: 260 MSE236 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. Lowpass Filter Wiener Filter Median Filter MSE: 224 MSE: 372 MSE: 244 17-18 October, 2005 Tokyo Institute of Technology
56
Extension of Belief Propagation
Generalized Belief Propagation J. S. Yedidia, W. T. Freeman and Y. Weiss: Constructing free-energy approximations and generalized belief propagation algorithms, IEEE Transactions on Information Theory, 51 (2005). Generalized belief propagation is equivalent to the cluster variation method in statistical mechanics R. Kikuchi: A theory of cooperative phenomena, Phys. Rev., 81 (1951). T. Morita: Cluster variation method of cooperative phenomena and its generalization I, J. Phys. Soc. Jpn, 12 (1957). 17-18 October, 2005 Tokyo Institute of Technology
57
Image Restoration by Gaussian Graphical Model
MSE Belief Propagation 327 36.302 Generalized Belief Propagation 315 37.909 Exact 37.919 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE Belief Propagation 260 33.998 Generalized Belief Propagation 236 34.971 Exact 34.975 17-18 October, 2005 Tokyo Institute of Technology
58
Image Restoration by Gaussian Graphical Model and Conventional Filters
MSE Belief Propagation 327 Lowpass Filter (3x3) 388 (5x5) 413 Generalized Belief Propagation 315 Median Filter 486 445 Exact Wiener Filter 864 548 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (3x3) Lowpass (5x5) Median (5x5) Wiener 17-18 October, 2005 Tokyo Institute of Technology
59
Image Restoration by Gaussian Graphical Model and Conventional Filters
MSE Belief Propagation 260 Lowpass Filter (3x3) 241 (5x5) 224 Generalized Belief Propagation 236 Median Filter 331 244 Exact Wiener Filter 703 372 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (5x5) Lowpass (5x5) Median (5x5) Wiener 17-18 October, 2005 Tokyo Institute of Technology
60
Tokyo Institute of Technology
Contents Introduction Belief Propagation Belief Propagation and Cluster Variation Method Belief Propagation for Gaussian Graphical Model Cluster Variation Method for Gaussian Graphical Model Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks 17-18 October, 2005 Tokyo Institute of Technology
61
Image Segmentation by Gauss Mixture Model
17-18 October, 2005 Tokyo Institute of Technology
62
Image Segmentation by Combining Gauss Mixture Model with Potts Model
Belief Propagation Potts Model 17-18 October, 2005 Tokyo Institute of Technology
63
Tokyo Institute of Technology
Image Segmentation Original Image Gauss Mixture Model Gauss Mixture Model and Potts Model Histogram Belief Propagation 17-18 October, 2005 Tokyo Institute of Technology
64
Tokyo Institute of Technology
Motion Detection a Segmentation b Detection AND c Segmentation Gauss Mixture Model and Potts Model with Belief Propagation 17-18 October, 2005 Tokyo Institute of Technology
65
Tokyo Institute of Technology
Contents Introduction Belief Propagation Belief Propagation and Cluster Variation Method Belief Propagation for Gaussian Graphical Model Cluster Variation Method for Gaussian Graphical Model Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks 17-18 October, 2005 Tokyo Institute of Technology
66
Tokyo Institute of Technology
Summary Formulation of belief propagation Accuracy of belief propagation in Bayesian image analysis by means of Gaussian graphical model (Comparison between the belief propagation and exact calculation) Some applications of Bayesian image analysis and belief propagation 17-18 October, 2005 Tokyo Institute of Technology
67
Tokyo Institute of Technology
Related Problem Statistical Performance Spin Glass Theory H. Nishimori: Statistical Physics of Spin Glasses and Information Processing: An Introduction, Oxford University Press, Oxford, 2001. 17-18 October, 2005 Tokyo Institute of Technology
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.