Graduate School of Information Sciences, Tohoku University, Japan

Slides:



Advertisements
Similar presentations
Part 2: Unsupervised Learning
Advertisements

Discrete Optimization Lecture 4 – Part 3 M. Pawan Kumar Slides available online
Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
3 March, 2003University of Glasgow1 Statistical-Mechanical Approach to Probabilistic Inference --- Cluster Variation Method and Generalized Loopy Belief.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
1 Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan.
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Understanding Belief Propagation and its Applications Dan Yuan June 2004.
Super-Resolution of Remotely-Sensed Images Using a Learning-Based Approach Isabelle Bégin and Frank P. Ferrie Abstract Super-resolution addresses the problem.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
Some Surprises in the Theory of Generalized Belief Propagation Jonathan Yedidia Mitsubishi Electric Research Labs (MERL) Collaborators: Bill Freeman (MIT)
1 物理フラクチュオマティクス論 Physical Fluctuomatics 応用確率過程論 Applied Stochastic Process 第 5 回グラフィカルモデルによる確率的情報処理 5th Probabilistic information processing by means of.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Appendix Kazuyuki Tanaka Graduate School of Information.
1 October, 2007 ALT&DS2007 (Sendai, Japan ) 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate School of Information.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
3 September, 2009 SSP2009, Cardiff, UK 1 Probabilistic Image Processing by Extended Gauss-Markov Random Fields Kazuyuki Tanaka Kazuyuki Tanaka, Muneki.
Image Restoration using Iterative Wiener Filter --- ECE533 Project Report Jing Liu, Yan Wu.
Physics Fluctuomatics / Applied Stochastic Process (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation Kazuyuki.
28 February, 2003University of Glasgow1 Cluster Variation Method and Probabilistic Image Processing -- Loopy Belief Propagation -- Kazuyuki Tanaka Graduate.
10 December, 2008 CIMCA2008 (Vienna) 1 Statistical Inferences by Gaussian Markov Random Fields on Complex Networks Kazuyuki Tanaka, Takafumi Usui, Muneki.
September 2007 IW-SMI2007, Kyoto 1 A Quantum-Statistical-Mechanical Extension of Gaussian Mixture Model Kazuyuki Tanaka Graduate School of Information.
Tokyo Institute of Technology, Japan Yu Nishiyama and Sumio Watanabe Theoretical Analysis of Accuracy of Gaussian Belief Propagation.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
29 December, 2008 National Tsing Hua University, Taiwan 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Kazuyuki Tanaka Graduate School of Information Sciences,
Phisical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 4th Maximum likelihood estimation and EM algorithm Kazuyuki Tanaka Graduate School.
Lecture 2: Statistical learning primer for biologists
Belief Propagation and its Generalizations Shane Oldenburger.
Wei Sun and KC Chang George Mason University March 2008 Convergence Study of Message Passing In Arbitrary Continuous Bayesian.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Graduate School of Information Sciences, Tohoku University
Dynamics and its stability of Boltzmann-machine learning algorithm for gray scale image restoration J. Inoue (Hokkaido Univ.) and K. Tanaka (Tohoku Univ.)
Physical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
29 June, 2006 Kyoto University 1 画像処理における確率伝搬法と EM アルゴリズムの統計的性能評価 東北大学大学院情報科学研究科田中和之 Reference 田中和之 : ガウシアングラフィカルモデルにもとづく確率的情報処理におけ.
ICPR2004 (24 July, 2004, Cambridge) 1 Probabilistic image processing based on the Q-Ising model by means of the mean- field method and loopy belief propagation.
6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Graduate School of Information Sciences, Tohoku University
マルコフ確率場の統計的機械学習の数理と データサイエンスへの展開 Statistical Machine Learning in Markov Random Field and Expansion to Data Sciences 田中和之 東北大学大学院情報科学研究科 Kazuyuki Tanaka.
Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki Tanaka Graduate School of Information Sciences,
Statistical-Mechanical Approach to Probabilistic Image Processing -- Loopy Belief Propagation and Advanced Mean-Field Method -- Kazuyuki Tanaka and Noriko.
Graduate School of Information Sciences, Tohoku University
Sublinear Computational Time Modeling in Statistical Machine Learning Theory for Markov Random Fields Kazuyuki Tanaka GSIS, Tohoku University, Sendai,
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences Tohoku University, Japan
Generalized Belief Propagation
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Graduate School of Information Sciences, Tohoku University
一般化された確率伝搬法の数学的構造 東北大学大学院情報科学研究科 田中和之
Image Registration 박성진.
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Graduate School of Information Sciences, Tohoku University
Physical Fluctuomatics 7th~10th Belief propagation
Expectation-Maximization & Belief Propagation
Graduate School of Information Sciences, Tohoku University
Advanced Mean Field Methods in Quantum Probabilistic Inference
Probabilistic image processing and Bayesian network
Probabilistic image processing and Bayesian network
Graduate School of Information Sciences, Tohoku University
Probabilistic image processing and Bayesian network
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Kazuyuki Tanaka Graduate School of Information Sciences
Presentation transcript:

Graduate School of Information Sciences, Tohoku University, Japan Generalized Belief Propagation for Gaussian Graphical Model in probabilistic image processing Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University, Japan http://www.smapip.is.tohoku.ac.jp/~kazu/ Reference K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: Accuracy of the Bethe Approximation for Hyperparameter Estimation in Probabilistic Image Processing, J. Phys. A: Math & Gen., 37, 8675 (2004). 6 September, 2005 SPDSA2005 (Roma)

Contents Introduction Loopy Belief Propagation Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

Bayesian Image Analysis Noise Transmission Original Image Degraded Image Graphical Model with Loops =Spin System on Square Lattice Bayesian Image Analysis + Belief Propagation → Probabilistic Image Processing 6 September, 2005 SPDSA2005 (Roma)

Belief Propagation Belief Propagation Generalized Belief Propagation Probabilistic model with no loop Belief Propagation = Transfer Matrix (Lauritzen, Pearl) Probabilistic model with some loops Approximation→Loopy Belief Propagation Generalized Belief Propagation (Yedidia, Freeman, Weiss) Loopy Belief Propagation (LBP) = Bethe Approximation Generalized Belief Propagation (GBP) = Cluster Variation Method How is the accuracy of LBP and GBP? 6 September, 2005 SPDSA2005 (Roma)

Gaussian Graphical Model 6 September, 2005 SPDSA2005 (Roma)

Probabilistic Image Processing by Gaussian Graphical Model and Generalized Belief Propagation How can we construct a probabilistic image processing algorithm by using Loopy Belief Propagation and Generalized Belief Propagation? How is the accuracy of Loopy Belief Propagation and Generalized Belief Propagation? In order to clarify both questions, we assume the Gaussian graphical model as a posterior probabilistic model 6 September, 2005 SPDSA2005 (Roma)

Contents Introduction Loopy Belief Propagation Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

Kullback-Leibler Divergence of Gaussian Graphical Model Entropy Term 6 September, 2005 SPDSA2005 (Roma)

Loopy Belief Propagation Trial Function Tractable Form 6 September, 2005 SPDSA2005 (Roma)

Loopy Belief Propagation Trial Function Marginal Distribution of GGM is also GGM 6 September, 2005 SPDSA2005 (Roma)

Loopy Belief Propagation Bethe Free Energy in GGM 6 September, 2005 SPDSA2005 (Roma)

Loopy Belief Propagation m is exact 6 September, 2005 SPDSA2005 (Roma)

Iteration Procedure Fixed Point Equation Iteration 6 September, 2005 SPDSA2005 (Roma)

Loopy Belief Propagation and TAP Free Energy Mean Field Free Energy 6 September, 2005 SPDSA2005 (Roma)

Contents Introduction Loopy Belief Propagation Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

Generalized Belief Propagation Cluster: Set of nodes Every subcluster of the element of B does not belong to B. Example: System consisting of 4 nodes 1 2 3 4 1 2 3 4 6 September, 2005 SPDSA2005 (Roma)

Selection of B in LBP and GBP (Bethe Approx.) 1 2 4 5 3 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 4 5 3 6 7 8 9 GBP (Square Approx. in CVM) 6 September, 2005 SPDSA2005 (Roma)

Selection of B and C in Loopy Belief Propagation LBP (Bethe Approx.) The set of Basic Clusters The Set of Basic Clusters and Their Subclusters 6 September, 2005 SPDSA2005 (Roma)

Selection of B and C in Generalized Belief Propagation GBP (Square Approximation in CVM) The set of Basic Clusters The Set of Basic Clusters and Their Subclusters 6 September, 2005 SPDSA2005 (Roma)

Generalized Belief Propagation Trial Function Marginal Distribution of GGM is also GGM 6 September, 2005 SPDSA2005 (Roma)

Generalized Belief Propagation m is exact 6 September, 2005 SPDSA2005 (Roma)

Contents Introduction Loopy Belief Propagation Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

Bayesian Image Analysis Noise Transmission Original Image Degraded Image 6 September, 2005 SPDSA2005 (Roma)

Bayesian Image Analysis Degradation Process Additive White Gaussian Noise Transmission Original Image Degraded Image 6 September, 2005 SPDSA2005 (Roma)

Bayesian Image Analysis A Priori Probability Generate Standard Images Similar? 6 September, 2005 SPDSA2005 (Roma)

Bayesian Image Analysis Original Image f Degraded Image g A Posteriori Probability Gaussian Graphical Model 6 September, 2005 SPDSA2005 (Roma)

Bayesian Image Analysis A Priori Probability Degraded Image Degraded Image Original Image Pixels A Posteriori Probability 6 September, 2005 SPDSA2005 (Roma)

Hyperparameter Determination by Maximization of Marginal Likelihood In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Marginalization Degraded Image Original Image Marginal Likelihood 6 September, 2005 SPDSA2005 (Roma)

Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Incomplete Data Equivalent 6 September, 2005 SPDSA2005 (Roma)

Iterate the following EM-steps until convergence: Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Marginal Likelihood Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Iterate the following EM-steps until convergence: EM Algorithm A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977). 6 September, 2005 SPDSA2005 (Roma)

Image Restoration The original image is generated from the prior probability. (Hyperparameters: Maximization of Marginal Likelihood) Degraded Image Original Image Loopy Belief Propagation Mean-Field Method Exact Result 6 September, 2005 SPDSA2005 (Roma)

Numerical Experiments of Logarithm of Marginal Likelihood The original image is generated from the prior probability. (Hyperparameters: Maximization of Marginal Likelihood) Original Image Degraded Image MFA -5.0 -5.0 MFA LPB -5.5 Exact LPB Exact -6.0 -5.5 10 20 30 40 50 60 0.0010 0.0020 Mean-Field Method Loopy Belief Propagation Exact Result 6 September, 2005 SPDSA2005 (Roma)

Numerical Experiments of Logarithm of Marginal Likelihood EM Algorithm with Belief Propagation Original Image MF Exact LBP 0.001 50 100 LPB MFA 0.002 Degraded Image 6 September, 2005 SPDSA2005 (Roma)

Image Restoration by Gaussian Graphical Model EM Algorithm with Belief Propagation Original Image Degraded Image MSE: 1512 MSE: 1529 6 September, 2005 SPDSA2005 (Roma)

Image Restoration by Gaussian Graphical Model Original Image Degraded Image Mean Field Method MSE: 1512 MSE:611 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. LBP TAP GBP Exact Solution MSE:327 MSE:320 MSE: 315 MSE:315 6 September, 2005 SPDSA2005 (Roma)

Image Restoration by Gaussian Graphical Model Original Image Degraded Image Mean Field Method Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE: 1529 MSE: 565 LBP TAP GBP Exact Solution MSE:260 MSE:248 MSE:236 MSE:236 6 September, 2005 SPDSA2005 (Roma)

Image Restoration by Gaussian Graphical Model MSE MF 611 0.000263 26.918 -5.13083 LBP 327 0.000611 36.302 -5.19201 TAP 320 0.000674 37.170 -5.20265 GBP 315 0.000758 37.909 -5.21172 Exact 0.000759 37.919 -5.21444 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE MF 565 0.000293 26.353 -5.09121 LBP 260 0.000574 33.998 -5.15241 TAP 248 0.000610 34.475 -5.16297 GBP 236 0.000652 34.971 -5.17256 Exact 34.975 -5.17528 6 September, 2005 SPDSA2005 (Roma)

Image Restoration by Gaussian Graphical Model and Conventional Filters MSE MF 611 Lowpass Filter (3x3) 388 LBP 327 (5x5) 413 TAP 320 Median Filter 486 GBP 315 445 Exact Wiener Filter 864 548 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (3x3) Lowpass (5x5) Median (5x5) Wiener 6 September, 2005 SPDSA2005 (Roma)

Image Restoration by Gaussian Graphical Model and Conventional Filters MSE MF 565 Lowpass Filter (3x3) 241 LBP 260 (5x5) 224 TAP 248 Median Filter 331 GBP 236 244 Exact Wiener Filter 703 372 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (5x5) Lowpass (5x5) Median (5x5) Wiener 6 September, 2005 SPDSA2005 (Roma)

Contents Introduction Loopy Belief Propagation Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

Summary Generalized Belief Propagation for Gaussian Graphical Model Accuracy of Generalized Belief Propagation Derivation of TAP Free Energy for Gaussian Graphical Model by Perturbation Expansion of Bethe Approximation 6 September, 2005 SPDSA2005 (Roma)

Future Problem Hyperparameter Estimation by TAP Free Energy is better than by Loopy Belief Propagation. Effectiveness of Higher Order Terms of TAP Free Energy for Hyperparameter Estimation by means of Marginal Likelihood in Bayesian Image Analysis. TAP Free Energy Mean Field Free Energy 6 September, 2005 SPDSA2005 (Roma)