Probabilistic image processing and Bayesian network

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
3 March, 2003University of Glasgow1 Statistical-Mechanical Approach to Probabilistic Inference --- Cluster Variation Method and Generalized Loopy Belief.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Graduate School of Information Sciences, Tohoku University
1 Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan.
Super-Resolution of Remotely-Sensed Images Using a Learning-Based Approach Isabelle Bégin and Frank P. Ferrie Abstract Super-resolution addresses the problem.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
1 物理フラクチュオマティクス論 Physical Fluctuomatics 応用確率過程論 Applied Stochastic Process 第 5 回グラフィカルモデルによる確率的情報処理 5th Probabilistic information processing by means of.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Appendix Kazuyuki Tanaka Graduate School of Information.
1 October, 2007 ALT&DS2007 (Sendai, Japan ) 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate School of Information.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
3 September, 2009 SSP2009, Cardiff, UK 1 Probabilistic Image Processing by Extended Gauss-Markov Random Fields Kazuyuki Tanaka Kazuyuki Tanaka, Muneki.
Physics Fluctuomatics / Applied Stochastic Process (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation Kazuyuki.
28 February, 2003University of Glasgow1 Cluster Variation Method and Probabilistic Image Processing -- Loopy Belief Propagation -- Kazuyuki Tanaka Graduate.
Physical Fuctuomatics (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 1st Review of probabilistic information processing Kazuyuki.
10 December, 2008 CIMCA2008 (Vienna) 1 Statistical Inferences by Gaussian Markov Random Fields on Complex Networks Kazuyuki Tanaka, Takafumi Usui, Muneki.
September 2007 IW-SMI2007, Kyoto 1 A Quantum-Statistical-Mechanical Extension of Gaussian Mixture Model Kazuyuki Tanaka Graduate School of Information.
Tokyo Institute of Technology, Japan Yu Nishiyama and Sumio Watanabe Theoretical Analysis of Accuracy of Gaussian Belief Propagation.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
29 December, 2008 National Tsing Hua University, Taiwan 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Kazuyuki Tanaka Graduate School of Information Sciences,
Phisical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 4th Maximum likelihood estimation and EM algorithm Kazuyuki Tanaka Graduate School.
Physical Fuctuomatics (Tohoku University) 1 Physical Fluctuomatics 1st Review of probabilistic information processing Kazuyuki Tanaka Graduate School of.
Belief Propagation and its Generalizations Shane Oldenburger.
Wei Sun and KC Chang George Mason University March 2008 Convergence Study of Message Passing In Arbitrary Continuous Bayesian.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Graduate School of Information Sciences, Tohoku University
Physical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
29 June, 2006 Kyoto University 1 画像処理における確率伝搬法と EM アルゴリズムの統計的性能評価 東北大学大学院情報科学研究科田中和之 Reference 田中和之 : ガウシアングラフィカルモデルにもとづく確率的情報処理におけ.
ICPR2004 (24 July, 2004, Cambridge) 1 Probabilistic image processing based on the Q-Ising model by means of the mean- field method and loopy belief propagation.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
ベーテ自由エネルギーに対するCCCPアルゴリズムの拡張
Graduate School of Information Sciences, Tohoku University
マルコフ確率場の統計的機械学習の数理と データサイエンスへの展開 Statistical Machine Learning in Markov Random Field and Expansion to Data Sciences 田中和之 東北大学大学院情報科学研究科 Kazuyuki Tanaka.
Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki Tanaka Graduate School of Information Sciences,
Statistical-Mechanical Approach to Probabilistic Image Processing -- Loopy Belief Propagation and Advanced Mean-Field Method -- Kazuyuki Tanaka and Noriko.
Graduate School of Information Sciences, Tohoku University
Sublinear Computational Time Modeling in Statistical Machine Learning Theory for Markov Random Fields Kazuyuki Tanaka GSIS, Tohoku University, Sendai,
Graduate School of Information Sciences, Tohoku University, Japan
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences Tohoku University, Japan
Generalized Belief Propagation
量子情報処理にむけての クラスター変分法と確率伝搬法の定式化
Graduate School of Information Sciences, Tohoku University
一般化された確率伝搬法の数学的構造 東北大学大学院情報科学研究科 田中和之
Markov Random Fields Presented by: Vladan Radosavljevic.
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences
Physical Fluctuomatics 7th~10th Belief propagation
Expectation-Maximization & Belief Propagation
Graduate School of Information Sciences, Tohoku University
Advanced Mean Field Methods in Quantum Probabilistic Inference
Probabilistic image processing and Bayesian network
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences
Probabilistic image processing and Bayesian network
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Kazuyuki Tanaka Graduate School of Information Sciences
Presentation transcript:

Probabilistic image processing and Bayesian network Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University kazu@smapip.is.tohoku.ac.jp http://www.smapip.is.tohoku.ac.jp/~kazu/ References K. Tanaka: Statistical-mechanical approach to image processing (Topical Review), J. Phys. A, vol.35, pp.R81-R150 (2002). K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: Accuracy of the Bethe approximation for hyperparameter estimation in probabilistic image processing, J. Phys. A, vol.37, pp.8675-8695 (2004). 8 November, 2005 CISJ2005

Bayesian Network and Belief Propagation Bayes Formula Probabilistic Model Probabilistic Information Processing Belief Propagation J. Pearl: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference (Morgan Kaufmann, 1988). C. Berrou and A. Glavieux: Near optimum error correcting coding and decoding: Turbo-codes, IEEE Trans. Comm., 44 (1996). 8 November, 2005 CISJ2005

Contents Introduction Belief Propagation Bayesian Image Analysis and Gaussian Graphical Model Concluding Remarks 8 November, 2005 CISJ2005

It is very hard to calculate exactly except some special cases. Belief Propagation How should we treat the calculation of the summation over 2N configuration? It is very hard to calculate exactly except some special cases. Formulation for approximate algorithm Accuracy of the approximate algorithm 8 November, 2005 CISJ2005

Tractable Model Probabilistic models with no loop are tractable. Factorizable Probabilistic models with loop are not tractable. Not Factorizable 8 November, 2005 CISJ2005

Probabilistic model on a graph with no loop 1 2 3 4 5 6 Marginal probability of the node 2 8 November, 2005 CISJ2005

Probabilistic model on a graph with no loop 1 2 3 4 5 6 Marginal probability can be expressed in terms of the product of messages from all the neighbouring nodes of node 2. Message from the node 1 to the node 2 can be expressed in terms of the product of message from all the neighbouring nodes of the node 1 except one from the node 2. 8 November, 2005 CISJ2005

Probabilistic Model on a Graph with Loops Marginal Probability 8 November, 2005 CISJ2005

Belief Propagation Message Update Rule 1 4 2 5 3 1 4 5 3 2 6 8 7 In the Bethe approximation, the marginal probabilities are assumed to be the following form in terms of the messages from the neighboring pixels to the pixel. These marginal probabilities satisfy the reducibility conditions at each pixels and each nearest-neighbor pair of pixels. The messages are determined so as to satisfy the reducibility conditions. 8 November, 2005 CISJ2005

Message Passing Rule of Belief Propagation 1 3 4 2 5 The reducibility conditions can be rewritten as the following fixed point equations. This fixed point equations is corresponding to the extremum condition of the Bethe free energy. And the fixed point equations can be numerically solved by using the natural iteration. The algorithm is corresponding to the loopy belief propagation. Fixed Point Equations for Massage 8 November, 2005 CISJ2005

Fixed Point Equation and Iterative Method 8 November, 2005 CISJ2005

Contents Introduction Belief Propagation Bayesian Image Analysis and Gaussian Graphical Model Concluding Remarks 8 November, 2005 CISJ2005

Bayesian Image Analysis Noise Transmission Original Image Degraded Image 8 November, 2005 CISJ2005

Bayesian Image Analysis Degradation Process Additive White Gaussian Noise Transmission Original Image Degraded Image 8 November, 2005 CISJ2005

Bayesian Image Analysis A Priori Probability Generate Standard Images Similar? 8 November, 2005 CISJ2005

Bayesian Image Analysis A Posteriori Probability Gaussian Graphical Model 8 November, 2005 CISJ2005

Bayesian Image Analysis A Priori Probability Degraded Image Degraded Image Original Image Pixels A Posteriori Probability 8 November, 2005 CISJ2005

Hyperparameter Determination by Maximization of Marginal Likelihood In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Marginalization Degraded Image Original Image Marginal Likelihood 8 November, 2005 CISJ2005

Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Incomplete Data Equivalent 8 November, 2005 CISJ2005

Iterate the following EM-steps until convergence: Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Marginal Likelihood Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Iterate the following EM-steps until convergence: EM Algorithm A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977). 8 November, 2005 CISJ2005

One-Dimensional Signal 127 255 100 200 Original Signal Degraded Signal Estimated Signal EM Algorithm 8 November, 2005 CISJ2005

Image Restoration by Gaussian Graphical Model EM Algorithm with Belief Propagation Original Image Degraded Image MSE: 1512 MSE: 1529 8 November, 2005 CISJ2005

Exact Results of Gaussian Graphical Model Multi-dimensional Gauss integral formula 8 November, 2005 CISJ2005

Comparison of Belief Propagation with Exact Results in Gaussian Graphical Model MSE Belief Propagation 327 0.000611 36.302 -5.19201 Exact 315 0.000759 37.919 -5.21444 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE Belief Propagation 260 0.000574 33.998 -5.15241 Exact 236 0.000652 34.975 -5.17528 8 November, 2005 CISJ2005

Image Restoration by Gaussian Graphical Model Original Image Degraded Image Belief Propagation Exact MSE: 1512 MSE: 325 MSE:315 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. Lowpass Filter Wiener Filter Median Filter MSE: 411 MSE: 545 MSE: 447 8 November, 2005 CISJ2005

Image Restoration by Gaussian Graphical Model Original Image Degraded Image Belief Propagation Exact MSE: 1529 MSE: 260 MSE236 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. Lowpass Filter Wiener Filter Median Filter MSE: 224 MSE: 372 MSE: 244 8 November, 2005 CISJ2005

Extension of Belief Propagation Generalized Belief Propagation J. S. Yedidia, W. T. Freeman and Y. Weiss: Constructing free-energy approximations and generalized belief propagation algorithms, IEEE Transactions on Information Theory, 51 (2005). Generalized belief propagation is equivalent to the cluster variation method in statistical mechanics R. Kikuchi: A theory of cooperative phenomena, Phys. Rev., 81 (1951). T. Morita: Cluster variation method of cooperative phenomena and its generalization I, J. Phys. Soc. Jpn, 12 (1957). 8 November, 2005 CISJ2005

Image Restoration by Gaussian Graphical Model MSE Belief Propagation 327 0.000611 36.302 -5.19201 Generalized Belief Propagation 315 0.000758 37.909 -5.21172 Exact 0.000759 37.919 -5.21444 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE Belief Propagation 260 0.000574 33.998 -5.15241 Generalized Belief Propagation 236 0.000652 34.971 -5.17256 Exact 34.975 -5.17528 8 November, 2005 CISJ2005

Image Restoration by Gaussian Graphical Model and Conventional Filters MSE Belief Propagation 327 Lowpass Filter (3x3) 388 (5x5) 413 Generalized Belief Propagation 315 Median Filter 486 445 Exact Wiener Filter 864 548 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (3x3) Lowpass (5x5) Median (5x5) Wiener 8 November, 2005 CISJ2005

Image Restoration by Gaussian Graphical Model and Conventional Filters MSE Belief Propagation 260 Lowpass Filter (3x3) 241 (5x5) 224 Generalized Belief Propagation 236 Median Filter 331 244 Exact Wiener Filter 703 372 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (5x5) Lowpass (5x5) Median (5x5) Wiener 8 November, 2005 CISJ2005

Bayesian Image Analysis and Gaussian Graphical Model Contents Introduction Belief Propagation Bayesian Image Analysis and Gaussian Graphical Model Concluding Remarks 8 November, 2005 CISJ2005

Summary Formulation of belief propagation Accuracy of belief propagation in Bayesian image analysis by means of Gaussian graphical model (Comparison between the belief propagation and exact calculation) 8 November, 2005 CISJ2005