Probabilistic image processing and Bayesian network

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
3 March, 2003University of Glasgow1 Statistical-Mechanical Approach to Probabilistic Inference --- Cluster Variation Method and Generalized Loopy Belief.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Graduate School of Information Sciences, Tohoku University
1 Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan.
Understanding Belief Propagation and its Applications Dan Yuan June 2004.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
Some Surprises in the Theory of Generalized Belief Propagation Jonathan Yedidia Mitsubishi Electric Research Labs (MERL) Collaborators: Bill Freeman (MIT)
1 物理フラクチュオマティクス論 Physical Fluctuomatics 応用確率過程論 Applied Stochastic Process 第 5 回グラフィカルモデルによる確率的情報処理 5th Probabilistic information processing by means of.
Physics Fluctuomatics / Applied Stochastic Process (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 7th “More is different” and.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Appendix Kazuyuki Tanaka Graduate School of Information.
1 October, 2007 ALT&DS2007 (Sendai, Japan ) 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate School of Information.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
3 September, 2009 SSP2009, Cardiff, UK 1 Probabilistic Image Processing by Extended Gauss-Markov Random Fields Kazuyuki Tanaka Kazuyuki Tanaka, Muneki.
Physics Fluctuomatics / Applied Stochastic Process (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation Kazuyuki.
28 February, 2003University of Glasgow1 Cluster Variation Method and Probabilistic Image Processing -- Loopy Belief Propagation -- Kazuyuki Tanaka Graduate.
Physical Fuctuomatics (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 1st Review of probabilistic information processing Kazuyuki.
10 December, 2008 CIMCA2008 (Vienna) 1 Statistical Inferences by Gaussian Markov Random Fields on Complex Networks Kazuyuki Tanaka, Takafumi Usui, Muneki.
September 2007 IW-SMI2007, Kyoto 1 A Quantum-Statistical-Mechanical Extension of Gaussian Mixture Model Kazuyuki Tanaka Graduate School of Information.
Tokyo Institute of Technology, Japan Yu Nishiyama and Sumio Watanabe Theoretical Analysis of Accuracy of Gaussian Belief Propagation.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
29 December, 2008 National Tsing Hua University, Taiwan 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Kazuyuki Tanaka Graduate School of Information Sciences,
Phisical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 4th Maximum likelihood estimation and EM algorithm Kazuyuki Tanaka Graduate School.
Physical Fuctuomatics (Tohoku University) 1 Physical Fluctuomatics 1st Review of probabilistic information processing Kazuyuki Tanaka Graduate School of.
Wei Sun and KC Chang George Mason University March 2008 Convergence Study of Message Passing In Arbitrary Continuous Bayesian.
Graduate School of Information Sciences, Tohoku University
@ 15/7/2003 Tokyo Institute of Technology 1 Propagating beliefs in spin- glass models Yoshiyuki Kabashima Dept. of Compt. Intel. & Syst.
Physical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
29 June, 2006 Kyoto University 1 画像処理における確率伝搬法と EM アルゴリズムの統計的性能評価 東北大学大学院情報科学研究科田中和之 Reference 田中和之 : ガウシアングラフィカルモデルにもとづく確率的情報処理におけ.
ICPR2004 (24 July, 2004, Cambridge) 1 Probabilistic image processing based on the Q-Ising model by means of the mean- field method and loopy belief propagation.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
ベーテ自由エネルギーに対するCCCPアルゴリズムの拡張
Graduate School of Information Sciences, Tohoku University
マルコフ確率場の統計的機械学習の数理と データサイエンスへの展開 Statistical Machine Learning in Markov Random Field and Expansion to Data Sciences 田中和之 東北大学大学院情報科学研究科 Kazuyuki Tanaka.
Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki Tanaka Graduate School of Information Sciences,
Statistical-Mechanical Approach to Probabilistic Image Processing -- Loopy Belief Propagation and Advanced Mean-Field Method -- Kazuyuki Tanaka and Noriko.
Graduate School of Information Sciences, Tohoku University
Sublinear Computational Time Modeling in Statistical Machine Learning Theory for Markov Random Fields Kazuyuki Tanaka GSIS, Tohoku University, Sendai,
Graduate School of Information Sciences, Tohoku University, Japan
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences Tohoku University, Japan
Generalized Belief Propagation
量子情報処理にむけての クラスター変分法と確率伝搬法の定式化
Graduate School of Information Sciences, Tohoku University
一般化された確率伝搬法の数学的構造 東北大学大学院情報科学研究科 田中和之
Markov Random Fields Presented by: Vladan Radosavljevic.
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences
Physical Fluctuomatics 7th~10th Belief propagation
Expectation-Maximization & Belief Propagation
Graduate School of Information Sciences, Tohoku University
Advanced Mean Field Methods in Quantum Probabilistic Inference
Probabilistic image processing and Bayesian network
Probabilistic image processing and Bayesian network
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Kazuyuki Tanaka Graduate School of Information Sciences
Presentation transcript:

Probabilistic image processing and Bayesian network Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University kazu@smapip.is.tohoku.ac.jp http://www.smapip.is.tohoku.ac.jp/~kazu/ References K. Tanaka: Statistical-mechanical approach to image processing (Topical Review), J. Phys. A, vol.35, pp.R81-R150 (2002). K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: Accuracy of the Bethe approximation for hyperparameter estimation in probabilistic image processing, J. Phys. A, vol.37, pp.8675-8695 (2004). RC2005 (19 July, 2005, Sendai)

Bayesian Network and Belief Propagation Bayes Formula Probabilistic Model Probabilistic Information Processing Belief Propagation J. Pearl: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference (Morgan Kaufmann, 1988). C. Berrou and A. Glavieux: Near optimum error correcting coding and decoding: Turbo-codes, IEEE Trans. Comm., 44 (1996). RC2005 (19 July, 2005, Sendai)

Formulation of Belief Propagation Link between belief propagation and statistical mechanics. Y. Kabashima and D. Saad, Belief propagation vs. TAP for decoding corrupted messages, Europhys. Lett. 44 (1998). M. Opper and D. Saad (eds), Advanced Mean Field Methods ---Theory and Practice (MIT Press, 2001). Generalized belief propagation J. S. Yedidia, W. T. Freeman and Y. Weiss: Constructing free-energy approximations and generalized belief propagation algorithms, IEEE Transactions on Information Theory, 51 (2005). Information geometrical interpretation of belief propagation S. Ikeda, T. Tanaka and S. Amari: Stochastic reasoning, free energy, and information geometry, Neural Computation, 16 (2004). RC2005 (19 July, 2005, Sendai)

Application of Belief Propagation Image Processing K. Tanaka: Statistical-mechanical approach to image processing (Topical Review), J. Phys. A, 35 (2002). A. S. Willsky: Multiresolution Markov Models for Signal and Image Processing, Proceedings of IEEE, 90 (2002). Low Density Parity Check Codes Y. Kabashima and D. Saad: Statistical mechanics of low-density parity-check codes (Topical Review), J. Phys. A, 37 (2004). S. Ikeda, T. Tanaka and S. Amari: Information geometry of turbo and low-density parity-check codes, IEEE Transactions on Information Theory, 50 (2004). CDMA Multiuser Detection Algorithm Y. Kabashima: A CDMA multiuser detection algorithm on the basis of belief propagation, J. Phys. A, 36 (2003). T. Tanaka and M. Okada: Approximate Belief propagation, density evolution, and statistical neurodynamics for CDMA multiuser detection, IEEE Transactions on Information Theory, 51 (2005). Satisfability Problem O. C. Martin, R. Monasson, R. Zecchina: Statistical mechanics methods and phase transitions in optimization problems, Theoretical Computer Science, 265 (2001). M. Mezard, G. Parisi, R. Zecchina: Analytic and algorithmic solution of random satisfability problems, Science, 297 (2002). RC2005 (19 July, 2005, Sendai)

Contents Introduction Belief Propagation Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks RC2005 (19 July, 2005, Sendai)

It is very hard to calculate exactly except some special cases. Belief Propagation How should we treat the calculation of the summation over 2N configurations. It is very hard to calculate exactly except some special cases. Formulation for approximate algorithm Accuracy of the approximate algorithm RC2005 (19 July, 2005, Sendai)

Tractable Model Probabilistic models with no loop are tractable. Factorizable Probabilistic models with loop are not tractable. Not Factorizable RC2005 (19 July, 2005, Sendai)

Probabilistic Model on a Graph with Loops Marginal Probability RC2005 (19 July, 2005, Sendai)

Message Passing Rule of Belief Propagation The reducibility conditions can be rewritten as the following fixed point equations. This fixed point equations is corresponding to the extremum condition of the Bethe free energy. And the fixed point equations can be numerically solved by using the natural iteration. The algorithm is corresponding to the loopy belief propagation. 1 3 4 2 5 Fixed Point Equations for Massage RC2005 (19 July, 2005, Sendai)

Approximate Representation of Marginal Probability 1 4 2 5 3 In the Bethe approximation, the marginal probabilities are assumed to be the following form in terms of the messages from the neighboring pixels to the pixel. These marginal probabilities satisfy the reducibility conditions at each pixels and each nearest-neighbor pair of pixels. The messages are determined so as to satisfy the reducibility conditions. Fixed Point Equations for Messages RC2005 (19 July, 2005, Sendai)

Fixed Point Equation and Iterative Method RC2005 (19 July, 2005, Sendai)

Contents Introduction Belief Propagation Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks RC2005 (19 July, 2005, Sendai)

Bayesian Image Analysis Noise Transmission Original Image Degraded Image RC2005 (19 July, 2005, Sendai)

Bayesian Image Analysis Degradation Process Additive White Gaussian Noise Transmission Original Image Degraded Image RC2005 (19 July, 2005, Sendai)

Bayesian Image Analysis A Priori Probability Generate Standard Images Similar? RC2005 (19 July, 2005, Sendai)

Bayesian Image Analysis A Posteriori Probability Gaussian Graphical Model RC2005 (19 July, 2005, Sendai)

Bayesian Image Analysis A Priori Probability Degraded Image Degraded Image Original Image Pixels A Posteriori Probability RC2005 (19 July, 2005, Sendai)

Hyperparameter Determination by Maximization of Marginal Likelihood In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Marginalization Degraded Image Original Image Marginal Likelihood RC2005 (19 July, 2005, Sendai)

Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Incomplete Data Equivalent RC2005 (19 July, 2005, Sendai)

Iterate the following EM-steps until convergence: Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Marginal Likelihood Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Iterate the following EM-steps until convergence: EM Algorithm A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977). RC2005 (19 July, 2005, Sendai)

One-Dimensional Signal 127 255 100 200 Original Signal Degraded Signal Estimated Signal EM Algorithm RC2005 (19 July, 2005, Sendai)

Image Restoration by Gaussian Graphical Model EM Algorithm with Belief Propagation Original Image Degraded Image MSE: 1512 MSE: 1529 RC2005 (19 July, 2005, Sendai)

Exact Results of Gaussian Graphical Model Multi-dimensional Gauss integral formula RC2005 (19 July, 2005, Sendai)

Comparison of Belief Propagation with Exact Results in Gaussian Graphical Model MSE Belief Propagation 327 0.000611 36.302 -5.19201 Exact 315 0.000759 37.919 -5.21444 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE Belief Propagation 260 0.000574 33.998 -5.15241 Exact 236 0.000652 34.975 -5.17528 RC2005 (19 July, 2005, Sendai)

Image Restoration by Gaussian Graphical Model Original Image Degraded Image Belief Propagation Exact Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE: 1512 MSE: 325 MSE:315 Lowpass Filter Wiener Filter Median Filter MSE: 411 MSE: 545 MSE: 447 RC2005 (19 July, 2005, Sendai)

Image Restoration by Gaussian Graphical Model Original Image Degraded Image Belief Propagation Exact Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE: 1529 MSE: 260 MSE236 Lowpass Filter Wiener Filter Median Filter MSE: 224 MSE: 372 MSE: 244 RC2005 (19 July, 2005, Sendai)

Extension of Belief Propagation Generalized Belief Propagation J. S. Yedidia, W. T. Freeman and Y. Weiss: Constructing free-energy approximations and generalized belief propagation algorithms, IEEE Transactions on Information Theory, 51 (2005). Generalized belief propagation is equivalent to the cluster variation method in statistical mechanics R. Kikuchi: A theory of cooperative phenomena, Phys. Rev., 81 (1951). T. Morita: Cluster variation method of cooperative phenomena and its generalization I, J. Phys. Soc. Jpn, 12 (1957). RC2005 (19 July, 2005, Sendai)

Image Restoration by Gaussian Graphical Model MSE Belief Propagation 327 0.000611 36.302 -5.19201 Generalized Belief Propagation 315 0.000758 37.909 -5.21172 Exact 0.000759 37.919 -5.21444 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE Belief Propagation 260 0.000574 33.998 -5.15241 Generalized Belief Propagation 236 0.000652 34.971 -5.17256 Exact 34.975 -5.17528 RC2005 (19 July, 2005, Sendai)

Image Restoration by Gaussian Graphical Model and Conventional Filters MSE Belief Propagation 327 Lowpass Filter (3x3) 388 (5x5) 413 Generalized Belief Propagation 315 Median Filter 486 445 Exact Wiener Filter 864 548 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. GBP (3x3) Lowpass (5x5) Median (5x5) Wiener RC2005 (19 July, 2005, Sendai)

Image Restoration by Gaussian Graphical Model and Conventional Filters MSE Belief Propagation 260 Lowpass Filter (3x3) 241 (5x5) 224 Generalized Belief Propagation 236 Median Filter 331 244 Exact Wiener Filter 703 372 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. GBP (5x5) Lowpass (5x5) Median (5x5) Wiener RC2005 (19 July, 2005, Sendai)

Contents Introduction Belief Propagation Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks RC2005 (19 July, 2005, Sendai)

Image Segmentation by Gauss Mixture Model RC2005 (19 July, 2005, Sendai)

Image Segmentation by Combining Gauss Mixture Model with Potts Model Belief Propagation Potts Model RC2005 (19 July, 2005, Sendai)

Image Segmentation Belief Propagation Original Image Histogram Gauss Mixture Model Gauss Mixture Model and Potts Model Histogram Belief Propagation RC2005 (19 July, 2005, Sendai)

Motion Detection a Segmentation b Detection AND c Segmentation Gauss Mixture Model and Potts Model with Belief Propagation RC2005 (19 July, 2005, Sendai)

Contents Introduction Belief Propagation Bayesian Image Analysis and Gaussian Graphical Model Image Segmentation Concluding Remarks RC2005 (19 July, 2005, Sendai)

Summary Formulation of belief propagation Accuracy of belief propagation in Bayesian image analysis by means of Gaussian graphical model (Comparison between the belief propagation and exact calculation) Some applications of Bayesian image analysis and belief propagation RC2005 (19 July, 2005, Sendai)

Related Problem Statistical Performance Spin Glass Theory H. Nishimori: Statistical Physics of Spin Glasses and Information Processing: An Introduction, Oxford University Press, Oxford, 2001. RC2005 (19 July, 2005, Sendai)

確率的情報処理の動向 田中和之・樺島祥介編著, “ミニ特集/ベイズ統計・統計力学と情報処理”, 計測と制御 2003年8月号. 田中和之,田中利幸,渡辺治 他著,“連載/確率的情報処理と統計力学 ~様々なアプローチとそのチュートリアル~”,数理科学2004年11月号から開始. 田中和之,岡田真人,堀口剛 他著,“小特集/確率を手なづける秘伝の計算技法 ~古くて新しい確率・統計モデルのパラダイム~”,電子情報通信学会誌 2005年9月号 RC2005 (19 July, 2005, Sendai)