10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.

Slides:



Advertisements
Similar presentations
Exact Inference in Bayes Nets
Advertisements

Bayesian Estimation in MARK
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
3 March, 2003University of Glasgow1 Statistical-Mechanical Approach to Probabilistic Inference --- Cluster Variation Method and Generalized Loopy Belief.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Markov Networks.
Graduate School of Information Sciences, Tohoku University
1 Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan.
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
1 物理フラクチュオマティクス論 Physical Fluctuomatics 応用確率過程論 Applied Stochastic Process 第 5 回グラフィカルモデルによる確率的情報処理 5th Probabilistic information processing by means of.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Appendix Kazuyuki Tanaka Graduate School of Information.
1 October, 2007 ALT&DS2007 (Sendai, Japan ) 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate School of Information.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
3 September, 2009 SSP2009, Cardiff, UK 1 Probabilistic Image Processing by Extended Gauss-Markov Random Fields Kazuyuki Tanaka Kazuyuki Tanaka, Muneki.
November 1, 2012 Presented by Marwan M. Alkhweldi Co-authors Natalia A. Schmid and Matthew C. Valenti Distributed Estimation of a Parametric Field Using.
Physics Fluctuomatics / Applied Stochastic Process (Tohoku University) 1 Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation Kazuyuki.
Markov Random Fields Probabilistic Models for Images
28 February, 2003University of Glasgow1 Cluster Variation Method and Probabilistic Image Processing -- Loopy Belief Propagation -- Kazuyuki Tanaka Graduate.
10 December, 2008 CIMCA2008 (Vienna) 1 Statistical Inferences by Gaussian Markov Random Fields on Complex Networks Kazuyuki Tanaka, Takafumi Usui, Muneki.
September 2007 IW-SMI2007, Kyoto 1 A Quantum-Statistical-Mechanical Extension of Gaussian Mixture Model Kazuyuki Tanaka Graduate School of Information.
Tokyo Institute of Technology, Japan Yu Nishiyama and Sumio Watanabe Theoretical Analysis of Accuracy of Gaussian Belief Propagation.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
29 December, 2008 National Tsing Hua University, Taiwan 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Kazuyuki Tanaka Graduate School of Information Sciences,
Phisical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 4th Maximum likelihood estimation and EM algorithm Kazuyuki Tanaka Graduate School.
Lecture 2: Statistical learning primer for biologists
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Expectation-Maximization (EM) Algorithm & Monte Carlo Sampling for Inference and Approximation.
Graduate School of Information Sciences, Tohoku University
Dynamics and its stability of Boltzmann-machine learning algorithm for gray scale image restoration J. Inoue (Hokkaido Univ.) and K. Tanaka (Tohoku Univ.)
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Physical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
29 June, 2006 Kyoto University 1 画像処理における確率伝搬法と EM アルゴリズムの統計的性能評価 東北大学大学院情報科学研究科田中和之 Reference 田中和之 : ガウシアングラフィカルモデルにもとづく確率的情報処理におけ.
ICPR2004 (24 July, 2004, Cambridge) 1 Probabilistic image processing based on the Q-Ising model by means of the mean- field method and loopy belief propagation.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Graduate School of Information Sciences, Tohoku University
マルコフ確率場の統計的機械学習の数理と データサイエンスへの展開 Statistical Machine Learning in Markov Random Field and Expansion to Data Sciences 田中和之 東北大学大学院情報科学研究科 Kazuyuki Tanaka.
Physical Fluctuomatics 13th Quantum-mechanical extensions of probabilistic information processing Kazuyuki Tanaka Graduate School of Information Sciences,
Statistical-Mechanical Approach to Probabilistic Image Processing -- Loopy Belief Propagation and Advanced Mean-Field Method -- Kazuyuki Tanaka and Noriko.
Graduate School of Information Sciences, Tohoku University
Sublinear Computational Time Modeling in Statistical Machine Learning Theory for Markov Random Fields Kazuyuki Tanaka GSIS, Tohoku University, Sendai,
Markov Random Fields with Efficient Approximations
Graduate School of Information Sciences, Tohoku University, Japan
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Markov Networks.
Graduate School of Information Sciences Tohoku University, Japan
Graduate School of Information Sciences, Tohoku University
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Graduate School of Information Sciences, Tohoku University
Physical Fluctuomatics 7th~10th Belief propagation
Expectation-Maximization & Belief Propagation
Graduate School of Information Sciences, Tohoku University
Probabilistic image processing and Bayesian network
Probabilistic image processing and Bayesian network
Graduate School of Information Sciences, Tohoku University
Probabilistic image processing and Bayesian network
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Graduate School of Information Sciences, Tohoku University
Markov Networks.
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Kazuyuki Tanaka Graduate School of Information Sciences
Presentation transcript:

10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University Collaborators: D. M. Titterington (Department of Statistics, University of Glasgow)

10 October, 2007 University of Glasgow 2 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks

10 October, 2007 University of Glasgow 3 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks

10 October, 2007University of Glasgow4 MRF and Statistical Inference Geman and Geman (1986): IEEE Transactions on PAMI Image Processing for Markov Random Fields (MRF) (Simulated Annealing, Line Fields) How can we estimate hyperparameters in the degradation process and in the prior model only from observed data? EM Algorithm In the EM algorithm, we have to calculate some statistical quantities in the posterior and the prior models. Belief Propagation Markov Chain Monte Carlo Method

10 October, 2007University of Glasgow5 Statistical Analysis in EM Algorithm J. Inoue and K. Tanaka: Phys. Rev. E 2002, J. Phys. A 2003 Statistical Behaviour of EM Algorithm for MRF (Graphical Models on Complete Graph) (Graphical Models on Complete Graph) K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: J. Phys. A 2004 Hyperparameter Estimation by using Belief Propagation (BP) for Gaussian Graphical Model in Image Processing It is possible to estimate statistical behaviour of EM algorithm with belief propagation analytically. K. Tanaka and D. M. Titterington: J. Phys. A 2007 Statistical Trajectory of Approximate EM Algorithm for Probabilistic Image Processing

10 October, 2007 University of Glasgow 6 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks

10 October, 2007 University of Glasgow 7 Bayesian Image Restoration Original Image Degraded Image transmission Noise

10 October, 2007University of Glasgow8 Bayes Formula and Probabilistic Image Processing Original ImageDegraded Image Prior Probability Posterior Probability Degradation Process Pixel

10 October, 2007 University of Glasgow 9 Prior Probability in Probabilistic Image Processing Samples are generated by MCMC. B: Set of All the Nearest Neighbour pairs of Pixels  : Set of all the nodes Markov Chain Monte Carlo Method

10 October, 2007University of Glasgow10 Degradation Process Additive White Gaussian Noise Histogram of Gaussian Random Numbers

10 October, 2007 University of Glasgow 11 Degradation Process and Prior Degradation Process Prior Probability Density Function Posterior Probability Density Function Multi-Dimensional Gaussian Integral Formula

10 October, 2007 University of Glasgow 12 Maximization of Marginal Likelihood by EM Algorithm Marginal Likelihood Iterate the following EM-steps until convergence: EM Algorithm Q-Function A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977).

10 October, 2007 University of Glasgow 13 Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm = Extremum Condisions of Q( ,  |  (t),  (t),g) w.r.t.  and 

10 October, 2007 University of Glasgow 14 Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm

10 October, 2007 University of Glasgow 15 Statistical Behaviour of EM (Expectation Maximization) Algorithm Numerical Experiments for Standard Image Statistical Behaviour of EM Algorithm

10 October, 2007 University of Glasgow 16 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks

10 October, 2007 University of Glasgow 17 Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm = Markov Chain Monte Carlo

10 October, 2007University of Glasgow18 Markov Chain Monte Carlo Method fi(t)fi(t)f i (t+1) w i (f(t+1)|f(t)) i cici Basic Step

10 October, 2007University of Glasgow19 Frequency fifi Marginal Probabilities can be estimated from histograms. Markov Chain Monte Carlo Method

10 October, 2007University of Glasgow20 Markov Chain Monte Carlo Method EM MCMC

10 October, 2007University of Glasgow21 Markov Chain Monte Carlo Method Non-Synchronized Update Numerical Experiments for Standard Image 20 Samples Input Output MCMC  =50  EM MCMC (  =50) MCMC (  =1) Exact EM Input Output MCMC 

10 October, 2007University of Glasgow22 Markov Chain Monte Carlo Method Non-Synchronized Update Numerical Experiments for Standard Image 20 Samples Input Output MCMC  =50  EM MCMC (  =50) MCMC (  =1) Exact EM Input Output MCMC 

10 October, 2007 University of Glasgow 23 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks

10 October, 2007University of Glasgow24 Summary We construct EM algorithms by means of Markov Chain Monte Carlo method and compare them with some exact calculations. Input Output Exact EM Input Output MCMC  =50  EM Input Output MCMC 

10 October, 2007University of Glasgow25 New Project 1 fi(t)fi(t)f i (t+1) w i (f(t+1)|f(t)) i cici Basic Step EM Input Output MCMC  Can we derive the trajectory of EM algorithm by solving the master equations for any step t in the case of  ? EM

10 October, 2007University of Glasgow26 New Project 1 i cici Transition Probability From the solution of master equation, we calculate These are included in the EM update rules.

10 October, 2007University of Glasgow27 New Project 2 Input Can we replace the calculation of statistical quantities in the prior probability by the MCMC? EM Input Output MCMC EM

10 October, 2007University of Glasgow28 New Project 3 K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: J. Phys. A 2004 Hyperparameter Estimation by using Belief Propagation (BP) for Gaussian Graphical Model in Image Processing K. Tanaka and D. M. Titterington: J. Phys. A 2007 Statistical Trajectory of Approximate EM Algorithm for Probabilistic Image Processing Our previous works in EM algorithm and Loopy Belief Propagation

10 October, 2007 University of Glasgow 29 New Project 3 Loopy Belief Propagation Exact MSE:327 MSE:315 Loopy BP Exact

10 October, 2007 University of Glasgow 30 New Project 3 Statistical Behaviour of EM Algorithm Numerical Experiments for Standard Image Loopy BP Exact Loopy BP Exact

10 October, 2007University of Glasgow31 New Project 3 Can we update both messages and hyperparameters in the same step? Can we calculate the statistical trajectory? Input Output BP EM Input Output BP More Practical Algorithm