Download presentation
Presentation is loading. Please wait.
Published byDamian Abraham Doyle Modified over 8 years ago
1
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University http://www.smapip.is.tohoku.ac.jp/~kazu/ Collaborators: D. M. Titterington (Department of Statistics, University of Glasgow)
2
10 October, 2007 University of Glasgow 2 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks
3
10 October, 2007 University of Glasgow 3 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks
4
10 October, 2007University of Glasgow4 MRF and Statistical Inference Geman and Geman (1986): IEEE Transactions on PAMI Image Processing for Markov Random Fields (MRF) (Simulated Annealing, Line Fields) How can we estimate hyperparameters in the degradation process and in the prior model only from observed data? EM Algorithm In the EM algorithm, we have to calculate some statistical quantities in the posterior and the prior models. Belief Propagation Markov Chain Monte Carlo Method
5
10 October, 2007University of Glasgow5 Statistical Analysis in EM Algorithm J. Inoue and K. Tanaka: Phys. Rev. E 2002, J. Phys. A 2003 Statistical Behaviour of EM Algorithm for MRF (Graphical Models on Complete Graph) (Graphical Models on Complete Graph) K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: J. Phys. A 2004 Hyperparameter Estimation by using Belief Propagation (BP) for Gaussian Graphical Model in Image Processing It is possible to estimate statistical behaviour of EM algorithm with belief propagation analytically. K. Tanaka and D. M. Titterington: J. Phys. A 2007 Statistical Trajectory of Approximate EM Algorithm for Probabilistic Image Processing
6
10 October, 2007 University of Glasgow 6 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks
7
10 October, 2007 University of Glasgow 7 Bayesian Image Restoration Original Image Degraded Image transmission Noise
8
10 October, 2007University of Glasgow8 Bayes Formula and Probabilistic Image Processing Original ImageDegraded Image Prior Probability Posterior Probability Degradation Process Pixel
9
10 October, 2007 University of Glasgow 9 Prior Probability in Probabilistic Image Processing Samples are generated by MCMC. B: Set of All the Nearest Neighbour pairs of Pixels : Set of all the nodes Markov Chain Monte Carlo Method
10
10 October, 2007University of Glasgow10 Degradation Process Additive White Gaussian Noise Histogram of Gaussian Random Numbers
11
10 October, 2007 University of Glasgow 11 Degradation Process and Prior Degradation Process Prior Probability Density Function Posterior Probability Density Function Multi-Dimensional Gaussian Integral Formula
12
10 October, 2007 University of Glasgow 12 Maximization of Marginal Likelihood by EM Algorithm Marginal Likelihood Iterate the following EM-steps until convergence: EM Algorithm Q-Function A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977).
13
10 October, 2007 University of Glasgow 13 Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm = Extremum Condisions of Q( , | (t), (t),g) w.r.t. and
14
10 October, 2007 University of Glasgow 14 Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm
15
10 October, 2007 University of Glasgow 15 Statistical Behaviour of EM (Expectation Maximization) Algorithm Numerical Experiments for Standard Image Statistical Behaviour of EM Algorithm
16
10 October, 2007 University of Glasgow 16 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks
17
10 October, 2007 University of Glasgow 17 Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm = Markov Chain Monte Carlo
18
10 October, 2007University of Glasgow18 Markov Chain Monte Carlo Method fi(t)fi(t)f i (t+1) w i (f(t+1)|f(t)) i cici Basic Step
19
10 October, 2007University of Glasgow19 Frequency fifi Marginal Probabilities can be estimated from histograms. Markov Chain Monte Carlo Method
20
10 October, 2007University of Glasgow20 Markov Chain Monte Carlo Method EM MCMC
21
10 October, 2007University of Glasgow21 Markov Chain Monte Carlo Method Non-Synchronized Update Numerical Experiments for Standard Image 20 Samples Input Output MCMC =50 EM MCMC ( =50) MCMC ( =1) Exact EM Input Output MCMC
22
10 October, 2007University of Glasgow22 Markov Chain Monte Carlo Method Non-Synchronized Update Numerical Experiments for Standard Image 20 Samples Input Output MCMC =50 EM MCMC ( =50) MCMC ( =1) Exact EM Input Output MCMC
23
10 October, 2007 University of Glasgow 23 Contents 1. Introduction 2. Gaussian Graphical Model and EM Algorithm 3. Markov Chain Monte Carlo Method 4. Concluding Remarks
24
10 October, 2007University of Glasgow24 Summary We construct EM algorithms by means of Markov Chain Monte Carlo method and compare them with some exact calculations. Input Output Exact EM Input Output MCMC =50 EM Input Output MCMC
25
10 October, 2007University of Glasgow25 New Project 1 fi(t)fi(t)f i (t+1) w i (f(t+1)|f(t)) i cici Basic Step EM Input Output MCMC Can we derive the trajectory of EM algorithm by solving the master equations for any step t in the case of ? EM
26
10 October, 2007University of Glasgow26 New Project 1 i cici Transition Probability From the solution of master equation, we calculate These are included in the EM update rules.
27
10 October, 2007University of Glasgow27 New Project 2 Input Can we replace the calculation of statistical quantities in the prior probability by the MCMC? EM Input Output MCMC EM
28
10 October, 2007University of Glasgow28 New Project 3 K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: J. Phys. A 2004 Hyperparameter Estimation by using Belief Propagation (BP) for Gaussian Graphical Model in Image Processing K. Tanaka and D. M. Titterington: J. Phys. A 2007 Statistical Trajectory of Approximate EM Algorithm for Probabilistic Image Processing Our previous works in EM algorithm and Loopy Belief Propagation
29
10 October, 2007 University of Glasgow 29 New Project 3 Loopy Belief Propagation Exact MSE:327 MSE:315 Loopy BP Exact
30
10 October, 2007 University of Glasgow 30 New Project 3 Statistical Behaviour of EM Algorithm Numerical Experiments for Standard Image Loopy BP Exact Loopy BP Exact
31
10 October, 2007University of Glasgow31 New Project 3 Can we update both messages and hyperparameters in the same step? Can we calculate the statistical trajectory? Input Output BP EM Input Output BP More Practical Algorithm
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.