6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung.

Slides:



Advertisements
Similar presentations
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Advertisements

Classification. Introduction A discriminant is a function that separates the examples of different classes. For example – IF (income > Q1 and saving >Q2)
CS479/679 Pattern Recognition Dr. George Bebis
METHODS FOR HAPLOTYPE RECONSTRUCTION
Smoothing 3D Meshes using Markov Random Fields
Markov random field Institute of Electronics, NCTU
Bayesian Learning Rong Jin. Outline MAP learning vs. ML learning Minimum description length principle Bayes optimal classifier Bagging.
Yilin Wang 11/5/2009. Background Labeling Problem Labeling: Observed data set (X) Label set (L) Inferring the labels of the data points Most vision problems.
Jing Gao 1, Feng Liang 1, Wei Fan 2, Chi Wang 1, Yizhou Sun 1, Jiawei Han 1 University of Illinois, IBM TJ Watson Debapriya Basu.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Announcements Readings for today:
1 Bayesian Restoration Using a New Nonstationary Edge-Preserving Image Prior Giannis K. Chantas, Nikolaos P. Galatsanos, and Aristidis C. Likas IEEE Transactions.
Visual Recognition Tutorial
Bayesian Learning Rong Jin.
Stereo Matching & Energy Minimization Vision for Graphics CSE 590SS, Winter 2001 Richard Szeliski.
In this presentation, we developed an algorithm for describing the bending structure of myxobacteria. In this algorithm, cell structure is presented by.
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
SVM by Sequential Minimal Optimization (SMO)
1 Logistic Regression Adapted from: Tom Mitchell’s Machine Learning Book Evan Wei Xiang and Qiang Yang.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
3 September, 2009 SSP2009, Cardiff, UK 1 Probabilistic Image Processing by Extended Gauss-Markov Random Fields Kazuyuki Tanaka Kazuyuki Tanaka, Muneki.
INDEPENDENT COMPONENT ANALYSIS OF TEXTURES based on the article R.Manduchi, J. Portilla, ICA of Textures, The Proc. of the 7 th IEEE Int. Conf. On Comp.
Markov Random Fields Probabilistic Models for Images
Xu Huaping, Wang Wei, Liu Xianghua Beihang University, China.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
MCMC (Part II) By Marc Sobel. Monte Carlo Exploration  Suppose we want to optimize a complicated distribution f(*). We assume ‘f’ is known up to a multiplicative.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Lecture 2: Statistical learning primer for biologists
5. Maximum Likelihood –II Prof. Yuille. Stat 231. Fall 2004.
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
Lecture 9 State Space Gradient Descent Gibbs Sampler with Simulated Annealing.
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Ch 2. Probability Distributions (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by Joo-kyung Kim Biointelligence Laboratory,
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization (M-FISH) Images Amin Fazel 2006 Department of Computer Science.
1 1)Bayes’ Theorem 2)MAP, ML Hypothesis 3)Bayes optimal & Naïve Bayes classifiers IES 511 Machine Learning Dr. Türker İnce (Lecture notes by Prof. T. M.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Biointelligence Laboratory, Seoul National University
LINEAR CLASSIFIERS The Problem: Consider a two class task with ω1, ω2.
Simulated Annealing Chapter
Statistical-Mechanical Approach to Probabilistic Image Processing -- Loopy Belief Propagation and Advanced Mean-Field Method -- Kazuyuki Tanaka and Noriko.
Graduate School of Information Sciences, Tohoku University
Sublinear Computational Time Modeling in Statistical Machine Learning Theory for Markov Random Fields Kazuyuki Tanaka GSIS, Tohoku University, Sendai,
ERGM conditional form Much easier to calculate delta (change statistics)
Markov Random Fields with Efficient Approximations
Latent Variables, Mixture Models and EM
Graduate School of Information Sciences, Tohoku University, Japan
Probabilistic Models for Linear Regression
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
Graduate School of Information Sciences, Tohoku University
Markov Networks.
Binarization of Low Quality Text Using a Markov Random Field Model
Segmentation Using Metropolis Algorithm
Akio Utsugi National Institute of Bioscience and Human-technology,
Bayesian Models in Machine Learning
Markov Random Fields for Edge Classification
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Biointelligence Laboratory, Seoul National University
LECTURE 23: INFORMATION THEORY REVIEW
Markov Networks.
The Improved Iterative Scaling Algorithm: A gentle Introduction
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung

Optimal Bayesian Estimation Posterior margianl distribution  Sum of all posterior distributions Energy Function

Optimal Bayesian Estimation Optimal Bayesian estimator f a* with respect to the energy Optimal Bayesian estimator f a* for all s ≠ q over the finite set of possible values Q i

Segmentation and Reconstruction Error: the number of elements that are classified incorrectly Optimal segmentation estimate

Segmentation and Reconstruction Smoothness enforcing energy through quadratic form Mean Optimal restoration estimate (Thresholded posterior mean)

The MPM Algorithm Posterior marginals  Counting the number of times each label occurs at a site Mean field Metropolis-based MPM algorithm 1. Initialize the lattice system by choosing the pixel value at each site that maximizes the likelihood. 2. For each site in the lattice choose a pixel value at random subject to the symmetry requirement. 3. Calculate the energy change ΔU. 4. If ΔU < 0, accept the move. 5. Or else accept the move with probability exp{- ΔU }. 6. Repeat steps 2 through 5 n times, saving configurations f (k+1) to f (n). 7. Form p i ( q | g ) according to posterior marginals. 8. For each site in the lattice, choose the pixel value q satisfying the condition for all pixel values s.

Quenching and ICM Iterated conditional modes of the posterior distribution  Choose pixel values that maximizes the posterior distribution rather than selecting pixel values based on the Gibbs sampler or Metropolis algorithm  Depend on the prior distribution on the pixel values in the neighborhood of the site of interset  Replace the prior probabilites p(f i ) with the conditionanl probabilites p(f i |f ∂i )

Quenching and ICM

The ICM Algorithm Mode of the posterior distribution κ=2ασ 2 ICM Algorithm 1. Initialize the lattice by choosing the values at each lattice site that maximize the likelihood. 2. For each site in the lattice update the pixel value by selecting the quantity that maximizes the posterior energy, by finding the minimum energy according to mode of the posterior distribution or by using an equivalent expression appropriate for a different potential, or in some other manner. 3. Iterate, by repeating step 2 until finished.