Biointelligence Laboratory, Seoul National University

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Biointelligence Laboratory, Seoul National University
Smoothing 3D Meshes using Markov Random Fields
Markov random field Institute of Electronics, NCTU
ECE 472/572 - Digital Image Processing Lecture 8 - Image Restoration – Linear, Position-Invariant Degradations 10/10/11.
Digital Image Processing Chapter 5: Image Restoration.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Announcements Readings for today:
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
1 Bayesian Restoration Using a New Nonstationary Edge-Preserving Image Prior Giannis K. Chantas, Nikolaos P. Galatsanos, and Aristidis C. Likas IEEE Transactions.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Today Wrap up of probability Vectors, Matrices. Calculus
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Summarized by Soo-Jin Kim
© by Yu Hen Hu 1 ECE533 Digital Image Processing Image Restoration.
Speech Recognition Pattern Classification. 22 September 2015Veton Këpuska2 Pattern Classification  Introduction  Parametric classifiers  Semi-parametric.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Ch 2. Probability Distributions (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by Yung-Kyun Noh and Joo-kyung Kim Biointelligence.
Markov Random Fields Probabilistic Models for Images
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Ch 4. Linear Models for Classification (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized and revised by Hee-Woong Lim.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
CHAPTER 5 SIGNAL SPACE ANALYSIS
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
Additional Topics in Prediction Methodology. Introduction Predictive distribution for random variable Y 0 is meant to capture all the information about.
Lecture 2: Statistical learning primer for biologists
Machine Learning CUNY Graduate Center Lecture 2: Math Primer.
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
Geology 5670/6670 Inverse Theory 28 Jan 2015 © A.R. Lowry 2015 Read for Fri 30 Jan: Menke Ch 4 (69-88) Last time: Ordinary Least Squares: Uncertainty The.
Regularization of energy-based representations Minimize total energy E p (u) + (1- )E d (u,d) E p (u) : Stabilizing function - a smoothness constraint.
Ch 2. Probability Distributions (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by Joo-kyung Kim Biointelligence Laboratory,
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
Markov random fields. The Markov property Discrete time: A time symmetric version: A more general version: Let A be a set of indices >k, B a set of indices.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Ch 2. THERMODYNAMICS, STATISTICAL MECHANICS, AND METROPOLIS ALGORITHMS 2.6 ~ 2.8 Adaptive Cooperative Systems, Martin Beckerman, Summarized by J.-W.
6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Ch 12. Continuous Latent Variables ~ 12
Probability Theory and Parameter Estimation I
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Degradation/Restoration Model
The general linear model and Statistical Parametric Mapping
Ch3: Model Building through Regression
CH 5: Multivariate Methods
Markov Random Fields with Efficient Approximations
The Chinese University of Hong Kong
Markov random fields Help from David Bolin, Johan Lindström and Finn Lindgren.
Special Topics In Scientific Computing
Graduate School of Information Sciences, Tohoku University
Binarization of Low Quality Text Using a Markov Random Field Model
Shashi Shekhar Weili Wu Sanjay Chawla Ranga Raju Vatsavai
Biointelligence Laboratory, Seoul National University
Pattern Recognition and Machine Learning
Biointelligence Laboratory, Seoul National University
OVERVIEW OF LINEAR MODELS
The general linear model and Statistical Parametric Mapping
Image and Video Processing
Biointelligence Laboratory, Seoul National University
Adaptive Cooperative Systems Chapter 6 Markov Random Fields
Biointelligence Laboratory, Seoul National University
Ch 3. Linear Models for Regression (2/2) Pattern Recognition and Machine Learning, C. M. Bishop, Previously summarized by Yung-Kyun Noh Updated.
Biointelligence Laboratory, Seoul National University
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Biointelligence Laboratory, Seoul National University Ch 6. Markov Random Fields 6.6 ~ 6.7 Adaptive Cooperative Systems, Martin Beckerman, 1997. Summarized by M.-O. Heo Biointelligence Laboratory, Seoul National University http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Contents 6.6 Simultaneous Autoregressive Models 6.6.1 Simultaneous Autoregressive Random Fields 6.6.2 Image Reconstruction 6.6.3 Fourier Computation and Block Circulant Matrices 6.7 The Method of Geman and Geman 6.7.1 Maximum A posteriori (MAP) Estimation 6.7.2 Clique Potentials 6.7.3 The Posterior Distribution 6.7.4 The Gibbs Sampler (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Simultaneous Autoregressive Random Fields Simultaneous Autoregressive Models AR representations Finite support SAR Random Fields Defined with noise random variables Support Zero-mean independent random variable Real-valued coefficient over the lattice The set of acquaintances of (i,j) (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Simultaneous Autoregressive Random Fields SAR random field is a MRF? Using joint probability distribution Find the following relation (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Simultaneous Autoregressive Random Fields Toroidal lattice SAR field Defined by a pair of equations of the form For the interior region For the boundary (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Image Reconstruction Problem Formulation Notations D{f(x, y)} : point-spread function, linear, space invariant f(x, y) : ideal image g(x, y) : observed image n(x, y) : noise process (independent of the imaging process) Degredation process (or noise process) One dimensional process Two dimensional process (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Image reconstruction methods 1. Constrained least-squares estimation (LSE) Constrained least-squares estimation (LSE) Objective function and the corresponding solution Selecting an appropriate form of Q Subject to constraint Optimal estimate) Correlation matrix for the image f Correlation matrix for the noise n (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Image reconstruction methods 2. linear minimum mean square error (MMSE) estimation linear minimum mean square error (MMSE) estimation Define the error e Minimize the positive quantity Assuming that f transform into g under a linear operator L, Assumption of signal-independent noise Solution by this step) More assumptions Image f is stationary Rf is approximated by the block circulant covariance matrix Qf White noise process with common variance Solution Using SAR and GRF image models (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Fourier Computation and Block Circulant Matrices Degredation with discrete Fourier transform Diagonal elements Tkk are the eigenvalues of the matrix R (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Using one dimensional convolution to the degredation model Discrete Fourier transform with W-1 The result The discrete convolution theorem Convolution of a pair of arrays in the spatial domain to element-by-element multiplication in the frequency domain, and vice versa. Two-dimensional image model (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

The method of Geman and Geman MAP estimation for image reconstruction 3 main points Use of Bayesian formalism to incorporate constraints Gibbs-Markov equivalence A second, dual lattice system containing discontinuity-preserving information (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Maximum A Posteriori (MAP) Estimation Posterior distribution to maximize Degredation process Additive noise to be described by the multivariate gaussian Assumption for noise : uncorrelated and stationary with zero mean Likelihood Assumption: no blurring, point-spread function takes other effects (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Model the image as a MRF Prior distribution An Energy composed of a sum of local contributions Partition function Z The site potentials Strength parameter like the inverse of the temperature Clique potentials Cliques associated with the neighborhood system for the site (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Clique Potentials The general property of the lattice systems Piecewise smoothness Neighboring pixels tend to have similar grey values Ex) Four-level model for texture realization Four type of elements (a) (b) (c) (d) Encourages all elements within a given clique to have the same grey level. Controls the percentage of pixels of each region. And useful for textured image regions. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

The Posterior Distribution The energy If we use the quadratic difference potentials as clique potentials Gibbs distribution having the same potential as the posterior distribution Favors maximally smooth restorations, penalizing large pairwise contrasts in pixel values within a local neighborhood. Encourages restoration that are not too different from the data

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ The Gibbs Sampler Finding desired low-energy states using the SA Gibbs Sampler The sampling method on the Gibbs distribution In the exponential in the Posterior Gibbs distribution… The result of this sampler Controls the relative hardness of the constraints Annealing temperature (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ The Gibbs Sampler The general form (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/