Binarization of Low Quality Text Using a Markov Random Field Model

Slides:



Advertisements
Similar presentations
A Two-Step Approach to Hallucinating Faces: Global Parametric Model and Local Nonparametric Model Ce Liu Heung-Yeung Shum Chang Shui Zhang CVPR 2001.
Advertisements

Bayesian Belief Propagation
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Prénom Nom Document Analysis: Document Image Processing Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
電腦視覺 Computer and Robot Vision I Chapter2: Binary Machine Vision: Thresholding and Segmentation Instructor: Shih-Shinh Huang 1.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Smoothing 3D Meshes using Markov Random Fields
Contextual Classification by Melanie Ganz Lecture 6, Medical Image Analysis 2008.
Markov random field Institute of Electronics, NCTU
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
A Graphical Model For Simultaneous Partitioning And Labeling Philip Cowans & Martin Szummer AISTATS, Jan 2005 Cambridge.
J. Mike McHugh,Janusz Konrad, Venkatesh Saligrama and Pierre-Marc Jodoin Signal Processing Letters, IEEE Professor: Jar-Ferr Yang Presenter: Ming-Hua Tang.
Parameter Estimation: Maximum Likelihood Estimation Chapter 3 (Duda et al.) – Sections CS479/679 Pattern Recognition Dr. George Bebis.
Predictive Automatic Relevance Determination by Expectation Propagation Yuan (Alan) Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani.
J. Daunizeau Wellcome Trust Centre for Neuroimaging, London, UK Institute of Empirical Research in Economics, Zurich, Switzerland Bayesian inference.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Announcements Readings for today:
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
GmImgProc Alexandra Olteanu SCPD Alexandru Ştefănescu SCPD.
Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.).
SVCL Automatic detection of object based Region-of-Interest for image compression Sunhyoung Han.
DTU Medical Visionday May 27, 2009 Generative models for automated brain MRI segmentation Koen Van Leemput Athinoula A. Martinos Center for Biomedical.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
City University of Hong Kong 18 th Intl. Conf. Pattern Recognition Self-Validated and Spatially Coherent Clustering with NS-MRF and Graph Cuts Wei Feng.
Markov Random Fields Probabilistic Models for Images
Siddhartha Shakya1 Estimation Of Distribution Algorithm based on Markov Random Fields Siddhartha Shakya School Of Computing The Robert Gordon.
Xu Huaping, Wang Wei, Liu Xianghua Beihang University, China.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
BCS547 Neural Decoding.
Guest lecture: Feature Selection Alan Qi Dec 2, 2004.
Lecture 2: Statistical learning primer for biologists
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
1 Identifying Differentially Regulated Genes Nirmalya Bandyopadhyay, Manas Somaiya, Sanjay Ranka, and Tamer Kahveci Bioinformatics Lab., CISE Department,
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Regularization of energy-based representations Minimize total energy E p (u) + (1- )E d (u,d) E p (u) : Stabilizing function - a smoothness constraint.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Markov Random Fields in Vision
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung.
Date of download: 7/9/2016 Copyright © 2016 SPIE. All rights reserved. Plot of the weight function α(∙) for different values of the steepness parameter.
Biointelligence Laboratory, Seoul National University
CS479/679 Pattern Recognition Dr. George Bebis
Learning Deep Generative Models by Ruslan Salakhutdinov
Statistical-Mechanical Approach to Probabilistic Image Processing -- Loopy Belief Propagation and Advanced Mean-Field Method -- Kazuyuki Tanaka and Noriko.
Ch3: Model Building through Regression
Classification of unlabeled data:
CSC321: Neural Networks Lecture 19: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Markov Random Fields with Efficient Approximations
Outline Parameter estimation – continued Non-parametric methods.
Graduate School of Information Sciences, Tohoku University
Markov Random Fields for Edge Classification
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Shashi Shekhar Weili Wu Sanjay Chawla Ranga Raju Vatsavai
Image and Video Processing
A Block Based MAP Segmentation for Image Compression
Christian Wolf Jean-Michel Jolion Françoise Chassaing
Computer and Robot Vision I
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Binarization of Low Quality Text Using a Markov Random Field Model Christian Wolf and David Doermann Most existing binarization techniques have been conceived for high-resolution and good quality document images. Binarization of low quality, low resolution and lossly compressed multimedia docu-ments is a non trivial problem. We present a method using prior information about the spatial configu-ration of the binary pixels. Binarization is performed as a Bayesian estimation pro-blem in a MAP framework using a Mar-kov random field model. Markov random field models The MRF models the prior information on the spatial configuration of the binary pixels in the image. Energy potentials are assigned to cliques, i.e. possible black and white labellings of pixel neighborhoods, where high energy means a low possibility for a clique according to the model. The joint probability distribution function of the sites(pixels) of the MRF is a Gibbs Distribution, containing the sum of the clique potentials of all pixels. Optimization is done by simulated annealing. C ... cliques Vc .. clique potential z ... estimated image T ... Temperature (for the simulated annealing) The prior distribution The MRF is defined on a large neighborhood (4x4 pixel cliques). The clique potentials are learned from training data by converting the estimated absolute probabilities into potentials: ... clique labelling B ... clique size In order to compensate for the high difference between text and background pixels, each potential is normalized by deviding it by the probability Pi of this clique labelling being drawn from a stationary but biased source, which generates white and black pixels with probabilities  and , respectively (estimated from the frequencies of white and black pixels in the training set). w ... number of white pixels in the clique b ... number of black pixels in the clique The observation model (likelihood) Most observation models in MRF based estimation methods use simple models, as e.g. Gaussian noise with zero mean. This corresponds to a fixed thresholding with a threshold of 127.5 if the prior is uniform. is achieved. z ... estimated gray value f ... observed gray value We use standard binarization methods (Niblack and derived techniques) to “model” the likelihood. With a uniform prior, the same result as using the classic techniques is obtained. Desired effect: improving the performance of classic methods with prior knowledge of the spatial configuration of the image.Niblack is achieved. The clique labelings of the repaired pixel before and after flipping it. All 16 cliques favor the change of the pixel. The noise variance is estimated by maximizing the intra class variance between the text and background pixels using Otsu’s method.is achieved. Experimental results Document images from the Pink Panther database and from the Uni-versity of Washington database were down sampled by a factor of 2, coded in JPEG 75% and then binarized passed to the commercial OCR program Finereader. Sauvola et al. MRF Christian Wolf: wolf@rfv.insa-lyon.fr http://rfv.insa-lyon.fr/~wolf David Doermann: doermann@umiacs.umd.edu http://lamp.cfar.umd.edu/~doermann