Fields of Experts: A Framework for Learning Image Priors 2006. 7. 10 (Mon) Young Ki Baik, Computer Vision Lab.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Example 1 Generating Random Photorealistic Objects Umar Mohammed and Simon Prince Department of Computer Science, University.
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Frank Wood - Training Products of Experts by Minimizing Contrastive Divergence Geoffrey E. Hinton presented by Frank Wood.
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Various Regularization Methods in Computer Vision Min-Gyu Park Computer Vision Lab. School of Information and Communications GIST.
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Computer vision: models, learning and inference Chapter 8 Regression.
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
Proportion Priors for Image Sequence Segmentation Claudia Nieuwenhuis, etc. ICCV 2013 Oral.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
Visual Recognition Tutorial
Unsupervised Learning With Neural Nets Deep Learning and Neural Nets Spring 2015.
1 On the Statistical Analysis of Dirty Pictures Julian Besag.
Lecture 5: Learning models using EM
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Independent Component Analysis (ICA) and Factor Analysis (FA)
Image Enhancement.
Advanced Topics in Computer Vision
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Learning Energy-Based Models of High-Dimensional Data Geoffrey Hinton Max Welling Yee-Whye Teh Simon Osindero
CSC2535: 2013 Advanced Machine Learning Lecture 3a: The Origin of Variational Bayes Geoffrey Hinton.
CSC 2535: 2013 Lecture 3b Approximate inference in Energy-Based Models
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.).
SVCL Automatic detection of object based Region-of-Interest for image compression Sunhyoung Han.
EM and expected complete log-likelihood Mixture of Experts
Interactive Graph Cuts for Optimal Boundary & Region Segmentation of Objects in N-D Images (Fri) Young Ki Baik, Computer Vision Lab.
A Markov Random Field Model for Term Dependencies Donald Metzler W. Bruce Croft Present by Chia-Hao Lee.
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 3: LINEAR MODELS FOR REGRESSION.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Markov Random Fields Probabilistic Models for Images
CSC321: Neural Networks Lecture 24 Products of Experts Geoffrey Hinton.
CSC 2535 Lecture 8 Products of Experts Geoffrey Hinton.
Virtual Vector Machine for Bayesian Online Classification Yuan (Alan) Qi CS & Statistics Purdue June, 2009 Joint work with T.P. Minka and R. Xiang.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Paper Reading Dalong Du Nov.27, Papers Leon Gu and Takeo Kanade. A Generative Shape Regularization Model for Robust Face Alignment. ECCV08. Yan.
Lecture 2: Statistical learning primer for biologists
CIAR Summer School Tutorial Lecture 1b Sigmoid Belief Nets Geoffrey Hinton.
 Present by 陳群元.  Introduction  Previous work  Predicting motion patterns  Spatio-temporal transition distribution  Discerning pedestrians  Experimental.
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 15: Mixtures of Experts Geoffrey Hinton.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Markov Random Fields in Vision
CSC 2535: Computation in Neural Networks Lecture 10 Learning Deterministic Energy-Based Models Geoffrey Hinton.
CSC Lecture 23: Sigmoid Belief Nets and the wake-sleep algorithm Geoffrey Hinton.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
CSC2535: Computation in Neural Networks Lecture 7: Independent Components Analysis Geoffrey Hinton.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
CSC321: Lecture 8: The Bayesian way to fit models Geoffrey Hinton.
Statistical environment representation to support navigation of mobile robots in unstructured environments Sumare workshop Stefan Rolfes Maria.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
1 Nonlinear models for Natural Image Statistics Urs Köster & Aapo Hyvärinen University of Helsinki.
Biointelligence Laboratory, Seoul National University
CSC2535: Computation in Neural Networks Lecture 11 Extracting coherent properties by maximizing mutual information across space or time Geoffrey Hinton.
Deep Feedforward Networks
Markov Random Fields with Efficient Approximations
Goodfellow: Chapter 14 Autoencoders
Statistical environment representation to support navigation of mobile robots in unstructured environments Stefan Rolfes Maria Joao Rendas
Markov Random Fields Presented by: Vladan Radosavljevic.
Probabilistic Surrogate Models
Goodfellow: Chapter 14 Autoencoders
Presentation transcript:

Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.

2 Fields of Experts References On the Spatial Statistics of Optical Flow Stefan Roth and Michael J. Black (ICCV 2005) Fields of Experts: A Framework for Learning Image Priors Stefan Roth, Michael J. Black (CVPR 2005) Products of Experts G. Hinton (ICANN 1999) Training products of experts by minimizing contrastive divergence G. Hinton (Neural Comp. 2002) Sparse coding with an over-complete basis set B. Olshausen and D. Field (VR1997)

3 Fields of Experts Contents Introduction Products of Experts Fields of Experts Application : Image denoising Summary

4 Fields of Experts Introduction (Image denoising) Spatial filter Gaussian, Mean, Median ….

5 Fields of Experts Introduction (Image denosing)

6 Fields of Experts Introduction Target Developing a framework for learning rich, generic image priors (potential function) that capture the statistics of natural scenes. Special features Sparse Coding methods and Products of Experts Extended version of Products of Experts. MRF(Markov Random Field) model with learning potential function in order to solving conventional PoE problems.

7 Fields of Experts Sparse Coding Sparse coding represent an image patch in terms of a linear combination of learned filters( or bases). To express the image probability with small parameters An example of mixture model

8 Fields of Experts Products of Experts Mixture model Build a model of a complicated data distribution by combining several simple models. Mixture models take a weighted sum of the distributions. Mixture model: Scale each distribution down and add them together

9 Fields of Experts Products of Experts Mixture model Mixture models are very inefficient in high- dimensional spaces.

10 Fields of Experts Products of Experts PoE model Build a model of a complicated data distribution by combining several simple models. multiply the distributions together and renormalize. The product is much sharper than the individual distributions. Product model: Multiply the two densities together at every point and then renormalize.

11 Fields of Experts Products of Experts PoE model PoE ’ s work well on high dimensional distributions. A normalization term is needed to convert the product of the individual densities into a combined density.

12 Fields of Experts Products of Experts Geoffrey E. Hinton : Products of Exports Most of perceptual systems produce a sharp posterior distribution on high-dimensional manifold. PoE model is very efficient to solve vision problem.

13 Fields of Experts Products of Experts PoE framework for vision problem Learning sparse topographic representation with products of Student-t distributions -M. Welling, G. Hinton, and S. Osindero(NIPS 2003)

14 Fields of Experts Products of Experts PoE framework for vision problem Experts : Student-t distribution Responses of linear filters applied to natural images typically resemble Studient-t experts Learning sparse topographic representation with products of Student-t distributions -M. Welling, G. Hinton, and S. Osindero(NIPS 2003)

15 Fields of Experts Products of Experts PoE framework for vision problem Probability density in Gibbs form

16 Fields of Experts Products of Experts Problems Patch based method Patch can be set to whole image or collection of patch with specific location in order to treat whole image region.

17 Fields of Experts Products of Experts Problems The number of parameters to learn would be too large. The model would only work for one specific image size and would not generalize to other image size. The model would not be translation invariant, which is a desirable property for generic image priors.

18 Fields of Experts Key idea Combining MRF models

19 Fields of Experts Key idea Define a neighborhood system that connects all nodes in an m x m rectangular region. Defines a maximal clique in the graph

20 Fields of Experts The Hammersley-Clifford theorem Set the probability density of graphical model as a Gibbs distribution. Translation-invariance of an MRF model assume that potential function is same for all cliques.

21 Fields of Experts Potential function Learn from training images Probability density of a full image under the FoE

22 Fields of Experts Learning Parameter and filter can be learned from a set of training images by maximizing its likelihood. Maximizing the likelihood for the PoE and the FoE model is equivalent. Perform a gradient ascent method on the log- likelihood

23 Fields of Experts Application Image denoising Given an observed noisy image y, Find the true image x that maximizes the posterior probability. Assumption The true image has been corrupted by additive, i.i.d Gaussian noise with zero mean and known standard deviation.

24 Fields of Experts Application Image denoising In order to maximize the posterior probability, gradient ascent method on the logarithm of the posterior probability is used. The gradient of the log-likelihood The gradient of the log-prior

25 Fields of Experts Application Image denoising The gradient ascent denoising algorithm

26 Fields of Experts Applications Image denoising a) Original image b) Noisy image c) Denoising image

27 Fields of Experts Summary Contribution Point out limitation of conventional Product of Experts. PoE focus on the modeling of small image patches rather than defining a prior model over an entire image. Propose FoE which models the prior probability of an entire image in term of random field with overlapping cliques, whose potentials are represented as a PoE.