Rob Fergus Courant Institute of Mathematical Sciences New York University A Variational Approach to Blind Image Deconvolution.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

A Two-Step Approach to Hallucinating Faces: Global Parametric Model and Local Nonparametric Model Ce Liu Heung-Yeung Shum Chang Shui Zhang CVPR 2001.
Removing blur due to camera shake from images. William T. Freeman Joint work with Rob Fergus, Anat Levin, Yair Weiss, Fredo Durand, Aaron Hertzman, Sam.
Bayesian inference Lee Harrison York Neuroimaging Centre 01 / 05 / 2009.
Bayesian Belief Propagation
Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
Lecture 2: Convolution and edge detection CS4670: Computer Vision Noah Snavely From Sandlot ScienceSandlot Science.
Patch-based Image Deconvolution via Joint Modeling of Sparse Priors Chao Jia and Brian L. Evans The University of Texas at Austin 12 Sep
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
Reducing Drift in Parametric Motion Tracking
Shaojie Zhuo, Dong Guo, Terence Sim School of Computing, National University of Singapore CVPR2010 Reporter: 周 澄 (A.J.) 01/16/2011 Key words: image deblur,
Single Image Blind Deconvolution Presented By: Tomer Peled & Eitan Shterenbaum.
Unnatural L 0 Representation for Natural Image Deblurring Speaker: Wei-Sheng Lai Date: 2013/04/26.
1 Removing Camera Shake from a Single Photograph Rob Fergus, Barun Singh, Aaron Hertzmann, Sam T. Roweis and William T. Freeman ACM SIGGRAPH 2006, Boston,
Jun Zhu Dept. of Comp. Sci. & Tech., Tsinghua University This work was done when I was a visiting researcher at CMU. Joint.
ECE 472/572 - Digital Image Processing Lecture 8 - Image Restoration – Linear, Position-Invariant Degradations 10/10/11.
Visual Recognition Tutorial
Edge detection. Edge Detection in Images Finding the contour of objects in a scene.
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Lecture 1: Images and image filtering
Image Deblurring with Optimizations Qi Shan Leo Jiaya Jia Aseem Agarwala University of Washington The Chinese University of Hong Kong Adobe Systems, Inc.
Lecture 2: Image filtering
Our output Blur kernel. Close-up of child Our output Original photograph.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Computer vision: models, learning and inference
Introduction to Bayesian Parameter Estimation
Understanding and evaluating blind deconvolution algorithms
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Automatic Estimation and Removal of Noise from a Single Image
CSC2535: 2013 Advanced Machine Learning Lecture 3a: The Origin of Variational Bayes Geoffrey Hinton.
Super-Resolution of Remotely-Sensed Images Using a Learning-Based Approach Isabelle Bégin and Frank P. Ferrie Abstract Super-resolution addresses the problem.
Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10.
Crash Course on Machine Learning
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
1 Patch Complexity, Finite Pixel Correlations and Optimal Denoising Anat Levin, Boaz Nadler, Fredo Durand and Bill Freeman Weizmann Institute, MIT CSAIL.
Mutual Information-based Stereo Matching Combined with SIFT Descriptor in Log-chromaticity Color Space Yong Seok Heo, Kyoung Mu Lee, and Sang Uk Lee.
Chapter 7 Case Study 1: Image Deconvolution. Different Types of Image Blur Defocus blur --- Depth of field effects Scene motion --- Objects in the scene.
Discrete Images (Chapter 7) Fourier Transform on discrete and bounded domains. Given an image: 1.Zero boundary condition 2.Periodic boundary condition.
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
Yu-Wing Tai, Hao Du, Michael S. Brown, Stephen Lin CVPR’08 (Longer Version in Revision at IEEE Trans PAMI) Google Search: Video Deblurring Spatially Varying.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Why is computer vision difficult?
Machine Learning for Computer graphics Aaron Hertzmann University of Toronto Bayesian.
Randomized Algorithms for Bayesian Hierarchical Clustering
Learning to Perceive Transparency from the Statistics of Natural Scenes Anat Levin School of Computer Science and Engineering The Hebrew University of.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 07: BAYESIAN ESTIMATION (Cont.) Objectives:
Guest lecture: Feature Selection Alan Qi Dec 2, 2004.
Lecture#4 Image reconstruction
Chapter 5 Image Restoration.
Univariate Gaussian Case (Cont.)
Vincent DeVito Computer Systems Lab The goal of my project is to take an image input, artificially blur it using a known blur kernel, then.
Bayesian Hierarchical Clustering Paper by K. Heller and Z. Ghahramani ICML 2005 Presented by David Williams Paper Discussion Group ( )
Lecture 1: Images and image filtering CS4670/5670: Intro to Computer Vision Noah Snavely Hybrid Images, Oliva et al.,
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
RECONSTRUCTION OF MULTI- SPECTRAL IMAGES USING MAP Gaurav.
Univariate Gaussian Case (Cont.)
Deep Feedforward Networks
Image Deblurring and noise reduction in python
CS 2750: Machine Learning Density Estimation
Ch3: Model Building through Regression
Deconvolution , , Computational Photography
Markov Random Fields with Efficient Approximations
Lecture 26: Faces and probabilities
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
LECTURE 09: BAYESIAN LEARNING
LECTURE 07: BAYESIAN ESTIMATION
Deblurring Shaken and Partially Saturated Images
Presentation transcript:

Rob Fergus Courant Institute of Mathematical Sciences New York University A Variational Approach to Blind Image Deconvolution

Overview Much recent interest in blind deconvolution: Levin ’06, Fergus et al. ’06, Jia et al. ’07, Joshi et al. ’08, Shan et al. ’08, Whyte et al. ’10 This talk: – Discuss Fergus ‘06 algorithm – Try and give some insight into the variational methods used

Removing Camera Shake from a Single Photograph Rob Fergus, Barun Singh, Aaron Hertzmann, Sam T. Roweis and William T. Freeman Massachusetts Institute of Technology and University of Toronto

Image formation process =  Blurry image Sharp image Blur kernel Input to algorithm Desired output Convolution operator Model is approximation Assume static scene & constant blur

Why such a hard problem? =  Blurry image Sharp image Blur kernel =  = 

Statistics of gradients in natural images Histogram of image gradients Characteristic distribution with heavy tails Log # pixels Image gradient is difference between spatially adjacent pixel intensities

Blurry images have different statistics Histogram of image gradients Log # pixels

Parametric distribution Histogram of image gradients Use parametric model of sharp image statistics Log # pixels

Three sources of information 1. Reconstruction constraint: =  Input blurry image Estimated sharp image Estimated blur kernel 2. Image prior: 3. Blur prior: Positive & Sparse Distribution of gradients

Three sources of information y = observed image b = blur kernelx = sharp image

Posterior Three sources of information y = observed image b = blur kernelx = sharp image

Posterior 1. Likelihood (Reconstruction constraint) 2. Image prior 3. Blur prior Three sources of information y = observed image b = blur kernelx = sharp image

y = observed image b = blurx = sharp image 1. Likelihood Reconstruction constraint i - pixel index Overview of model 2. Image prior Mixture of Gaussians fit to empirical distribution of image gradients 3. Blur prior Exponential distribution to keep kernel +ve & sparse

The obvious thing to do – Combine 3 terms into an objective function – Run conjugate gradient descent – This is Maximum a-Posteriori (MAP) No success! Posterior 1. Likelihood (Reconstruction constraint) 2. Image prior 3. Blur prior

Variational Independent Component Analysis Binary images Priors on intensities Small, synthetic blurs Not applicable to natural images Miskin and Mackay, 2000

Variational Bayes Variational Bayesian approach Keeps track of uncertainty in estimates of image and blur by using a distribution instead of a single estimate Toy illustration: Optimization surface for a single variable Maximum a-Posteriori (MAP) Pixel intensity Score

Simple 1-D blind deconvolution example y = observed image b = blur x = sharp image n = noise ~ N(0,σ 2 ) Let y = 2

σ 2 = 0.1

Gaussian distribution:

Marginal distribution p(b|y)

MAP solution Highest point on surface:

Variational Bayes True Bayesian approach not tractable Approximate posterior with simple distribution

Fitting posterior with a Gaussian Approximating distribution is Gaussian Minimize

KL-Distance vs Gaussian width

Variational Approximation of Marginal Variational True marginal MAP

Try sampling from the model Let true b = 2 Repeat: Sample x ~ N(0,2) Sample n ~ N(0,σ 2 ) y = xb + n Compute p MAP (b|y), p Bayes (b|y) & p Variational (b|y) Multiply with existing density estimates (assume iid)

Actual Setup of Variational Approach Work in gradient domain: Approximate posterior with is Gaussian on each pixel is rectified Gaussian on each blur kernel element Assume Cost function

Close-up Original Output

Original photograph

Our output Blur kernel

Our outputOriginal image Close-up

Recent Comparison paper Percent success (higher is better) IEEE CVPR 2009 conference

Related Problems Bayesian Color Constancy – Brainard & Freeman JOSA 1997 – Given color pixels, deduce: Reflectance Spectrum of surface Illuminant Spectrum – Use Laplace approximation – Similar to Gaussian q(.) From Foundation of Vision by Brian Wandell, Sinauer Associates, 1995

Conclusions Variational methods seem to do the right thing for blind deconvolution. Interesting from inference point of view: rare case where Bayesian methods are needed Can potentially be applied to other ill-posed problems in image processing