SOS Boosting of Image Denoising Algorithms

Slides:



Advertisements
Similar presentations
Visual Dictionaries George Papandreou CVPR 2014 Tutorial on BASIS
Advertisements

MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Patch-based Image Deconvolution via Joint Modeling of Sparse Priors Chao Jia and Brian L. Evans The University of Texas at Austin 12 Sep
Mixed-Resolution Patch- Matching (MRPM) Harshit Sureka and P.J. Narayanan (ECCV 2012) Presentation by Yaniv Romano 1.
K-SVD Dictionary-Learning for Analysis Sparse Models
Image Denoising using Locally Learned Dictionaries Priyam Chatterjee Peyman Milanfar Dept. of Electrical Engineering University of California, Santa Cruz.
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
A novel supervised feature extraction and classification framework for land cover recognition of the off-land scenario Yan Cui
1 Micha Feigin, Danny Feldman, Nir Sochen
Duke University COPYRIGHT © DUKE UNIVERSITY 2012 Sparsity Based Denoising of Spectral Domain Optical Coherence Tomography Images Leyuan Fang, Shutao Li,
Ilias Theodorakopoulos PhD Candidate
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Newton’s Method Application to LMS Recursive Least Squares Exponentially-Weighted.
* *Joint work with Idan Ram Israel Cohen
Learning Convolutional Feature Hierarchies for Visual Recognition
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
Sparse and Overcomplete Data Representation
Retinex Algorithm Combined with Denoising Methods Hae Jong, Seo Multi Dimensional Signal Processing Group University of California at Santa Cruz.
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
1 Preprocessing for JPEG Compression Elad Davidson & Lilach Schwartz Project Supervisor: Ari Shenhar SPRING 2000 TECHNION - ISRAEL INSTITUTE of TECHNOLOGY.
Image Denoising via Learned Dictionaries and Sparse Representations
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
ON THE IMPROVEMENT OF IMAGE REGISTRATION FOR HIGH ACCURACY SUPER-RESOLUTION Michalis Vrigkas, Christophoros Nikou, Lisimachos P. Kondi University of Ioannina.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
* Joint work with Michal Aharon Guillermo Sapiro
On Signal Reconstruction from Fourier Magnitude Gil Michael Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000,
Super-Resolution With Fuzzy Motion Estimation
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Image Denoising and Inpainting with Deep Neural Networks Junyuan Xie, Linli Xu, Enhong Chen School of Computer Science and Technology University of Science.
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Retinex by Two Bilateral Filters Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000, Israel Scale-Space 2005 The.
Jinhui Tang †, Shuicheng Yan †, Richang Hong †, Guo-Jun Qi ‡, Tat-Seng Chua † † National University of Singapore ‡ University of Illinois at Urbana-Champaign.
1 Patch Complexity, Finite Pixel Correlations and Optimal Denoising Anat Levin, Boaz Nadler, Fredo Durand and Bill Freeman Weizmann Institute, MIT CSAIL.
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Eran Treister and Irad Yavneh Computer Science, Technion (with thanks to Michael Elad)
Sparsity-based Image Deblurring with Locally Adaptive and Nonlocally Robust Regularization Weisheng Dong a, Xin Li b, Lei Zhang c, Guangming Shi a a Xidian.
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Non-local Sparse Models for Image Restoration Julien Mairal, Francis Bach, Jean Ponce, Guillermo Sapiro and Andrew Zisserman ICCV 2009 Presented by: Mingyuan.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Combining the Power of Internal & External Denoising Inbar Mosseri The Weizmann Institute of Science, ISRAEL ICCP, 2013.
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
My Research in a Nut-Shell Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel Meeting with.
Sparsity Based Poisson Denoising and Inpainting
Systems Biology for Translational Medicine
A Quest for a Universal Model for Signals: From Sparsity to ConvNets
Yaniv Romano The Electrical Engineering Department
RED: Regularization by Denoising
Recent Results on the Co-Sparse Analysis Model
* *Joint work with Ron Rubinstein Tomer Peleg Remi Gribonval and
Sparse and Redundant Representations and Their Applications in
Non-negative Matrix Factorization (NMF)
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Learned Convolutional Sparse Coding
Presentation transcript:

SOS Boosting of Image Denoising Algorithms * * Joint work with Yaniv Romano Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel January 2015 Switzerland The research leading to these results has received funding from the European Research Council under European Union's Seventh Framework Program, ERC Grant agreement no. 320649, and by the Intel Collaborative Research Institute for Computational Intelligence

Leading Image Denoising Methods Are built upon powerful patch-based (local) image models: Non-Local Means (NLM): self-similarity within natural images K-SVD: sparse representation modeling of image patches BM3D: combines a sparsity prior and non local self-similarity Kernel-regression: offers a local directional filter EPLL: exploits a GMM model of the image patches … Today we present a way to improve various such state-of-the-art image denoising methods, simply by applying the original algorithm as a “black-box” several times SOS Boosting of Image Denoising Algorithms By Michael Elad

Search: “Image and Denoising” in ISI Web-of-Science Leading Image Denoising Methods Are built upon powerful patch-based (local) image models: Non-Local Means (NLM): self-similarity within natural images K-SVD: sparse representation modeling of image patches BM3D: combines a sparsity prior and non local self-similarity Kernel-regression: offers a local directional filter EPLL: exploits a GMM model of the image patches … Search: “Image and Denoising” in ISI Web-of-Science Today we present a way to improve various such state-of-the-art image denoising methods, simply by applying the original algorithm as a “black-box” several times SOS Boosting of Image Denoising Algorithms By Michael Elad

Background

Boosting Methods for Denoising Improved results can be achieved by processing the residual/method-noise image: Noisy image 𝐲 Denoised image 𝐱 =𝑓 𝐲 Method Noise 𝐲− 𝐱 SOS Boosting of Image Denoising Algorithms By Michael Elad

Processing the Residual Image Twicing [Tukey (’77), Charest et al. (’06)] 𝐱 𝑘+1 = 𝐱 𝑘 +𝑓 𝐲− 𝐱 𝑘 Method noise whitening [Romano & Elad (‘13)] Recovering the “stolen” content from the method-noise using the same basis elements that were chosen to represent the initially denoised patches TV denoising using Bregman distance [Bregman (‘67), Osher et al. (’05)] 𝐱 𝑘+1 =𝑓 𝐱 𝑘 + 𝑖=1 𝑘 𝐲− 𝐱 𝑖 SOS Boosting of Image Denoising Algorithms By Michael Elad

Boosting Methods Diffusion [Perona-Malik (’90), Coifman et al. (’06), Milanfar (’12)] Removes the noise leftovers that are found in the denoised image 𝐱 𝑘+1 =𝑓 𝐱 𝑘 SAIF [Talebi et al. (’12)] Chooses automatically the local improvement mechanism: Diffusion Twicing SOS Boosting of Image Denoising Algorithms By Michael Elad

Reducing the Local/Global Gap EPLL [Zoran & Weiss (’09), Sulam & Elad (‘14)] Treats a major shortcoming of patch-based methods: The gap between the local patch processing and the global need for a whole restored image By encouraging the patches of the final image (i.e. after patch aggregation) to comply with the local prior In practice – iterated denoising with a diminishing variance Denoising the patches of x 𝑘 Obtain x 𝑘+1 by averaging the overlapping patches SOS Boosting of Image Denoising Algorithms By Michael Elad

SOS Boosting

Strengthen - Operate - Subtract Boosting Given any denoiser, how can we improve its performance? Denoise SOS Boosting of Image Denoising Algorithms By Michael Elad

Strengthen - Operate - Subtract Boosting Given any denoiser, how can we improve its performance? Denoise Previous Result Strengthen the signal Operate the denoiser SOS Boosting of Image Denoising Algorithms By Michael Elad

Strengthen - Operate - Subtract Boosting Given any denoiser, how can we improve its performance? Denoise Previous Result Strengthen the signal Operate the denoiser Subtract the previous estimation from the outcome SOS formulation: 𝐱 𝑘+1 =𝑓 𝐲+ 𝐱 𝑘 − 𝐱 𝑘 SOS Boosting of Image Denoising Algorithms By Michael Elad

Strengthen - Operate - Subtract Boosting An improvement is obtained since SNR 𝐲+ 𝐱 >SNR 𝐲 In the ideal case, where 𝐱 =𝐱, we get We suggest strengthening the underlying signal, rather than Adding the residual back to the noisy image Twicing converges to the noisy image Filtering the previous estimate over and over again Diffusion could lead to over-smoothing, converging to a piece-wise constant image SNR 𝐲+𝐱 =2⋅SNR 𝐲 SOS Boosting of Image Denoising Algorithms By Michael Elad

Image Denoising – A Matrix Formulation In order to study the convergence of the SOS function, we represent the denoiser in its matrix form The properties of 𝐖: Kernel-based methods (e.g. Bilateral filter, NLM, Kernel Regression) can be approximated as row-stochastic positive definite matrices [Milanfar (’13)] Has eigenvalues in the range [0,…,1] What about sparsity-based methods? 𝐱 =𝑓 𝐲 =𝐖 𝐲 SOS Boosting of Image Denoising Algorithms By Michael Elad

D D Sparsity Model – The Basics … x = 𝛼 x =𝐃 𝛼 We assume the existence of a dictionary 𝐃∈ ℝ 𝑑×𝑛 whose columns are the atom signals Signals are modeled as sparse linear combinations of the dictionary atoms: where 𝛼 is sparse, meaning that it is assumed to contain mostly zeros The computation of 𝛼 from x (or its or its noisy version) is called sparse-coding The OMP is a popular sparse-coding technique, especially for low dimensional signals D … x =𝐃 𝛼 x = D 𝛼 SOS Boosting of Image Denoising Algorithms By Michael Elad

K-SVD Image Denoising [Elad & Aharon (‘06)] Initial Dictionary Noisy Image Using KSVD Update the Dictionary Denoise each patch Using OMP = 𝐃 𝑆 𝑖 𝐃 𝑆 𝑖 T 𝐃 𝑆 𝑖 −𝟏 𝐃 𝑆 𝑖 T 𝐑 𝒊 𝐲 𝐩 𝑖 = 𝐃 𝑆 𝑖 𝛼 𝒊 Denoised Patch A linear combination of few atoms 𝛼 𝑖 = min 𝑧 𝐃 𝑆 𝑖 z− 𝐑 𝒊 𝐲 2 2 𝐑 𝒊 extracts the 𝑖 𝑡ℎ patch from 𝐲 SOS Boosting of Image Denoising Algorithms By Michael Elad

= 𝜇𝐈+ 𝑖 𝐑 𝑖 T 𝐑 𝑖 −1 𝜇𝐈+ 𝑖 𝐑 𝑖 T 𝐃 𝑆 𝑖 𝐃 𝑆 𝑖 T 𝐃 𝑆 𝑖 −𝟏 𝐃 𝑆 𝑖 T 𝐑 𝒊 𝐲 K-SVD Image Denoising [Elad & Aharon (‘06)] Initial Dictionary Noisy Image Using KSVD Reconstructed Image Update the Dictionary Denoise each patch Using OMP 𝐱 = min 𝐱 𝜇 𝐱−𝐲 2 2 + 𝑖 𝐩 𝑖 − 𝐑 𝒊 𝐱 2 2 = 𝜇𝐈+ 𝑖 𝐑 𝑖 T 𝐑 𝑖 −1 𝜇𝐈+ 𝑖 𝐑 𝑖 T 𝐃 𝑆 𝑖 𝐃 𝑆 𝑖 T 𝐃 𝑆 𝑖 −𝟏 𝐃 𝑆 𝑖 T 𝐑 𝒊 𝐲 =𝐖𝐲 SOS Boosting of Image Denoising Algorithms By Michael Elad

K-SVD: A Matrix Formulation 𝐖 is a sum of projection matrices, and has the following properties: Symmetric, 𝐖 T =𝐖 (by assuming periodic boundary condition) Positive definite, 𝐖≻𝟎 Minimal eigenvalue 𝜆 𝑚𝑖𝑛 𝐖 ≥𝑐, where 𝑐= 𝜇 𝜇+𝑛 >0 𝜇 is the global averaging constant, and 𝑛 is the patch size 𝐖 𝟏 = 𝟏 , thus 𝜆=1 is eigenvalue corresponding to the eigenvector 𝟏 The spectral radius 𝐖 2 =1, thus 𝜆 𝑚𝑎𝑥 𝐖 =1 𝐖 may have negative entries, which violates the classic definition of row-stochasticness The spectral radius 𝐖−𝐈 2 ≤ 1−𝑐<1 SOS Boosting of Image Denoising Algorithms By Michael Elad

𝑒 𝑘 = 𝐖 𝑘 −𝐈 𝑒 𝑘−1 + 𝐖 𝑘 − 𝐖 ∗ 𝐲+ 𝐱 ∗ Convergence Study SOS formulation: x 𝑘 = 𝐖 𝑘 y+ x 𝑘−1 − x 𝑘−1 Given: 𝐱 ∗ – the denoised image that obtained for a large 𝑘 𝐖 𝑘 , 𝑘=1,2,… – A series of filter-matrices The error 𝑒 𝑘 = x 𝑘 − x ∗ of the recursive function is expressed by By assuming 𝐖 𝑘 = W ∗ from a certain 𝑘 (which comes up in practice): 𝑒 𝑘 = 𝐖 𝑘 −𝐈 𝑒 𝑘−1 + 𝐖 𝑘 − 𝐖 ∗ 𝐲+ 𝐱 ∗ 𝑒 𝑘 = 𝐖−𝐈 𝑒 𝑘−1 SOS Boosting of Image Denoising Algorithms By Michael Elad

Convergence Study The SOS recursive function converges if 𝐖−𝐈 2 <1 Holds both for kernel-based (Bilateral filter, NLM, Kernel Regression), and sparsity-based methods (K-SVD). For most denoising algorithms the SOS boosting is “guaranteed” to converge to x ∗ = 2𝐈−𝐖 −1 y SOS formulation: x 𝑘 = 𝐖 𝑘 y+ x 𝑘−1 − x 𝑘−1 SOS Boosting of Image Denoising Algorithms By Michael Elad

Generalization 𝐱 𝑘+1 =𝑓 𝐲+ 𝜌 𝐱 𝑘 − 𝜌 𝐱 𝑘 We introduce 2 parameters that modify The steady-state outcome The requirements for convergence (the eigenvalues range) The rate of convergence The parameter 𝜌, expressed by Controls the signal emphasis Affects the steady-state outcome 𝐱 𝑘+1 =𝑓 𝐲+ 𝜌 𝐱 𝑘 − 𝜌 𝐱 𝑘 SOS Boosting of Image Denoising Algorithms By Michael Elad

𝜏 𝐱 ∗ =𝜏𝑓 𝐲+ 𝜌 𝐱 ∗ − 𝜏𝜌 𝐱 ∗ + 𝐱 ∗ − 𝐱 ∗ Generalization The second parameter, 𝜏, controls the rate-of-convergence The steady state outcome is given by Multiplying both sides by 𝜏, and adding 𝐱 ∗ − 𝐱 ∗ to the RHS (does not effect the steady state result) Rearranging the equation above, and replacing 𝐱 ∗ with 𝐱 𝑘 leading to the generalized SOS recursive function: 𝐱 ∗ =𝑓 𝐲+ 𝜌 𝐱 ∗ − 𝜌 𝐱 ∗ 𝜏 𝐱 ∗ =𝜏𝑓 𝐲+ 𝜌 𝐱 ∗ − 𝜏𝜌 𝐱 ∗ + 𝐱 ∗ − 𝐱 ∗ 𝐱 𝑘+1 =𝜏𝑓 𝐲+ 𝜌 𝐱 𝑘 − 𝜏𝜌+𝜏−1 𝐱 𝑘 SOS Boosting of Image Denoising Algorithms By Michael Elad

Largest eigenvalue of the error’s transition matrix Generalization By assuming 𝐖 𝑘 = 𝐖 ∗ , the error 𝑒 𝑘 = x 𝑘 − x ∗ is given by As a result, 𝜌 and 𝜏 control the rate-of-convergence We suggest a closed-form expression for the optimal (𝜌,𝜏) setting Given 𝜌, what is the best 𝜏, leading to the fastest convergence? 𝑒 𝑘 = 𝜏𝜌 𝐖 𝑘 − 𝜏𝜌+𝜏−1 𝐈 𝑒 𝑘−1 𝜌 𝜏 Largest eigenvalue of the error’s transition matrix SOS Boosting of Image Denoising Algorithms By Michael Elad

Experiments

Verifying the Convergence Study We apply the K-SVD on the noisy House image 𝜎=25 , with a fixed 𝐖 𝝉 ∗ is the outcome of our closed-form expression, achieving the fastest convergence 𝜸 is our analytic expression that bounds the error (largest eigenvalue of the error’s transition matrix) log 10 𝑒 𝑘 Iteration number SOS Boosting of Image Denoising Algorithms By Michael Elad

Convergence We apply the K-SVD on noisy House image 𝜎=25, with a fixed 𝐖 Converges faster and improves the PSNR PSNR of 𝐱 𝑘 [dB] log 10 𝑒 𝑘 Iteration number Iteration number SOS Boosting of Image Denoising Algorithms By Michael Elad

Results We successfully boost several state-of-the-art denoising algorithms: K-SVD, NLM, BM3D, and EPLL Without any modifications, simply by applying the original software as a “black-box” We manually tuned two parameters 𝜌 – signal emphasis factor 𝜎 – noise level, which is an input to the denoiser Since the noise level of 𝐲+ 𝜌𝐱 𝑘 is higher than the one of 𝐲 SOS Boosting of Image Denoising Algorithms By Michael Elad

Quantitative Comparison Average PSNR* over 5 images (higher is better): K-SVD 𝝈 Avg. PSNR original Avg. PSNR with SOS Avg. Boosting 10 35.12 35.25 0.13 20 32.04 32.26 0.22 25 31.02 31.28 0.26 50 27.20 27.97 0.77 75 24.57 25.83 1.26 100 23.11 23.92 0.81 *PSNR=20 log 10 255 MSE

Quantitative Comparison Average PSNR* over 5 images (higher is better): NLM 𝝈 Avg. PSNR original Avg. PSNR with SOS Avg. Boosting 10 33.97 34.41 0.44 20 31.10 31.44 0.34 25 29.93 30.34 0.41 50 26.43 26.73 0.30 75 24.11 24.67 0.56 100 22.67 23.03 0.36 *PSNR=20 log 10 255 MSE

Quantitative Comparison Average PSNR* over 5 images (higher is better): BM3D 𝝈 Avg. PSNR original Avg. PSNR with SOS Avg. Boosting 10 35.40 35.41 0.01 20 32.56 32.58 0.02 25 31.58 31.61 0.03 50 28.51 28.58 0.07 75 26.58 26.69 0.11 100 24.99 25.13 0.14 *PSNR=20 log 10 255 MSE

Quantitative Comparison Average PSNR* over 5 images (higher is better): EPLL 𝝈 Avg. PSNR original Avg. PSNR with SOS Avg. Boosting 10 35.02 35.15 0.13 20 32.01 32.26 0.25 25 30.97 31.23 0.26 50 27.74 28.04 0.30 75 25.74 26.06 0.32 100 24.20 24.47 0.27 *PSNR=20 log 10 255 MSE

Quantitative Comparison Average boosting in PSNR* over 5 images (higher is better): Noise std Improved Methods 𝝈 K-SVD NLM BM3D EPLL 10 0.13 0.44 0.01 20 0.22 0.34 0.02 0.25 25 0.26 0.41 0.03 50 0.77 0.30 0.07 75 1.26 0.56 0.11 0.32 100 0.81 0.36 0.14 0.27 *PSNR=20 log 10 255 MSE SOS Boosting of Image Denoising Algorithms By Michael Elad

Visual Comparison: K-SVD Original K-SVD results, 𝜎=25 29.06dB SOS Boosting of Image Denoising Algorithms By Michael Elad

Visual Comparison: K-SVD SOS K-SVD results, 𝜎=25 29.41dB SOS Boosting of Image Denoising Algorithms By Michael Elad

Visual Comparison: EPLL Original EPLL results, 𝜎=25 Forman House 32.44dB 32.07dB SOS Boosting of Image Denoising Algorithms By Michael Elad

Visual Comparison: EPLL SOS EPLL results, 𝜎=25 Forman House 32.78dB 32.38dB SOS Boosting of Image Denoising Algorithms By Michael Elad

Reducing the “Local-Global” Gap

Reaching a Consensus It turns out that the SOS boosting treats a major shortcoming of many patch-based methods: Local processing of patches VS. the global need in a whole denoised result We define the local disagreements by Disagreement patch Local independent denoised patch Globally averaged patch The disagreements Naturally exist since each noisy patch is denoised independently Are based on the intermediate results SOS Boosting of Image Denoising Algorithms By Michael Elad

“Sharing the Disagreement” Inspired by the “Consensus and Sharing” problem from game-theory: There are several agents, each one of them aims to minimize its individual cost (i.e., representing the noisy patch sparsely) These agents affect a shared objective term, describing the overall goal (i.e., obtaining the globally denoised image) Imitating this concept, we suggest sharing the disagreements Noisy patches Patch-based denoising Patch Avg. Est. Image Disagreement per-patch Noisy image SOS Boosting of Image Denoising Algorithms By Michael Elad

The SOS boosting reduces the Local/Global gap Connection to SOS Boosting Interestingly, for a fixed filter matrix 𝐖, “sharing the disagreement” and the SOS boosting are equivalent The connection to the SOS is far from trivial because The SOS is blind to the intermediate results (the independent denoised patches, before patch-averaging) The intermediate results are crucial for “sharing the disagreement” approach 𝐱 𝑘+1 =𝐖 𝐲+ 𝐱 𝑘 − 𝐱 𝑘 The SOS boosting reduces the Local/Global gap SOS Boosting of Image Denoising Algorithms By Michael Elad

Conclusions The SOS boosting algorithm: Easy to use In practice, we treat the denoiser 𝑓 ⋅ as a “black-box” Applicable to a wide range of denoising algorithms Guaranteed to converge for most denoising algorithms Thus, has a straightforward stopping criterion Reduces the local-global gap Improves the state-of-the-art methods SOS Boosting of Image Denoising Algorithms By Michael Elad

Thank You For Your Attention & Thanks To the Organizer of BASP Frontiers Yves Wiaux SOS Boosting of Image Denoising Algorithms By Michael Elad