Machine Learning for Adaptive Bilateral Filtering I. Frosio 1, K. Egiazarian 1,2, and K. Pulli 1,3 1 NVIDIA, USA 2 Tampere University of Technology, Finland 3 Light, USA
Motivation (denoising) void denoise(float *img){ … for (int y = 0; y < ys; y++){ for (int x = 0; x < xs; x++){ img(y*xs+x) = … } … }
Motivation: (Gaussian filter) t(x)d(x)
Bilateral filter Motivation (bilateral filter) C. Tomasi and R. Manduchi, Bilateral filtering for gray and color images, ICCV, t(x)d(x)
Motivation: (choice of the parameters) dd dd rr d(x)
Motivation (use intuition?) σ d scales with resolution σ r scaled with grey level dynamics possible automatic design of parameter values [BF, Tomasi and Manduchi, ICCV, 1998] image noise std σ n σ d = [1.5, 2.1], independently from σ n σ r = k·σ n [BF, Zhang, Gunturk, TIP, 2008] local signal variance σ s 2 (x,y) σ d = [1.5, 2.1], independently from σ n σ r (x,y) = σ n 2 / σ s (x,y) [ABF, Qi et al., AMR, 2013] dd dd rr
Motivation (use machine learning!) dd rr
Framework (adaptive denoising) 3 features 6 unknowns d (x,y) r (x,y)
Framework (learning) Training images {t j } j=1..N Noise model (AWGN) Local image features, f x,y Image quality model (PSNR) Adaptive bilateral filter,
Entropy features FlatGradientTextureEdges 0.0 bit6.0 bits1.0 bit5.6 bits Shannon’s entropy i(x,y)
Entropy features FlatGradientTextureEdges eiei 0.0 bit6.0 bits1.0 bit5.6 bits egeg 0.0 bit 1.2 bit5.5 bits i(x,y) ||grad[i(x,y)]||
Entropy features eiei egeg
Framework (complete) Training images {t j } j=1..N Noise model (AWGN) Local image features f x,y Image quality model (PSNR) Adaptive bilateral filter, d (x,y) Logistic functions Local image features f x,y r (x,y) Noisy image EABF Filtered image
Results - PSNR BFBF [Zhang]ABF [Qi]EABF d (x,y) optimal1.8 our r (x,y) optimal 2n2n n 2 /(0.3 s ) our n = n = n = n = n = n = n = n = n = n = n = n = n = n = n = BFBF [Zhang]ABF [Qi]EABF d (x,y) optimal1.8 our r (x,y) optimal 2n2n n 2 /(0.3 s ) our n = n = n = n = n = n = n = n = n = n = n = n = n = n = n =
Results - PSNR BF BF [Zhang] ABF [Qi] EABF d (x,y) optimal1.8 our r (x,y) optimal 2n2n n 2 /(0.3 s ) our n = n = n = n = n = average n = 5… dB +0.51dB
Results – Image quality Ground truth Noisy dB BF [Zhang] dB ABF [Qi] dB EABF dB
Machine learning vs. intuition: d (x,y), r (x,y) n = 20 BF [Zhang et al.]ABF [Qi et al.]EABF d (x,y) 1.8 r (x,y) 2s n [0.6, 2.6] [71, 85] [20, 110]
Machine learning vs. intuition: d = d ( n ) Flat areaEdge area
Machine learning vs. intuition: r = r ( n ) Flat areaEdge area
Conclusion Learning optimal parameter modulation strategies through Machine Learning is feasible. Modulation strategies are complicated… … But effective. PSNR
Conclusion Training images {t j } j=1..N Noise model (AWGN) Local image features, f x,y Image quality model (PSNR) Adaptive bilateral filter,
Conclusion Your training images Noise model (AWGN) Local image features, f x,y Image quality model (PSNR) Adaptive bilateral filter,
Conclusion Your training images Your noise model Local image features, f x,y Image quality model (PSNR) Adaptive bilateral filter,
Conclusion Your training images Your noise model Your local features, f x,y Image quality model (PSNR) Adaptive bilateral filter,
Conclusion Your training images Your noise model Your local features, f x,y Your image quality model, Q Adaptive bilateral filter,
Conclusion Your training images Your noise model Your local features, f x,y Your image quality model, Q A different adaptive filter,
Conclusion A general FRAMEWORK based on MACHINE LEARNING for the development of ADAPTIVE FILTERS