Learned Convolutional Sparse Coding

Slides:



Advertisements
Similar presentations
A brief review of non-neural-network approaches to deep learning
Advertisements

Visual Dictionaries George Papandreou CVPR 2014 Tutorial on BASIS
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Tiled Convolutional Neural Networks TICA Speedup Results on the CIFAR-10 dataset Motivation Pretraining with Topographic ICA References [1] Y. LeCun, L.
Patch-based Image Deconvolution via Joint Modeling of Sparse Priors Chao Jia and Brian L. Evans The University of Texas at Austin 12 Sep
Wangmeng Zuo, Deyu Meng, Lei Zhang, Xiangchu Feng, David Zhang
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
1 Micha Feigin, Danny Feldman, Nir Sochen
More MR Fingerprinting
Ilias Theodorakopoulos PhD Candidate
Learning With Dynamic Group Sparsity Junzhou Huang Xiaolei Huang Dimitris Metaxas Rutgers University Lehigh University Rutgers University.
1 Applications on Signal Recovering Miguel Argáez Carlos A. Quintero Computational Science Program El Paso, Texas, USA April 16, 2009.
Learning Convolutional Feature Hierarchies for Visual Recognition
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Supervisor: Prof. Anil Kokaram Co-Supervisor: Dr. David Corrigan Student: Yun feng Wang Deblurring in Atomic Force Microscopy (AFM) images.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
MSRC Summer School - 30/06/2009 Cambridge – UK Hybrids of generative and discriminative methods for machine learning.
AN ANALYSIS OF SINGLE- LAYER NETWORKS IN UNSUPERVISED FEATURE LEARNING [1] Yani Chen 10/14/
SOS Boosting of Image Denoising Algorithms
Autoencoders Mostafa Heidarpour
Image Denoising and Inpainting with Deep Neural Networks Junyuan Xie, Linli Xu, Enhong Chen School of Computer Science and Technology University of Science.
Image Classification using Sparse Coding: Advanced Topics
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.

Image Renaissance Using Discrete Optimization Cédric AllèneNikos Paragios ENPC – CERTIS ESIEE – A²SI ECP - MAS France.
Online Dictionary Learning for Sparse Coding International Conference on Machine Learning, 2009 Julien Mairal, Francis Bach, Jean Ponce and Guillermo Sapiro.
Approximating the Algebraic Solution of Systems of Interval Linear Equations with Use of Neural Networks Nguyen Hoang Viet Michal Kleiber Institute of.
An approach for solving the Helmholtz Equation on heterogeneous platforms An approach for solving the Helmholtz Equation on heterogeneous platforms G.
Hurieh Khalajzadeh Mohammad Mansouri Mohammad Teshnehlab
Non Negative Matrix Factorization
Cs: compressed sensing
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Learning With Structured Sparsity
Center for Evolutionary Functional Genomics Large-Scale Sparse Logistic Regression Jieping Ye Arizona State University Joint work with Jun Liu and Jianhui.
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Learning Spectral Clustering, With Application to Speech Separation F. R. Bach and M. I. Jordan, JMLR 2006.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
Ariadna Quattoni Xavier Carreras An Efficient Projection for l 1,∞ Regularization Michael Collins Trevor Darrell MIT CSAIL.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 6: Applying backpropagation to shape recognition Geoffrey Hinton.
A Kernel Approach for Learning From Almost Orthogonal Pattern * CIS 525 Class Presentation Professor: Slobodan Vucetic Presenter: Yilian Qin * B. Scholkopf.
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
SUPER RESOLUTION USING NEURAL NETS Hila Levi & Eran Amar Weizmann Ins
Sparsity Based Poisson Denoising and Inpainting
Jinbo Bi Joint work with Tingyang Xu, Chi-Ming Chen, Jason Johannesen
Compressive Coded Aperture Video Reconstruction
Convolutional Neural Network
Learning Mid-Level Features For Recognition
Solving Systems of Linear Equations: Iterative Methods
Systems Biology for Translational Medicine
Unrolling: A principled method to develop deep neural networks
Intelligent Information System Lab
Recovery from Occlusion in Deep Feature Space for Face Recognition
Learning With Dynamic Group Sparsity
Proposed (MoDL-SToRM)
NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
CNNs and compressive sensing Theoretical analysis
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Deep Learning Hierarchical Representations for Image Steganalysis
Goodfellow: Chapter 14 Autoencoders
Sparse Learning Based on L2,1-norm
Improving K-SVD Denoising by Post-Processing its Method-Noise
Autoencoders hi shea autoencoders Sys-AI.
实习生汇报 ——北邮 张安迪.
Autoencoders Supervised learning uses explicit labels/correct output in order to train a network. E.g., classification of images. Unsupervised learning.
Learning Incoherent Sparse and Low-Rank Patterns from Multiple Tasks
Learning and Memorization
Presentation transcript:

Learned Convolutional Sparse Coding Hillel Sreter & Raja Giryes ICASSP 2018

Outline Motivation and applications Brief sparse coding background (ASC) Approximate sparse coding (CSC) Convolution sparse coding (ACSC) Approximate Convolution sparse coding

ACSC - real-time image processing Sparse coding is a strong image processing tool Sparse coding algorithms tend to be quite slow Train an approximate efficient sparse coding model Keep the quality. Drop the runtime

ACSC - advantages Sparse coding has been shown to work well for Image processing task. Yet, it tends to be slow Requires multiple minutes on CPU. Approximate convolutional sparse coding cuts the runtime to a fraction while maintaining the sparse coding performance. Requires less than a second on a CPU.

ACSC - Image denoising example ACSC (0.7 sec) PSNR 29.88 dB KSVD (70 sec) PSNR 28.67 dB x100 speedup compared to OMP sparse coding in KSVD

ACSC - Document S&P denoising ACSC (0.43 sec) PSNR 30.2 dB

ACSC - Document inpainting ACSC (0.43 sec) PSNR 30.13 dB

Sparse Code problem setup Given An input X and an over complete matrix D Find such that:

Sparse Code problem setup Given An input X and an over complete matrix D Find such that: Solving the above is intractable due to the norm. Can be approximated by relaxation Leads to the LASSO minimization problem

Sparse Code setup - solving LASSO Use the proximal gradient technique - This is an iterative method ISTA Many iterations may be required for convergence!

Sparse Code setup - solving LASSO Long convergence time of ISTA can be an issue for time sensitive applications An acceleration strategy that approximates the solution is required Learned ISTA (LISTA): Replaces linear operations and threshold in ISTA with learned ones LISTA

Using Learned-ISTA as a SC approximation Given a fixed number of iterations learn the matrices S and W and the thresholds Learning Fast Approximations of Sparse Coding - Karol Gregor and Yann LeCun S S S recurrent structure trained by unfolding

LISTA training LISTA iteration: Train using loss function: Where is the sparse code of Learning Fast Approximations of Sparse Coding - Karol Gregor and Yann LeCun S S S trained by unfolding

LISTA disadvantages LISTA is a patch based method. Therefore, we have Loss of spatial information. Inefficient for large patches Image properties not incorporated (i.e. translation invariance). Solution: Use convolutional structure.

Approximate Convolutional Sparse Coding (ACSC) Learn a convolutional sparse coding of the whole image instead of patches A global algorithm: Image is processed as whole. Efficient. Convolution based inherently incorporates image properties. This overcomes the disadvantages of LISTA being a patch based method: Loss of spatial information. Not Inefficient for large patches Image properties not incorporated (i.e. translation invariance)

Standard Convolutional Sparse Coding (CSC) In convolutional sparse coding we replace the regular dictionary with a convolutional one: Given input X, solve: In other word -- Find a set of kernels d that reconstruct X with sparse feature maps Z

Convolutional Sparse Coding as Regular Sparse Coding Convolutional Sparse Coding (CSC) Regular Sparse Coding (SC) A convolutional dictionary is a concatenation of toeplitz matrices Equivalent to

From SC to CSC Rewrite ISTA as a solution to CSC problem Assuming D is a concatenation of toeplitz matrices we can replace matrix multiplication with convolution D is a toeplitz matrix Convolutional ISTA

From CSC to Approximate CSC Approximate Convolutional sparse coding (ACSC): Replaces convolutional operations and threshold in Convolutional ISTA solution with learned ones ACSC

Approximate CSC architecture Proposed recurrent architecture for CSC with unfold of T steps ACSC step ACSC step ASCS step

Approximate CSC with dictionary learning Train end to end as a sparse autoencoder Encoder Decoder Convolutional sparse encoder Convolutional sparse dictionary We learn in our framework both the sparse coding and the dictionary In LISTA, only the sparse coding is being learned

Training ACSC autoencoder Initialization is important We found kernel of size 7x7 to give best results. We found Unfolding 3 time step to be sufficient Use loss function H. Zhao, O. Gallo, I. Frosio, and J. Kautz, “Loss functions for image restoration with neural networks,”

Approximate CSC and dictionary learning Learned convolution dictionary Classical sparse dictionary Classical approach tends to learn the same atom with a translation. Convolution dictionary a single atom covers all translation.

Approximate CSC Tasks Example result on denoising task ACSC (0.56 sec) PSNR 32.11 dB KSVD (57 sec) PSNR 32.09 dB x100 speedup compared to OMP sparse coding in KSVD

Approximate CSC Tasks Example result on denoising task ACSC (0.6 sec) PSNR 30.14 dB KSVD (64 sec) PSNR 30.05 dB x100 speedup compared to OMP sparse coding in KSVD

Approximate CSC Tasks Example result on inpainting task ACSC (0.57 sec) PSNR 32.3 dB

Approximate CSC Tasks Example result on denoising task ACSC (0.83 sec) PSNR 30.32 dB KSVD (78 sec) PSNR 30.21 dB x100 speedup compared to OMP sparse coding in KSVD

Approximate CSC Tasks Example result on inpainting task ACSC (0.48 sec) PSNR 30.32 dB Heide et al.(320 sec) PSNR 30.21 dB x650 speedup compared to Heide et al. (Fast and Flexible Convolutional sparse coding)

Approximate CSC Tasks Example result on document denoising task ACSC (0.44 sec) PSNR 29.8dB

Approximate CSC Tasks Example result on document denoising task ACSC (0.38 sec) PSNR 29.4dB

Conclusion LISTA is good for accelerating sparse coding for patches Yet, it is slow when working on images as a whole Our proposed learned CSC is good for processing the whole image Provide comparable performance to other sparse coding based techniques on images with up to x100 speedup

Thank You! Encoder Decoder Convolutional sparse encoder Convolutional sparse dictionary Encoder Decoder