K-SVD Dictionary-Learning for Analysis Sparse Models

Slides:



Advertisements
Similar presentations
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Advertisements

MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
1 Micha Feigin, Danny Feldman, Nir Sochen
Ilias Theodorakopoulos PhD Candidate
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Sparse & Redundant Signal Representation, and its Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
Sparse and Overcomplete Data Representation
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Mathematics and Image Analysis, MIA'06
Image Denoising via Learned Dictionaries and Sparse Representations
Image Denoising and Beyond via Learned Dictionaries and Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
* *Joint work with Ron Rubinstein Tomer Faktor Remi Gribonval and
* Joint work with Michal Aharon Guillermo Sapiro
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of.
SOS Boosting of Image Denoising Algorithms
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Retinex by Two Bilateral Filters Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000, Israel Scale-Space 2005 The.
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Cs: compressed sensing
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Eran Treister and Irad Yavneh Computer Science, Technion (with thanks to Michael Elad)
* *Joint work with Ron Rubinstein Tomer Peleg Remi Gribonval and
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Shape From Moments Shape from Moments An Estimation Perspective Michael Elad *, Peyman Milanfar **, and Gene Golub * SIAM 2002 Meeting MS104 - Linear.
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Image Priors and the Sparse-Land Model
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Martina Uray Heinz Mayer Joanneum Research Graz Institute of Digital Image Processing Horst Bischof Graz University of Technology Institute for Computer.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Sparsity Based Poisson Denoising and Inpainting
Yaniv Romano The Electrical Engineering Department
Recent Results on the Co-Sparse Analysis Model
Parallelization of Sparse Coding & Dictionary Learning
Sparse and Redundant Representations and Their Applications in
* *Joint work with Ron Rubinstein Tomer Peleg Remi Gribonval and
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Sparse and Redundant Representations and Their Applications in
Presentation transcript:

K-SVD Dictionary-Learning for Analysis Sparse Models * Joint work with Ron Rubinstein Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel and Remi Gribonval, Mark Plumbley, Mike Davies, Sangnam Nam, Boaz Ophir, Nancy Bertin SPARS11 Workshop: Signal Processing with Adaptive Sparse Structured Representations June 27-30, 2011 - Edinburgh, (Scotland, UK)

Part I - Background Recalling the Synthesis Model and the K-SVD K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

= The Synthesis Model – Basics Dictionary The synthesis representation is expected to be sparse: Adopting a Bayesian point of view: Draw the support at random Choose the non-zero coefficients randomly (e.g. iid Gaussians) Multiply by D to get the synthesis signal Such synthesis signals belong to a Union-of-Subspaces (UoS): This union contains subspaces, each of dimension k. = K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

The Synthesis Model – Pursuit Fundamental problem: Given the noisy measurements, recover the clean signal x – This is a denoising task. This can be posed as: While this is a (NP-) hard problem, its approximated solution can be obtained by Use L1 instead of L0 (Basis-Pursuit) Greedy methods (MP, OMP, LS-OMP) Hybrid methods (IHT, SP, CoSaMP) Theoretical studies provide various guarantees for the success of these techniques, typically depending on k and properties of D. Pursuit Algorithms K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

X A D = The Synthesis Model – Dictionary Learning Field & Olshausen (96’) Engan et. al. (99’) … Gribonval et. al. (04’) Aharon et. al. (04’) Each example is a linear combination of atoms from D Each example has a sparse representation with no more than k atoms K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

D Y The Synthesis Model – K-SVD Aharon, E., & Bruckstein (`04) D Initialize D e.g. choose a subset of the examples Recall: the dictionary update stage in the K-SVD is done one atom at a time, updating it using ONLY those examples who use it, while fixing the non-zero supports. Sparse Coding Use OMP or BP Y Dictionary Update Column-by-Column by SVD computation K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Part II - Analysis The Basics of the Analysis Model S. Nam, M.E. Davies, M. Elad, and R. Gribonval, "Co-sparse Analysis Modeling - Uniqueness and Algorithms" , ICASSP, May, 2011. S. Nam, M.E. Davies, M. Elad, and R. Gribonval, "The Co-sparse Analysis Model and Algorithms" , Submitted to ACHA, June 2011. K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

= The Analysis Model – Basics * The analysis representation z is expected to be sparse Co-sparsity: - the number of zeros in z. Co-Support:  - the rows that are orthogonal to x If  is in general position , then and thus we cannot expect to get a truly sparse analysis representation – Is this a problem? No! Notice that in this model we put an emphasis on the zeros in the analysis representation, z, rather then the non-zeros. In particular, the values of the non-zeroes in z are not important to characterize the signal. = p * Analysis Dictionary K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

= The Analysis Model – Bayesian View Analysis signals, just like synthesis ones, can be generated in a systematic way: Bottom line: an analysis signal x satisfies: = Synthesis Signals Analysis Signals Support: Choose the support T (|T|=k) at random Choose the co-support  (||= ) at random Coef. : Choose T at random Choose a random vector v Generate: Synthesize by: DTT=x Orhto v w.r.t. : p Analysis Dictionary K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

= The Analysis Model – UoS Analysis signals, just like synthesis ones, belong to a union of subspaces: Example: p=m=2d: Synthesis: k=1 (one atom) – there are 2d subspaces of dimensionality 1 Analysis: =d-1 leads to >>O(2d) subspaces of dimensionality 1 = Synthesis Signals Analysis Signals What is the Subspace Dimension: k d- How Many Subspaces: Who are those Subspaces: p Analysis Dictionary K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

The Analysis Model – Pursuit Fundamental problem: Given the noisy measurements, recover the clean signal x – This is a denoising task. This goal can be posed as: This is a (NP-) hard problem, just as in the synthesis case (and even harder!!!) We can approximate its solution by L1 replacing L0 (BP-analysis) Greedy methods (OMP, …), and Hybrid methods (IHT, SP, CoSaMP, …). Theoretical studies should provide guarantees for the success of these techniques, typically depending on the co-sparsity and properties of . K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

The Analysis Model – Backward Greedy BG finds one row at a time from  for approximating the solution of Variations and Improvements: Gram-Schmidt applied to the accumulated rows speeds-up the algorithm. An exhaustive alternative, xBG, can be used, where per each candidate row we test the decay in the projection energy and choose the smallest of them as the next row. One could think of a forward alternative that detects the non-zero rows (GAP) – talk with Sangnam. Initialization No Yes Stop K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

The Analysis Model – Low-Spark  What if spark(T)<<d ? For example: a TV-like operator for image-patches of size 66 pixels ( size is 7236) Here are analysis-signals generated for co-sparsity ( ) of 32: Their true co-sparsity is higher – see graph: In such a case we may consider More info: S. Nam, M.E. Davies, M. Elad, and R. Gribonval 10 20 30 40 50 60 70 80 100 200 300 400 500 600 700 800 Co-Sparsity # of signals K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

The Analysis Model – Low-Spark  – Pursuit An example – performance of BG (and xBG) for these TV-like signals: 1000 signal examples, SNR=25 Accuracy of the co-support recovered Denoising performance BG or xBG K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

= = The Analysis Model – Summary The analysis and the synthesis models are similar, and yet very different The two align for p=m=d : non-redundant Just as the synthesis, we should work on: Pursuit algorithms (of all kinds) – Design Pursuit algorithms (of all kinds) – Theoretical study Dictionary learning from example-signals Applications … Our experience on the analysis model: Theoretical study is harder Different applications should be considered = d p = K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Part III – Dictionaries Analysis Dictionary-Learning by K-SVD-Like Algorithm B. Ophir, M. Elad, N. Bertin and M.D. Plumbley, "Sequential Minimal Eigenvalues - An Approach to Analysis Dictionary Learning", EUSIPCO, August 2011. R. Rubinstein and M. Elad, "The Co-sparse Analysis Model and Algorithms" , will be submitted (very) soon to IEEE-TSP .... K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

X A = Analysis Dictionary Learning – The Signals We are given a set of N contaminated (noisy) analysis signals, and our goal is to recover their analysis dictionary,  K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Analysis Dictionary Learning – Goal Synthesis Analysis We shall adopt a similar approach to the K-SVD for approximating the minimization of the analysis goal K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Assuming that  is fixed, we aim at updating X Analysis Dictionary – Sparse-Coding Assuming that  is fixed, we aim at updating X These are N separate analysis-pursuit problems. We suggest to use the BG or the xBG algorithms. K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Analysis Dictionary – Dic. Update (1) Assuming that X has been updated (and thus j are known), we now aim at updating a row (e.g. wkT) from  We use only the signals Sk that are found orthogonal to wk Each example should keep its co-support j\k Avoid trivial solution Each of the chosen examples should be orthogonal to the new row wk K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

This problem we have defined is too hard to handle Analysis Dictionary – Dic. Update (2) This problem we have defined is too hard to handle Intuitively, and in the spirit of the K-SVD, we could suggest the following alternative K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Analysis Dictionary – Dic. Update (3) This lacks in one of the forces on wk that the original problem had A better approximation for our original problem is The obtained problem is a simple Rank-1 approximation problem, easily given by SVD K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Analysis Dictionary Learning – Results (1) Synthetic experiment #1: TV-Like  We generate 30,000 TV-like signals of the same kind described before (: 7236, =32) We apply 300 iterations of the Analysis K-SVD with BG (fixed ), and then 5 more using the xBG Initialization by orthogonal vectors to randomly chosen sets of 35 examples Additive noise: SNR=25. atom detected if: 100 200 300 20 40 60 80 Iteration Relative Recovered Rows [%] Even though we have not identified  completely (~92% this time), we got an alternative feasible analysis dictionary with the same number of zeros per example, and a residual error within the noise level. K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Analysis Dictionary Learning – Results (1) Synthetic experiment #1: TV-Like  Original Analysis Dictionary Learned Analysis Dictionary K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Analysis Dictionary Learning – Results (2) Synthetic experiment #2: Random  Very similar to the above, but with a random (full-spark) analysis dictionary : 7236 Experiment setup and parameters: the very same as above In both algorithms: replacing BG by xBG (in both experiments) leads to a consistent descent in the relative error, and better recovery results. However, the run-time is ~50 times longer 100 200 300 20 40 60 80 Iteration Relative Recovered Rows [%] As in the previous example, even though we have not identified  completely (~80% this time), we got an alternative feasible analysis dictionary with the same number of zeros per example, and a residual error within the noise level. K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Analysis Dictionary Learning – Results (3) Experiment #3: Piece-Wise Constant Image We take 10,000 patches (+noise σ=5) to train on Here is what we got: Initial  Trained (100 iterations)  Original Image Patches used for training K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Analysis Dictionary Learning – Results (4) Experiment #3: The Image “House” We take 10,000 patches (+noise σ=10) to train on Here is what we got: Initial  Trained (100 iterations)  Original Image Patches used for training K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

Part IV – We Are Done Summary and Conclusions K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad

What about Dictionary learning? Today We Have Seen that … Sparsity and Redundancy are practiced mostly in the context of the synthesis model Yes, the analysis model is a very appealing (and different) alternative, worth looking at Is there any other way? We propose new algorithms (e.g. K-SVD like) for this task. The next step is applications that will benefit from this In the past few years there is a growing interest in better defining this model, suggesting pursuit methods, analyzing them, etc. So, what to do? What about Dictionary learning? More on these (including the slides and the relevant papers) can be found in http://www.cs.technion.ac.il/~elad K-SVD Dictionary-Learning for Analysis Sparse Models By: Michael Elad