Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.

Slides:



Advertisements
Similar presentations
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Advertisements

Compressive Sensing IT530, Lecture Notes.
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Randomized Sensing in Adversarial Environments Andreas Krause Joint work with Daniel Golovin and Alex Roper International Joint Conference on Artificial.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Approximate Message Passing for Bilinear Models
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
1 Micha Feigin, Danny Feldman, Nir Sochen
More MR Fingerprinting
Ilias Theodorakopoulos PhD Candidate
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
Near-Optimal Sensor Placements in Gaussian Processes Carlos Guestrin Andreas KrauseAjit Singh Carnegie Mellon University.
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
“Random Projections on Smooth Manifolds” -A short summary
Efficient Informative Sensing using Multiple Robots
Sparse and Overcomplete Data Representation
Mathematics and Image Analysis, MIA'06
Image Denoising via Learned Dictionaries and Sparse Representations
Near-optimal Nonmyopic Value of Information in Graphical Models Andreas Krause, Carlos Guestrin Computer Science Department Carnegie Mellon University.
Sensor placement applications Monitoring of spatial phenomena Temperature Precipitation... Active learning, Experiment design Precipitation data from Pacific.
INFERRING NETWORKS OF DIFFUSION AND INFLUENCE Presented by Alicia Frame Paper by Manuel Gomez-Rodriguez, Jure Leskovec, and Andreas Kraus.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
Random Convolution in Compressive Sampling Michael Fleyer.
* Joint work with Michal Aharon Guillermo Sapiro
Introduction to Compressive Sensing
1 Budgeted Nonparametric Learning from Data Streams Ryan Gomes and Andreas Krause California Institute of Technology.
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
6.829 Computer Networks1 Compressed Sensing for Loss-Tolerant Audio Transport Clay, Elena, Hui.
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Spatial and Temporal Databases Efficiently Time Series Matching by Wavelets (ICDE 98) Kin-pong Chan and Ada Wai-chee Fu.
Multiscale transforms : wavelets, ridgelets, curvelets, etc.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Alfredo Nava-Tudela John J. Benedetto, advisor
Compressed Sensing Compressive Sampling
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
Compressive Sampling: A Brief Overview
Game Theory Meets Compressed Sensing
Cs: compressed sensing
EE369C Final Project: Accelerated Flip Angle Sequences Jan 9, 2012 Jason Su.
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
Learning With Structured Sparsity
SCALE Speech Communication with Adaptive LEarning Computational Methods for Structured Sparse Component Analysis of Convolutive Speech Mixtures Volkan.
Basis Expansions and Regularization Part II. Outline Review of Splines Wavelet Smoothing Reproducing Kernel Hilbert Spaces.
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Image Priors and the Sparse-Land Model
Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions Onur G. Guleryuz Epson Palo Alto Laboratory.
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Sparsity Based Poisson Denoising and Inpainting
Compressive Coded Aperture Video Reconstruction
Near-optimal Observation Selection using Submodular Functions
Jeremy Watt and Aggelos Katsaggelos Northwestern University
Basic Algorithms Christina Gallner
Sparselet Models for Efficient Multiclass Object Detection
Optimal sparse representations in general overcomplete bases
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Non-Negative Matrix Factorization
Submodular Maximization with Cardinality Constraints
Guess Free Maximization of Submodular and Linear Sums
Lecture 7 Patch based methods: nonlocal means, BM3D, K- SVD, data-driven (tight) frame.
Presentation transcript:

Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS & Idiap Research Institute Joint work with Andreas Krause Learning and Adaptive Systems Group

Applications du jour: Compressive sensing, inpainting, denoising, deblurring, … Sparse Representations

Which dictionary should we use? Sparse Representations

Which dictionary should we use? Suppose we have training data This session / talk: Learning D

Existing Solutions Dictionary design –functional space assumptions<>C2, Besov, Sobolev,… ex. natural images <>smooth regions + edges –induced norms/designed bases & frames ex. wavelets, curvelets, etc. Dictionary learning –regularization –clustering<> identify clustering of data –Bayesian <>non-parametric approaches ex. Indian buffet processes

Dictionary Selection Problem—DiSP Given –candidate columns –sparsity and dictionary size –training data Choose D to maximize variance reduction: –reconstruction accuracy: –var. reduct. for s-th data: –overall var. reduction: Want to solve: Combines dictionary design and learning

Combinatorial Challenges in DiSP 1.Evaluation of 2.Finding D* NP-hard! Sparse reconstruction problem!

Combinatorial Challenges in DiSP 1.Evaluation of 2.Finding D* (NP-hard) Key observations: –F(D) <>approximately submodular –submodularity <> efficient algorithms with provable guarantees Sparse reconstruction problem!

(Approximate) Submodularity Set function F submodular, if Set function F approximately submodular, if v D’ D v + + Large improvement Small improvement D’ D

A Greedy Algorithm for Submodular Opt. Greedy algorithm: –Start with –For i = 1:k do  Choose  Set This greedy algorithm empirically performs surprisingly well!

Submodularity and the Greedy Algorithm Theorem [Nemhauser et al, ‘78] For the greedy solution A G, it holds that Krause et al ‘08: For approximately submodular F: Key question: How can we show that F is approximately submodular?

A Sufficient Condition: Incoherence Incoherence of columns: Define: Proposition: is (sub)modular! Furthermore, Thus is approximately submodular! (modular approximation)

An Algorithm for DiSP: SDS OMP Algorithm SDS OMP –Use Orthogonal Matching Pursuit to evaluate F for fixed D –Use greedy algorithm to select columns of D Theorem: SDS OMP will produce a dictionary such that Need n and k to be much less than d SDS: Sparsifying Dictionary Selection

Improved Guarantees: SDS MA Algorithm SDS MA –Optimize modular approximation instead of F –Use greedy algorithm to select columns of D Theorem: SDS MA will produce a dictionary such that SDS MA <> faster + better guarantees SDS OMP <>empirically performs better (in particular when μ is small)

Improved Guarantees for both SDS MA & SDS OMP incoherence<> restricted singular value Theorem [Das & Kempe ‘11]: SDS MA will produce a dictionary such that explains the good performance in practice

Experiment: Finding a Basis in a Haystack V union of 8 orthonormal, 64-dimensional bases (Discrete Cosine Transform, Haar, Daubechies 4 & 8, Coiflets 1, 3, 5 and Discrete Meyer) Pick dictionary of size n=64 at random Generate 100 sparse (k=5) signals at random Use SDS to pick dictionary of increasing size Evaluate –fraction of correctly recovered columns –variance reduction

Reconstruction Performance SDS OMP <>perfect reconstruction accuracy SDS MA <>comparable the variance reduction

Battle of Bases on Natural Image Patches Seek a dictionary among existing bases discrete cosine transform (DCT), wavelets (Haar, Daub4), Coiflets, noiselets, and Gabor (frame) SDS OMP prefers DCT+Gabor SDS MA chooses Gabor (predominantly) Optimized dictionary improves compression

Sparsity on Average hard sparsity<>group sparsity cardinality constraint<>matroid constraint

Conclusions Dictionary Selection <> new problem dictionary learning + dictionary design Incoherence/ <> approximate / restricted singular value multiplicative submodularity Two algorithms <>SDS MA and SDS OMP with guarantees Extensions to structured sparsity in –ICML 2010 / J-STSP 2011 Novel connection between sparsity and submodularity

Experiment: Inpainting Dictionary selection from dimensionality-reduced measurements Take Barbara with 50% pixels missing at random Partition the image into 8x8 patches Optimize dictionary based on observed pixel values “Inpaint” the missing pixels via sparse reconstruction Caveat: This is not a representation problem.

Results: Inpainting Comparable to state-of-the art nonlocal TV; but faster!