Jeremy Watt and Aggelos Katsaggelos Northwestern University

Slides:



Advertisements
Similar presentations
L1 sparse reconstruction of sharp point set surfaces
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Image acquisition using sparse (pseudo)-random matrices Piotr Indyk MIT.
Compressive Sensing IT530, Lecture Notes.
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Joint work with Irad Yavneh
Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.
K-SVD Dictionary-Learning for Analysis Sparse Models
Exact or stable image\signal reconstruction from incomplete information Project guide: Dr. Pradeep Sen UNM (Abq) Submitted by: Nitesh Agarwal IIT Roorkee.
Wangmeng Zuo, Deyu Meng, Lei Zhang, Xiangchu Feng, David Zhang
Approximate Message Passing for Bilinear Models
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
1 Micha Feigin, Danny Feldman, Nir Sochen
Ilias Theodorakopoulos PhD Candidate
Design of Non-Linear Kernel Dictionaries for Object Recognition
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Sparse and Overcomplete Data Representation
Nonlinear Optimization for Optimal Control
Image Denoising via Learned Dictionaries and Sparse Representations
An Introduction to Sparse Representation and the K-SVD Algorithm
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
* Joint work with Michal Aharon Guillermo Sapiro
Introduction to Compressive Sensing
Analysis of the Basis Pustuit Algorithm and Applications*
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
6.829 Computer Networks1 Compressed Sensing for Loss-Tolerant Audio Transport Clay, Elena, Hui.
GRADIENT PROJECTION FOR SPARSE RECONSTRUCTION: APPLICATION TO COMPRESSED SENSING AND OTHER INVERSE PROBLEMS M´ARIO A. T. FIGUEIREDO ROBERT D. NOWAK STEPHEN.
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Auto-tuned high-dimensional regression with the TREX: theoretical guarantees and non-convex global optimization Jacob Bien 1, Irina Gaynanova 1, Johannes.
FINAL PRESENTATION ANAT KLEMPNER SPRING 2012 SUPERVISED BY: MALISA MARIJAN YONINA ELDAR A Compressed Sensing Based UWB Communication System 1.
Compressed Sensing Compressive Sampling
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
ACCESS IC LAB Graduate Institute of Electronics Engineering, NTU Reconstruction Algorithms for Compressive Sensing II Presenter: 黃乃珊 Advisor: 吳安宇 教授 Date:
Compressive Sampling: A Brief Overview
1 Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization Qing Ling Department of Automation University of Science and Technology of China.
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Numerical Linear Algebra IKI Course outline Review linear algebra Square linear systems Least Square Problems Eigen Problems Text: Applied Numerical.
Game Theory Meets Compressed Sensing
Cs: compressed sensing
Recovering low rank and sparse matrices from compressive measurements Aswin C Sankaranarayanan Rice University Richard G. Baraniuk Andrew E. Waters.
Introduction to Compressed Sensing and its applications
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
1 Sparsity Control for Robust Principal Component Analysis Gonzalo Mateos and Georgios B. Giannakis ECE Department, University of Minnesota Acknowledgments:
Eran Treister and Irad Yavneh Computer Science, Technion (with thanks to Michael Elad)
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Direct Robust Matrix Factorization Liang Xiong, Xi Chen, Jeff Schneider Presented by xxx School of Computer Science Carnegie Mellon University.
R EGRESSION S HRINKAGE AND S ELECTION VIA THE L ASSO Author: Robert Tibshirani Journal of the Royal Statistical Society 1996 Presentation: Tinglin Liu.
Rank Minimization for Subspace Tracking from Incomplete Data
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Progress Report #2 Alvaro Velasquez. Project Selection I chose to work with Nasim Souly on the project titled “Subspace Clustering via Graph Regularized.
Jeremy Watt and Aggelos Katsaggelos Northwestern University
Zhu Han University of Houston Thanks for Dr. Mingyi Hong’s slides
Jeremy Watt and Aggelos Katsaggelos Northwestern University
Highly Undersampled 0-norm Reconstruction
VII. Other Time Frequency Distributions (II)
Nonnegative polynomials and applications to learning
Basic Algorithms Christina Gallner
Optimal sparse representations in general overcomplete bases
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Non-Negative Matrix Factorization
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

Sparse and low-rank recovery problems in signal processing and machine learning Jeremy Watt and Aggelos Katsaggelos Northwestern University Department of EECS

Part 2: Quick and dirty optimization techniques

Big picture – a story of 2’s 2 excellent greedy algorithms: narrow in problem type, broad in scale Sparse Least Squares - OMP Dictionary Learning - KSVD 2 common smooth reformulations: broad in problem type, narrow in scale Positive negative split The epigraph trick These greedy approaches provide large scale solutions to specific problems. The reformulations provide small-medium sized solutions for a wider array of sparse and low rank problems. Knowing how to rewrite/reformulate problems is half the battle in optimization.

Greedy methods: Smooth reformulations:

Greedy approaches to sparse Least Squares problems Orthogonal Matching Pursuit

Models for small and sparse recovery Ccombinatorially difficult

Orthogonal Matching Pursuit Greedy method for approximately solving or

Orthogonal Matching Pursuit (OMP) Intuitive algorithm1 Effective in applications Extremely efficient Good theoretical recovery guarantees2 CoSAMP another excellent greedy algorithm for the problem3

KSVD: a greedy algorithm for Dictionary Learning KSVD very effective and uses OMP in X-step Has been applied to many image processing tasks3

Smooth reformulation tricks for sparse problems Positive-negative split4,5

Basis pursuit (e.g. compressive sensing)

Pos/Neg decomposition

Basis pursuit (e.g. compressive sensing) pos/neg split Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online.

Basis pursuit Standard Linear Program Linear objective Linear constraints Note that if you don’t know how to solve LPs there are tons of standard toolboxes available – CVX being the easiest to use. Standard Linear Program

Basis Pursuit – phrased as an LP pos/neg split Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online.

The Lasso Standard Quadratic Program pos/neg split Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online. Standard Quadratic Program

Smooth reformulation tricks for sparse problems The epigraph trick6

Absolute Deviations Standard Quadratic Program epigraph trick Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online. Standard Quadratic Program

Absolute Deviations Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online.

Absolute Deviations epigraph trick Another standard LP

“Medium sized problems” Many solvers online for solving reformulations CVX11 highly recommended for MATLAB users Most reformulations solved via Interior Programming 9 Practically limited to tens of thousand variables Some nice extensions for basic sparse recovery problems have been developed6

Final thought Nuclear norm reformulations work analogously Reformulations as Second Order Cone Programs (SOCPs)7,10

References Tropp, Joel A., and Anna C. Gilbert. "Signal recovery from random measurements via orthogonal matching pursuit." Information Theory, IEEE Transactions on 53.12 (2007): 4655-4666. Needell, Deanna, and Joel A. Tropp. "CoSaMP: Iterative signal recovery from incomplete and inaccurate samples." Applied and Computational Harmonic Analysis 26.3 (2009): 301-321. Aharon, Michal, Michael Elad, and Alfred Bruckstein. "K-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation." IEEE TRANSACTIONS ON SIGNAL PROCESSING 54.11 (2006): 4311. Figueiredo, Mário AT, Robert D. Nowak, and Stephen J. Wright. "Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems." Selected Topics in Signal Processing, IEEE Journal of 1.4 (2007): 586-597.

References Tibshirani, Robert, et al. "Sparsity and smoothness via the fused lasso." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67.1 (2005): 91-108. Candes, Emmanuel J., Michael B. Wakin, and Stephen P. Boyd. "Enhancing sparsity by reweighted ℓ 1 minimization." Journal of Fourier Analysis and Applications 14.5-6 (2008): 877-905. Kim, Seung-Jean, et al. "An Interior-Point Method for Large-Scale l_1-Regularized Least Squares." IEEE Journal of Selected Topics in Signal Processing 1 (2007): 606-617. Liu, Zhang, and Lieven Vandenberghe. "Interior-point method for nuclear norm approximation with application to system identification." SIAM Journal on Matrix Analysis and Applications 31.3 (2009): 1235-1256. Boyd, Stephen Poythress, and Lieven Vandenberghe. Convex optimization. Cambridge university press, 2004. Lobo, Miguel Sousa, et al. "Applications of second-order cone programming." Linear algebra and its applications 284.1 (1998): 193-228. Grant, Michael, Stephen Boyd, and Yinyu Ye. "CVX: Matlab software for disciplined convex programming." (2008).