Download presentation
Presentation is loading. Please wait.
Published byTracy Stevenson Modified over 6 years ago
1
Sparse and low-rank recovery problems in signal processing and machine learning
Jeremy Watt and Aggelos Katsaggelos Northwestern University Department of EECS
2
Part 2: Quick and dirty optimization techniques
3
Big picture – a story of 2’s
2 excellent greedy algorithms: narrow in problem type, broad in scale Sparse Least Squares - OMP Dictionary Learning - KSVD 2 common smooth reformulations: broad in problem type, narrow in scale Positive negative split The epigraph trick These greedy approaches provide large scale solutions to specific problems. The reformulations provide small-medium sized solutions for a wider array of sparse and low rank problems. Knowing how to rewrite/reformulate problems is half the battle in optimization.
4
Greedy methods: Smooth reformulations:
5
Greedy approaches to sparse Least Squares problems
Orthogonal Matching Pursuit
6
Models for small and sparse recovery
Ccombinatorially difficult
7
Orthogonal Matching Pursuit
Greedy method for approximately solving or
8
Orthogonal Matching Pursuit (OMP)
Intuitive algorithm1 Effective in applications Extremely efficient Good theoretical recovery guarantees2 CoSAMP another excellent greedy algorithm for the problem3
9
KSVD: a greedy algorithm for Dictionary Learning
KSVD very effective and uses OMP in X-step Has been applied to many image processing tasks3
10
Smooth reformulation tricks for sparse problems
Positive-negative split4,5
11
Basis pursuit (e.g. compressive sensing)
12
Pos/Neg decomposition
13
Basis pursuit (e.g. compressive sensing)
pos/neg split Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online.
14
Basis pursuit Standard Linear Program Linear objective
Linear constraints Note that if you don’t know how to solve LPs there are tons of standard toolboxes available – CVX being the easiest to use. Standard Linear Program
15
Basis Pursuit – phrased as an LP
pos/neg split Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online.
16
The Lasso Standard Quadratic Program pos/neg split
Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online. Standard Quadratic Program
17
Smooth reformulation tricks for sparse problems
The epigraph trick6
18
Absolute Deviations Standard Quadratic Program epigraph trick
Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online. Standard Quadratic Program
19
Absolute Deviations Note that the dimension of the space has double – but now we solve a standard LP for which there are tons of solvers online.
20
Absolute Deviations epigraph trick Another standard LP
21
“Medium sized problems”
Many solvers online for solving reformulations CVX11 highly recommended for MATLAB users Most reformulations solved via Interior Programming 9 Practically limited to tens of thousand variables Some nice extensions for basic sparse recovery problems have been developed6
22
Final thought Nuclear norm reformulations work analogously
Reformulations as Second Order Cone Programs (SOCPs)7,10
23
References Tropp, Joel A., and Anna C. Gilbert. "Signal recovery from random measurements via orthogonal matching pursuit." Information Theory, IEEE Transactions on (2007): Needell, Deanna, and Joel A. Tropp. "CoSaMP: Iterative signal recovery from incomplete and inaccurate samples." Applied and Computational Harmonic Analysis 26.3 (2009): Aharon, Michal, Michael Elad, and Alfred Bruckstein. "K-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation." IEEE TRANSACTIONS ON SIGNAL PROCESSING (2006): 4311. Figueiredo, Mário AT, Robert D. Nowak, and Stephen J. Wright. "Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems." Selected Topics in Signal Processing, IEEE Journal of 1.4 (2007):
24
References Tibshirani, Robert, et al. "Sparsity and smoothness via the fused lasso." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67.1 (2005): Candes, Emmanuel J., Michael B. Wakin, and Stephen P. Boyd. "Enhancing sparsity by reweighted ℓ 1 minimization." Journal of Fourier Analysis and Applications (2008): Kim, Seung-Jean, et al. "An Interior-Point Method for Large-Scale l_1-Regularized Least Squares." IEEE Journal of Selected Topics in Signal Processing 1 (2007): Liu, Zhang, and Lieven Vandenberghe. "Interior-point method for nuclear norm approximation with application to system identification." SIAM Journal on Matrix Analysis and Applications 31.3 (2009): Boyd, Stephen Poythress, and Lieven Vandenberghe. Convex optimization. Cambridge university press, 2004. Lobo, Miguel Sousa, et al. "Applications of second-order cone programming." Linear algebra and its applications (1998): Grant, Michael, Stephen Boyd, and Yinyu Ye. "CVX: Matlab software for disciplined convex programming." (2008).
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.