Model-based Compressive Sensing

Slides:



Advertisements
Similar presentations
An Introduction to Compressed Sensing Student : Shenghan TSAI Advisor : Hsuan-Jung Su and Pin-Hsun Lin Date : May 02,
Advertisements

On the Power of Adaptivity in Sparse Recovery Piotr Indyk MIT Joint work with Eric Price and David Woodruff, 2011.
Compressive Sensing IT530, Lecture Notes.
Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.
Short-course Compressive Sensing of Videos Venue CVPR 2012, Providence, RI, USA June 16, 2012 Organizers: Richard G. Baraniuk Mohit Gupta Aswin C. Sankaranarayanan.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 2: Compressive Sampling for Analog Time Signals.
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
K-SVD Dictionary-Learning for Analysis Sparse Models
Approximate Message Passing for Bilinear Models
More MR Fingerprinting
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
Richard Baraniuk Rice University Progress in Analog-to- Information Conversion.
Compressive Signal Processing Richard Baraniuk Rice University.
Learning With Dynamic Group Sparsity Junzhou Huang Xiaolei Huang Dimitris Metaxas Rutgers University Lehigh University Rutgers University.
Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Volkan Cevher Laboratory for Information and Inference Systems – LIONS / EPFL.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Sparse Representation and Compressed Sensing: Theory and Algorithms
“Random Projections on Smooth Manifolds” -A short summary
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Compressed Sensing meets Information Theory Dror Baron Duarte Wakin Sarvotham Baraniuk Guo Shamai.
Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Signal Processing.
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai.
Compressive Signal Processing
Random Convolution in Compressive Sampling Michael Fleyer.
Introduction to Compressive Sensing
Rice University dsp.rice.edu/cs Distributed Compressive Sensing A Framework for Integrated Sensing and Processing for Signal Ensembles Marco Duarte Shriram.
Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction
Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.
Compressed Sensing Compressive Sampling
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
Compressive Sensing A New Approach to Image Acquisition and Processing
Richard Baraniuk Rice University Model-based Sparsity.
Compressive Sampling: A Brief Overview
Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher
Game Theory Meets Compressed Sensing
Compressive Sensing A New Approach to Signal Acquisition and Processing Richard Baraniuk Rice University.
Richard Baraniuk Chinmay Hegde Marco Duarte Mark Davenport Rice University Michael Wakin University of Michigan Compressive Learning and Inference.
Recovery of Clustered Sparse Signals from Compressive Measurements
Cs: compressed sensing
Introduction to Compressive Sensing
Recovering low rank and sparse matrices from compressive measurements Aswin C Sankaranarayanan Rice University Richard G. Baraniuk Andrew E. Waters.
Sketching via Hashing: from Heavy Hitters to Compressive Sensing to Sparse Fourier Transform Piotr Indyk MIT.
Learning With Structured Sparsity
Source Localization on a budget Volkan Cevher Rice University Petros RichAnna Martin Lance.
Model-Based Compressive Sensing Presenter: Jason David Bonior ECE / CMR Tennessee Technological University November 5, 2010 Reading Group (Richard G. Baraniuk,
SCALE Speech Communication with Adaptive LEarning Computational Methods for Structured Sparse Component Analysis of Convolutive Speech Mixtures Volkan.
Compressible priors for high-dimensional statistics Volkan Cevher LIONS/Laboratory for Information and Inference Systems
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
ALISSA M. STAFFORD MENTOR: ALEX CLONINGER DIRECTED READING PROJECT MAY 3, 2013 Compressive Sensing & Applications.
An Introduction to Compressive Sensing Speaker: Ying-Jou Chen Advisor: Jian-Jiun Ding.
An Introduction to Compressive Sensing
Li-Wei Kang and Chun-Shien Lu Institute of Information Science, Academia Sinica Taipei, Taiwan, ROC {lwkang, April IEEE.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
n-pixel Hubble image (cropped)
Learning With Dynamic Group Sparsity
Basic Algorithms Christina Gallner
CNNs and compressive sensing Theoretical analysis
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Sudocodes Fast measurement and reconstruction of sparse signals
Introduction to Compressive Sensing Aswin Sankaranarayanan
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
CIS 700: “algorithms for Big Data”
Sudocodes Fast measurement and reconstruction of sparse signals
Outline Sparse Reconstruction RIP Condition
Subspace Expanders and Low Rank Matrix Recovery
Presentation transcript:

Model-based Compressive Sensing Volkan Cevher volkan@rice.edu

Chinmay Hegde Richard Baraniuk Marco Duarte

Dimensionality Reduction Compressive sensing non-adaptive measurements Sparse Bayesian learning dictionary of features Information theory coding frame Computer science sketching matrix / expander

Underdetermined Linear Regression Challenge: nullspace of

Key Insights from the Compressive Sensing Sparse or compressible not sufficient alone Projection information preserving (restricted isometry property - RIP) Decoding algorithms tractable

Deterministic Priors Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces aligned w/ coordinate axes sorted index

Deterministic Priors Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces Compressible signal: sorted coordinates decay rapidly to zero Model: weak shell power-law decay sorted index

Deterministic Priors Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K-sparse signal (simply by thresholding) sorted index

Restricted Isometry Property (RIP) Preserve the structure of sparse/compressible signals RIP of order 2K implies: for all K-sparse x1 and x2 K-planes

Restricted Isometry Property (RIP) Preserve the structure of sparse/compressible signals Random subGaussian (iid Gaussian, Bernoulli) matrix has the RIP with high probability if K-planes

Recovery Algorithms Goal: given recover and convex optimization formulations basis pursuit, Lasso, scalarization … iterative re-weighted algorithms Greedy algorithms: CoSaMP, IHT, SP

Performance of Recovery Tractability polynomial time Sparse signals noise-free measurements: exact recovery noisy measurements: stable recovery Compressible signals recovery as good as K-sparse approximation CS recovery error signal K-term approx error noise

From Sparsity to Structured parsity

background subtracted images Sparse Models pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones

Sparse Models Sparse/compressible signal model captures simplistic primary structure sparse image

background subtracted images Beyond Sparse Models Sparse/compressible signal model captures simplistic primary structure Modern compression/processing algorithms capture richer secondary coefficient structure pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones

Sparse Signals Defn: K-sparse signals comprise a particular set of K-dim canonical subspaces

Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces

Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces Structured subspaces <> fewer subspaces <> relaxed RIP <> fewer measurements

Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces Structured subspaces <> increased signal discrimination <> improved recovery perf. <> faster recovery

Model-based CS Running Example: Tree-Sparse Signals [Baraniuk, VC, Duarte, Hegde]

Wavelet Sparse 1-D signals 1-D wavelet transform scale amplitude Typical of wavelet transforms of natural signals and images (piecewise smooth) scale amplitude time coefficients

Tree-Sparse Model: K-sparse coefficients + significant coefficients lie on a rooted subtree Typical of wavelet transforms of natural signals and images (piecewise smooth)

Tree-Sparse Model: K-sparse coefficients + significant coefficients lie on a rooted subtree Sparse approx: find best set of coefficients sorting hard thresholding Tree-sparse approx: find best rooted subtree of coefficients condensing sort and select [Baraniuk] dynamic programming [Donoho]

Sparse Model: K-sparse coefficients RIP: stable embedding K-planes

Tree-Sparse Model: K-sparse coefficients + significant coefficients lie on a rooted subtree Tree-RIP: stable embedding K-planes

Tree-Sparse Model: K-sparse coefficients + significant coefficients lie on a rooted subtree Tree-RIP: stable embedding Recovery: new model based algorithms [VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte, Hegde]

Standard CS Recovery Iterative Thresholding update signal estimate [Nowak, Figueiredo; Kingsbury, Reeves; Daubechies, Defrise, De Mol; Blumensath, Davies; …] update signal estimate prune signal estimate (best K-term approx) update residual

Model-based CS Recovery Iterative Model Thresholding [VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte, Hegde] update signal estimate prune signal estimate (best K-term model approx) update residual

Tree-Sparse Signal Recovery CoSaMP, (MSE=1.12) target signal N=1024 M=80 L1-minimization (MSE=0.751) Tree-sparse CoSaMP (MSE=0.037)

Compressible Signals Real-world signals are compressible, not sparse Recall: compressible <> well approximated by sparse compressible signals lie close to a union of subspaces ie: approximation error decays rapidly as If has RIP, then both sparse and compressible signals are stably recoverable sorted index

Model-Compressible Signals Model-compressible <> well approximated by model-sparse model-compressible signals lie close to a reduced union of subspaces ie: model-approx error decays rapidly as

Model-Compressible Signals Model-compressible <> well approximated by model-sparse model-compressible signals lie close to a reduced union of subspaces ie: model-approx error decays rapidly as While model-RIP enables stable model-sparse recovery, model-RIP is not sufficient for stable model-compressible recovery at !

Stable Recovery Stable model-compressible signal recovery at requires that have both: RIP + Restricted Amplification Property RAmP: controls nonisometry of in the approximation’s residual subspaces optimal K-term model recovery (error controlled by RIP) optimal 2K-term model recovery (error controlled by RIP) residual subspace (error not controlled by RIP)

Tree-RIP, Tree-RAmP Theorem: An MxN iid subgaussian random matrix has the Tree(K)-RIP if Theorem: An MxN iid subgaussian random matrix has the Tree(K)-RAmP if

Simulation Number samples for correct recovery Piecewise cubic signals + wavelets Models/algorithms: compressible (CoSaMP) tree-compressible (tree-CoSaMP)

Performance of Recovery Using model-based IT, CoSaMP with RIP and RAmP Model-sparse signals noise-free measurements: exact recovery noisy measurements: stable recovery Model-compressible signals recovery as good as K-model-sparse approximation CS recovery error signal K-term model approx error noise [Baraniuk, VC, Duarte, Hegde]

Other Useful Models When the model-based framework makes sense: model with fast approximation algorithm sensing matrix with model-RIP model-RAmP Ex: block sparsity / signal ensembles [Tropp, Gilbert, Strauss], [Stojnic, Parvaresh, Hassibi], [Eldar, Mishali], [Baron, Duarte et al], [Baraniuk, VC, Duarte, Hegde] Ex: clustered signals [VC, Duarte, Hegde, Baraniuk], [VC, Indyk, Hegde, Baraniuk] Ex: neuronal spike trains [Hegde, Duarte, VC – best student paper award]

Clustered Sparsity Model clustering of significant pixels in space domain using graphical model (MRF) Ising model approximation via graph cuts [VC, Duarte, Hedge, Baraniuk] target Ising-model recovery CoSaMP recovery LP (FPC) recovery

Conclusions Why CS works: stable embedding for signals with concise geometric structure Sparse signals <> model-sparse signals Compressible signals <> model-compressible signals upshot: fewer measurements more stable recovery new concept: RAmP