Recovery of Clustered Sparse Signals from Compressive Measurements Volkan Cevher volkan@rice.edu Piotr Indyk Chinmay Hegde Richard Baraniuk
The Digital Universe Size: ~300 billion gigabytes generated in 2007 digital bits > stars in the universe growing by a factor of 10 every 5 years > Avogadro’s number (6.02x1023) in 15 years Growth fueled by multimedia / multisensor data audio, images, video, surveillance cameras, sensor nets, … In 2007 digital data generated > total storage by 2011, ½ of digital universe will have no home [Source: IDC Whitepaper “The Diverse and Exploding Digital Universe” March 2008]
Challenges Acquisition Compression Processing
Approaches Do nothing / Ignore be content with where we are… generalizes well robust
Approaches Finite Rate of Innovation Sketching / Streaming Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
Approaches PARSITY Finite Rate of Innovation Sketching / Streaming Compressive Sensing PARSITY [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
Agenda A short review of compressive sensing Beyond sparse models Potential gains via structured sparsity Block-sparse model (K,C)-sparse model Conclusions
Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the sparsity/compressibility geometry of acquired signal
Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity/compressibility geometry of acquired signal iid Gaussian iid Bernoulli …
Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the model geometry of acquired signal
Basic Signal Models Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces aligned w/ coordinate axes sorted index
Basic Signal Models Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K-sparse signal (simply by thresholding) power-law decay sorted index
Recovery Algorithms Goal: given recover and convex optimization formulations basis pursuit, Dantzig selector, Lasso, … Greedy algorithms iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) subspace pursuit (SP) at their core: iterative sparse approximation
Performance of Recovery Using methods, IT, CoSaMP, SP Sparse signals noise-free measurements: exact recovery noisy measurements: stable recovery Compressible signals recovery as good as K-sparse approximation CS recovery error signal K-term approx error noise
background subtracted images Sparse Signals pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones
Sparsity as a Model Sparse/compressible signal model captures simplistic primary structure sparse image
background subtracted images Beyond Sparse Models Sparse/compressible signal model captures simplistic primary structure Modern compression/processing algorithms capture richer secondary coefficient structure pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones
Sparse Signals Defn: K-sparse signals comprise a particular set of K-dim canonical subspaces
Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces
Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces Structured subspaces <> fewer subspaces <> relaxed RIP <> fewer measurements [Blumensath and Davies]
Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces Structured subspaces <> increased signal discrimination <> improved recovery perf. <> faster recovery
Block Sparse Signals
Block-Sparsity Motivation Signal model sensor networks intra-sensor: sparsity inter-sensor: common sparse supports union of subspaces when signals are concatenated frequency sensors … *Individual block sizes may vary… [Tropp, Eldar, Mishali; Stojnic, Parvaresh, Hassibi; Baraniuk, VC, Duarte, Hegde]
Block-Sparsity Recovery Sampling bound (B = # of blocks) Problem specific solutions Mixed -norm solutions Greedy solutions: simultaneous orthogonal matching pursuit [Tropp] Model-based recovery framework: -norm [Baraniuk, VC, Duarte, Hegde] within block across blocks [Eldar, Mishali (provable); Stojnic, Parvaresh, Hassibi]
Standard CS Recovery Iterative Thresholding update signal estimate [Nowak, Figueiredo; Kingsbury, Reeves; Daubechies, Defrise, De Mol; Blumensath, Davies; …] update signal estimate prune signal estimate (best K-term approx) update residual
Model-based CS Recovery Iterative Model Thresholding [VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte, Hegde] update signal estimate prune signal estimate (best K-term model approx) update residual
Model-based CS Recovery Provable guarantees [Baraniuk, VC, Duarte, Hegde] CS recovery error signal K-term model approx error noise update signal estimate prune signal estimate (best K-term model approx) update residual
Block-Sparse CS Recovery Iterative Block Thresholding within block sort + threshold across blocks
block-sparse model recovery (MSE=0.015) Block-Sparse Signal target CoSaMP (MSE = 0.723) block-sparse model recovery (MSE=0.015) Blocks are pre-specified. 29
Clustered Sparse Signals: (K,C)-sparse signal model
Clustered Sparsity (K,C) sparse signals (1-D) Stable recovery K-sparse within at most C clusters Stable recovery Recovery: w/ model-based framework model approximation via dynamic programming (recursive / bottom up)
Example
Simulation via Block-Sparsity Clustered sparse <> approximable by block-sparse *If we are willing to pay 3 × sampling penalty Proof by (adversarial) construction …
Clustered Sparsity in 2D Model clustering of significant pixels in space domain using graphical model (MRF) Ising model approximation via graph cuts [VC, Duarte, Hedge, Baraniuk] target Ising-model recovery CoSaMP recovery LP (FPC) recovery 34
Conclusions Why CS works: stable embedding for signals with special geometric structure Sparse signals >> model-sparse signals Greedy model-based signal recovery algorithms upshot: provably fewer measurements more stable recovery new model: clustered sparsity Compressible signals >> model-compressible signals 35
Volkan Cevher / volkan@rice.edu