Download presentation
Presentation is loading. Please wait.
Published byEaster Banks Modified over 9 years ago
1
Recovery of Clustered Sparse Signals from Compressive Measurements
Volkan Cevher Piotr Indyk Chinmay Hegde Richard Baraniuk
2
The Digital Universe Size: ~300 billion gigabytes generated in 2007
digital bits > stars in the universe growing by a factor of 10 every 5 years > Avogadro’s number (6.02x1023) in 15 years Growth fueled by multimedia / multisensor data audio, images, video, surveillance cameras, sensor nets, … In 2007 digital data generated > total storage by 2011, ½ of digital universe will have no home [Source: IDC Whitepaper “The Diverse and Exploding Digital Universe” March 2008]
3
Challenges Acquisition Compression Processing
4
Approaches Do nothing / Ignore be content with where we are…
generalizes well robust
5
Approaches Finite Rate of Innovation Sketching / Streaming
Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
6
Approaches PARSITY Finite Rate of Innovation Sketching / Streaming
Compressive Sensing PARSITY [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
7
Agenda A short review of compressive sensing Beyond sparse models
Potential gains via structured sparsity Block-sparse model (K,C)-sparse model Conclusions
8
Compressive Sensing 101 Goal: Recover a sparse or
compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the sparsity/compressibility geometry of acquired signal
9
Compressive Sensing 101 Goal: Recover a sparse or
compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity/compressibility geometry of acquired signal iid Gaussian iid Bernoulli …
10
Compressive Sensing 101 Goal: Recover a sparse or
compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the model geometry of acquired signal
11
Basic Signal Models Sparse signal: only K out of N coordinates nonzero
model: union of K-dimensional subspaces aligned w/ coordinate axes sorted index
12
Basic Signal Models Sparse signal: only K out of N coordinates nonzero
model: union of K-dimensional subspaces Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K-sparse signal (simply by thresholding) power-law decay sorted index
13
Recovery Algorithms Goal: given recover
and convex optimization formulations basis pursuit, Dantzig selector, Lasso, … Greedy algorithms iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) subspace pursuit (SP) at their core: iterative sparse approximation
14
Performance of Recovery
Using methods, IT, CoSaMP, SP Sparse signals noise-free measurements: exact recovery noisy measurements: stable recovery Compressible signals recovery as good as K-sparse approximation CS recovery error signal K-term approx error noise
15
background subtracted images
Sparse Signals pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones
16
Sparsity as a Model Sparse/compressible signal model captures simplistic primary structure sparse image
17
background subtracted images
Beyond Sparse Models Sparse/compressible signal model captures simplistic primary structure Modern compression/processing algorithms capture richer secondary coefficient structure pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones
18
Sparse Signals Defn: K-sparse signals comprise a particular set of K-dim canonical subspaces
19
Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces
20
Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces Structured subspaces <> fewer subspaces <> relaxed RIP <> fewer measurements [Blumensath and Davies]
21
Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces Structured subspaces <> increased signal discrimination <> improved recovery perf. <> faster recovery
22
Block Sparse Signals
23
Block-Sparsity Motivation Signal model sensor networks
intra-sensor: sparsity inter-sensor: common sparse supports union of subspaces when signals are concatenated frequency sensors … *Individual block sizes may vary… [Tropp, Eldar, Mishali; Stojnic, Parvaresh, Hassibi; Baraniuk, VC, Duarte, Hegde]
24
Block-Sparsity Recovery
Sampling bound (B = # of blocks) Problem specific solutions Mixed norm solutions Greedy solutions: simultaneous orthogonal matching pursuit [Tropp] Model-based recovery framework: norm [Baraniuk, VC, Duarte, Hegde] within block across blocks [Eldar, Mishali (provable); Stojnic, Parvaresh, Hassibi]
25
Standard CS Recovery Iterative Thresholding update signal estimate
[Nowak, Figueiredo; Kingsbury, Reeves; Daubechies, Defrise, De Mol; Blumensath, Davies; …] update signal estimate prune signal estimate (best K-term approx) update residual
26
Model-based CS Recovery
Iterative Model Thresholding [VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte, Hegde] update signal estimate prune signal estimate (best K-term model approx) update residual
27
Model-based CS Recovery
Provable guarantees [Baraniuk, VC, Duarte, Hegde] CS recovery error signal K-term model approx error noise update signal estimate prune signal estimate (best K-term model approx) update residual
28
Block-Sparse CS Recovery
Iterative Block Thresholding within block sort + threshold across blocks
29
block-sparse model recovery (MSE=0.015)
Block-Sparse Signal target CoSaMP (MSE = 0.723) block-sparse model recovery (MSE=0.015) Blocks are pre-specified. 29
30
Clustered Sparse Signals: (K,C)-sparse signal model
31
Clustered Sparsity (K,C) sparse signals (1-D) Stable recovery
K-sparse within at most C clusters Stable recovery Recovery: w/ model-based framework model approximation via dynamic programming (recursive / bottom up)
32
Example
33
Simulation via Block-Sparsity
Clustered sparse <> approximable by block-sparse *If we are willing to pay 3 × sampling penalty Proof by (adversarial) construction …
34
Clustered Sparsity in 2D
Model clustering of significant pixels in space domain using graphical model (MRF) Ising model approximation via graph cuts [VC, Duarte, Hedge, Baraniuk] target Ising-model recovery CoSaMP recovery LP (FPC) recovery 34
35
Conclusions Why CS works: stable embedding for signals with special geometric structure Sparse signals >> model-sparse signals Greedy model-based signal recovery algorithms upshot: provably fewer measurements more stable recovery new model: clustered sparsity Compressible signals >> model-compressible signals 35
36
Volkan Cevher / volkan@rice.edu
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.