Download presentation
Presentation is loading. Please wait.
1
compressive nonsensing
Richard Baraniuk Rice University
2
Chapter 1 The Problem
3
challenge 1 data too expensive
4
Case in Point: MR Imaging
Measurements very expensive $1-3 million per machine 30 minutes per scan
5
Case in Point: IR Imaging
6
challenge 2 too much data
7
Case in Point: DARPA ARGUS-IS
1.8 Gpixel image sensor video rate output: Gbits/s comm data rate: Mbits/s factor of 1600x way out of reach of existing compression technology Reconnaissance without conscience too much data to transmit to a ground station too much data to make effective real-time decisions
8
Chapter 2 The Promise
9
COMPRESSIVE SENSING
10
innovation 1 sparse signal models
11
Sparsity large wavelet coefficients pixels wideband signal samples
(blue = 0) pixels wideband signal samples large Gabor (TF) coefficients frequency time
12
Sparsity large wavelet coefficients pixels nonlinear signal model
(blue = 0) pixels sparse signal nonlinear signal model nonzero entries
13
innovation 2 dimensionality reduction for sparse signals
14
Dimensionality Reduction
When data is sparse/compressible, can directly acquire a compressed representation with no/little information loss through linear dimensionality reduction sparse signal measurements nonzero entries
15
Stable Embedding An information preserving projection preserves the geometry of the set of sparse signals SE ensures that K-dim subspaces 15
16
Stable Embedding An information preserving projection preserves the geometry of the set of sparse signals SE ensures that 16
17
Random Embedding is Stable
Measurements = random linear combinations of the entries of No information loss for sparse vectors whp sparse signal measurements nonzero entries
18
innovation 3 sparsity-based signal recovery
19
Signal Recovery Goal: Recover signal from measurements
Problem: Random projection not full rank (ill-posed inverse problem) Solution: Exploit the sparse/compressible geometry of acquired signal Recovery via (convex) sparsity penalty or greedy algorithms [Donoho; Candes, Romberg, Tao, 2004]
20
Signal Recovery Goal: Recover signal from measurements
Problem: Random projection not full rank (ill-posed inverse problem) Solution: Exploit the sparse/compressible geometry of acquired signal Recovery via (convex) sparsity penalty or greedy algorithms [Donoho; Candes, Romberg, Tao, 2004]
21
“Single-Pixel” CS Camera
scene single photon detector image reconstruction or processing DMD DMD random pattern on DMD array DMD is used in projectors Multiply value of random pattern in mirror with value of signal (light intensity) in pixel lens is focused onto the photodiode w/ Kevin Kelly
22
“Single-Pixel” CS Camera
scene single photon detector image reconstruction or processing DMD DMD random pattern on DMD array DMD is used in projectors Multiply value of random pattern in mirror with value of signal (light intensity) in pixel lens is focused onto the photodiode … Flip mirror array M times to acquire M measurements Sparsity-based recovery
23
Random Demodulator Problem: In contrast to Moore’s Law, ADC performance doubles only every 6-8 years CS enables sampling near signal’s (low) “information rate” rather than its (high) Nyquist rate A2I sampling rate number of tones / window Nyquist bandwidth
24
Example: Frequency Hopper
Sparse in time-frequency 20x sub-Nyquist sampling Nyquist rate sampling spectrogram sparsogram
25
challenge 1 data too expensive
means fewer expensive measurements needed for the same resolution scan
26
challenge 2 too much data
means we compress on the fly as we acquire data
27
EXCITING!!!
28
2004—2014 9797 citations 6640 citations dsp.rice.edu/cs archive >1500 papers nuit-blanche.blogspot.com > 1 posting/sec
29
Chapter 3 The Hype
30
CS is Growing Up
31
Gerhard Richter 4096 Colours
32
muralsoflajolla.com/roy-mcmakin-mural
35
“L1 is the new L2” - Stan Osher
36
Exponential Growth
38
?
39
Chapter 4 The Fallout
40
“L1 is the new L2” - Stan Osher
41
CS for “Face Recognition”
42
From: M. V. Subject: Interesting application for compressed sensing Date: June 10, 2011 at 11:37:31 PM EDT To: Drs. Candes and Romberg, You may have already been approached about this, but I feel I should say something in case you haven't. I'm writing to you because I recently read an article in Wired Magazine about compressed sensing I'm excited about the applications CS could have in many fields, but today I was reminded of a specific application where CS could conceivably settle an area of dispute between mainstream historians and Roswell UFO theorists. As outlined in the linked video below, Dr. Rudiak has analyzed photos from 1947 in which a General Ramey appears holding a typewritten letter from which Rudiak believes he has been able to discern a number of words which he believes substantiate the extraterrestrial hypothesis for the Roswell Incident). For your perusal, I've located a "hi-res" copy of the cropped image of the letter in Ramey's hand. I hope to hear back from you. Is this an application where compressed sensing could be useful? Any chance you would consider trying it? Thank you for your time, M. V. P.S. - Out of personal curiosity, are there currently any commercial entities involved in developing CS-based software for use by the general public? --
44
x
45
Chapter 5 Back to Reality
46
Back to Reality “There's no such thing as a free lunch”
“Something for Nothing” theorems Dimensionality reduction is no exception Result: Compressive Nonsensing
47
Nonsense 1 Robustness
48
Measurement Noise Stable recovery with additive measurement noise
Noise is added to Stability: noise only mildly amplified in recovered signal
49
Signal Noise Often seek recovery with additive signal noise
Noise is added to Noise folding: signal noise amplified in by dB for every doubling of Same effect seen in classical “bandpass subsampling” [Davenport, Laska, Treichler, B 2011]
50
Noise Folding in CS slope = -3 CS recovered signal SNR
51
“Tail Folding” Can model compressible (approx sparse) signals as “signal” + “tail” Tail “folds” into signal as increases “signal” “tail” [Davies, Guo, 2011; Davenport, Laska, Treichler, B 2011] sorted index
52
All Is Not Lost – Dynamic Range
In wideband ADC apps As amount of subsampling grows, can employ an ADC with a lower sampling rate and hence higher-resolution quantizer
53
Dynamic Range CS can significantly boost the ENOB of an ADC system for sparse signals CS ADC w/ sparsity stated number of bits conventional ADC log sampling frequency
54
Dynamic Range As amount of subsampling grows, can employ an ADC with a lower sampling rate and hence higher-resolution quantizer Thus dynamic range of CS ADC can significantly exceed Nyquist ADC With current ADC trends, dynamic range gain is theoretically 7.9dB for each doubling in
55
Dynamic Range slope = +5 (almost 7.9) dynamic range
56
Tradeoff SNR: 3dB loss for each doubling of
Dynamic Range: up to 7.9dB gain for each doubling of
57
Adaptivity ’ Say we know the locations of the non-zero entries in
Then we boost the SNR by Motivates adaptive sensing strategies that bypass the noise-folding tradeoff [Haupt, Castro, Nowak, B 2009; Candes, Davenport 2011] columns ’
58
Nonsense 2 Quantization
59
CS and Quantization Vast majority of work in CS assumes the measurements are real-valued In practice, measurements must be quantized (nonlinear) Should measure CS performance in terms of number of measurement bits rather than number of (real-valued) measurements Limited progress large number of bits per measurement 1 bit per measurement
60
CS and Quantization N=2000, K=20, M = (total bits)/(bits per meas)
12 bits/meas 10 bits 8 bits 6 bits 1 bit 4 bits 2 bits
61
Nonsense 3 Weak Models
62
Weak Models Sparsity models in CS emphasize discrete bases and frames
DFT, wavelets, … But in real data acquisition problems, the world is continuous, not discrete
63
The Grid Problem Consider “frequency sparse” signal
suggests the DFT sparsity basis Easy CS problem: K=1 frequency Hard CS problem: K=1 frequency slow decay due to sinc interpolation of off-grid sinusoids (asymptotically, signal is not even in L1)
64
Going Off the Grid Spectral CS [Duarte, B, 2010]
discrete formulation CS Off the Grid [Tang, Bhaskar, Shah, Recht, 2012] continuous formulation best case Spectral CS 20dB average case worst case
65
Nonsense 4 Focus on Recovery
66
Misguided Focus on Recovery
Recall the data deluge problem in sensing ex: large-scale imaging, HSI, video, ultrawideband ADC, data ambient dimension N too large When N ~ billions, signal recovery becomes problematic, if not impossible Solution: Perform signal exploitation directly on the compressive measurements
67
Compressive Signal Processing
Many applications involve signal inference and not reconstruction detection < classification < estimation < reconstruction Good news: CS supports efficient learning, inference, processing directly on compressive measurements Random projections ~ sufficient statistics for signals with concise geometrical structure 67
68
Classification Simple object classification problem Common issue:
AWGN: nearest neighbor classifier Common issue: L unknown articulation parameters Common solution: matched filter find nearest neighbor under all articulations 68
69
CS-based Classification
Target images form a low-dimensional manifold as the target articulates random projections preserve information in these manifolds if CS-based classifier: smashed filter find nearest neighbor under all articulations under random projection [Davenport, B, et al 2006] 69
70
Smashed Filter Random shift and rotation (L=3 dim. manifold)
White Gaussian noise added to measurements Goals: identify most likely shift/rotation parameters identify most likely class more noise classification rate (%) avg. shift estimate error more noise number of measurements M number of measurements M 70
71
Frequency Tracking Compressive Phase Locked Loop (PLL)
key idea: phase detector in PLL computes inner product between signal and oscillator output RIP ensures we can compute this inner product between corresponding low-rate CS measurements CS-PLL w/ 20x undersampling
72
Nonsense 5 Weak Guarantees
73
Performance Guarantees
CS performance guarantees RIP, incoherence, phase transition To date, rigorous results only for random matrices practically not useful often pessimistic Need rigorous guarantees for non-random, structured sampling matrices with fast algorithms analogous to the progress in coding theory from Shannon’s original random codes to modern codes
74
Chapter 6 All Is Not Lost !
75
Sparsity Convex optimization Dimensionality reduction
76
12-Step Program To End Compressive Nonsensing
Don’t give in to the hype surrounding CS Resist the urge to blindly apply L1 minimization Face up to robustness issues Deal with measurement quantization Develop more realistic signal models Develop practical sensing matrices beyond random Develop more efficient recovery algorithms Develop rigorous performance guarantees for practical CS systems Exploit signals directly in the compressive domain
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.