Compressive Signal Processing Richard Baraniuk Rice University dsp.rice.edu/cs Came out of my personal experience with 301 – fourier analysis and linear systems
Compressive Sensing (CS) When data is sparse/compressible, can directly acquire a condensed representation with no/little information loss Random projection will work sparse signal measurements sparse in some basis [Candes-Romberg-Tao, Donoho, 2004]
CS Signal Recovery Reconstruction/decoding: given (ill-posed inverse problem) find sparse signal measurements nonzero entries
CS Signal Recovery Reconstruction/decoding: given (ill-posed inverse problem) find L2 fast
CS Signal Recovery Reconstruction/decoding: given (ill-posed inverse problem) find L2 fast, wrong
Why L2 Doesn’t Work least squares, minimum L2 solution is almost never sparse null space of translated to (random angle)
CS Signal Recovery Reconstruction/decoding: given (ill-posed inverse problem) find L2 fast, wrong L0 number of nonzero entries: ie: find sparsest potential solution
CS Signal Recovery Reconstruction/decoding: given (ill-posed inverse problem) find L2 fast, wrong L0 correct, slow only M=K+1 measurements required to perfectly reconstruct K-sparse signal [Bresler; Rice]
CS Signal Recovery Reconstruction/decoding: given (ill-posed inverse problem) find L2 fast, wrong L0 correct, slow L1 correct, mild oversampling [Candes et al, Donoho] linear program
Why L1 Works minimum L1 solution = sparsest solution (with high probability) if
Universality Gaussian white noise basis is incoherent with any fixed orthonormal basis (with high probability) Signal sparse in time domain:
Universality Gaussian white noise basis is incoherent with any fixed orthonormal basis (with high probability) Signal sparse in frequency domain: Product remains white Gaussian