Bounds for Optimal Compressed Sensing Matrices

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Eigen Decomposition and Singular Value Decomposition
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
An Introduction to Compressed Sensing Student : Shenghan TSAI Advisor : Hsuan-Jung Su and Pin-Hsun Lin Date : May 02,
Object Specific Compressed Sensing by minimizing a weighted L2-norm A. Mahalanobis.
Compressive Sensing IT530, Lecture Notes.
The Stability of a Good Clustering Marina Meila University of Washington
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Extremum Properties of Orthogonal Quotients Matrices By Achiya Dax Hydrological Service, Jerusalem, Israel
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
“Random Projections on Smooth Manifolds” -A short summary
Signal , Weight Vector Spaces and Linear Transformations
Chapter 5 Orthogonality
Compressed Sensing for Networked Information Processing Reza Malek-Madani, 311/ Computational Analysis Don Wagner, 311/ Resource Optimization Tristan Nguyen,
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
Uncalibrated Geometry & Stratification Sastry and Yang
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
1 Distortion-Rate for Non-Distributed and Distributed Estimation with WSNs Presenter: Ioannis D. Schizas May 5, 2005 EE8510 Project May 5, 2005 EE8510.
Random Convolution in Compressive Sampling Michael Fleyer.
Introduction to Compressive Sensing
Basic Concepts and Definitions Vector and Function Space. A finite or an infinite dimensional linear vector/function space described with set of non-unique.
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
3D Geometry for Computer Graphics
Compressed Sensing Compressive Sampling
SVD(Singular Value Decomposition) and Its Applications
Compressive Sampling: A Brief Overview
Kansas State University Department of Computing and Information Sciences CIS 736: Computer Graphics Monday, 26 January 2004 William H. Hsu Department of.
Game Theory Meets Compressed Sensing
Cs: compressed sensing
NETWORK CODING. Routing is concerned with establishing end to end paths between sources and sinks of information. In existing networks each node in a.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Compressible priors for high-dimensional statistics Volkan Cevher LIONS/Laboratory for Information and Inference Systems
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
Chapter 2 Nonnegative Matrices. 2-1 Introduction.
The Haar + Tree: A Refined Synopsis Data Structure Panagiotis Karras HKU, September 7 th, 2006.
By Josh Zimmer Department of Mathematics and Computer Science The set ℤ p = {0,1,...,p-1} forms a finite field. There are p ⁴ possible 2×2 matrices in.
STATIC ANALYSIS OF UNCERTAIN STRUCTURES USING INTERVAL EIGENVALUE DECOMPOSITION Mehdi Modares Tufts University Robert L. Mullen Case Western Reserve University.
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
Lecture 16: Image alignment
Singular Value Decomposition and its applications
CS479/679 Pattern Recognition Dr. George Bebis
Modulated Unit Norm Tight Frames for Compressed Sensing
Computing and Compressive Sensing in Wireless Sensor Networks
Lecture 13 Compressive sensing
Highly Undersampled 0-norm Reconstruction
Wavelets : Introduction and Examples
Spectral Methods Tutorial 6 1 © Maks Ovsjanikov
An Introduction to Support Vector Machines
LSI, SVD and Data Management
Structure from motion Input: Output: (Tomasi and Kanade)
Sketching and Embedding are Equivalent for Norms
Nuclear Norm Heuristic for Rank Minimization
RECORD. RECORD Gaussian Elimination: derived system back-substitution.
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Probabilistic existence of regular combinatorial objects
Sudocodes Fast measurement and reconstruction of sparse signals
1.3 Vector Equations.
Introduction to Compressive Sensing Aswin Sankaranarayanan
Sparse and Redundant Representations and Their Applications in
Connecting Data with Domain Knowledge in Neural Networks -- Use Deep learning in Conventional problems Lizhong Zheng.
Classical Algorithms from Quantum and Arthur-Merlin Communication Protocols Lijie Chen MIT Ruosong Wang CMU.
Recursively Adapted Radial Basis Function Networks and its Relationship to Resource Allocating Networks and Online Kernel Learning Weifeng Liu, Puskal.
CIS 700: “algorithms for Big Data”
Sudocodes Fast measurement and reconstruction of sparse signals
Unfolding with system identification
Structure from motion Input: Output: (Tomasi and Kanade)
Non-Negative Matrix Factorization
Outline Sparse Reconstruction RIP Condition
Subspace Expanders and Low Rank Matrix Recovery
Presentation transcript:

Bounds for Optimal Compressed Sensing Matrices Shriram Sarvotham DSP Group, ECE, Rice University

Compressed Sensing Signal has non-zero coefficients Efficient ways to measure and recover ? Traditional DSP approach: Acquisition: first obtain measurements Then compress, throwing away all but coefficients Emerging Compressed Sensing (CS) approach: Acquisition: obtain just measurements No unnecessary measurements / computations [Candes et al; Donoho]

Compressed Sensing CS measurements: matrix multiplication sparse signal measurements sparse

Contributions Bounds on quality of CS matrices In terms of the CS parameters Quality metric: Restricted Isometry Designing CS matrices for fast reconstruction Sparse matrices Fast algorithms CS rate-distortion

Contributions Bounds on quality of CS matrices In terms of the CS parameters Quality metric: Restricted Isometry Designing CS matrices for fast reconstruction Sparse matrices Fast algorithms CS rate-distortion

Restricted Isometry Property (RIP) Key idea: ensure (approximate) isometry for restricted to the domain of -sparse signals: Restricted Isometry Property of order RIP ensures columns of are locally almost orthogonal rather than globally perfectly orthogonal Measure of quality of CS matrix:

Restricted Isometry Property (RIP) Good CS matrix: Question: In , what can we say about for the best CS matrix? Answer: Determined by 2 bounds Structural bound Packing bound

Restricted Isometry Property (RIP) Good CS matrix: Question: In , what can we say about for the best CS matrix? Answer: Determined by 2 bounds Structural bound Packing bound

Tied to SVD’s of sub-matrices I) Structural bound Tied to SVD’s of sub-matrices

Role of sub-matrices of RIP depends only on sub-matrices of

Role of sub-matrices of RIP depends only on sub-matrices of 

Theorem of Thompson [Thompson 1972] Denote the SV characteristic equation of a matrix by The SV characteristic equations of sub-matrices satisfy

Significance to RIP Let Zeros of : Then, and is minimized when ’s are equal

Significance to RIP Let Zeros of : Then, and is minimized when ’s are equal

Significance to RIP Let Zeros of : Then, and is minimized when ’s are equal Structural bound

Geometrical meaning Relates volumes of hyper-ellipse to those of Special case: when and equating constant terms,

Geometrical meaning Relates volumes of hyper-ellipse to those of Special case: when and equating constant terms, (Generalized Pythagorean Theorem)

GPT for Areas in .

Geometrical meaning Thompson Equation extends GPT Relates -volumes of the hyperellipses in

Structural bound

Structural bound Upper bound

How tight is the structural bound?

How tight is the structural bound? We answer for

Comparison with best . Structural bound good for up to some Best known Best known Structural bound Structural bound Structural bound good for up to some Beyond , RIP of best construction diverges

Comparison with best . Structural bound good for up to some Best known Best known Structural bound Structural bound Structural bound good for up to some Beyond , RIP of best construction diverges  Hints at another mechanism controlling RIP!

Connections to Equi-angular Tight Frames (ETF) that meets the structural bound for is an ETF [Sustik, Tropp et al] ETF’s satisfy three conditions Columns are equi-normed Angle between every pair of columns is same

Connections to Equi-angular Tight Frames (ETF) that meets the structural bound for is an ETF [Sustik, Tropp et al] ETF’s satisfy three conditions Columns are equi-normed Angle between every pair of columns is same Example:

Singular and Eigen values of Sub-matrices Series of 9 papers by R.C. Thompson Additional results on Singular vectors Useful in construction of Prof. Robert C. Thompson Born 1931 Ph. D. (CalTech, 1960), Professor ( UCSB, 1963 -- 1995) Published 4 books + 120 papers Died Dec. 10, 1995

Tied to packing in Euclidean spaces II) Packing bound Tied to packing in Euclidean spaces

Tied to packing in Euclidean spaces II) Packing bound Tied to packing in Euclidean spaces Derive for

Role of column norms on SV’s

Role of angle on SV’s Ratio of SV’s depends only on

Role of column norms on RIP

Role of column norms on RIP

Role of column norms on RIP  Restrict our attention to with equi-normed columns

Maximizing the minimum angle between lines . and codes in . Design of good for . is equivalent to Maximizing the minimum angle between lines in .

Packing bound θ Packing (converse):

Packing bound Packing (converse): Covering (achievable): θ Packing (converse): Covering (achievable): [Shannon, Chabauty, Wyner]

Ok, lets put this all together…. + + Structural bound Packing bound Covering bound

Comparison of bounds Gaussian iid construction Achievable (covering) Best known CS matrix Converse (structural) Converse (packing)

Comparison of bounds Gaussian iid construction Achievable (covering) Best known CS matrix Converse (packing) Converse (structural)

Comparison of bounds Gaussian iid construction Achievable (covering) Best known CS matrix Converse (structural) Converse (packing)

Relevance of Thompson polynomial Comparison of with histograms of

Relevance of Thompson polynomial Comparison of with histograms of

Relevance of Thompson polynomial Comparison of with histograms of Useful in stochastic RIP!

Future directions Connections to Johnson-Lindenstrauss Lemma Packing bounds for Explicit constructions for Extensions to Universal Compressed Sensing

Summary Derived deterministic bounds for RIP Connections to coding Structural bound based on SVD Packing bound based on sphere/cone packing Connections to coding Codes on Grassmannian spaces Geometric interpretations Generalized Pythagorean Theorem Equi-angular tight frames

BACKUP SLIDES

{ {

Structural bound and Equi-angular tight frames Columns of such that norms are equal Angle between every pair is same If then A*A=cI

Equi-angular tight frames meets the bound iff an ETF

M=4

M=8

M=16