Download presentation
Presentation is loading. Please wait.
Published byBenjamin Gaines Modified over 6 years ago
1
A Motivating Application: Sensor Array Signal Processing
Goal: Estimate directions of arrival of acoustic sources using a microphone array Data collection setup Underlying “sparse” spatial spectrum f* Forward Inverse
2
Underdetermined Linear Inverse Problems
Basic problem: find an estimate of , where Underdetermined -- non-uniqueness of solutions Additional information/constraints needed for a unique solution A typical approach is the min-norm solution: What if we know is sparse (i.e. has few non-zero elements)? [ ]
3
Sparsity constraints Prefer the sparsest solution:
Number of non-zero elements in f Can be viewed as finding a sparse representation of the signal y in an overcomplete dictionary A Intractable combinatorial optimization problem Are there tractable alternatives that might produce the same result? Empirical observation: l1-norm-based techniques produce solutions that look sparse l1 cost function can be optimized by linear programming!
4
l1-norm and sparsity – a simple example
A sparse signal 1.4142 2.0000 A non-sparse signal 0.5816 3.5549 See lp_norm_example.m, which produces these plots as well as plots of lp-norm vs p for these two signals Goal: Rigorous characterization of the l1 – sparsity link For these two signals f1 and f2 we have A*f1=A*f2 where A is a 16x128 DFT operator
5
l0 uniqueness conditions
Prefer the sparsest solution: Let where When is ? Number of non-zero elements in f Thm. 1: What can we say about more tractable formulations like l1 ? Unique l0 solution where and K(A) is the largest integer such that any set of K(A) columns of A is linearly independent.
6
l1 equivalence conditions
Consider the l1 problem: Can we ever hope to get ? Thm. 2(*): is sparse enough exact solution by l1 optimization Can solve a combinatorial optimization problem by convex optimization! (*) Donoho and Elad obtained a similar result concurrently. l1 solution = l0 solution ! where
7
lp (p ≤ 1) equivalence conditions
Consider the lp problem: How about ? Thm. 3: lp solution = l0 solution ! where Smaller p more non-zero elements tolerated As p0 we recover the l0 condition, namely Smaller p
8
l0 uniqueness conditions
Prefer the sparsest solution: Let When is ? Number of non-zero elements in f Definition: The index of ambiguity K(A) of A is the largest integer such that any set of K(A) columns of A is linearly independent. Thm. 1: What can we say about more tractable formulations like l1 ? Unique l0 solution
9
l1 equivalence conditions
Consider the l1 problem: Can we ever hope to get ? Definition: Maximum absolute dot product of columns Thm. 2(*): is sparse enough exact solution by l1 optimization Can solve a combinatorial optimization problem by convex optimization! (*) Donoho and Elad obtained a similar result concurrently. l1 solution = l0 solution !
10
lp (p ≤ 1) equivalence conditions
Consider the lp problem: How about ? Definition: Thm. 3: lp solution = l0 solution ! Smaller p Smaller p more non-zero elements tolerated As p0 we recover the l0 condition, namely
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.