Basic Algorithms Christina Gallner 6.11.2014
Basic Algorithms sparse recovery problem: min 𝑧 0 s.t. 𝐴𝑧=𝑦 Optimization Methods Greedy Methods Thresholding-based Methods
Optimization Methods
Optimization Methods min 𝑥∈ 𝑅 𝑛 𝐹 0 𝑥 𝑠.𝑡. 𝐹 𝑖 𝑥 ≤ 𝑏 𝑖 𝑖∈ 𝑛 Approximation of sparse recovery problem: min 𝑧 𝑞 s.t. 𝐴𝑧=𝑦
𝑙 1 −𝑚𝑖𝑛𝑖𝑚𝑖𝑧𝑎𝑡𝑖𝑜𝑛 Input: measurement matrix A, measurement vector y Optimization Methods 𝑙 1 −𝑚𝑖𝑛𝑖𝑚𝑖𝑧𝑎𝑡𝑖𝑜𝑛 Input: measurement matrix A, measurement vector y Instruction: 𝑥 # = argmin 𝑧 1 s.t. 𝐴𝑧=𝑦 Output vector 𝑥 #
Theorem 𝑙 1 minimizers are sparse Optimization Methods Theorem 𝑙 1 minimizers are sparse Let A∈ ℝ 𝑚 ×𝑁 be a measurement matrix with columns 𝑎 1 ,…, 𝑎 𝑁 . Assuming the uniqueness of a minimizer 𝑥 # of min 𝑧∈ ℝ 𝑁 𝑧 1 𝑠.𝑡. 𝐴𝑧=𝑦, the system 𝑎 𝑗 , 𝑗∈𝑠𝑢𝑝𝑝 𝑥 # is linearly independent, and in particular 𝑥 # 0 =𝑐𝑎𝑟𝑑(𝑠𝑢𝑝𝑝 𝑥 # )≤𝑚
Optimization Methods Convex Setting min 𝑧 1 s.t. 𝐴𝑧−𝑦 2 ≤η 𝑧∈ ℂ 𝑛 : 𝑢=𝑅𝑒 𝑧 , 𝑣=𝐼𝑚(𝑧) 𝑐 𝑗 ≥ 𝑧 𝑗 = 𝑢 𝑗 2 + 𝑣 𝑗 2 ∀ 𝑗∈ 𝑁 min 𝑐,𝑢,𝑣∈ ℝ 𝑛 𝑗=1 𝑁 𝑐 𝑗 𝑠.𝑡. 𝑅𝑒 𝐴 −𝐼𝑚 𝐴 𝐼𝑚 𝐴 𝑅𝑒 𝐴 𝑢 𝑣 − 𝑅𝑒 𝑦 𝐼𝑚 𝑦 ≤𝜂 𝑢 1 2 + 𝑣 1 2 ≤ 𝑐 1 ⋮ 𝑢 𝑁 2 + 𝑣 𝑁 2 ≤ 𝑐 𝑁
Quadratically constraint basis pursuit Optimization Methods Quadratically constraint basis pursuit Input: measurement matrix A, measurement vector y, noise level η Instruction: 𝑥 # =argmin 𝑧 1 𝑠.𝑡. 𝐴𝑧−𝑦 2 ≤η Output: 𝑥 # # #
LASSO Problem: min 𝑧∈ ℂ 𝑁 𝐴𝑧−𝑦 2 𝑠.𝑡. 𝑧 1 ≤𝜏 Optimization Methods LASSO Problem: min 𝑧∈ ℂ 𝑁 𝐴𝑧−𝑦 2 𝑠.𝑡. 𝑧 1 ≤𝜏 If x is a unique minimizer of the quadratically constrained basis pursuit ( 𝑥 # =𝑎𝑟𝑔𝑚𝑖𝑛 𝑧 1 𝑠.𝑡. 𝐴𝑧−𝑦 2 ≤η) with η≥0, then there exists 𝜏= 𝜏 𝑥 ≥0 such that x is a unique minimizer of the LASSO.
Optimization Methods Homotopy Method Solve 𝑥 # = argmin 𝑧 1 s.t. 𝐴𝑧=𝑦 Therefore: 𝐹 λ 𝑥 = 1 2 𝐴𝑥−𝑦 2 2 +λ 𝑥 1 Every clusterpoint of (𝑥 λ 𝑛 ) 𝑤𝑖𝑡ℎ lim 𝑛→∞ λ 𝑛 = 0 + is a minimum.
Optimization Methods: Homotopy Method Start: 𝑥 λ 0 =0 → λ (0) = 𝐴 ∗ 𝑦 ∞ Go as far as possible in direction (𝐴 ∗ 𝑦) 𝑗 s.t. the error 𝐴 ∗ 𝑦−𝐴𝑥 2 gets as small as possible Update 𝜆 Repeat until λ=0
Greedy Methods
Orthogonal matching pursuit Greedy Methods Orthogonal matching pursuit Input: measurement matrix A, measurement vector y Initialization: 𝑆 0 =∅, 𝑥 0 =0 Iteration: 𝑆 𝑛+1 = 𝑆 𝑛 ∪ 𝑗 𝑛+1 , 𝑗 𝑛+1 = argmax 𝑗∈ 𝑁 𝐴 ∗ 𝑦−𝐴 𝑥 𝑛 𝑗 𝑥 𝑛+1 = argmin 𝑧∈ ℂ 𝑛 𝑦−𝐴𝑧 2 , 𝑠𝑢𝑝𝑝(𝑧)⊂ 𝑆 𝑛+1 Output: 𝑛 -sparse vector 𝑥 # = 𝑥 𝑛 ,where the algorithms stops at 𝑛 because of a stopping criterion.
Greedy Methods: Orthogonal matching pursuit Lemma Let 𝑆⊂ 𝑁 , 𝑥 𝑛 with 𝑠𝑢𝑝𝑝 𝑥 𝑛 ⊂𝑆, 𝑗∈ 𝑁 , If 𝑤≔ argmin 𝑥∈ ℂ 𝑁 𝑦−𝐴𝑧 2 ,𝑠𝑢𝑝𝑝 𝑧 ⊂𝑆∪ 𝑗 , then 𝑦−𝐴𝑤 2 2 ≤ 𝑦−𝐴 𝑥 𝑛 2 2 − 𝐴 ∗ 𝑦−𝐴 𝑥 𝑛 𝑗 2 .
Greedy Methods: Orthogonal matching pursuit Lemma Given an index set 𝑆⊂ 𝑁 , if 𝑣≔ argmin 𝑧∈ ℂ 𝑁 𝑦−𝐴𝑧 2 , 𝑠𝑢𝑝𝑝 𝑧 ⊂𝑆 , then 𝐴 ∗ 𝑦−𝐴𝑣 𝑆 =0.
Stopping criteria For example: Greedy Methods Stopping criteria For example: 𝐴 𝑥 𝑛 =𝑦 or because of calculation and measurement errors: 𝑦−𝐴 𝑥 𝑛 2 ≤𝜀 Estimation for the sparsity s: 𝑛 =𝑠 → 𝑥 𝑛 is s-sparse
Greedy Methods: Orthogonal matching pursuit Proposition Given a matrix 𝐴∈ ℂ 𝑚×𝑁 , every nonzero vector 𝑥∈ ℂ 𝑁 supported on a set S of size s is recovered from 𝑦=𝐴𝑥 after at most s iterations of orthogonal matching pursuit if and only if the matrix 𝐴 𝑆 is injective and max 𝑗∈𝑆 | 𝐴 ∗ 𝑟 𝑗 |> max 𝑙∈ 𝑆 | 𝐴 ∗ 𝑟 𝑙 | for all nonzero 𝑟∈{𝐴𝑧, 𝑠𝑢𝑝𝑝(𝑧)⊂𝑆}
Compressive sampling matching pursuit (CoSaMP) Greedy Methods Compressive sampling matching pursuit (CoSaMP) Input: measurement matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector 𝑥 0 Iteration: repeat until 𝑛= 𝑛 𝑈 𝑛+1 =𝑠𝑢𝑝𝑝( 𝑥 𝑛 )∪ 𝐿 2𝑠 𝐴 ∗ 𝑦−𝐴 𝑥 𝑛 𝑢 𝑛+1 = argmin 𝑧∈ ℂ 𝑁 𝑦−𝐴𝑧 2 ,𝑠𝑢𝑝𝑝(𝑧)⊂ 𝑈 𝑛+1 𝑥 𝑛+1 = 𝐻 𝑠 𝑢 𝑛+1 Output: s-sparse vector 𝑥 # = 𝑥 𝑛
Thresholding-based methods
Thresholding- Based Methods Basic thresholding Input: measurement matrix A, measurement vector y, sparsity level s Instruction: 𝑆 # = 𝐿 𝑠 𝐴 ∗ 𝑦 𝑥 # = argmin 𝑥∈ ℂ 𝑛 𝑦−𝐴𝑧 2 , 𝑠𝑢𝑝𝑝 𝑧 ⊂ 𝑆 # Output: s-sparse vector 𝑥 #
Thresholding- Based Methods: Basic thresholding Proposition A vector 𝑥∈ ℂ 𝑛 supported on a set S is recovered from 𝑦=𝐴𝑥 via basic thresholding if and only if min 𝑗∈𝑆 𝐴 ∗ 𝑦 𝑗 > max 𝑙∈ 𝑆 𝐴 ∗ 𝑦 𝑙
Iterative hard thresholding Thresholding- Based Methods Iterative hard thresholding Input: measurment matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector 𝑥 0 Iteration: repeat until 𝑛= 𝑛 : 𝑥 𝑛+1 = 𝐻 𝑠 𝑥 𝑛 + 𝐴 ∗ 𝑦−𝐴 𝑥 𝑛 Output: s-sparse vector 𝑥 # = 𝑥 𝑛
Hard thresholding pursuit Thresholding- Based Methods Hard thresholding pursuit Input: measurment matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector 𝑥 0 Iteration: repeat until 𝑛= 𝑛 𝑆 𝑛+1 = 𝐿 𝑠 𝑥 𝑛 + 𝐴 ∗ 𝑦−𝐴 𝑥 𝑛 𝑥 𝑛+1 = argmin 𝑧∈ ℂ 𝑁 { 𝑦−𝐴𝑧 2 ,𝑠𝑢𝑝𝑝(𝑧)⊂ 𝑆 𝑛+1 } Output: s-sparse vector 𝑥 # = 𝑥 𝑛
Any questions?