Download presentation
Presentation is loading. Please wait.
Published byHope Melissa Williamson Modified over 6 years ago
1
Basic Algorithms Christina Gallner
2
Basic Algorithms sparse recovery problem: min ๐ง 0 s.t. ๐ด๐ง=๐ฆ
Optimization Methods Greedy Methods Thresholding-based Methods
3
Optimization Methods
4
Optimization Methods min ๐ฅโ ๐
๐ ๐น 0 ๐ฅ ๐ .๐ก. ๐น ๐ ๐ฅ โค ๐ ๐ ๐โ ๐
Approximation of sparse recovery problem: min ๐ง ๐ s.t. ๐ด๐ง=๐ฆ
5
๐ 1 โ๐๐๐๐๐๐๐ง๐๐ก๐๐๐ Input: measurement matrix A, measurement vector y
Optimization Methods ๐ 1 โ๐๐๐๐๐๐๐ง๐๐ก๐๐๐ Input: measurement matrix A, measurement vector y Instruction: ๐ฅ # = argmin ๐ง 1 s.t. ๐ด๐ง=๐ฆ Output vector ๐ฅ #
6
Theorem ๐ 1 minimizers are sparse
Optimization Methods Theorem ๐ 1 minimizers are sparse Let Aโ โ ๐ ร๐ be a measurement matrix with columns ๐ 1 ,โฆ, ๐ ๐ . Assuming the uniqueness of a minimizer ๐ฅ # of min ๐งโ โ ๐ ๐ง 1 ๐ .๐ก. ๐ด๐ง=๐ฆ, the system ๐ ๐ , ๐โ๐ ๐ข๐๐ ๐ฅ # is linearly independent, and in particular ๐ฅ # 0 =๐๐๐๐(๐ ๐ข๐๐ ๐ฅ # )โค๐
7
Optimization Methods Convex Setting min ๐ง 1 s.t. ๐ด๐งโ๐ฆ 2 โคฮท ๐งโ โ ๐ : ๐ข=๐
๐ ๐ง , ๐ฃ=๐ผ๐(๐ง) ๐ ๐ โฅ ๐ง ๐ = ๐ข ๐ 2 + ๐ฃ ๐ 2 โ ๐โ ๐ min ๐,๐ข,๐ฃโ โ ๐ ๐=1 ๐ ๐ ๐ ๐ .๐ก. ๐
๐ ๐ด โ๐ผ๐ ๐ด ๐ผ๐ ๐ด ๐
๐ ๐ด ๐ข ๐ฃ โ ๐
๐ ๐ฆ ๐ผ๐ ๐ฆ โค๐ ๐ข ๐ฃ 1 2 โค ๐ 1 โฎ ๐ข ๐ 2 + ๐ฃ ๐ 2 โค ๐ ๐
8
Quadratically constraint basis pursuit
Optimization Methods Quadratically constraint basis pursuit Input: measurement matrix A, measurement vector y, noise level ฮท Instruction: ๐ฅ # =argmin ๐ง 1 ๐ .๐ก. ๐ด๐งโ๐ฆ 2 โคฮท Output: ๐ฅ # # #
9
LASSO Problem: min ๐งโ โ ๐ ๐ด๐งโ๐ฆ 2 ๐ .๐ก. ๐ง 1 โค๐
Optimization Methods LASSO Problem: min ๐งโ โ ๐ ๐ด๐งโ๐ฆ 2 ๐ .๐ก. ๐ง 1 โค๐ If x is a unique minimizer of the quadratically constrained basis pursuit ( ๐ฅ # =๐๐๐๐๐๐ ๐ง 1 ๐ .๐ก. ๐ด๐งโ๐ฆ 2 โคฮท) with ฮทโฅ0, then there exists ๐= ๐ ๐ฅ โฅ0 such that x is a unique minimizer of the LASSO.
10
Optimization Methods Homotopy Method Solve ๐ฅ # = argmin ๐ง 1 s.t. ๐ด๐ง=๐ฆ Therefore: ๐น ฮป ๐ฅ = 1 2 ๐ด๐ฅโ๐ฆ 2 2 +ฮป ๐ฅ 1 Every clusterpoint of (๐ฅ ฮป ๐ ) ๐ค๐๐กโ lim ๐โโ ฮป ๐ = 0 + is a minimum.
11
Optimization Methods: Homotopy Method
Start: ๐ฅ ฮป 0 =0 โ ฮป (0) = ๐ด โ ๐ฆ โ Go as far as possible in direction (๐ด โ ๐ฆ) ๐ s.t. the error ๐ด โ ๐ฆโ๐ด๐ฅ 2 gets as small as possible Update ๐ Repeat until ฮป=0
12
Greedy Methods
13
Orthogonal matching pursuit
Greedy Methods Orthogonal matching pursuit Input: measurement matrix A, measurement vector y Initialization: ๐ 0 =โ
, ๐ฅ 0 =0 Iteration: ๐ ๐+1 = ๐ ๐ โช ๐ ๐+1 , ๐ ๐+1 = argmax ๐โ ๐ ๐ด โ ๐ฆโ๐ด ๐ฅ ๐ ๐ ๐ฅ ๐+1 = argmin ๐งโ โ ๐ ๐ฆโ๐ด๐ง 2 , ๐ ๐ข๐๐(๐ง)โ ๐ ๐+1 Output: ๐ -sparse vector ๐ฅ # = ๐ฅ ๐ ,where the algorithms stops at ๐ because of a stopping criterion.
14
Greedy Methods: Orthogonal matching pursuit
Lemma Let ๐โ ๐ , ๐ฅ ๐ with ๐ ๐ข๐๐ ๐ฅ ๐ โ๐, ๐โ ๐ , If ๐คโ argmin ๐ฅโ โ ๐ ๐ฆโ๐ด๐ง 2 ,๐ ๐ข๐๐ ๐ง โ๐โช ๐ , then ๐ฆโ๐ด๐ค 2 2 โค ๐ฆโ๐ด ๐ฅ ๐ 2 2 โ ๐ด โ ๐ฆโ๐ด ๐ฅ ๐ ๐ 2 .
15
Greedy Methods: Orthogonal matching pursuit
Lemma Given an index set ๐โ ๐ , if ๐ฃโ argmin ๐งโ โ ๐ ๐ฆโ๐ด๐ง 2 , ๐ ๐ข๐๐ ๐ง โ๐ , then ๐ด โ ๐ฆโ๐ด๐ฃ ๐ =0.
16
Stopping criteria For example:
Greedy Methods Stopping criteria For example: ๐ด ๐ฅ ๐ =๐ฆ or because of calculation and measurement errors: ๐ฆโ๐ด ๐ฅ ๐ 2 โค๐ Estimation for the sparsity s: ๐ =๐ โ ๐ฅ ๐ is s-sparse
17
Greedy Methods: Orthogonal matching pursuit
Proposition Given a matrix ๐ดโ โ ๐ร๐ , every nonzero vector ๐ฅโ โ ๐ supported on a set S of size s is recovered from ๐ฆ=๐ด๐ฅ after at most s iterations of orthogonal matching pursuit if and only if the matrix ๐ด ๐ is injective and max ๐โ๐ | ๐ด โ ๐ ๐ |> max ๐โ ๐ | ๐ด โ ๐ ๐ | for all nonzero ๐โ{๐ด๐ง, ๐ ๐ข๐๐(๐ง)โ๐}
18
Compressive sampling matching pursuit (CoSaMP)
Greedy Methods Compressive sampling matching pursuit (CoSaMP) Input: measurement matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector ๐ฅ 0 Iteration: repeat until ๐= ๐ ๐ ๐+1 =๐ ๐ข๐๐( ๐ฅ ๐ )โช ๐ฟ 2๐ ๐ด โ ๐ฆโ๐ด ๐ฅ ๐ ๐ข ๐+1 = argmin ๐งโ โ ๐ ๐ฆโ๐ด๐ง 2 ,๐ ๐ข๐๐(๐ง)โ ๐ ๐+1 ๐ฅ ๐+1 = ๐ป ๐ ๐ข ๐+1 Output: s-sparse vector ๐ฅ # = ๐ฅ ๐
19
Thresholding-based methods
20
Thresholding- Based Methods
Basic thresholding Input: measurement matrix A, measurement vector y, sparsity level s Instruction: ๐ # = ๐ฟ ๐ ๐ด โ ๐ฆ ๐ฅ # = argmin ๐ฅโ โ ๐ ๐ฆโ๐ด๐ง 2 , ๐ ๐ข๐๐ ๐ง โ ๐ # Output: s-sparse vector ๐ฅ #
21
Thresholding- Based Methods: Basic thresholding
Proposition A vector ๐ฅโ โ ๐ supported on a set S is recovered from ๐ฆ=๐ด๐ฅ via basic thresholding if and only if min ๐โ๐ ๐ด โ ๐ฆ ๐ > max ๐โ ๐ ๐ด โ ๐ฆ ๐
22
Iterative hard thresholding
Thresholding- Based Methods Iterative hard thresholding Input: measurment matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector ๐ฅ 0 Iteration: repeat until ๐= ๐ : ๐ฅ ๐+1 = ๐ป ๐ ๐ฅ ๐ + ๐ด โ ๐ฆโ๐ด ๐ฅ ๐ Output: s-sparse vector ๐ฅ # = ๐ฅ ๐
23
Hard thresholding pursuit
Thresholding- Based Methods Hard thresholding pursuit Input: measurment matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector ๐ฅ 0 Iteration: repeat until ๐= ๐ ๐ ๐+1 = ๐ฟ ๐ ๐ฅ ๐ + ๐ด โ ๐ฆโ๐ด ๐ฅ ๐ ๐ฅ ๐+1 = argmin ๐งโ โ ๐ { ๐ฆโ๐ด๐ง 2 ,๐ ๐ข๐๐(๐ง)โ ๐ ๐+1 } Output: s-sparse vector ๐ฅ # = ๐ฅ ๐
24
Any questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.