Presentation is loading. Please wait.

Presentation is loading. Please wait.

Basic Algorithms Christina Gallner 6.11.2014.

Similar presentations


Presentation on theme: "Basic Algorithms Christina Gallner 6.11.2014."โ€” Presentation transcript:

1 Basic Algorithms Christina Gallner

2 Basic Algorithms sparse recovery problem: min ๐‘ง 0 s.t. ๐ด๐‘ง=๐‘ฆ
Optimization Methods Greedy Methods Thresholding-based Methods

3 Optimization Methods

4 Optimization Methods min ๐‘ฅโˆˆ ๐‘… ๐‘› ๐น 0 ๐‘ฅ ๐‘ .๐‘ก. ๐น ๐‘– ๐‘ฅ โ‰ค ๐‘ ๐‘– ๐‘–โˆˆ ๐‘›
Approximation of sparse recovery problem: min ๐‘ง ๐‘ž s.t. ๐ด๐‘ง=๐‘ฆ

5 ๐‘™ 1 โˆ’๐‘š๐‘–๐‘›๐‘–๐‘š๐‘–๐‘ง๐‘Ž๐‘ก๐‘–๐‘œ๐‘› Input: measurement matrix A, measurement vector y
Optimization Methods ๐‘™ 1 โˆ’๐‘š๐‘–๐‘›๐‘–๐‘š๐‘–๐‘ง๐‘Ž๐‘ก๐‘–๐‘œ๐‘› Input: measurement matrix A, measurement vector y Instruction: ๐‘ฅ # = argmin ๐‘ง 1 s.t. ๐ด๐‘ง=๐‘ฆ Output vector ๐‘ฅ #

6 Theorem ๐‘™ 1 minimizers are sparse
Optimization Methods Theorem ๐‘™ 1 minimizers are sparse Let Aโˆˆ โ„ ๐‘š ร—๐‘ be a measurement matrix with columns ๐‘Ž 1 ,โ€ฆ, ๐‘Ž ๐‘ . Assuming the uniqueness of a minimizer ๐‘ฅ # of min ๐‘งโˆˆ โ„ ๐‘ ๐‘ง 1 ๐‘ .๐‘ก. ๐ด๐‘ง=๐‘ฆ, the system ๐‘Ž ๐‘— , ๐‘—โˆˆ๐‘ ๐‘ข๐‘๐‘ ๐‘ฅ # is linearly independent, and in particular ๐‘ฅ # 0 =๐‘๐‘Ž๐‘Ÿ๐‘‘(๐‘ ๐‘ข๐‘๐‘ ๐‘ฅ # )โ‰ค๐‘š

7 Optimization Methods Convex Setting min ๐‘ง 1 s.t. ๐ด๐‘งโˆ’๐‘ฆ 2 โ‰คฮท ๐‘งโˆˆ โ„‚ ๐‘› : ๐‘ข=๐‘…๐‘’ ๐‘ง , ๐‘ฃ=๐ผ๐‘š(๐‘ง) ๐‘ ๐‘— โ‰ฅ ๐‘ง ๐‘— = ๐‘ข ๐‘— 2 + ๐‘ฃ ๐‘— 2 โˆ€ ๐‘—โˆˆ ๐‘ min ๐‘,๐‘ข,๐‘ฃโˆˆ โ„ ๐‘› ๐‘—=1 ๐‘ ๐‘ ๐‘— ๐‘ .๐‘ก. ๐‘…๐‘’ ๐ด โˆ’๐ผ๐‘š ๐ด ๐ผ๐‘š ๐ด ๐‘…๐‘’ ๐ด ๐‘ข ๐‘ฃ โˆ’ ๐‘…๐‘’ ๐‘ฆ ๐ผ๐‘š ๐‘ฆ โ‰ค๐œ‚ ๐‘ข ๐‘ฃ 1 2 โ‰ค ๐‘ 1 โ‹ฎ ๐‘ข ๐‘ 2 + ๐‘ฃ ๐‘ 2 โ‰ค ๐‘ ๐‘

8 Quadratically constraint basis pursuit
Optimization Methods Quadratically constraint basis pursuit Input: measurement matrix A, measurement vector y, noise level ฮท Instruction: ๐‘ฅ # =argmin ๐‘ง 1 ๐‘ .๐‘ก. ๐ด๐‘งโˆ’๐‘ฆ 2 โ‰คฮท Output: ๐‘ฅ # # #

9 LASSO Problem: min ๐‘งโˆˆ โ„‚ ๐‘ ๐ด๐‘งโˆ’๐‘ฆ 2 ๐‘ .๐‘ก. ๐‘ง 1 โ‰ค๐œ
Optimization Methods LASSO Problem: min ๐‘งโˆˆ โ„‚ ๐‘ ๐ด๐‘งโˆ’๐‘ฆ 2 ๐‘ .๐‘ก. ๐‘ง 1 โ‰ค๐œ If x is a unique minimizer of the quadratically constrained basis pursuit ( ๐‘ฅ # =๐‘Ž๐‘Ÿ๐‘”๐‘š๐‘–๐‘› ๐‘ง 1 ๐‘ .๐‘ก. ๐ด๐‘งโˆ’๐‘ฆ 2 โ‰คฮท) with ฮทโ‰ฅ0, then there exists ๐œ= ๐œ ๐‘ฅ โ‰ฅ0 such that x is a unique minimizer of the LASSO.

10 Optimization Methods Homotopy Method Solve ๐‘ฅ # = argmin ๐‘ง 1 s.t. ๐ด๐‘ง=๐‘ฆ Therefore: ๐น ฮป ๐‘ฅ = 1 2 ๐ด๐‘ฅโˆ’๐‘ฆ 2 2 +ฮป ๐‘ฅ 1 Every clusterpoint of (๐‘ฅ ฮป ๐‘› ) ๐‘ค๐‘–๐‘กโ„Ž lim ๐‘›โ†’โˆž ฮป ๐‘› = 0 + is a minimum.

11 Optimization Methods: Homotopy Method
Start: ๐‘ฅ ฮป 0 =0 โ†’ ฮป (0) = ๐ด โˆ— ๐‘ฆ โˆž Go as far as possible in direction (๐ด โˆ— ๐‘ฆ) ๐‘— s.t. the error ๐ด โˆ— ๐‘ฆโˆ’๐ด๐‘ฅ 2 gets as small as possible Update ๐œ† Repeat until ฮป=0

12 Greedy Methods

13 Orthogonal matching pursuit
Greedy Methods Orthogonal matching pursuit Input: measurement matrix A, measurement vector y Initialization: ๐‘† 0 =โˆ…, ๐‘ฅ 0 =0 Iteration: ๐‘† ๐‘›+1 = ๐‘† ๐‘› โˆช ๐‘— ๐‘›+1 , ๐‘— ๐‘›+1 = argmax ๐‘—โˆˆ ๐‘ ๐ด โˆ— ๐‘ฆโˆ’๐ด ๐‘ฅ ๐‘› ๐‘— ๐‘ฅ ๐‘›+1 = argmin ๐‘งโˆˆ โ„‚ ๐‘› ๐‘ฆโˆ’๐ด๐‘ง 2 , ๐‘ ๐‘ข๐‘๐‘(๐‘ง)โŠ‚ ๐‘† ๐‘›+1 Output: ๐‘› -sparse vector ๐‘ฅ # = ๐‘ฅ ๐‘› ,where the algorithms stops at ๐‘› because of a stopping criterion.

14 Greedy Methods: Orthogonal matching pursuit
Lemma Let ๐‘†โŠ‚ ๐‘ , ๐‘ฅ ๐‘› with ๐‘ ๐‘ข๐‘๐‘ ๐‘ฅ ๐‘› โŠ‚๐‘†, ๐‘—โˆˆ ๐‘ , If ๐‘คโ‰” argmin ๐‘ฅโˆˆ โ„‚ ๐‘ ๐‘ฆโˆ’๐ด๐‘ง 2 ,๐‘ ๐‘ข๐‘๐‘ ๐‘ง โŠ‚๐‘†โˆช ๐‘— , then ๐‘ฆโˆ’๐ด๐‘ค 2 2 โ‰ค ๐‘ฆโˆ’๐ด ๐‘ฅ ๐‘› 2 2 โˆ’ ๐ด โˆ— ๐‘ฆโˆ’๐ด ๐‘ฅ ๐‘› ๐‘— 2 .

15 Greedy Methods: Orthogonal matching pursuit
Lemma Given an index set ๐‘†โŠ‚ ๐‘ , if ๐‘ฃโ‰” argmin ๐‘งโˆˆ โ„‚ ๐‘ ๐‘ฆโˆ’๐ด๐‘ง 2 , ๐‘ ๐‘ข๐‘๐‘ ๐‘ง โŠ‚๐‘† , then ๐ด โˆ— ๐‘ฆโˆ’๐ด๐‘ฃ ๐‘† =0.

16 Stopping criteria For example:
Greedy Methods Stopping criteria For example: ๐ด ๐‘ฅ ๐‘› =๐‘ฆ or because of calculation and measurement errors: ๐‘ฆโˆ’๐ด ๐‘ฅ ๐‘› 2 โ‰ค๐œ€ Estimation for the sparsity s: ๐‘› =๐‘  โ†’ ๐‘ฅ ๐‘› is s-sparse

17 Greedy Methods: Orthogonal matching pursuit
Proposition Given a matrix ๐ดโˆˆ โ„‚ ๐‘šร—๐‘ , every nonzero vector ๐‘ฅโˆˆ โ„‚ ๐‘ supported on a set S of size s is recovered from ๐‘ฆ=๐ด๐‘ฅ after at most s iterations of orthogonal matching pursuit if and only if the matrix ๐ด ๐‘† is injective and max ๐‘—โˆˆ๐‘† | ๐ด โˆ— ๐‘Ÿ ๐‘— |> max ๐‘™โˆˆ ๐‘† | ๐ด โˆ— ๐‘Ÿ ๐‘™ | for all nonzero ๐‘Ÿโˆˆ{๐ด๐‘ง, ๐‘ ๐‘ข๐‘๐‘(๐‘ง)โŠ‚๐‘†}

18 Compressive sampling matching pursuit (CoSaMP)
Greedy Methods Compressive sampling matching pursuit (CoSaMP) Input: measurement matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector ๐‘ฅ 0 Iteration: repeat until ๐‘›= ๐‘› ๐‘ˆ ๐‘›+1 =๐‘ ๐‘ข๐‘๐‘( ๐‘ฅ ๐‘› )โˆช ๐ฟ 2๐‘  ๐ด โˆ— ๐‘ฆโˆ’๐ด ๐‘ฅ ๐‘› ๐‘ข ๐‘›+1 = argmin ๐‘งโˆˆ โ„‚ ๐‘ ๐‘ฆโˆ’๐ด๐‘ง 2 ,๐‘ ๐‘ข๐‘๐‘(๐‘ง)โŠ‚ ๐‘ˆ ๐‘›+1 ๐‘ฅ ๐‘›+1 = ๐ป ๐‘  ๐‘ข ๐‘›+1 Output: s-sparse vector ๐‘ฅ # = ๐‘ฅ ๐‘›

19 Thresholding-based methods

20 Thresholding- Based Methods
Basic thresholding Input: measurement matrix A, measurement vector y, sparsity level s Instruction: ๐‘† # = ๐ฟ ๐‘  ๐ด โˆ— ๐‘ฆ ๐‘ฅ # = argmin ๐‘ฅโˆˆ โ„‚ ๐‘› ๐‘ฆโˆ’๐ด๐‘ง 2 , ๐‘ ๐‘ข๐‘๐‘ ๐‘ง โŠ‚ ๐‘† # Output: s-sparse vector ๐‘ฅ #

21 Thresholding- Based Methods: Basic thresholding
Proposition A vector ๐‘ฅโˆˆ โ„‚ ๐‘› supported on a set S is recovered from ๐‘ฆ=๐ด๐‘ฅ via basic thresholding if and only if min ๐‘—โˆˆ๐‘† ๐ด โˆ— ๐‘ฆ ๐‘— > max ๐‘™โˆˆ ๐‘† ๐ด โˆ— ๐‘ฆ ๐‘™

22 Iterative hard thresholding
Thresholding- Based Methods Iterative hard thresholding Input: measurment matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector ๐‘ฅ 0 Iteration: repeat until ๐‘›= ๐‘› : ๐‘ฅ ๐‘›+1 = ๐ป ๐‘  ๐‘ฅ ๐‘› + ๐ด โˆ— ๐‘ฆโˆ’๐ด ๐‘ฅ ๐‘› Output: s-sparse vector ๐‘ฅ # = ๐‘ฅ ๐‘›

23 Hard thresholding pursuit
Thresholding- Based Methods Hard thresholding pursuit Input: measurment matrix A, measurement vector y, sparsity level s Initialization: s-sparse vector ๐‘ฅ 0 Iteration: repeat until ๐‘›= ๐‘› ๐‘† ๐‘›+1 = ๐ฟ ๐‘  ๐‘ฅ ๐‘› + ๐ด โˆ— ๐‘ฆโˆ’๐ด ๐‘ฅ ๐‘› ๐‘ฅ ๐‘›+1 = argmin ๐‘งโˆˆ โ„‚ ๐‘ { ๐‘ฆโˆ’๐ด๐‘ง 2 ,๐‘ ๐‘ข๐‘๐‘(๐‘ง)โŠ‚ ๐‘† ๐‘›+1 } Output: s-sparse vector ๐‘ฅ # = ๐‘ฅ ๐‘›

24 Any questions?


Download ppt "Basic Algorithms Christina Gallner 6.11.2014."

Similar presentations


Ads by Google