Sparse Regression-based Hyperspectral Unmixing Marian-Daniel Iordache1,2 José M. Bioucas-Dias2 Antonio Plaza1 1 Department of Technology of Computers and Communications, University of Extremadura, Caceres Spain 2 Instituto de Telecomunicações, Instituto Superior Técnico, Technical University of Lisbon, Lisbon IGARSS 2011
Hyperspectral imaging concept IGARSS 2011
Outline Linear mixing model Spectral unmixing Sparse regression-based unmixing Sparsity-inducing regularizers ( ) Algorithms Results IGARSS 2011
Linear mixing model (LMM) Incident radiation interacts only with one component (checkerboard type scenes) Hyperspectral linear unmixing Estimate IGARSS 2011
Algorithms for SLU Three step approach Sparse regression Dimensionality reduction (Identify the subspace spanned by the columns of ) Sparse regression Endmember determination (Identify the columns of ) Inversion (For each pixel, identify the vector of proportions ) IGARSS 2011
Sparse regression-based SLU Spectral vectors can be expressed as linear combinations of a few pure spectral signatures obtained from a (potentially very large) spectral library Unmixing: given y and A, find the sparsest solution of Advantage: sidesteps endmember estimation IGARSS 2011 6
Sparse regression-based SLU (library, , undetermined system) Problem – P0 Very difficult (NP-hard) Approximations to P0: OMP – orthogonal matching pursuit [Pati et al., 2003] BP – basis pursuit [Chen et al., 2003] BPDN – basis pursuit denoising IGARSS 2011 7
Convex approximations to P0 CBPDN – Constrained basis pursuit denoising Equivalent problem Striking result: In given circumstances, related with the coherence of among the columns of matrix A, BP(DN) yields the sparsest solution ([Donoho 06], [Candès et al. 06]). Efficient solvers for CBPDN: SUNSAL, CSUNSAL [Bioucas-Dias, Figueiredo, 2010] IGARSS 2011 8
Application of CBPDN to SLU Extensively studied in [Iordache et al.,10,11] Six libraries (A1, …, A6 ) Simulated data Endmembers random selected from the libraries Fractional abundances uniformely distributed over the simplex Real data AVIRIS Cuprite Library: calibrated version of USGS (A1) IGARSS 2011
Hyperspectral libraries Bad news: hyperspectral libraries exhibits high mutual coherence Good news: hyperspectral mixtures are sparse (k· 5 very often) IGARSS 2011
Reconstruction errors (SNR = 30 dB) ISMA [Rogge et al, 2006] IGARSS 2011
Real data – AVIRIS Cuprite IGARSS 2011
Real data – AVIRIS Cuprite IGARSS 2011
Beyond l1 regularization Rationale: introduce new sparsity-inducing regularizers to counter the sparse regression limits imposed by the high coherence of the hyperspectral libraries. New regularizers: Total variation (TV ) and group lasso (GL) Matrix with all vectors of fractions TV regularizer l1 regularizer GL regularizer IGARSS 2011
Total variation and group lasso regularizers i-th image band promotes similarity between neighboring fractions i-th pixel promotes groups of atoms of A (group sparsity) IGARSS 2011
GLTV_SUnSAL for hyperspectral unmixing Criterion: GLTV_SUnSAL algorithm: based on CSALSA [Afonso et al., 11]. Applies the augmented Lagrangian method and alternating optimization to decompose the initial problem into a sequence of simper optimizations IGARSS 2011
GLTV_SUnSAL results: l1 and GL regularizers GLTV_SUnSAL (l1) Library A2 2 groups active SRE = 5.2 dB GLTV_SUnSAL (l1+GL) SRE = 15.4 dB k (no. act. groups) no. endmembers SRE (l1) dB SRE (l1+GL) dB 1 3 9.7 16.3 2 6 7.8 14.5 9 6.7 14.0 4 12 4.8 12.3 MC runs = 20 SNR = 1 IGARSS 2011
GLTV_SUnSAL results: l1 and GL regularizers Library SNR = 20 dB, l1 SNR = 20 dB, l1+TV Endmember #5 SNR = 30 dB, l1 SNR = 30 dB, l1+TV IGARSS 2011
Real data – AVIRIS Cuprite IGARSS 2011
Concluding remarks Shown that the sparse regression framework has a strong potential for linear hyperspectral unmixing Tailored new regression criteria to cope with the high coherence of hyperspectral libraries Developed optimization algorithms for the above criteria To be done: reseach ditionary learning techniques IGARSS 2011