Sparse Regression-based Hyperspectral Unmixing

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

IPIM, IST, José Bioucas, Convolution Operators Spectral Representation Bandlimited Signals/Systems Inverse Operator Null and Range Spaces Sampling,
Object Specific Compressed Sensing by minimizing a weighted L2-norm A. Mahalanobis.
L1 sparse reconstruction of sharp point set surfaces
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Slide 1 Bayesian Model Fusion: Large-Scale Performance Modeling of Analog and Mixed- Signal Circuits by Reusing Early-Stage Data Fa Wang*, Wangyang Zhang*,
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Joint work with Irad Yavneh
Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.
Prediction with Regression
Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Beyond Nyquist: Compressed Sensing of Analog Signals
Manifold Sparse Beamforming
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Extensions of wavelets
More MR Fingerprinting
Ilias Theodorakopoulos PhD Candidate
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Unsupervised Feature Selection for Multi-Cluster Data Deng Cai et al, KDD 2010 Presenter: Yunchao Gong Dept. Computer Science, UNC Chapel Hill.
Sparse and Overcomplete Data Representation
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Analysis of Hyperspectral Image Using Minimum Volume Transform (MVT) Ziv Waxman & Chen Vanunu Instructor: Mr. Oleg Kuybeda.
Alfredo Nava-Tudela John J. Benedetto, advisor
Compressed Sensing Compressive Sampling
Chapter 12 Spatial Sharpening of Spectral Image Data.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Noise-Robust Spatial Preprocessing Prior to Endmember Extraction from Hyperspectral Data Gabriel Martín, Maciel Zortea and Antonio Plaza Hyperspectral.
The horseshoe estimator for sparse signals CARLOS M. CARVALHO NICHOLAS G. POLSON JAMES G. SCOTT Biometrika (2010) Presented by Eric Wang 10/14/2010.
RECPAD - 14ª Conferência Portuguesa de Reconhecimento de Padrões, Aveiro, 23 de Outubro de 2009 The data exhibit a severe type of signal-dependent noise,
A New Subspace Approach for Supervised Hyperspectral Image Classification Jun Li 1,2, José M. Bioucas-Dias 2 and Antonio Plaza 1 1 Hyperspectral Computing.
Recovering low rank and sparse matrices from compressive measurements Aswin C Sankaranarayanan Rice University Richard G. Baraniuk Andrew E. Waters.
1 Sparsity Control for Robust Principal Component Analysis Gonzalo Mateos and Georgios B. Giannakis ECE Department, University of Minnesota Acknowledgments:
Sparse Matrix Factorizations for Hyperspectral Unmixing John Wright Visual Computing Group Microsoft Research Asia Sept. 30, 2010 TexPoint fonts used in.
A Sparse Non-Parametric Approach for Single Channel Separation of Known Sounds Paris Smaragdis, Madhusudana Shashanka, Bhiksha Raj NIPS 2009.
USE OF KERNELS FOR HYPERSPECTRAL TRAGET DETECTION Nasser M. Nasrabadi Senior Research Scientist U.S. Army Research Laboratory, Attn: AMSRL-SE-SE 2800 Powder.
IGARSS 2011, Vancouver, Canada HYPERSPECTRAL UNMIXING USING A NOVEL CONVERSION MODEL Fereidoun A. Mianji, Member, IEEE, Shuang Zhou, Member, IEEE, Ye Zhang,
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Validation of MODIS Snow Mapping Algorithm Jiancheng Shi Institute for Computational Earth System Science University of California, Santa Barbara.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Endmember Extraction from Highly Mixed Data Using MVC-NMF Lidan Miao AICIP Group Meeting Apr. 6, 2006 Lidan Miao AICIP Group Meeting Apr. 6, 2006.
Linear Subspace Transforms PCA, Karhunen- Loeve, Hotelling C306, 2000.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Review of Spectral Unmixing for Hyperspectral Imagery Lidan Miao Sept. 29, 2005.
Abstract  Arterial Spin Labeling (ASL) is a noninvasive method for quantifying Cerebral Blood Flow (CBF).  The most common approach is to alternate between.
Large-Scale Matrix Factorization with Missing Data under Additional Constraints Kaushik Mitra University of Maryland, College Park, MD Sameer Sheoreyy.
Mathematical Analysis of MaxEnt for Mixed Pixel Decomposition
1 Robust Nonparametric Regression by Controlling Sparsity Gonzalo Mateos and Georgios B. Giannakis ECE Department, University of Minnesota Acknowledgments:
Comparative Analysis of Spectral Unmixing Algorithms Lidan Miao Nov. 10, 2005.
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Date of download: 7/7/2016 Copyright © 2016 SPIE. All rights reserved. Evaluation of the orthogonal matching pursuit (OMP) cost over the target space in.
Jeremy Watt and Aggelos Katsaggelos Northwestern University
Müjdat Çetin Stochastic Systems Group, M.I.T.
Hyperspectral Analysis Techniques
Orthogonal Subspace Projection - Matched Filter
T. Chernyakova, A. Aberdam, E. Bar-Ilan, Y. C. Eldar
Basic Algorithms Christina Gallner
USPACOR: Universal Sparsity-Controlling Outlier Rejection
NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
Chapter 6. Large Scale Optimization
Digital Image Procesing The Karhunen-Loeve Transform (KLT) in Image Processing DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL.
Optimal sparse representations in general overcomplete bases
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society Denoising of LSFCM images with compensation for the Photoblinking/Photobleaching.
Improving K-SVD Denoising by Post-Processing its Method-Noise
Learned Convolutional Sparse Coding
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

Sparse Regression-based Hyperspectral Unmixing Marian-Daniel Iordache1,2 José M. Bioucas-Dias2 Antonio Plaza1 1 Department of Technology of Computers and Communications, University of Extremadura, Caceres Spain 2 Instituto de Telecomunicações, Instituto Superior Técnico, Technical University of Lisbon, Lisbon IGARSS 2011

Hyperspectral imaging concept IGARSS 2011

Outline Linear mixing model Spectral unmixing Sparse regression-based unmixing Sparsity-inducing regularizers ( ) Algorithms Results IGARSS 2011

Linear mixing model (LMM) Incident radiation interacts only with one component (checkerboard type scenes) Hyperspectral linear unmixing Estimate IGARSS 2011

Algorithms for SLU Three step approach Sparse regression Dimensionality reduction (Identify the subspace spanned by the columns of ) Sparse regression Endmember determination (Identify the columns of ) Inversion (For each pixel, identify the vector of proportions ) IGARSS 2011

Sparse regression-based SLU Spectral vectors can be expressed as linear combinations of a few pure spectral signatures obtained from a (potentially very large) spectral library Unmixing: given y and A, find the sparsest solution of Advantage: sidesteps endmember estimation IGARSS 2011 6

Sparse regression-based SLU (library, , undetermined system) Problem – P0 Very difficult (NP-hard) Approximations to P0: OMP – orthogonal matching pursuit [Pati et al., 2003] BP – basis pursuit [Chen et al., 2003] BPDN – basis pursuit denoising IGARSS 2011 7

Convex approximations to P0 CBPDN – Constrained basis pursuit denoising Equivalent problem Striking result: In given circumstances, related with the coherence of among the columns of matrix A, BP(DN) yields the sparsest solution ([Donoho 06], [Candès et al. 06]). Efficient solvers for CBPDN: SUNSAL, CSUNSAL [Bioucas-Dias, Figueiredo, 2010] IGARSS 2011 8

Application of CBPDN to SLU Extensively studied in [Iordache et al.,10,11] Six libraries (A1, …, A6 ) Simulated data Endmembers random selected from the libraries Fractional abundances uniformely distributed over the simplex Real data AVIRIS Cuprite Library: calibrated version of USGS (A1) IGARSS 2011

Hyperspectral libraries Bad news: hyperspectral libraries exhibits high mutual coherence Good news: hyperspectral mixtures are sparse (k· 5 very often) IGARSS 2011

Reconstruction errors (SNR = 30 dB) ISMA [Rogge et al, 2006] IGARSS 2011

Real data – AVIRIS Cuprite IGARSS 2011

Real data – AVIRIS Cuprite IGARSS 2011

Beyond l1 regularization Rationale: introduce new sparsity-inducing regularizers to counter the sparse regression limits imposed by the high coherence of the hyperspectral libraries. New regularizers: Total variation (TV ) and group lasso (GL) Matrix with all vectors of fractions TV regularizer l1 regularizer GL regularizer IGARSS 2011

Total variation and group lasso regularizers i-th image band promotes similarity between neighboring fractions i-th pixel promotes groups of atoms of A (group sparsity) IGARSS 2011

GLTV_SUnSAL for hyperspectral unmixing Criterion: GLTV_SUnSAL algorithm: based on CSALSA [Afonso et al., 11]. Applies the augmented Lagrangian method and alternating optimization to decompose the initial problem into a sequence of simper optimizations IGARSS 2011

GLTV_SUnSAL results: l1 and GL regularizers GLTV_SUnSAL (l1) Library A2 2 groups active SRE = 5.2 dB GLTV_SUnSAL (l1+GL) SRE = 15.4 dB k (no. act. groups) no. endmembers SRE (l1) dB SRE (l1+GL) dB 1 3 9.7 16.3 2 6 7.8 14.5 9 6.7 14.0 4 12 4.8 12.3 MC runs = 20 SNR = 1 IGARSS 2011

GLTV_SUnSAL results: l1 and GL regularizers Library SNR = 20 dB, l1 SNR = 20 dB, l1+TV Endmember #5 SNR = 30 dB, l1 SNR = 30 dB, l1+TV IGARSS 2011

Real data – AVIRIS Cuprite IGARSS 2011

Concluding remarks Shown that the sparse regression framework has a strong potential for linear hyperspectral unmixing Tailored new regression criteria to cope with the high coherence of hyperspectral libraries Developed optimization algorithms for the above criteria To be done: reseach ditionary learning techniques IGARSS 2011