Download presentation
Presentation is loading. Please wait.
1
Orthogonal Subspace Projection - Matched Filter
Reduce dimensionality by projecting pixel vectors onto subspace orthogonal to the undesired features. After all undesired features are removed, projecting the residuals onto the signature of interest yields a maximized S/N ratio and single component image that represents a class map for that signature. Method is extensible to more than one signature. Harsanyi, J.C., & Chang, C. (1994). Hyperspectral image classification and dimensionality reduction: An orthogonal subspace projection approach, IEEE Transactions on Geoscience and Remote Sensing, Vol. 32, No. 4, pp Orthogonal Subspace Projection Chapter 19
2
Orthogonal Subspace Projection
Definition of terms (N.B. all terms are a function of pixel location (x,y). (1) is an x 1 vector representing the pixel (mixed) in an band image, is an x p matrix whose columns (assumed linearly independent) are the p end member vectors included in the analysis are background and d is the target. Orthogonal Subspace Projection
3
Orthogonal Subspace Projection
is a p x 1 vector of end member fractions, is an x 1 vector representing random noise which is assumed to be independent, identically distributed (i·i-d) Gaussian with zero mean and covariance (N.B. this may be forced by preprocessing to orthogonalize and whiten the noise). Orthogonal Subspace Projection
4
Orthogonal Subspace Projection
We can rewrite this as (2) where is the x 1 target end member vector, p is the fraction of the target in the pixel, is the x (p-1) matrix containing the end members other than d, is the (p-1) x 1 matrix of fractions for the backgrounds Orthogonal Subspace Projection
5
Orthogonal Subspace Projection
We desire an operator that will minimize the effects of the backgrounds which are represented by the columns of This is accomplished by projecting onto a subspace that is orthogonal to the columns of The resulting vector should only contain energy (variance) associated with the target signature and random noise. Orthogonal Subspace Projection
6
Orthogonal Subspace Projection
From least squares theory, the projection that minimizes energy (variance) from the signatures in the matrix is achieved with the operator: (3) where is the pseudo inverse of . Operating on the image with yields (4) reducing the contribution of U to zero in the new projection space (i.e., we’ve minimized the interference. Orthogonal Subspace Projection
7
Orthogonal Subspace Projection
In addition, we seek to maximize the signal, or more precisely, we seek to maximize the S/N energy in the scene. We seek an 1 x operator which when applied to our background suppressed vectors (5) will maximize the S/N ratio·energy (l) expressed as (6) Orthogonal Subspace Projection
8
Orthogonal Subspace Projection
Where E is the expected value and l is a scalar. Maximization of with an operator is a classic eigen vector problem which in this case yields the convenient result (7) where k is an arbitrary scalar. Thus, the overall operator is the 1 x vector having the form (8) i.e., We first null the background with and then apply a match filter to maximize the SNR. Orthogonal Subspace Projection
9
Orthogonal Subspace Projection
This problem can be extended to multiple target vectors by generating the k x matrix operator. (9) where each vector is formed from the desired and undesired signature vectors. Orthogonal Subspace Projection
10
Orthogonal Subspace Projection
Note some limitations of this approach are pointed out in Farrand and Harsanyi 1995, Journal of Geophysical Research, Vol. 100, No. E1, pp 1. You need to know the end member vectors that make up the U matrix. 2. If your target looks like an end member, you may have suppressed it in the suppression step. Orthogonal Subspace Projection
11
Orthogonal Subspace Projection
3. Since the noise is assumed Gaussian i.i.d. you have to work in a space where this is approximately true, e.g., image DC or radiance space. They suggest that transforming to reflectance space may distort noise and invalidate this approach (Is this a valid concern? Will gains and biases remove identically distributed assumptions?). Is data more likely to be Gaussian i.i.d. in reflectance, radiance, or DC space? Comments: The end members don’t need to be identified, they can be image derived (e.g., using the PPI algorithm). This still leaves us with problem #2 if our target represents a large portion of the image. Orthogonal Subspace Projection
12
Orthogonal Subspace Projection
Notes on Harsanyi could be in DN, radiance or reflectance depending on how end members and noise are defined. where fi is the fraction for the ith end member Orthogonal Subspace Projection
13
Orthogonal Subspace Projection
14
Orthogonal Subspace Projection
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.