Double Nearest Proportion Feature Extraction for Hyperspectral-Image Classification

For the classification among different land-cover types in a hyperspectral image, particularly in the small-sample-size situation, a feature-extraction method is an approach for reducing the dimensionality and increasing the classification accuracy. Fisher's linear discriminant analysis (LDA) i...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 48; no. 11; pp. 4034 - 4046
Main Authors Huang, Hsiao-Yun, Kuo, Bor-Chen
Format Journal Article
LanguageEnglish
Published New York IEEE 01.11.2010
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For the classification among different land-cover types in a hyperspectral image, particularly in the small-sample-size situation, a feature-extraction method is an approach for reducing the dimensionality and increasing the classification accuracy. Fisher's linear discriminant analysis (LDA) is one of the most popular feature-extraction methods. However, it cannot be applied directly to the classification of hyperspectral image because of several natures of Fisher's criterion. Nonparametric discriminant analysis (NDA) and nonparametric weighted feature extraction, on the other hand, are two extensions of LDA with a creative idea about emphasizing the boundary structure of class distributions. However, the overlap situation was not considered in these methods and thus decreased the robustness of these methods. In this paper, a new feature-extraction method is introduced based on a structure named double nearest proportion. This structure enables the proposed method to reduce the effect of overlap, allows a new regularization technique to be embedded, and includes LDA and NDA as special cases. These properties enable the proposed method to be more robust and thus, generally, have better performance.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2010.2058580