Semisupervised Hyperspectral Image Classification Based on Generative Adversarial Networks

Because the collection of ground-truth labels is difficult, expensive, and time-consuming, classifying hyperspectral images (HSIs) with few training samples is a challenging problem. In this letter, we propose a novel semisupervised algorithm for the classification of hyperspectral data by training...

Full description

Saved in:
Bibliographic Details
Published inIEEE geoscience and remote sensing letters Vol. 15; no. 2; pp. 212 - 216
Main Authors Zhan, Ying, Hu, Dan, Wang, Yuntao, Yu, Xianchuan
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.02.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Because the collection of ground-truth labels is difficult, expensive, and time-consuming, classifying hyperspectral images (HSIs) with few training samples is a challenging problem. In this letter, we propose a novel semisupervised algorithm for the classification of hyperspectral data by training a customized generative adversarial network (GAN) for hyperspectral data. The GAN constructs an adversarial game between a discriminator and a generator. The generator generates samples that are not distinguishable by the discriminator, and the discriminator determines whether or not a sample is composed of real data. We design a semisupervised framework for HSI data based on a 1-D GAN (HSGAN). This framework enables the automatic extraction of spectral features for HSI classification. When HSGAN is trained using unlabeled hyperspectral data, the generator can generate hyperspectral samples that are similar to the real data, while the discriminator contains the features, which can be used to classify hyperspectral data with only a small number of labeled samples. The performance of the HSGAN is evaluated on the Airborne Visible Infrared Imaging Spectrometer image data, and the results show that the proposed framework achieves very promising results with a small number of labeled samples.
ISSN:1545-598X
1558-0571
DOI:10.1109/LGRS.2017.2780890