Semi-Supervised Remote-Sensing Image Scene Classification Using Representation Consistency Siamese Network

Deep learning has achieved excellent performance in remote-sensing image scene classification, since a large number of datasets with annotations can be applied for training. However, in actual applications, there is just a few annotated samples and a large number of unannotated samples in remote-sen...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 60; pp. 1 - 14
Main Authors Miao, Wang, Geng, Jie, Jiang, Wen
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Deep learning has achieved excellent performance in remote-sensing image scene classification, since a large number of datasets with annotations can be applied for training. However, in actual applications, there is just a few annotated samples and a large number of unannotated samples in remote-sensing images, which leads to overfitting of the deep model and affects the performance of scene classification. In order to address these problems, a semi-supervised representation consistency Siamese network (SS-RCSN) is proposed for remote-sensing image scene classification. First, considering intraclass diversity and interclass similarity of remote-sensing images, Involution-generative adversarial network (GAN) is utilized to extract the discriminative features from remote-sensing images via unsupervised learning. Then, Siamese network with a representation consistency loss is proposed for semi-supervised classification, which aims to reduce the differences of labeled and unlabeled data. Experimental results on UC Merced dataset, RESICS-45 dataset, aerial image dataset (AID), and RS dataset demonstrate that our method yields superior classification performance compared with other semi-supervised learning (SSL) methods.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2022.3140485