Multigranularity Decoupling Network With Pseudolabel Selection for Remote Sensing Image Scene Classification

The existing deep networks have shown excellent performance in remote sensing scene classification (RSSC), which generally requires a large amount of class-balanced training samples. However, deep networks will result in underfitting with imbalanced training samples since they can easily bias toward...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 61; pp. 1 - 13
Main Authors Miao, Wang, Geng, Jie, Jiang, Wen
Format Journal Article
LanguageEnglish
Published New York IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The existing deep networks have shown excellent performance in remote sensing scene classification (RSSC), which generally requires a large amount of class-balanced training samples. However, deep networks will result in underfitting with imbalanced training samples since they can easily bias toward the majority classes. To address these problems, a multigranularity decoupling network (MGDNet) is proposed for remote sensing image scene classification. To begin with, we design a multigranularity complementary feature representation (MGCFR) method to extract fine-grained features from remote sensing images, which utilizes region-level supervision to guide the attention of the decoupling network. Second, a class-imbalanced pseudolabel selection (CIPS) approach is proposed to evaluate the credibility of unlabeled samples. Finally, the diversity component feature (DCF) loss function is developed to force the local features to be more discriminative. Our model performs satisfactorily on three public datasets: UC Merced (UCM), NWPU-RESISC45, and Aerial Image Dataset (AID). Experimental results show that the proposed model yields superior performance compared with other state-of-the-art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2023.3244565