Domain Adaptation in Remote Sensing Image Classification: A Survey

Traditional remote sensing (RS) image classification methods heavily rely on labeled samples for model training. When labeled samples are unavailable or labeled samples have different distributions from that of the samples to be classified, the classification model may fail. The cross-domain or cros...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of selected topics in applied earth observations and remote sensing Vol. 15; pp. 1 - 18
Main Authors Peng, Jiangtao, Huang, Yi, Sun, Weiwei, Chen, Na, Ning, Yujie, Du, Qian
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Traditional remote sensing (RS) image classification methods heavily rely on labeled samples for model training. When labeled samples are unavailable or labeled samples have different distributions from that of the samples to be classified, the classification model may fail. The cross-domain or cross-scene remote sensing image classification is developed for this case where an existing image for training and an unknown image from different scenes or domains for classification. The distribution inconsistency problem may be caused by the differences in acquisition environment conditions, acquisition scene, acquisition time or/and changing sensors. To cope with the cross-domain remote sensing image classification problem, many domain adaptation (DA) techniques have been developed. In this article, we review DA methods in the fields of RS, especially hyperspectral image classification, and provide a survey of DA methods into traditional shallow DA methods (e.g., instance-based, feature-based, and classifier-based adaptations) and recently developed deep DA methods (e.g., discrepancy-based and adversarial-based adaptations).
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1939-1404
2151-1535
DOI:10.1109/JSTARS.2022.3220875