Cross-scene wetland mapping on hyperspectral remote sensing images using adversarial domain adaptation network

Wetlands are one of the most important ecosystems on the Earth, and using hyperspectral remote sensing (RS) technology for fine wetland mapping is important for restoring and protecting the natural resources of coastal wetlands. However, the high cost in collecting labeled samples and inconsistent a...

Full description

Saved in:
Bibliographic Details
Published inISPRS journal of photogrammetry and remote sensing Vol. 203; pp. 37 - 54
Main Authors Huang, Yi, Peng, Jiangtao, Chen, Na, Sun, Weiwei, Du, Qian, Ren, Kai, Huang, Ke
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.09.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Wetlands are one of the most important ecosystems on the Earth, and using hyperspectral remote sensing (RS) technology for fine wetland mapping is important for restoring and protecting the natural resources of coastal wetlands. However, the high cost in collecting labeled samples and inconsistent acquisition conditions across different geographic regions or scenes lead to difficulties in wetland mapping and classification. To mitigate these difficulties, a spatial–spectral weighted adversarial domain adaptation (SSWADA) network is proposed for the cross-scene wetland mapping using hyperspectral image (HSI). The proposed SSWADA employs an idea of weighted adversarial discrimination to align the feature distribution of source and target scenes, where a generator or feature extractor with joint 2D–3D convolution is used to extract spatial–spectral features of HSI, a weighted discriminator is constructed to perform source instance weighting and a multi-classifier structure is designed to improve the classification performance on target samples. Experimental results on four different tasks show that our SSWADA outperforms existing domain adaptation methods for cross-scene wetland mapping.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0924-2716
1872-8235
DOI:10.1016/j.isprsjprs.2023.07.009