Semantic Segmentation Network of Remote Sensing Images With Dynamic Loss Fusion Strategy

The remote sensing (RS) images are widely used in various industries, among which semantic segmentation of RS images is a common research direction. At the same time, because of the complexity of target information and the high similarity of features between the classes, this task is very challengin...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 9; pp. 70406 - 70418
Main Authors Liu, Wenjie, Zhang, Yongjun, Yan, Jun, Zou, Yongjie, Cui, Zhongwei
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The remote sensing (RS) images are widely used in various industries, among which semantic segmentation of RS images is a common research direction. At the same time, because of the complexity of target information and the high similarity of features between the classes, this task is very challenging. In recent years, semantic segmentation algorithms of RS images have emerged in an endless stream, but most of them are improved around the scale features of the target, and the accuracy has great room for improvement. In this case, we propose a semantic segmentation framework for RS images with dynamic perceptual loss. The framework is improved based on the InceptionV-4 network to form a network that includes contextual semantic fusion and dual-channel atrous spatial pyramid pooling (ASPP). The semantic segmentation network is an encoder-decoder structure. In addition, we design a dynamic perceptual loss module and a dynamic loss fusion strategy by further observing the loss changes of the network, so as to better improve the classified details. Finally, experiment on the ISPRS 2D Semantic Labeling Contest Vaihingen Dataset and Massachusetts Building Dataset. Compared with some segmentation networks, our model has excellent performance.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3078742