Double Branch Parallel Network for Segmentation of Buildings and Waters in Remote Sensing Images

The segmentation algorithm for buildings and waters is extremely important for the efficient planning and utilization of land resources. The temporal and space range of remote sensing pictures is growing. Due to the generic convolutional neural network’s (CNN) insensitivity to the spatial position i...

Full description

Saved in:
Bibliographic Details
Published inRemote sensing (Basel, Switzerland) Vol. 15; no. 6; p. 1536
Main Authors Chen, Jing, Xia, Min, Wang, Dehao, Lin, Haifeng
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.03.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The segmentation algorithm for buildings and waters is extremely important for the efficient planning and utilization of land resources. The temporal and space range of remote sensing pictures is growing. Due to the generic convolutional neural network’s (CNN) insensitivity to the spatial position information in remote sensing images, certain location and edge details can be lost, leading to a low level of segmentation accuracy. This research suggests a double-branch parallel interactive network to address these issues, fully using the interactivity of global information in a Swin Transformer network, and integrating CNN to capture deeper information. Then, by building a cross-scale multi-level fusion module, the model can combine features gathered using convolutional neural networks with features derived using Swin Transformer, successfully extracting the semantic information of spatial information and context. Then, an up-sampling module for multi-scale fusion is suggested. It employs the output high-level feature information to direct the low-level feature information and recover the high-resolution pixel-level features. According to experimental results, the proposed networks maximizes the benefits of the two models and increases the precision of semantic segmentation of buildings and waters.
ISSN:2072-4292
2072-4292
DOI:10.3390/rs15061536