Dual-stream shadow detection network: biologically inspired shadow detection for remote sensing images
Deep learning has achieved state-of-the-art results in various image classification and image segmentation tasks. However, due to the lack of well-labeled datasets, the insufficiency of deep feature extraction, and the complexity of the distribution of shadows on remote sensing images, popular deep...
Saved in:
Published in | Neural computing & applications Vol. 34; no. 12; pp. 10039 - 10049 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
London
Springer London
01.06.2022
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Deep learning has achieved state-of-the-art results in various image classification and image segmentation tasks. However, due to the lack of well-labeled datasets, the insufficiency of deep feature extraction, and the complexity of the distribution of shadows on remote sensing images, popular deep neural networks still fall short on satisfactory shadow detection from remote sensing images. Inspired by the brain's mechanism for processing visual signals, this paper proposes a new Dual-stream Shadow Detection Network (DSSDN) that is specifically designed for detecting shadows on remote sensing images. In DSSDN, the pooling stream extracts high-level features by merging multiple atrous pooling feature maps after the encoder, while the residual stream maintains low-level features and carries out the interaction of dual-stream features. This network is also featured with three new sub-modules. We manually labeled 1724 remote sensing images with shadows to form a new dataset for training and testing of DSSDN. In the quantitative contrast experiment on this dataset, DSSDN reaches the lowest Balanced Error Rate (BER) at 6.6% across all compared models and networks. In the qualitative analysis, the detected shadows of DSSDN also show best contours and details in comparison with results from other approaches. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-022-06989-w |