ABCNet: Attentive bilateral contextual network for efficient semantic segmentation of Fine-Resolution remotely sensed imagery

Semantic segmentation of remotely sensed imagery plays a critical role in many real-world applications, such as environmental change monitoring, precision agriculture, environmental protection, and economic assessment. Following rapid developments in sensor technologies, vast numbers of fine-resolut...

Full description

Saved in:
Bibliographic Details
Published inISPRS journal of photogrammetry and remote sensing Vol. 181; pp. 84 - 98
Main Authors Li, Rui, Zheng, Shunyi, Zhang, Ce, Duan, Chenxi, Wang, Libo, Atkinson, Peter M.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.11.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Semantic segmentation of remotely sensed imagery plays a critical role in many real-world applications, such as environmental change monitoring, precision agriculture, environmental protection, and economic assessment. Following rapid developments in sensor technologies, vast numbers of fine-resolution satellite and airborne remote sensing images are now available, for which semantic segmentation is potentially a valuable method. However, because of the rich complexity and heterogeneity of information provided with an ever-increasing spatial resolution, state-of-the-art deep learning algorithms commonly adopt complex network structures for segmentation, which often result in significant computational demand. Particularly, the frequently-used fully convolutional network (FCN) relies heavily on fine-grained spatial detail (fine spatial resolution) and contextual information (large receptive fields), both imposing high computational costs. This impedes the practical utility of FCN for real-world applications, especially those requiring real-time data processing. In this paper, we propose a novel Attentive Bilateral Contextual Network (ABCNet), a lightweight convolutional neural network (CNN) with a spatial path and a contextual path. Extensive experiments, including a comprehensive ablation study, demonstrate that ABCNet has strong discrimination capability with competitive accuracy compared with state-of-the-art benchmark methods while achieving significantly increased computational efficiency. Specifically, the proposed ABCNet achieves a 91.3% overall accuracy (OA) on the Potsdam test dataset and outperforms all lightweight benchmark methods significantly. The code is freely available at https://github.com/lironui/ABCNet.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0924-2716
1872-8235
DOI:10.1016/j.isprsjprs.2021.09.005