Adaptive Multi-Scale Attentional Network for Semantic Segmentation of Remote Sensing Images
High-resolution remote sensing image segmentation presents challenges due to the complexity and diversity of ground objects, variations in scale, and the presence of small, hard-to-detect targets. Traditional methods often fail to capture multi-scale contextual information and accurately segmenting...
Saved in:
Published in | International eConference on Computer and Knowledge Engineering (Online) pp. 200 - 206 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
19.11.2024
|
Subjects | |
Online Access | Get full text |
ISSN | 2643-279X |
DOI | 10.1109/ICCKE65377.2024.10874765 |
Cover
Loading…
Summary: | High-resolution remote sensing image segmentation presents challenges due to the complexity and diversity of ground objects, variations in scale, and the presence of small, hard-to-detect targets. Traditional methods often fail to capture multi-scale contextual information and accurately segmenting small objects in cluttered backgrounds. In this paper, we propose the Adaptive Multi-Scale Attentional Network (AMSANet), which addresses these challenges by leveraging a Hierarchical Multi-Scale Dilated Convolution Module (HMSDCM) and an Efficient Residual Atrous Spatial Pyramid Pooling (ER-ASPP) module to enhance feature extraction and refinement. The AMSANet architecture integrates these advanced modules to effectively capture and merge features at multiple scales, employing coordinate attention mechanisms for precise final refinement. Validated on the ISPRS Vaihingen and Potsdam datasets, AMSANet achieves a 2.1% improvement in mean intersection over union (mIoU), a 1.44% increase in overall accuracy (OA), and a 1.54% boost in mean F1 score over state-of-the-art methods. These results underscore AMSANet's efficiency in improving segmentation accuracy and robustness, particularly in managing high-resolution remote sensing images with complex details and heterogeneous landscapes. |
---|---|
ISSN: | 2643-279X |
DOI: | 10.1109/ICCKE65377.2024.10874765 |