Factory Extraction from Satellite Images: Benchmark and Baseline

Factory extraction from satellite images is a key step in urban factory planning, and plays a crucial role in ecological protection and land-use optimization. However, factory extraction is greatly underexplored in the existing literature due to the lack of large-scale benchmarks. In this paper, we...

Full description

Saved in:
Bibliographic Details
Published inRemote sensing (Basel, Switzerland) Vol. 14; no. 22; p. 5657
Main Authors Deng, Yifei, Li, Chenglong, Lu, Andong, Li, Wenjie, Luo, Bin
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.11.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Factory extraction from satellite images is a key step in urban factory planning, and plays a crucial role in ecological protection and land-use optimization. However, factory extraction is greatly underexplored in the existing literature due to the lack of large-scale benchmarks. In this paper, we contribute a challenging benchmark dataset named SFE4395, which consists of 4395 satellite images acquired from Google Earth. The features of SFE4395 include rich multiscale factory instances and a wide variety of factory types, with diverse challenges. To provide a strong baseline for this task, we propose a novel bidirectional feature aggregation and compensation network called BACNet. In particular, we design a bidirectional feature aggregation module to sufficiently integrate multiscale features in a bidirectional manner, which can improve the extraction ability for targets of different sizes. To recover the detailed information lost due to multiple instances of downsampling, we design a feature compensation module. The module adds the detailed information of low-level features to high-level features in a guidance of attention manner. In additional, a point-rendering module is introduced in BACNet to refine results. Experiments using SFE4395 and public datasets demonstrate the effectiveness of the proposed BACNet against state-of-the-art methods.
ISSN:2072-4292
2072-4292
DOI:10.3390/rs14225657