mapSR: A Deep Neural Network for Super-Resolution of Raster Map
The purpose of multisource map super-resolution is to reconstruct high-resolution maps based on low-resolution maps, which is valuable for content-based map tasks such as map recognition and classification. However, there is no specific super-resolution method for maps, and the existing image super-...
Saved in:
Published in | ISPRS international journal of geo-information Vol. 12; no. 7; p. 258 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Basel
MDPI AG
01.07.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The purpose of multisource map super-resolution is to reconstruct high-resolution maps based on low-resolution maps, which is valuable for content-based map tasks such as map recognition and classification. However, there is no specific super-resolution method for maps, and the existing image super-resolution methods often suffer from missing details when reconstructing maps. We propose a map super-resolution (mapSR) model that fuses local and global features for super-resolution reconstruction of low-resolution maps. Specifically, the proposed model consists of three main modules: a shallow feature extraction module, a deep feature fusion module, and a map reconstruction module. First, the shallow feature extraction module initially extracts the image features and embeds the images with appropriate dimensions. The deep feature fusion module uses Transformer and Convolutional Neural Network (CNN) to focus on extracting global and local features, respectively, and fuses them by weighted summation. Finally, the map reconstruction module uses upsampling methods to reconstruct the map features into the high-resolution map. We constructed a high-resolution map dataset for training and validating the map super-resolution model. Compared with other models, the proposed method achieved the best results in map super-resolution. |
---|---|
ISSN: | 2220-9964 2220-9964 |
DOI: | 10.3390/ijgi12070258 |