Super-resolution cropland mapping with Sentinel-2 images based on a self-training learning network
The Sentinel-2 image is widely used for cropland mapping, but the limitation of its 10 m spatial resolution may significantly impact the accuracy of result in areas characterized by severe cropland fragmentation. To address the issue, this letter proposes a super-resolution cropland mapping model fo...
Saved in:
Published in | Remote sensing letters Vol. 15; no. 11; pp. 1143 - 1152 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
Abingdon
Taylor & Francis
01.11.2024
Taylor & Francis Ltd |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The Sentinel-2 image is widely used for cropland mapping, but the limitation of its 10 m spatial resolution may significantly impact the accuracy of result in areas characterized by severe cropland fragmentation. To address the issue, this letter proposes a super-resolution cropland mapping model for Sentinel-2 images. The proposed model improves the spatial resolution of Sentinel-2 images to 2.5 m through self-training learning with the Swin Transformer for Image Restoration (SwinIR) model without needing fine-resolution training samples. Then, the random forest classification is applied to map cropland from the super-solved 2.5 m resolution image. The proposed model was assessed in the Jianghan Plain, China, and the results show that the super-resolution images produce cropland maps with higher accuracy than the original Sentinel-2 images. Comparing cropland mapping results before and after super-resolution, the overall accuracy improves from 0.82 to 0.89, while the commission error and omission error decrease from 0.09 to 0.05 and 0.17 to 0.11, respectively. This method addresses the shortage of training samples collection and improves the accuracy of cropland mapping, contributing to more reliable agricultural remote sensing applications. |
---|---|
ISSN: | 2150-704X 2150-7058 |
DOI: | 10.1080/2150704X.2024.2411068 |