Pansharpening based on convolutional autoencoder and multi-scale guided filter

In this paper, we propose a pansharpening method based on a convolutional autoencoder. The convolutional autoencoder is a sort of convolutional neural network (CNN) and objective to scale down the input dimension and typify image features with high exactness. First, the autoencoder network is traine...

Full description

Saved in:
Bibliographic Details
Published inEURASIP journal on image and video processing Vol. 2021; no. 1; pp. 1 - 20
Main Authors AL Smadi, Ahmad, Yang, Shuyuan, Kai, Zhang, Mehmood, Atif, Wang, Min, Alsanabani, Ala
Format Journal Article
LanguageEnglish
Published Cham Springer International Publishing 19.07.2021
Springer Nature B.V
SpringerOpen
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we propose a pansharpening method based on a convolutional autoencoder. The convolutional autoencoder is a sort of convolutional neural network (CNN) and objective to scale down the input dimension and typify image features with high exactness. First, the autoencoder network is trained to reduce the difference between the degraded panchromatic image patches and reconstruction output original panchromatic image patches. The intensity component, which is developed by adaptive intensity-hue-saturation (AIHS), is then delivered into the trained convolutional autoencoder network to generate an enhanced intensity component of the multi-spectral image. The pansharpening is accomplished by improving the panchromatic image from the enhanced intensity component using a multi-scale guided filter; then, the semantic detail is injected into the upsampled multi-spectral image. Real and degraded datasets are utilized for the experiments, which exhibit that the proposed technique has the ability to preserve the high spatial details and high spectral characteristics simultaneously. Furthermore, experimental results demonstrated that the proposed study performs state-of-the-art results in terms of subjective and objective assessments on remote sensing data.
ISSN:1687-5281
1687-5176
1687-5281
DOI:10.1186/s13640-021-00565-3