Pansharpening Based on Variational Fractional-Order Geometry Model and Optimized Injection Gains
Pansharpening techniques fuse the complementary information from panchromatic (PAN) and multispectral (MS) images to obtain a high-resolution MS image. However, the majority of existing pansharpening techniques suffer from spectral distortion owing to the low correlation between the MS and PAN image...
Saved in:
Published in | IEEE journal of selected topics in applied earth observations and remote sensing Vol. 15; pp. 2128 - 2141 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Pansharpening techniques fuse the complementary information from panchromatic (PAN) and multispectral (MS) images to obtain a high-resolution MS image. However, the majority of existing pansharpening techniques suffer from spectral distortion owing to the low correlation between the MS and PAN images, and difficulties in obtaining appropriate injection gains. To address these issues, this article presents a novel pansharpening method based on the variational fractional-order geometry (VFOG) model and optimized injection gains. Specifically, to improve the correlation between the PAN and MS images, the VFOG model is constructed to generate a refined PAN image with a similar spatial structure to the MS image, while maintaining the gradient information of the original PAN image. Furthermore, to obtain accurate injection gains, and considering that the vegetated and nonvegetated regions should be dissimilar, an optimized adaptive injection gain based on the normalized differential vegetation index is designed. The final pansharpened image is obtained by an injection model using the refined PAN image and optimized injection gains. Extensive experiments on various satellite datasets demonstrate that the proposed method offers superior spectral and spatial fidelity compared to existing state-of-the-art algorithms. |
---|---|
ISSN: | 1939-1404 2151-1535 |
DOI: | 10.1109/JSTARS.2022.3154642 |