A Pansharpening Based on the Non-Subsampled Contourlet Transform and Convolutional Autoencoder: Application to QuickBird Imagery
This paper presents a pansharpening technique based on the non-subsampled contourlet transform (NSCT) and convolutional autoencoder (CAE). NSCT is exceptionally proficient at presenting orientation information and capturing the internal geometry of objects. First, it's used to decompose the mul...
Saved in:
Published in | IEEE access Vol. 10; pp. 44778 - 44788 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper presents a pansharpening technique based on the non-subsampled contourlet transform (NSCT) and convolutional autoencoder (CAE). NSCT is exceptionally proficient at presenting orientation information and capturing the internal geometry of objects. First, it's used to decompose the multispectral (MS) and panchromatic (PAN) images into high-frequency and low-frequency components using the same number of decomposition levels. Second, a CAE network is trained to generate original low-frequency PAN images from their spatially degraded versions. Low-resolution multispectral images are then fed into the trained convolutional autoencoder network to generate estimated high-resolution multispectral images. Third, another CAE network is trained to generate original high-frequency PAN images from their spatially degraded versions. The result of low-pass CAE is fed to the trained high-pass CAE to generate estimated high-resolution multispectral images. The final pan-sharpened image is accomplished by injecting the detailed map of the spectral bands into the corresponding estimated high-resolution multispectral bands. The proposed method is tested on QuickBird datasets and compared with some existing pan-sharpening techniques. Objective and subjective results demonstrate the efficiency of the proposed method. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2022.3169698 |