Unsupervised Recurrent Hyperspectral Imagery Super-Resolution Using Pixel-Aware Refinement

Unsupervised fusion-based hyperspectral imagery (HSI) super-resolution (SR) is an essential task of HSI processing, which aims to reconstruct a high-resolution (HR) HSI using only an observed low-resolution HSI and a conventional HR image. Although a large number of unsupervised HSI SR methods have...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 60; pp. 1 - 15
Main Authors Wei, Wei, Nie, Jiangtao, Zhang, Lei, Zhang, Yanning
Format Journal Article
LanguageEnglish
Published New York IEEE 01.01.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Unsupervised fusion-based hyperspectral imagery (HSI) super-resolution (SR) is an essential task of HSI processing, which aims to reconstruct a high-resolution (HR) HSI using only an observed low-resolution HSI and a conventional HR image. Although a large number of unsupervised HSI SR methods have been proposed, the heuristic handcrafted image priors adopted by the majority of these methods restrict their capacity to capture specific characteristics of the HSI, as well as their ability to generalize to noisy observation images. In this study, we investigate a fusion-based HSI SR framework with the deep image prior, in which the deep neural network (rather than a heuristic handcrafted image prior) is exploited to capture plenty of image statistics. Within this framework, we further propose an unsupervised recurrence-based HSI SR method using pixel-aware refinement, which utilizes the intermediate reconstruction results to self-supervise unsupervised learning. Due to containing the information of the image-specific characteristic, the proposed method achieves better performance, in terms of both accuracy and robustness to noise, compared with the existing methods. Extensive experiments on four HSI data sets demonstrate the effectiveness of the proposed method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2020.3039534