Deep Hyperspectral Image Sharpening
Hyperspectral image (HSI) sharpening, which aims at fusing an observable low spatial resolution (LR) HSI (LR-HSI) with a high spatial resolution (HR) multispectral image (HR-MSI) of the same scene to acquire an HR-HSI, has recently attracted much attention. Most of the recent HSI sharpening approach...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. 29; no. 11; pp. 5345 - 5355 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.11.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Hyperspectral image (HSI) sharpening, which aims at fusing an observable low spatial resolution (LR) HSI (LR-HSI) with a high spatial resolution (HR) multispectral image (HR-MSI) of the same scene to acquire an HR-HSI, has recently attracted much attention. Most of the recent HSI sharpening approaches are based on image priors modeling, which are usually sensitive to the parameters selection and time-consuming. This paper presents a deep HSI sharpening method (named DHSIS) for the fusion of an LR-HSI with an HR-MSI, which directly learns the image priors via deep convolutional neural network-based residual learning. The DHSIS method incorporates the learned deep priors into the LR-HSI and HR-MSI fusion framework. Specifically, we first initialize the HR-HSI from the fusion framework via solving a Sylvester equation. Then, we map the initialized HR-HSI to the reference HR-HSI via deep residual learning to learn the image priors. Finally, the learned image priors are returned to the fusion framework to reconstruct the final HR-HSI. Experimental results demonstrate the superiority of the DHSIS approach over existing state-of-the-art HSI sharpening approaches in terms of reconstruction accuracy and running time. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2018.2798162 |