A novel SAR and optical image fusion algorithm based on an improved SPCNN and phase congruency information

The noise and significant spectral differences lead to severe spectral and spatial information distortions in the fusion result of SAR and optical images. We propose a fusion method based on phase congruency information and an improved, simplified pulse-coupled neural network (PC-SPCNN). The PC-SPCN...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of remote sensing Vol. 44; no. 4; pp. 1328 - 1347
Main Authors Fu, Yukai, Yang, Shuwen, Li, Yikun, Yan, Heng, Zheng, Yao
Format Journal Article
LanguageEnglish
Published London Taylor & Francis 16.02.2023
Taylor & Francis Ltd
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The noise and significant spectral differences lead to severe spectral and spatial information distortions in the fusion result of SAR and optical images. We propose a fusion method based on phase congruency information and an improved, simplified pulse-coupled neural network (PC-SPCNN). The PC-SPCNN method builds the basic fusion framework based on the generalized intensity-hue-saturation transform (GIHS) and nonsubsampled contourlet transform (NSCT). When fusing low-frequency coefficients, a fusion method that couples phase congruency and gain injection is adopted to reduce the spectral distortion caused by nonlinear radiometric differences between images. Meanwhile, an improved, simplified pulse-coupled neural network model is used to fuse the high-frequency coefficients of SAR and optical images. Three groups of multi-source, multi-scale, and multi-scene remote sensing images are used to verify the feasibility of PC-SPCNN and compared with existing fusion algorithms. The results indicate that the PC-SPCNN is superior to existing algorithms in both visual effect and objective evaluation and has better fusion performance.
ISSN:0143-1161
1366-5901
DOI:10.1080/01431161.2023.2179899