MGFCTFuse: A Novel Fusion Approach for Infrared and Visible Images

Traditional deep-learning-based fusion algorithms usually take the original image as input to extract features, which easily leads to a lack of rich details and background information in the fusion results. To address this issue, we propose a fusion algorithm, based on mutually guided image filterin...

Full description

Saved in:
Bibliographic Details
Published inElectronics (Basel) Vol. 12; no. 12; p. 2740
Main Authors Hao, Shuai, Li, Jiahao, Ma, Xu, Sun, Siya, Tian, Zhuo, Cao, Le
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.06.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Traditional deep-learning-based fusion algorithms usually take the original image as input to extract features, which easily leads to a lack of rich details and background information in the fusion results. To address this issue, we propose a fusion algorithm, based on mutually guided image filtering and cross-transmission, termed MGFCTFuse. First, an image decomposition method based on mutually guided image filtering is designed, one which decomposes the original image into a base layer and a detail layer. Second, in order to preserve as much background and detail as possible during feature extraction, the base layer is concatenated with the corresponding original image to extract deeper features. Moreover, in order to enhance the texture details in the fusion results, the information in the visible and infrared detail layers is fused, and an enhancement module is constructed to enhance the texture detail contrast. Finally, in order to enhance the communication between different features, a decoding network based on cross-transmission is designed within feature reconstruction, which further improves the quality of image fusion. In order to verify the advantages of the proposed algorithm, experiments are conducted on the TNO, MSRS, and RoadScene image fusion datasets, and the results demonstrate that the algorithm outperforms nine comparative algorithms in both subjective and objective aspects.
ISSN:2079-9292
2079-9292
DOI:10.3390/electronics12122740