ECFFNet: Effective and Consistent Feature Fusion Network for RGB-T Salient Object Detection
Under ideal environmental conditions, RGB-based deep convolutional neural networks can achieve high performance for salient object detection (SOD). In scenes with cluttered backgrounds and many objects, depth maps have been combined with RGB images to better distinguish spatial positions and structu...
Saved in:
Published in | IEEE transactions on circuits and systems for video technology Vol. 32; no. 3; pp. 1224 - 1235 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.03.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Under ideal environmental conditions, RGB-based deep convolutional neural networks can achieve high performance for salient object detection (SOD). In scenes with cluttered backgrounds and many objects, depth maps have been combined with RGB images to better distinguish spatial positions and structures during SOD, achieving high accuracy. However, under low-light and uneven lighting conditions, RGB and depth information may be insufficient for detection. Thermal images are insensitive to lighting and weather conditions, being able to capture important objects even during nighttime. By combining thermal images and RGB images, we propose an effective and consistent feature fusion network (ECFFNet) for RGB-T SOD. In ECFFNet, an effective cross-modality fusion module fully fuses features of corresponding sizes from the RGB and thermal modalities. Then, a bilateral reversal fusion module performs bilateral fusion of foreground and background information, enabling the full extraction of salient object boundaries. Finally, a multilevel consistent fusion module combines features across different levels to obtain complementary information. Comprehensive experiments on three RGB-T SOD datasets show that the proposed ECFFNet outperforms 12 state-of-the-art methods under different evaluation indicators. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1051-8215 1558-2205 |
DOI: | 10.1109/TCSVT.2021.3077058 |