JAMSNet: A Remote Pulse Extraction Network Based on Joint Attention and Multi-Scale Fusion

Remote photoplethysmography (rPPG) has been an active research topic in recent years. While most existing methods are focusing on eliminating motion artifacts in the raw traces obtained from single-scale region-of-interest (ROI), it is worth noting that there are some noise signals that cannot be ef...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on circuits and systems for video technology Vol. 33; no. 6; pp. 2783 - 2797
Main Authors Zhao, Changchen, Wang, Hongsheng, Chen, Huiling, Shi, Weiwei, Feng, Yuanjing
Format Journal Article
LanguageEnglish
Published New York IEEE 01.06.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Remote photoplethysmography (rPPG) has been an active research topic in recent years. While most existing methods are focusing on eliminating motion artifacts in the raw traces obtained from single-scale region-of-interest (ROI), it is worth noting that there are some noise signals that cannot be effectively separated in single-scale space but can be separated more easily in multi-scale space. In this paper, we analyze the distribution of pulse signal and motion artifacts in different layers of a Gaussian pyramid. We propose a method that combines multi-scale analysis and neural network for pulse extraction in different scales, and a layer-wise attention mechanism to adaptively fuse the features according to signal strength. In addition, we propose spatial-temporal joint attention module and channel-temporal joint attention module to learn and exaggerate pulse features in the joint spaces, respectively. The proposed remote pulse extraction network is called Joint Attention and Multi-Scale fusion Network (JAMSNet). Extensive experiments have been conducted on two publicly available datasets and one self-collected dataset. The results show that the proposed JAMSNet shows better performance than state-of-the-art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1051-8215
1558-2205
DOI:10.1109/TCSVT.2022.3227348