Correlation-and-Correction Fusion Attention Network for Occluded Pedestrian Detection
As a significant issue in computer vision, pedestrian detection has achieved certain achievements with the support of deep learning. However, pedestrian detection in congested scenes still encounters the challenging problem of feature loss and obfuscation. To address the issue, we propose a pedestri...
Saved in:
Published in | IEEE sensors journal Vol. 23; no. 6; p. 1 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
15.03.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | As a significant issue in computer vision, pedestrian detection has achieved certain achievements with the support of deep learning. However, pedestrian detection in congested scenes still encounters the challenging problem of feature loss and obfuscation. To address the issue, we propose a pedestrian detection network based on a correlation-and-correction fusion attention mechanism. First, a multi-mask correction attention module is proposed, which generates visible part masks of pedestrians, enhancing the visible region's features and correcting the false one. Besides, the module preserves the features of multi-class pedestrians by generating multiple masks. Then, we fuse a correlation channel attention module to enhance the correlation of various pedestrians' body features. Next, we studied three fusion methods of correlation and correction attention mechanisms and found that the serial connection of "correlation first and correction behind" works best. Finally, we extend our method to multi-class pedestrian detection in congested scenes. Experimental results on the CityPersons, Caltech and CrowdHuman datasets demonstrate the effectiveness of our method. On the CityPersons dataset where more than 70% of pedestrians are occluded, our method outperforms the baseline method by 1.12% on the heavy occlusion subset and surpasses many outstanding methods. |
---|---|
ISSN: | 1530-437X 1558-1748 |
DOI: | 10.1109/JSEN.2023.3242082 |