Orientation-Aware Pedestrian Attribute Recognition based on Graph Convolution Network

Pedestrian attribute recognition (PAR) aims to generate a structured description of pedestrians and plays an important role in surveillance. Current work focusing on 2D images can achieve decent performance when there is no variation in the captured pedestrian orientation. However, the performance o...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on multimedia Vol. 26; pp. 1 - 13
Main Authors Lu, Wei-Qing, Hu, Hai-Miao, Yu, Jinzuo, Zhou, Yibo, Wang, Hanzi
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.01.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Pedestrian attribute recognition (PAR) aims to generate a structured description of pedestrians and plays an important role in surveillance. Current work focusing on 2D images can achieve decent performance when there is no variation in the captured pedestrian orientation. However, the performance of these works cannot be maintained in scenarios when the orientation of pedestrians is ignored. To mitigate this problem, this paper proposes orientation-aware pedestrian attribute recognition based on graph convolution network (GCN), which is composed of an orientation-aware spatial attention (OSA) module and an orientation-guided attribute-relation learning (OAL) module. Since some attributes can be invisible for certain orientations, OSA is proposed for orientation-aware feature extraction to enhance the learned representation of the visual attributes. Moreover, since different orientations result in different relations among attributes, OAL is proposed to achieve distinguishable and impactful attribute relations by eliminating the confusion of attribute relations in different orientations. Experiments on three challenging datasets (PETA, RAP, and PA100K) demonstrate that the proposed PAR outperforms the state-of-the-art methods by considerable margins.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2023.3259686