A New Partitioned Spatial–Temporal Graph Attention Convolution Network for Human Motion Recognition

At present, human action recognition can be used in all walks of life, because the skeleton can transmit intuitive information without being affected by environmental factors. However, it only focuses on local information. In order to solve these problems, we introduce a neural network model for hum...

Full description

Saved in:
Bibliographic Details
Published inApplied sciences Vol. 13; no. 3; p. 1647
Main Authors Guo, Keyou, Wang, Pengshuo, Shi, Peipeng, He, Chengbo, Wei, Caili
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.02.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:At present, human action recognition can be used in all walks of life, because the skeleton can transmit intuitive information without being affected by environmental factors. However, it only focuses on local information. In order to solve these problems, we introduce a neural network model for human body recognition in this paper. We propose a model named NEW-STGCN-CA. The model is based on a spatial–temporal graph convolution network (ST-GCN), which contains a new partition strategy and coordination attention (CA) mechanism. By integrating the CA attention mechanism model, we enable the network to focus on input-related information, ignore unnecessary information, and prevent information loss. Second, a new partitioning strategy is proposed for the sampled regions, which is used to enhance the connection between local information and global information. We proved that the Top-1 accuracy of the NEW-STGCN-CA model in the NTU-RGB+D 60 dataset reached 84.86%, which was 1.7% higher than the original model; the accuracy of Top-1 on the Kinetics-Skeleton dataset reached 32.40%, which was 3.17% higher than the original model. The experimental results show that NEW-STGCN-CA can effectively improve the algorithm’s accuracy while also having high robustness and performance.
ISSN:2076-3417
2076-3417
DOI:10.3390/app13031647