Temporal Reasoning Graph for Activity Recognition

Despite great success has been achieved in activity analysis, it still has many challenges. Most existing works in activity recognition pay more attention to designing efficient architecture or video sampling strategy. However, due to the property of fine-grained action and long term structure in vi...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 29; p. 1
Main Authors Zhang, Jingran, Shen, Fumin, Xu, Xing, Shen, Heng Tao
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Despite great success has been achieved in activity analysis, it still has many challenges. Most existing works in activity recognition pay more attention to designing efficient architecture or video sampling strategy. However, due to the property of fine-grained action and long term structure in video, activity recognition is expected to reason temporal relation between video sequences. In this paper, we propose an efficient temporal reasoning graph (TRG) to simultaneously capture the appearance features and temporal relation between video sequences at multiple time scales. Specifically, we construct learnable temporal relation graphs to explore temporal relation on the multi-scale range. Additionally, to facilitate multi-scale temporal relation extraction, we design a multi-head temporal adjacent matrix to represent multi-kinds of temporal relations. Eventually, a multi-head temporal relation aggregator is proposed to extract the semantic meaning of those features convolving through the graphs. Extensive experiments are performed on widely-used large-scale datasets, such as Something-Something, Charades and Jester, and the results show that our model can achieve stateof- the-art performance. Further analysis shows that temporal relation reasoning with our TRG can extract discriminative features for activity recognition.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2020.2985219