Action Recognition With Motion Diversification and Dynamic Selection

Motion modeling is crucial in modern action recognition methods. As motion dynamics like moving tempos and action amplitude may vary a lot in different video clips, it poses great challenge on adaptively covering proper motion information. To address this issue, we introduce a Motion Diversification...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 31; pp. 4884 - 4896
Main Authors Zhuang, Peiqin, Guo, Yu, Yu, Zhipeng, Zhou, Luping, Bai, Lei, Liang, Ding, Wang, Zhiyong, Wang, Yali, Ouyang, Wanli
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Motion modeling is crucial in modern action recognition methods. As motion dynamics like moving tempos and action amplitude may vary a lot in different video clips, it poses great challenge on adaptively covering proper motion information. To address this issue, we introduce a Motion Diversification and Selection (MoDS) module to generate diversified spatio-temporal motion features and then select the suitable motion representation dynamically for categorizing the input video. To be specific, we first propose a spatio-temporal motion generation (StMG) module to construct a bank of diversified motion features with varying spatial neighborhood and time range. Then, a dynamic motion selection (DMS) module is leveraged to choose the most discriminative motion feature both spatially and temporally from the feature bank. As a result, our proposed method can make full use of the diversified spatio-temporal motion information, while maintaining computational efficiency at the inference stage. Extensive experiments on five widely-used benchmarks, demonstrate the effectiveness of the method and we achieve state-of-the-art performance on Something-Something V1 & V2 that are of large motion variation.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2022.3189811