Object-Level Trajectories Based Fine-Grained Action Recognition in Visual IoT Applications

The emerging computer vision and deep learning technologies are being applied to the intelligent analysis of sports training videos. In this paper, a deep learning based fine-grained action recognition (FGAR) method is proposed to analyze soccer training videos. The proposed method was applied to in...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 7; pp. 103629 - 103638
Main Authors Xiong, Jian, Lu, Liguo, Wang, Hengbing, Yang, Jie, Gui, Guan
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The emerging computer vision and deep learning technologies are being applied to the intelligent analysis of sports training videos. In this paper, a deep learning based fine-grained action recognition (FGAR) method is proposed to analyze soccer training videos. The proposed method was applied to indoor training equipment for evaluating whether a player has stopped a soccer ball successfully or not. First, the problem of FGAR is modeled as human-object (player-ball) interactions. The object-level trajectories are proposed as a new descriptor to identify fine-grained sports videos. The proposed descriptor can take advantage of high-level semantic and human-object interaction motions. Second, a cascaded scheme of deep networks based on the object-level trajectories is proposed to realize FGAR. The cascaded network is constructed by concatenating a detector network with a classifier network (a long-short-term-memory (LSTM)-based network). The cascaded scheme takes the advantage of the high efficiency of the detector on object detection and the outstanding performance of the LSTM-based network on processing time series. The experimental results show that the proposed method can achieve an accuracy of 93.24%.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2931471