Extensible Hierarchical Method of Detecting Interactive Actions for Video Understanding

For video understanding, namely analyzing who did what in a video, actions along with objects are primary elements. Most studies on actions have handled recognition problems for a well‐trimmed video and focused on enhancing their classification performance. However, action detection, including local...

Full description

Saved in:
Bibliographic Details
Published inETRI journal Vol. 39; no. 4; pp. 502 - 513
Main Authors Moon, Jinyoung, Jin, Junho, Kwon, Yongjin, Kang, Kyuchang, Park, Jongyoul, Park, Kyoung
Format Journal Article
LanguageEnglish
Published Electronics and Telecommunications Research Institute (ETRI) 01.08.2017
한국전자통신연구원
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For video understanding, namely analyzing who did what in a video, actions along with objects are primary elements. Most studies on actions have handled recognition problems for a well‐trimmed video and focused on enhancing their classification performance. However, action detection, including localization as well as recognition, is required because, in general, actions intersect in time and space. In addition, most studies have not considered extensibility for a newly added action that has been previously trained. Therefore, proposed in this paper is an extensible hierarchical method for detecting generic actions, which combine object movements and spatial relations between two objects, and inherited actions, which are determined by the related objects through an ontology and rule based methodology. The hierarchical design of the method enables it to detect any interactive actions based on the spatial relations between two objects. The method using object information achieves an F‐measure of 90.27%. Moreover, this paper describes the extensibility of the method for a new action contained in a video from a video domain that is different from the dataset used.
Bibliography:http://onlinelibrary.wiley.com/doi/10.4218/etrij.17.0116.0054/epdf
ISSN:1225-6463
2233-7326
DOI:10.4218/etrij.17.0116.0054