Attention-based encoder-decoder networks for workflow recognition

Behavior recognition is a fundamental yet challenging task in intelligent surveillance system, which plays an increasingly important role in the process of “Industry 4.0”. However, monitoring the workflow of both workers and machines in production procedure is quite difficult in complex industrial e...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 80; no. 28-29; pp. 34973 - 34995
Main Authors Zhang, Min, Hu, Haiyang, Li, Zhongjin, Chen, Jie
Format Journal Article
LanguageEnglish
Published New York Springer US 01.11.2021
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Behavior recognition is a fundamental yet challenging task in intelligent surveillance system, which plays an increasingly important role in the process of “Industry 4.0”. However, monitoring the workflow of both workers and machines in production procedure is quite difficult in complex industrial environments. In this paper, we propose a novel workflow recognition framework to recognize the behavior of working subjects based on the well-designed encoder-decoder structure. Namely, attention-based workflow recognition framework, termed as AWR. To improve the accuracy of workflow recognition, a temporal attention cell ( AttCell ) is introduced to draw dynamic attention distribution in the last stage of the framework. In addition, a Rough-to-Refine phase localization model is exploited to improve localization accuracy, which can effectively identify the boundaries of a specific phase instance in long untrimmed videos. Comprehensive experiments indicate a 1.4% mAP@IoU= 0.4 boost on THUMOS’14 dataset and a 3.4% mAP@IoU= 0.4 boost on hand-crafted workflow dataset detection challenge compared to the advanced GTAN pipeline respectively. More remarkably, the effectiveness of the workflow recognition system is validated in a real-world production scenario.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-021-10633-5