The integration of contactless static pose recognition and dynamic hand motion tracking control system for industrial human and robot collaboration

Purpose – The purpose of this paper is to describe the integration of a gesture control system for industrial collaborative robot. Human and robot collaborative systems can be a viable manufacturing solution, but efficient control and communication are required for operations to be carried out effec...

Full description

Saved in:
Bibliographic Details
Published inIndustrial robot Vol. 42; no. 5; pp. 416 - 428
Main Authors Tang, Gilbert, Asif, Seemal, Webb, Phil
Format Journal Article
LanguageEnglish
Published Bedford Emerald Group Publishing Limited 17.08.2015
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Purpose – The purpose of this paper is to describe the integration of a gesture control system for industrial collaborative robot. Human and robot collaborative systems can be a viable manufacturing solution, but efficient control and communication are required for operations to be carried out effectively and safely. Design/methodology/approach – The integrated system consists of facial recognition, static pose recognition and dynamic hand motion tracking. Each sub-system has been tested in isolation before integration and demonstration of a sample task. Findings – It is demonstrated that the combination of multiple gesture control methods can increase its potential applications for industrial robots. Originality/value – The novelty of the system is the combination of a dual gesture controls method which allows operators to command an industrial robot by posing hand gestures as well as control the robot motion by moving one of their hands in front of the sensor. A facial verification system is integrated to improve the robustness, reliability and security of the control system which also allows assignment of permission levels to different users.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0143-991X
1758-5791
DOI:10.1108/IR-03-2015-0059