Contextual task-aware shared autonomy for assistive mobile robot teleoperation

For robot applications in unknown or even hazardous environments, such as search and rescue, it is difficult and stressful for human beings to merely simply teleoperate a mobile robot without its assistance. Consequently, means to facilitate an efficient shared autonomy between human and robot are t...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 3311 - 3318
Main Authors Ming Gao, Oberlander, Jan, Schamm, Thomas, Zollner, J. Marius
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2014
Subjects
Online AccessGet full text
ISSN2153-0858
DOI10.1109/IROS.2014.6943023

Cover

Loading…
More Information
Summary:For robot applications in unknown or even hazardous environments, such as search and rescue, it is difficult and stressful for human beings to merely simply teleoperate a mobile robot without its assistance. Consequently, means to facilitate an efficient shared autonomy between human and robot are the subject of much research work in the field of robotics. This paper proposes a novel shared autonomy system, which recognizes the user intention by estimating the task the user is executing based on the context information, and provides motion assistance according to the inferences. To incorporate the uncertainty of contextual task recognition, a Gaussian Mixture Regression model combined with a recursive Bayesian filter is adopted, which is adaptive to the implicit user model for task execution during operation. The proposed method is applied to the problem of controlling a flying robot in the context of two task types: doorway crossing and object inspection. Its benefits are demonstrated by the simulation results.
ISSN:2153-0858
DOI:10.1109/IROS.2014.6943023