Hierarchical Deep Learning for Intention Estimation of Teleoperation Manipulation in Assembly Tasks

In human-robot collaboration, shared control presents an opportunity to teleoperate robotic manipulation to improve the efficiency of manufacturing and assembly processes. Robots are expected to assist in executing the user's intentions. To this end, robust and prompt intention estimation is ne...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Cai, Mingyu, Patel, Karankumar, Iba, Soshi, Li, Songpo
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 28.03.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In human-robot collaboration, shared control presents an opportunity to teleoperate robotic manipulation to improve the efficiency of manufacturing and assembly processes. Robots are expected to assist in executing the user's intentions. To this end, robust and prompt intention estimation is needed, relying on behavioral observations. The framework presents an intention estimation technique at hierarchical levels i.e., low-level actions and high-level tasks, by incorporating multi-scale hierarchical information in neural networks. Technically, we employ hierarchical dependency loss to boost overall accuracy. Furthermore, we propose a multi-window method that assigns proper hierarchical prediction windows of input data. An analysis of the predictive power with various inputs demonstrates the predominance of the deep hierarchical model in the sense of prediction accuracy and early intention identification. We implement the algorithm on a virtual reality (VR) setup to teleoperate robotic hands in a simulation with various assembly tasks to show the effectiveness of online estimation.
ISSN:2331-8422
DOI:10.48550/arxiv.2403.19770