Adaptive-Constrained Impedance Control for Human-Robot Co-Transportation

Human-robot co-transportation allows for a human and a robot to perform an object transportation task cooperatively on a shared environment. This range of applications raises a great number of theoretical and practical challenges arising mainly from the unknown human-robot interaction model as well...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on cybernetics Vol. 52; no. 12; pp. 13237 - 13249
Main Authors Yu, Xinbo, Li, Bin, He, Wei, Feng, Yanghe, Cheng, Long, Silvestre, Carlos
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.12.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Human-robot co-transportation allows for a human and a robot to perform an object transportation task cooperatively on a shared environment. This range of applications raises a great number of theoretical and practical challenges arising mainly from the unknown human-robot interaction model as well as from the difficulty of accurately model the robot dynamics. In this article, an adaptive impedance controller for human-robot co-transportation is put forward in task space. Vision and force sensing are employed to obtain the human hand position, and to measure the interaction force between the human and the robot. Using the latest developments in nonlinear control theory, we propose a robot end-effector controller to track the motion of the human partner under actuators' input constraints, unknown initial conditions, and unknown robot dynamics. The proposed adaptive impedance control algorithm offers a safe interaction between the human and the robot and achieves a smooth control behavior along the different phases of the co-transportation task. Simulations and experiments are conducted to illustrate the performance of the proposed techniques in a co-transportation task.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2168-2267
2168-2275
2168-2275
DOI:10.1109/TCYB.2021.3107357