Shared Human-Machine Control for Self-Aware Prostheses

This paper presents a framework for shared, human-machine control of a prosthetic arm. The method employs electromyogram and peripheral neural signals to decode motor intent, and incorporates a higher-level goal in the controller to augment human effort. The controller derivation employs Markov Deci...

Full description

Saved in:
Bibliographic Details
Published in2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 6593 - 6597
Main Authors Dantas, Henrique, Nieven, Jacob, Davis, Tyler S., Fu, Xiao, Clark, Gregory A., Warren, David J., Mathews, V John
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.04.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents a framework for shared, human-machine control of a prosthetic arm. The method employs electromyogram and peripheral neural signals to decode motor intent, and incorporates a higher-level goal in the controller to augment human effort. The controller derivation employs Markov Decision Processes. The system is trained using a gradient ascent approach in which the policy is parameterized using a Kalman Filter and the goal is incorporated by adapting the Kalman filter output online. Results of experimental performance analysis of the shared controller when the goal information is imperfect are presented in the paper. These results, obtained from an amputee subject and a subject with intact arms, demonstrate that a system controlled by the human user and the machine together exhibit better performance than systems employing machine-only or human-only control.
ISSN:2379-190X
DOI:10.1109/ICASSP.2018.8461440