A Bayesian Shared Control Approach for Wheelchair Robot With Brain Machine Interface

To enhance the performance of the brain-actuated robot system, a novel shared controller based on Bayesian approach is proposed for intelligently combining robot automatic control and brain-actuated control, which takes into account the uncertainty of robot perception, action and human control. Base...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 28; no. 1; pp. 328 - 338
Main Authors Deng, Xiaoyan, Yu, Zhu Liang, Lin, Canguang, Gu, Zhenghui, Li, Yuanqing
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To enhance the performance of the brain-actuated robot system, a novel shared controller based on Bayesian approach is proposed for intelligently combining robot automatic control and brain-actuated control, which takes into account the uncertainty of robot perception, action and human control. Based on maximum a posteriori probability (MAP), this method establishes the probabilistic models of human and robot control commands to realize the optimal control of a brain-actuated shared control system. Application on an intelligent Bayesian shared control system based on steady-state visual evoked potential (SSVEP)-based brain machine interface (BMI) is presented for all-time continuous wheelchair navigation task. Moreover, to obtain more accurate brain control commands for shared controller and adapt the proposed system to the uncertainty of electroencephalogram (EEG), a hierarchical brain control mechanism with feedback rule is designed. Experiments have been conducted to verify the proposed system in several scenarios. Eleven subjects participated in our experiments and the results illustrate the effectiveness of the proposed method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1534-4320
1558-0210
1558-0210
DOI:10.1109/TNSRE.2019.2958076