Cortical control of a prosthetic arm for self-feeding

Robotic limbs: Controlled by a thought Brain-machine interfaces have mostly been used previously to move cursors on computer displays. Now experiments on macaque monkeys show that brain activity signals can control a multi-jointed prosthetic device in real-time. The monkeys used motor cortical activ...

Full description

Saved in:
Bibliographic Details
Published inNature Vol. 453; no. 7198; pp. 1098 - 1101
Main Authors Velliste, Meel, Perel, Sagi, Spalding, M. Chance, Whitford, Andrew S., Schwartz, Andrew B.
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 19.06.2008
Nature Publishing
Nature Publishing Group
Subjects
Online AccessGet full text
ISSN0028-0836
1476-4687
1476-4687
1476-4679
DOI10.1038/nature06996

Cover

Loading…
More Information
Summary:Robotic limbs: Controlled by a thought Brain-machine interfaces have mostly been used previously to move cursors on computer displays. Now experiments on macaque monkeys show that brain activity signals can control a multi-jointed prosthetic device in real-time. The monkeys used motor cortical activity to control a human-like prosthetic arm in a self-feeding task, with a greater sophistication of control than previously possible. This work could be important for the development of more practical neuro-prosthetic devices in the future. A system where monkeys use their motor cortical activity to control a robotic arm in a real-time self-feeding task, showing a significantly greater sophisitication of control than in previous studies, is demonstrated. This work could be important for the development of more practical neuro-prosthetic devices in the future. Arm movement is well represented in populations of neurons recorded from the motor cortex 1 , 2 , 3 , 4 , 5 , 6 , 7 . Cortical activity patterns have been used in the new field of brain–machine interfaces 8 , 9 , 10 , 11 to show how cursors on computer displays can be moved in two- and three-dimensional space 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 . Although the ability to move a cursor can be useful in its own right, this technology could be applied to restore arm and hand function for amputees and paralysed persons. However, the use of cortical signals to control a multi-jointed prosthetic device for direct real-time interaction with the physical environment (‘embodiment’) has not been demonstrated. Here we describe a system that permits embodied prosthetic control; we show how monkeys ( Macaca mulatta ) use their motor cortical activity to control a mechanized arm replica in a self-feeding task. In addition to the three dimensions of movement, the subjects’ cortical signals also proportionally controlled a gripper on the end of the arm. Owing to the physical interaction between the monkey, the robotic arm and objects in the workspace, this new task presented a higher level of difficulty than previous virtual (cursor-control) experiments. Apart from an example of simple one-dimensional control 23 , previous experiments have lacked physical interaction even in cases where a robotic arm 16 , 19 , 24 or hand 20 was included in the control loop, because the subjects did not use it to interact with physical objects—an interaction that cannot be fully simulated. This demonstration of multi-degree-of-freedom embodied prosthetic control paves the way towards the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:0028-0836
1476-4687
1476-4687
1476-4679
DOI:10.1038/nature06996