Decoding static and dynamic arm and hand gestures from the JPL BioSleeve
This paper presents methods for inferring arm and hand gestures from forearm surface electromyography (EMG) sensors and an inertial measurement unit (IMU). These sensors, together with their electronics, are packaged in an easily donned device, termed the BioSleeve, worn on the forearm. The gestures...
Saved in:
Published in | 2013 IEEE Aerospace Conference pp. 1 - 9 |
---|---|
Main Authors | , , , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.03.2013
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper presents methods for inferring arm and hand gestures from forearm surface electromyography (EMG) sensors and an inertial measurement unit (IMU). These sensors, together with their electronics, are packaged in an easily donned device, termed the BioSleeve, worn on the forearm. The gestures decoded from BioSleeve signals can provide natural user interface commands to computers and robots, without encumbering the users hands and without problems that hinder camera-based systems. Potential aerospace applications for this technology include gesture-based crew-autonomy interfaces, high degree of freedom robot teleoperation, and astronauts' control of power-assisted gloves during extra-vehicular activity (EVA). We have developed techniques to interpret both static (stationary) and dynamic (time-varying) gestures from the BioSleeve signals, enabling a diverse and adaptable command library. For static gestures, we achieved over 96% accuracy on 17 gestures and nearly 100% accuracy on 11 gestures, based solely on EMG signals. Nine dynamic gestures were decoded with an accuracy of 99%. This combination of wearableEMGand IMU hardware and accurate algorithms for decoding both static and dynamic gestures thus shows promise for natural user interface applications. |
---|---|
ISBN: | 9781467318129 1467318124 |
ISSN: | 1095-323X 2996-2358 |
DOI: | 10.1109/AERO.2013.6497171 |