Wearable Smart Band for American Sign Language Recognition With Polymer Carbon Nanocomposite-Based Pressure Sensors

The conventional camera-based systems and electronic gloves for gesture recognition are limited by the influence of lighting conditions, occlusions, and movement restrictions. A wearable smart band with integrated nanocomposite pressure sensors has been developed to overcome these shortcomings. The...

Full description

Saved in:
Bibliographic Details
Published inIEEE sensors letters Vol. 5; no. 6; pp. 1 - 4
Main Authors Ramalingame, Rajarajan, Barioul, Rim, Li, Xupeng, Sanseverino, Giuseppe, Krumm, Dominik, Odenwald, Stephan, Kanoun, Olfa
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.06.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The conventional camera-based systems and electronic gloves for gesture recognition are limited by the influence of lighting conditions, occlusions, and movement restrictions. A wearable smart band with integrated nanocomposite pressure sensors has been developed to overcome these shortcomings. The sensors consist of homogeneously dispersed carbon nanotubes in a polydimethylsiloxane polymer matrix prepared by an optimized synthesis process. The sensor band can actively monitor contractions/relaxations of muscles in the arm due to the sensor's high sensitivity in the low forces and stability. The band has eight sensors placed on a stretchable adhesive textile material and connected to a data logger with a multiplexed sensor interface and wireless communication capabilities. The novel smart band was validated by measurements on ten subjects to perform numerical gestures in American sign language from 0 to 9 with ten trials each. The data were recorded at 100 Hz, and a total of 100 datasets were generated for each subject. By feeding the datasets to an extreme machine learning algorithm that selects features, weights, and biases to classify the gestures, an overall gesture recognition accuracy of 93% could be achieved.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2475-1472
2475-1472
DOI:10.1109/LSENS.2021.3081689