Trbaggboost: an ensemble-based transfer learning method applied to Indian Sign Language recognition

An efficient sign language recognition (SLR) system would help speech and hearing-impaired people to communicate with normal people. This work aims to develop a SLR system for Indian sign language using data acquired from multichannel surface electromyogram, tri-axis accelerometers and tri-axis gyro...

Full description

Saved in:
Bibliographic Details
Published inJournal of ambient intelligence and humanized computing Vol. 13; no. 7; pp. 3527 - 3537
Main Authors Sharma, S., Gupta, R., Kumar, A.
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.07.2022
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:An efficient sign language recognition (SLR) system would help speech and hearing-impaired people to communicate with normal people. This work aims to develop a SLR system for Indian sign language using data acquired from multichannel surface electromyogram, tri-axis accelerometers and tri-axis gyroscopes placed on both the forearms of signers. A novel ensemble-based transfer learning algorithm called  Trbaggboost  is proposed, which uses small amount of labeled data from a new subject along with labelled data from other subjects to train an ensemble of learners for predicting unlabeled data from the new subject. Conventional machine learning algorithms such as decision tree, support vector machine and random forest (RF) are used as base learners. The results for classification of signs using Trbaggboost are compared with commonly used transfer learning algorithms such as TrAdaboost, TrResampling, TrBagg, and simple bagging approach such as RF. Average accuracy for classification of signs performed by a new subject is achieved as 69.56% when RF is used without transfer learning. When just two observations of labeled data from a new subject are integrated with training data of an existing SLR system, average classification accuracy for TrAdaboost, TrResampling, TrBagg and RF are 71.07%, 72.92%, 76.10% and 76.79%, respectively. However, for the same number of labelled data from the new subject, Trbaggboost yields an average classification accuracy of 80.44%, indicating the effectiveness of the algorithm. Moreover, the classification accuracy for Trbaggboost improves up to 97.04% as the number of labelled data from the new user increase.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1868-5137
1868-5145
DOI:10.1007/s12652-020-01979-z