Convolutional Neural Network Based American Sign Language Static Hand Gesture Recognition

Communicating through hand gestures with each other is simply called the language of signs. It is an acceptable language for communication among deaf and dumb people in this society. The society of the deaf and dumb admits a lot of obstacles in day to day life in communicating with their acquaintanc...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of ambient computing and intelligence Vol. 10; no. 3; pp. 60 - 73
Main Authors Ahuja, Ravinder, Jain, Daksh, Sachdeva, Deepanshu, Garg, Archit, Rajput, Chirag
Format Journal Article
LanguageEnglish
Published Hershey IGI Global 01.07.2019
Subjects
Online AccessGet full text
ISSN1941-6237
1941-6245
DOI10.4018/IJACI.2019070104

Cover

More Information
Summary:Communicating through hand gestures with each other is simply called the language of signs. It is an acceptable language for communication among deaf and dumb people in this society. The society of the deaf and dumb admits a lot of obstacles in day to day life in communicating with their acquaintances. The most recent study done by the World Health Organization reports that very large section (around 360 million folks) present in the world have hearing loss, i.e. 5.3% of the earth's total population. This gives us a need for the invention of an automated system which converts hand gestures into meaningful words and sentences. The Convolutional Neural Network (CNN) is used on 24 hand signals of American Sign Language in order to enhance the ease of communication. OpenCV was used in order to follow up on further execution techniques like image preprocessing. The results demonstrated that CNN has an accuracy of 99.7% utilizing the database found on kaggle.com.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1941-6237
1941-6245
DOI:10.4018/IJACI.2019070104