Get all the updates for this publication
Indian Sign Language Interpreter for Deaf & Mute People
Sign Language is the primary mode of interaction between the specially-abled deaf and mute population. As per the World Health Organization (WHO) survey, 460 million individuals including men, women and children suffer hearing disability. Out of these 460 million individuals, about 12.3 million belong to India. Most of this population relies on a third person who is capable of translating hand sign gestures to the regionally spoken language for them. But involving a translator hinders the conversation privacy. Here, the rapid development in the field of Artificial Intelligence (AI) and Machine Learning (ML) aids to convert hand sign gestures into words that can be understood by normal people. The Indian sign language interpreter for the mute and deafened people follows a Vision-based approach which uses the Machine Learning technique of 3D-CNN (3-Dimensional Convolutional Neural Network) and LSTM (Long Short-Term Memory) neural network to productively map the input-Indian sign language gesture images to their meaning. This system aims to interpret hand gestures images of Indian Sign Language for alphabets, numbers, sentences as well as emergency words (conversion of images of hand gesture to text/audio). This proposed technique with data from various sources will aid to alleviate the communication gap between the deaf and mute people and others.
Journal | International Journal of Creative Research Thoughts |
---|---|
Publisher | IJCRT |
Open Access | Yes |