A Chat Application for Disabled using Convolutional Neural Network Deep Learning Algorithm
DOI:
https://doi.org/10.54368/qijirse.2.2.0014Keywords:
Convolutional Neural Networks (CNN), Deep Learning, Deep Learning Algorithm, Deep Learning Techniques, PreferenceAbstract
This research paper primarily concentrates on creating a video chat application designed for individuals who are unable to speak or hear, with a specific focus on utilizing Indian Sign Language (ISL). The application employs a Deep Learning algorithm, specifically CNN, to accurately recognize various hand gestures performed by the users. Once the user begins displaying hand gestures to the camera, the algorithm promptly identifies the corresponding phrase, number, or letter, and transmits it to the front end for constructing sentences. The goal of this project is to create a tool that will allow people to communicate with individuals who are innately deaf and dumb. This project is an example of the growing research area of Sign Language Recognition, which is becoming increasingly important in helping people with disabilities to interact with others and lead more fulfilling lives.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2023 Quing: IJIRSE
This work is licensed under a Creative Commons Attribution 4.0 International License.