Demo File

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

Sign to Text Language Translation in Real

Time Using Convolution Neural Network

Fenil Panseriya Harshil Patel Dhruvil Patel

Dept. of IT,DDU Dept. of IT,DDU Dept. of IT,DDU

Nadiad,India. Nadiad,India. Nadiad,India.

Abstract:-
Sign language is one of the most reliable ways of communicating with special
needs people, as it can be done anywhere. However, The most people do not
understand sign language. Therefore, we have devised an idea to make a
desktop application that can recognize sign language and convert it to text in
real time.Therefore, we have devised an idea to make a desktop application that
can recognize sign language and convert it to text in real time. In This research
uses American Sign Language (ASL) datasets and the Convolutional Neural
Networks (CNN) classification system.A CNN is highly efficient in tackling
computer vision problems and is capable of detecting desired features with high
degree of accuracy upon sufficient training. In the classification, the hand image
is first passed through a filter and after the filter is applied, the hand is passed
through a classifier which predicts the class of the hand gestures. This research
Focus on the accuracy of recognition.Application Resulted in almost 96%
accuracy for the 26 Latters of the alphabet.

Keywords:- CNN, ASL, training ,Sign Language Recognition.

1
1. Introduction
AI or artificial intelligence capable of solving problems. Computer vision is a
subcategory of Artificial Intelligence (AI). The goal of computer vision is to
extract useful information from images. However, it is challenging to
implement. Computer vision has been used to manufacture robots and photo
scans and is also used in the automotive, medical, mathematical and industrial
fields [1, 2].. American Sign Language (ASL) is natural syntax that has the same
etymological homes as being speaking languages, having completely different
grammar, ASL can be express with destiny of actions of the body. In native
America, people who are deaf or can’t see, it’s a reliable source of absurdity.
There is not any formal or familiar form of sign language. Different signal
languages are speculating in particular areas. For a case, British Sign Language
(BSL) is an entirely different language from an ASL, and USA people who
familiarise with ASL would not easily understand BSL. Some nations adopt
capabilities of ASL of their sign languages. Sign language is a way of verbal
exchange via human beings diminished by speech and listening to loss. Around
360 million human beings globally be afflicted via unable to hearing loss out of
which 328000000 are adults and 32000000 children. hearing impairment extra
than 40 decibels in the better listening to ear is referred as disabling listening
to loss. Thus, with growing range of people with deafness, there is moreover
rise in demand for translators. Minimizing the verbal exchange gap among
listening to impaired and regular humans turns into a want to make certain
effective conversation among all. Sign language translation is one of the
amongst most growing line of research nowadays and its miles the maximum
natural manner of communication for the humans with hearing impairments.

A hand gesture recognition gadget can offer an opportunity for deaf people to
talk with vocal humans without the need of an interpreter. The system is built
for the automated conversion of ASL into textual content and speech.. As such,
this research aims to recognize hand gestures or ASL, which the system will
change into text that can be read in real-time, making communication with
people with special needs easier. Hand gesture recognition is also in Human-
Computer Interaction (HCI) because it interacts with the user directly. Human-
Computer Interaction (HCI) is the study, planning, or design of interaction
between users and computers. One of the functional interactions for a hand

2
gesture recognition system is displaying text composed of alphabets read by
the system [5, 6]. In this research, we will utilize Computer Vision and Pattern
Recognition technology to create a desktop application that can detect hand
movements in real-time using a webcam/live camera. Afterward, we will use
American Sign Language (ASL) datasets and the Convolutional Neural Networks
(CNN) classification system. This research focuseson the accuracy of
recognizing letters of the alphabet and provides the results in a text in real-
time.

3
2. Litrature Surwey

4
3. Methodology
3.1)Dataset

3.2)Implementing Algorithm

3.3)Evaluate

5
4) Result and Discussion

6
5) Conclusion

You might also like