- Open Access
- Authors : Kuo-Pao Yang, Bonnie Achee, Ashley Lewis, Sanele Hamon, Laura Sulzer
- Paper ID : IJERTV13IS120057
- Volume & Issue : Volume 13, Issue 12 (December 2024)
- Published (First Online): 20-12-2024
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Talk to the Hand: Sign Language Glove
Kuo-Pao Yang, Bonnie Achee, Ashley Lewis, Sanele Hamon, Laura Sulzer Computer Science Department
Southeastern Louisiana University Hammond, LA 70402, USA
Abstract This paper is intended for people that are deaf or hard of hearing. It gives them the ability to communicate with others who dont know sign language. American Sign Language (ASL) is not a common form of communication, with just over half a million people using it throughout the United States. Many deaf or hearing-impaired individuals have a hard time purveying what they are trying to express without being able to speak English. Out of the 333 million people living in the U.S. (from the 2022 census), 6.6 million are profoundly deaf. The discrepancy between ASL users and deaf individuals is stark, and it is even more so compared to the overall population. With the innovation of technology, the development of a glove that is capable of translating sign language into text is feasible. It can break the language barrier amongst deaf or hearing-impaired people with people unfamiliar with American Sign Language. The glove utilizes five flex sensors, an accelerometer, a Bluetooth module, and a micro-controller board to help translate sign language in real time as the combination of these components allow the glove to detect the motions of ASL.
Keywords American Sign Language (ASL); Arduino; Flex Sensor; Accelerometer; Bluetooth
-
INTRODUCTION
There are very few people who know American Sign Language (ASL) in respect to the American population. For those that have limited ways to communicate, writing and gesturing generally help them express their thoughts to the public. In service settings, such as restaurants or drive-thru lanes, speaking/hearing impaired people will often type their orders on their phones or write them down on a piece of paper. Their way of deciphering what non-ASL users are saying is either through writing or reading lips. This form of inefficient communication can end up being time-consuming or confusing for at least one party involved. If one already knows ASL as their main form of communication, it would be easy for them to speak in the way they know best and could be understood more efficiently. Thus, allowing the user to communicate in a more comfortable setting without any discrepancies while communicating with someone unfamiliar with ASL. The Sign Language Glove would allow the user to communicate due to its functionality of translating ASL in real-time through text on an app.
Along with providing a more diverse and efficient way of communicating with non-ASL users, the glove can be implemented in a teaching setting for those wanting to learn ASL. Anyone wanting immediate feedback and an interactive learning experience can utilize the glove as it will provide a
hands-on learning environment. It can make learning American Sign Language more interactive and exciting for those who were unsure or wary of it in the first place. The Sign Language Glove is an important device [1] since it breaks the language barrier between non-ASL users and deaf or hard-of-hearing individuals. It is important to innovate devices such as a sign language translation glove as it provides people around the world with the ability to communicate and converse with each other in their language.
It is important to note that there are several devices and tools that help aid those that are hearing-impaired or deaf [2]. One of these solutions is the use of hearing aids, and although this technology has been proven to be extremely useful there are caveats with this device. The use of this technology is extremely limited depending on the persons level of hearing loss. There have been instances this device could provide an unfavorable experience for the person using this technology as it could make their environment uncomfortable due to hearing loud noises and other high frequencies. Instead of providing an uncomfortable experience, the Sign Language Glove allows the user to communicate in their own language in an environment that is comfortable for them, allowing for a more effective way of communication. Another solution would be the inclusion of an interpreter, but problems arise with this solution as well. For day-to-day communication, an interpreter will not be available all the time. Along with this, personal interpreters break the connection between the communication of in-person contact. The focus would be on the interpreter with the non-ASL user talking to them instead of the deaf or hearing-impaired individual. The Sign Language Glove provides a personal connection between a deaf or hearing-impaired person and a non-ASL user by translating ASL in real time.
This paper provides insight into the development of a glove that translates sign language in real time through text on an app called Dabble. The problems involved pertaining to deaf or hearing-impaired individuals are discussed thoroughly. Other solutions and devices used to help these individuals are differentiated with the usage of the Sign Language Glove, along with the further discussion of what could be done differently or further explored. The implementation of the glove is fully explained regarding the methodology and process of design for the development of creating a glove that has the capability to translate sign language in real time. Furthermore, this paper concludes and provides future efforts to be made with this project along with further problem solving of how it could improve for the better.
-
RELATED WORK
The development of ASL gloves and technology that recognizes sign language has been a significant area of research and the goal of bridging the communication gap between deaf or hard-of-hearing individuals and the general population. To explain in greater depth the concepts and ideas behind the creation of this project, it is necessary to give an illustration of the works that predicated and inspired the creation of the Sign Language Glove.
Today, the technological landscape has some very prominent services available for those that are hearing-impaired. There are many innovations that are originally intended for hearing- disabled, but most individuals without the impairment can use and benefit from them as well. One very common technology that fits this description is real-time captioning; it is used not only by the deaf, but also by most people in everyday life. Television applications often have caption options to assist with a visualization of words in the same manner. This helps the deaf to understand more accurately what is going on around them and gives them the ability to participate in society in a way they otherwise would not be able to.
The concept of accessibility innovations differs from the goal of this project by helping the deaf to more easily communicate (on their end) with those that can hear. When faced with social interactions that require a steady form of communication, the deaf usually have pre-written scripts on notepads or phones. They often manually type or write their responses to questions asked in everyday scenarios. There is also an app that is in essence the reverse of the Sign Language Glove: Hand Talk. Instead of translating ASL to English, it renders ASL to words in English. Both innovations have a similar purpose, in terms of bridging the gap between the hearing and the deaf.
The evolution of wearable technology and assistive devices has significantly contributed to the development of innovative solutions for sign language recognition and communication. Wearable devices, suc as smart gloves, wristbands, and sensors, equipped with advanced technologies, including accelerometers, gyroscopes, flex sensors, and communication interfaces, have been employed to capture, analyze, and interpret hand movements and gestures for sign language recognition and translation. In 2001, the first, albeit primitive, sign glove was created by a student in Colorado named Ryan Patterson. The leather glove used was outfitted with ten sensors, and when the wearer mimed the spelling out a word, it was then relayed to a computer.
There are several approaches to implementing technology for a similar purpose. One such method would be [3] flex sensor- based wearable gloves for robotic gripper control. There has also been real-time embedded recognition of sign language alphabet fingerspelling using IMU-based gloves [4] "Veritas," a Sign Language-To-Text Translator using machine learning and computer vision [5] and interactive hand pose estimation using a stretch-sensing soft glove [6]. And, additionally, to highlight the potential of technology-assisted learning in sign language education, things such as a virtual environment for learning ASL [7] have been developed. Another invention of note that inspired the Sign Language Glove would be Gollners Mobile Lorm Glove, a communication device specifically designed for deaf-blind individuals, which further underscores
the diverse applications and potential of wearable technology in enhancing communication accessibility.
A new way to implement gesture recognition for sign language is machine learning and artificial intelligence. Machine learning for gesture recognition involves training a model to learn and discern hand positions, movements, and various gesticulations derived from input data. Cameras and sensors would detect the necessary information from each image or set of data they pick up [8]. There have been several such cases where machine learning was used for sign language interpretation and translation [9].
Similarly, Paudyal [10] conducted a comparative study on ASL alphabet recognition using wearable armbands in 2019, and Huenerfauth [11] focused on motion-capture glove calibration to ensure that their data collection was accurate. Both instances of research highlight the importance of precision in gesture recognition to imply reliability. Saquib [12] also applied machine learning algorithms to analyze hand signals and classify them. This research utilized wearable technology to capture gestures with sensors and thus were able to demonstrate the effectiveness of machine learning in enhancing the precision, accuracy, and reliability of sign language recognition technology. The ASL Alphabet is shown in Fig. 1.
Fig. 1. American Sign Language (ASL) Alphabet
Such wearable technology combined with precise gesture recognition algorithms have been instrumental in advancing systems that can detect and recognize ASL. Another innovation to be highlighted was created by Mummadi [13] when he began exploring real-time embedded recognition of sign language alphabet fingerspelling, comparable to Ryan Pattersons glove. He also demonstrated the potential of sensor-based wearables to capture and interpret ASL based on hand/finger movement and positions.
These past projects have many similarities as well as differences with the ASL glove highlighted in this paper. Instead of utilizing machine learning, each value derived from the sensors was monitored and implemented manually.
-
IMPLEMENTATION
-
Software Approach and Implementation
To implement the code for the Sign Language Glove, the Arduino Integrated Development Environment (IDE) is utilized to run the program. Arduino IDE is a software application specific to Arduino boards used to program the Elegoo Mega R3 board [14]. The board is virtually the same device as an Arduino Mega board and is compatible with the Arduino IDE. The code written inside the IDE gets loaded onto the boards microcontroller [15]. The code is modeled after C/C++ programming languages but with Arduino IDE-specific functions and structure.
The process of deriving each value comes from a block of code separate from the final program, which tracks the bend of each flex sensor (one at a time for more precision) shown in Fig. 2 as well as raw numerical values printed in the serial monitor. The range of each sensor is 0-1023. According to finger position in that range, the tester can record an accurate reading of each sensors value at a certain position. Once each fingers value is recorded and implemented into the code, the wearer can test whether the readings were accurate enough to present the desired letter or word.
Fig. 2. Flex Sensor Value Graph
In Fig. 3, each flex sensor value is being initialized. According to the snapshot of the code, each finger is assigned to a corresponding analog pin value. The implementation of each letter utilizes an if-statement, depicted in Fig. 3. Combining the five flex sensors with the hand position results in detecting letters from the sign language alphabet. The first one displays the letter a in the terminal (Serial.print()) as well as passes it to the Bluetooth Module (Terminal.print()) when the values for each finger are in the specified ranges. The else-if statement follows the same idea but works so that it will not confuse similar ranges and print out multiple letters like b at the same time. There is also a delay implemented at the end to ensure that the wearer has enough time to sign a letter or word and does not have to worry about signing anything by mistake when switching hand or finger positions.
To display the programs output of letters and words through text, a pre-existing app called Dabble is used. The app (available on both Android and IOS) is initialized at the beginning of the program with everything else using the Dabble.begin keyword. Each letter, word, or phrase in the program can be printed to the app through the aforementioned Terminal.print() function. Upon starting up the app and
selecting the HC-06 as the Bluetooth device to connect to, the app displays the output of the program.
The program handles the process of the user signing words differently than it does letters. This is because there is not a static value for a word as they often require movement or changes to finger position to be accurate to ASL. To adjust for this, different functions had to be applied so that the code would be able to distinguish between a wave, or gesture instead of a letter.
a = analogRead(A0); // Thumb (Black)
b = analogRead(A1); // Index (Brown)
c = analogRead(A2); // Middle (Grey)
d = analogRead(A6); // Ring (Purple)
e = analogRead(A7); // Little (Blue)
if (a <= 590 && b > 850 && c > 790 && d > 900 && e > 900 && ay < 8000) {
Serial.print("a"); Terminal.print("a");
}
else if (880 < a && a < 970 && b < 200 && c < 200 && d < 200 && e < 200 && ay < 6000) {
Serial.print("b"); Terminal.print("b");
}
Fig. 3. Sample Code for Detecting Letters
Overall, the process of coding the letters and words for the Sign Language Glove was tedious as the specific, very small range for the position of each finger had to be derived accurately enough so that a similar hand position wouldn't be mistakenly printed as well. Because of this, the glove would only be completely accurate to one wearer; since each hand is different, there would likely be erroneous outputs due to a change of size or positioning if the wearer didnt sign the letter precisely how the other did.
-
Hardware Approach, Equipment, and Programming
The Sign Language Glove is composed of five flex sensors, an HC-06 Bluetooth Module, an MPU-6050 Accelerometer, and an Elegoo Mega R3 board. I Fig. 4, it provides a diagram that consists of the components stated above and how they are connected, allowing the Sign Language Glove to work.
The project first began with an Arduino Uno, but later throughout the process, there was a need for more analog input pins. The issue began due to the MPU-6050 needing specific analog pins on the Arduino Uno that were already being used for the flex sensors. Due to this problem occurring, the Elegoo Mega R3 board was considered as it provided enough analog input pins for each component to work at the same time.
The HC-06 Module is implemented in this design and provides the user with the capability to translate sign language in real- time through text on an app. Because the Bluetooth module is not needed to detect the movement of the glove it is plugged directly into the breadboard instead. The use of Bluetooth is extremely important for this project because it allows the user the ability to translate sign language through text on an app as it provides a direct connection between the Elegoo Mega R3 board and the users phone.
In Fig. 4, each flex sensor is connected to the breadboard through a 10-kilo ohm resistor allowing the connection to the 5V to occur. The bigger breadboard is utilized for this project
as it allows all the components to be powered at once through the 5V connection. Furthermore, through the ground connection, each flex sensor is connected to their own individual analogue pins. Each finger is assigned their own analogue pin meaning A0 is plugged into the flex sensor for the thumb, A1 is plugged into the index finger, A2 is plugged into the middle finger, and A6 is plugged into the ring finger, and A7 is plugged into the pinky finger.
Fig. 4. Circuit Diagram for Sign Language Glove
The Sign Language Glove operates with two types of sensors that help detect sign language in real-time, the gyroscope and the flex sensors. In Fig. 5, the five flex sensors are hot glued onto each finger of the glove, and the MPU-6050 is sewed onto the backside of the palm of the glove to provide more precise and stable readings. These two components are the key to sensing sign language. The flex sensors detect how much the fingers bend and apply numerical values depending on the direction and force of how much the user bends their fingers. The MPU-6050 detects the tilt and orientation of the hand providing the glove with the ability to detect when the users hand rotates.
Fig. 5. Hardware Setup for Sign Language Glove
-
-
EVALUATION
To test out our solution, there were two separate graphs created from pre-existing libraries, one of which was sourced from GitHub: mpu6050. Finding the right function required lots of trial and error, as well as reading through as much of the source code as possible. The MPU-6050 combines a 3-axis gyroscope as well as a 3-axis accelerometer to estimate the machines orientation and track its motion by calculating the roll, pitch, and yaw. With the library included in our program, there are many functions that help detect the orientation and rotation of the users hand.
To detect the values of the flex sensors, each flex sensor was analyzed separately, allowing for a more precise detection on the different ranges required to distinguish each letter or phrase, and to make sure similar letters or hand positions would not be mistaken for other gestures. This method resulted in extreme success due to the values being monitored and implemented manually one at a time for each flex sensor. Combining the MPU-6050 and the five flex sensors yielded promising and successful results detecting letters from the sign language alphabet along with simple words like hello and phrases like I am shown in Fig. 6.
Fig. 6. Display the Detected Letters on the Users Phone
Other solutions that involve supporting deaf and hard of hearing people are the use of hearing aids. Hearing aids are an innovative and well supportive device that helps assist those in need. Although this piece of technology has proven to be beneficial, it is restricted on the users level of hearing loss, excluding individuals whose hearing loss is too far gone. Those that have experienced the use of hearing aids indicate that it could be overstimulating as the users are not used to hearing loud noises at higher frequencies than usual. The transition to using hearing aids in ones daily life is another big change that will take the user time to get used to.
The Sign Language Glove is an innovative and impressive device that aids individuals who are deaf and hearing-impaired. This glove helps improve and break the language barrier allowing for further communication between individuals that
utilize ASL in their daily life and non ASL users. With the help of the gloves translation of sign language in real time through text on an app, it provides a comfortable experience for the user to communicate. This solution towards the language barrier problem is an innovative piece of technology due to the inclusion of flex sensors, an MPU-6050, and an HC-06 Bluetooth module as it provides the ability to detect both the bending of the users fingers along with the orientation of the users hand.
-
CONCLUSION AND FUTURE WORK
In this research the problem regarding deaf and hard of hearing individuals was introduced stating that these people have limited ways of communicating with non ASL users. Deaf or hearing-impaired people would have to write down what they want to say on a piece of paper or a mobile phone. With the Sign Language Glove, it allows the users the ability to communicate with the rest of the world utilizing sign language in a more comfortable setting instead of struggling to communicate with someone else who is not familiar with sign language.
The sign language glove can recognize the whole alphabet and is able to detect basic words that are used in everyday conversation like hello and I am. Due to the Sign Language Glove having the ability to translate sign language in real time through text, the user can show whoever they are having a conversation with the words that they are signing through the use of a pre-existing app known as Dabble. The Sign Language Glove discussed in this research is superior to other solutions for hearing-impaired individuals because it allows them to use ASL even when speaking to someone who does not know ASL. This glove does not strip deaf or hard of hearing people from their ability to communicate in a manner that they are comfortable with, and the Sign Language Glove is able to break the language barrier between ASL users and non ASL users.
In the future, this glove will be improved by implementing more words that can be recognized by the glove. As of now, the user can only communicate a simple greeting with the glove along with letters in the alphabet. Furthermore, another important implementation would be the addition of another glove allowing the user the ability to use sign language with both hands instead of being limited to one hand. This would allow for more complex conversations to occur with the addition of two gloves and its ability to recognize more words. Visually the design of the glove will be streamlined by covering up the wires and the flex sensors. Options for the design on the glove itself will also be provided for the user to fit their style. These future implementations are important because they provide further insight into how to improve this project for the better.
-
K. Roy, D. Idiwal, A. Agrawal, and B. Hazra, Flex Sensor Based Wearable Gloves for Robotic Gripper Control, Proceedings of the 2015 Conference on Advances in Robotics (AIR '15), 2015, pp. 1-5.
-
R. Jitcharoenport, P. Senachakr, M. Dahlan, A. Suchato, E. Chuangsuwanich, and P. Punyabukkana, Recognizing words in Thai Sign Language Using Flex Sensors and Gyroscopes, Proceedings of the 11th International Convention on Rehabilitaton Engineering and Assistive Technology (i-CREATe '17), 2017, pp. 1-4.
-
S. Njazi, and S. Ng, Veritas: A Sign Language-To-Text Translator Using Machine Learning and Computer Vision, Proceedings of the 2021 4th International Conference on Computational Intelligence and Intelligent Systems (CIIS '21), 2021, pp. 55-60.
-
O. Glauser, S. Wu, D. Panozzo, O. Hilliges, and O.Sorkine-Hornung, Interactive Hand Pose Estimation Using a Stretch-Sensing Soft Glove, ACM Transactions on Graphics 38(4): 1-15, 2019.
-
J. Schioppo, Z. Meyer, D. Fabiano, and S. Canavan, Sign Language Recognition: Learning American Sign Language in a Virtual Environment, Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (CHI EA '19), 2019, pp. 1-6.
-
K.P. Yang, P. McDowell, P. Devkota, S. Pradhan, R. Bhandari, and Z. Madewell, Detecting Gas Leaks: A Case Study in IoT Technologies, European Journal of Engineering and Technology Research (EJ-ENG), ISSN 2736-576X, vol. 6, no. 7, pp. 103-106, December 2021.
-
M. Munir, F. Alam, S. Ishrak, S. Hussain, Md. Shalahuddin, and M. Islam, A Machine Learning Based Sign Language Interpretation System for Communication with Deaf-mute People, Proceedings of the
XXI International Conference on Human Computer Interaction (Interacción '21), 2021, pp. 1-9.
-
P. Paudyal, J. Lee, A. Banerjee, and S. Gupta, A Comparison of Techniques for Sign Language Alphabet Recognition Using Armband Wearables, ACM Transactions on Interactive Intelligent Systems 9(2- 3): 126, 2019.
-
M. Huenerfauth and P. Lu, Accurate and Accessible Motion-Capture Glove Calibration for Sign Language Data Collection, ACM Transactions on Accessible Computing, 3(1): 132, 2010.
-
N. Saquib, and A. Rahman, Application of Machine Learning Techniques for Real-Time Sign Language Detection Using Wearable Sensors, Proceedings of the 11th ACM Multimedia Systems Conference (MMSys '20), 2020, pp. 178-189.
-
C. Mummadi, F. Leo, K. Verma, S. Kasireddy, P. Scholl, and K. Laerhoven, Real-time Embedded Recognition of Sign Language Alphabet Fingerspelling in an IMU-Based Glove, Proceedings of the 4th International Workshop on Sensor-based Activity Recognition and Interaction (iWOAR '17), 2017, pp. 1-6.
-
K.P. Yang, G. Alkadi, and T. Parker, Converting SVG to G-code for 3D Printers, International Journal of Scientific Engineering and Science (IJSES), ISSN 2456-7361, 5(4): 36-39, April 2021.
-
K.P. Yang, G. Alkadi, B. Gautam, A. Sharma, D. Amatya, S. Charchut, and M. Jones, "Park-A-Lot: An Automated Parking Management System," Journal of Computer Science and Information Technology (CSIT), ISSN 2331-6063, 1(4): 276-279, December 2013.
REFERENCES
-
K.P. Yang, B. Achee, P. Laskie, B. Norris, and L. Vires, The Environmental (EVE) Sleeve: A Tool to Monitor the Environment for Hazardous Gases, International Journal of Engineering Research & Technology (IJERT), ISSN 2278-0181, 12(10): 1-5,
IJERTV12IS100054, October 2023.
-
K.P. Yang, P. McDowell, R. Demourelle, T. Parker, and E. Langstonirst, 3D Printing: A Custom-Built 3D Printer with Wireless Connectivity, SSRG International Journal of Computer Science and Engineering (SSRG-IJCSE), ISSN 2348 8387, 7(10): 1-5, October 2020.