- Open Access
- Total Downloads : 91
- Authors : Houva K Basheer, Nimisha N.N, S.C Chandana, Roshni Alex
- Paper ID : IJERTCONV7IS05019
- Volume & Issue : NCACCT – 2019 (Volume 7 – Issue 05)
- Published (First Online): 08-05-2019
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Hand Modelling on Virtual Board using Leap Motion Sensor and Image Processing
Houva K Basheer, Nimisha N.N, S.C Chandana, Roshni Alex
Department of Electronics, MES College, Marampally
Abstract People with disabilities faces many problems while communicating with others who have limited knowledge in sign language. The main idea of our proposed paper will help such people in their communication and also from holding the mouse or typing letters in the keyboard. Hand modelling on virtual board creates a virtual touch surface in the air. This done by using leap motion sensor and raspberry pi. This paper proposes the idea of changing next generation more smarter. Leap motion sensor detects the hand and finger motion as input but requires no hand contact or touching. Then the signal is transferred into raspberry Pi. From there the corresponding image is created and displayed. The image is displayed through two stages training and testing stage. In training stage, we keep training the support vector machine(SVM) classifier in raspberry pi. During the testing stage the sensor will capture the written letter. After creating the input image it will compare with the existing image in SVM classifier and the corresponding image will be displayed.
Keywords Lleap motion sensor, SVM classifier, DCT, histogram
-
INTRODUCTION
Some people suffer a combination of disabilities like speech and hearing impairments. This category includes individuals suffering from hearing impairments, some partially while others totally, a condition called deafness. Individuals with voice or speech disorders use sign languages for their communication. Sign languages are the natural languages that use different means of expression for communication. American sign language, british sign language, the native indian sign language, Japanese sign languages are the different examples of sign language. While writing is a slow and inefficient way of communication. The proposed idea help the above mentioned in different possible ways. Hand modelling on virtual board creates a virtual touch surface in the air.The process is done by using leap motion sensor and raspberry Pi. Leap motion sensor detects the hand or finger motion without any direct contact with the sensor. It can track all
10 fingers upto 1/100th of a millimeter. The tracked signals are fed into the raspberry Pi with different feature sets like discrete cosine transform (DCT), histogram in the training stage. While in the testing stage, we write a letter or word and the sensor senses and the image is created. The generated image is compared with the existing image in the SVM classifier and similar image is displayed. The output ofleap motion controller is the depth data which consist of palm direction , fingertips positions and other relevant
points. The feature extraction time of leap sensor is very less.
-
PROPOSED WORK
Leap motion sensor
Leap motion sensor
Up
Raspberry Pi
Screen
System
Screen
System
Block Diagram
Hand modelling on virtual board using leap motion sensor and image processing can be achieved by leap motion controller and raspberry Pi. A virtual touch events can be created in the air. The leap motion sensor detects the hand or finger contact. From there the corresponding image is formed and displayed. The image is displayed by two stages the training stage and the testing stage. In the training we keep training the SVM classifier in raspberry Pi with different set of coefficients like DCT, histogram. During the testing stage,we write a word or letter and the sensor will capture and compare the image with the existing image in the SVM classifier in the raspberry pi. Then the corresponding image is displayed as the output. The captured image is cropped and the features are extracted in the training stage
-
METHODOLOGY
Leap motion sensor: Leap motion controller is something that can be used as an inputdevice for a computer. It can be placed on the desk, or if desired can be strapped onto a virtual reality headset. It senses the hand or finger motions. It can track all 10 fingers upto 1/100th of a millimeter.
-
Raspberry pi: It is a credit card sized computer that can be easily plugged into a computer or monitor. It is powered by a 5V micro USB AC charger or at least 4 AA batteries. The specifications are :
-
1 GB RAM
-
Quad core 1.2 GHz Broadcom BCM2837 64 bit CPU
-
40 pin GPIO4 USB 2 ports
-
3.3 v operating voltage
-
-
SVM classifier : A Support Vector Machine (SVM) constructs a hyper plane or set of hyper planes in a high- or infinite-dimensional space, which can be used for classification, regression, or other tasks. Intuitively, a good separation is achieved by the hyper plane that has the largest distance to the nearest training-data point of any class, since in general the larger the margin the lower the generalization error of theclassifier.
-
Histogram: An image histogram is a type of histogram that acts as a graphical representation of the tonal distribution in a digital image. It plots the number of pixels for each tonal value. By looking at the histogram for a specific image a viewer will be able to judge the entire tonal distribution at a glance.
-
-
ADVANTAGES AND FUTURE SCOPE
-
Medium is not required.
-
User friendly.
-
Reliable.
-
Faster character prediction.
-
Replace touch screen technology .
-
Virtualizing digital world.
-
Will replace all costly and massive input systems.
-
Will be a great use for blind and dumb people.
-
-
CONCLUSION
The main aim of the proposed model is to facilitate the communication among deaf and dumb people and also to change the way of using smartphones for the next generation. This will also help the people with disabilities in holding the pen or in using mouses or in typing letters from the keyboard. This technology will change the robotic industry. Finally it will provide much better realizations in virtualreality technology and artificial intelligence.
REFERENCES
-
Chuan, C Regina & Guardino,American Sign Language recognition using leap motion sensor. International Conference on Machine Learning and Applications, pp. 541544. IEEE Press, New York, 2014.
-
Tin Hninn & H. Maung,Real-Time Hand Tracking and Gesture Recognition System Using Neural Networks, World Academy of Science, Engineering and Technology , Vol. 50, pp 466- 470. – 2009.
-
G. R. S. Murthy & R. S. Jadon,A Review of Vision Based Hand Gestures Recognition,International Journal of Information Technology and Knowledge Management, Vol. 2, No. 2, pp 405- 410. -2009.
-
Ying Wu and Thomas S. Huang,Hand Modeling, analysis and Recognition, IEEE Signal Processing Magazine, Vol. 18, No. 3, pp 51-60. doi: 2014.