- Open Access
- Total Downloads : 507
- Authors : M.Vasavi, Py. Ramesh, Dr.D.N.Rao
- Paper ID : IJERTV2IS101143
- Volume & Issue : Volume 02, Issue 10 (October 2013)
- Published (First Online): 26-10-2013
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Gesture Recognition For User Interaction With Computer Without Sensors
M.Vasavi PY. Ramesh Dr.D.N.Rao
M.Tech,Embedded systems Associate Professor ,JBREC Principal , JBREC
Abstract
The wide spread usage of PC and its applications have enabled us to use it in our daily activities like listening music and playing games.In earlier times, control of computer was only using mouse. The development of ever- changing technology led to changes in a way to control the computer. Because of the demands of time, now it no longer just a computer with mouse only. But it can be also done by doing hand gestures without sensors. This paper is to make a better communication between human and computer. Here we use ARM 11 processor for processing of image.Here, our hand acts like a mouse. Depending upon the finger movement up and down etc.,the cursor moves accordingly. It overcomes the performance in terms of sensors and hardware cost.
Key words : Hand gestures, ARM 11, computer.
-
Introduction
Nowadays, all of the aspect in the world is using the computer. Thus, we can see clearly where the interaction often happened between humans and computers. Because human and computer interaction was reached to entire daily life, it would require an input interface that is easy to use. There are a lot of input interface that has already exists in this world. For example the keyboard, mouse.
Most part of human body can be uniquely recorded for input. The concrete example is a fingerprint that is used to instruct the computer to issue permits for the users so the users can access the computer. In addition to fingerprint, there are much more parts of the human body that can be an input source. The input from the human body will be used as a code in a computer command. Human body parts include body movements, hand gestures, faces, and many more. Human posture and gesture activity analysis has emerged as an important, interdisciplinary area which has many potential applications such as surveillance, advanced human computer interaction, intelligent driver assistance systems, 3-D animation, health care monitoring, and robot control. Compared to previous
technologies using some wearable sensors or markers, marker-less vision-based approaches provide more natural and less intrusive solutions which are more convenient for real-world deployments. In this paper, we develop a system for human machine interactivity that can recognize human gesture from marker-less multi view input. We choose to deal with the hand gesture only since it is simpler than full body, and yet conveys important information about several human activities where arms carry the most influential information .
For controlling a computer it should be a real time process and place directly at the same time. Thus, the input devices is used to record the hand gesture. It also has to be able to record the hand gesture directly. Input devices that can be used directly are camera and video camera. In this paper, we use a camera as an input device. This camera will record the hand gesture from the user's hand. The camera should capture hand. The last hand gesture that is recorded as commands is intended to control the computer.Image captured is processed by the haar like features.After obtaining the result of the processed image we then compare with the already stored images.So,that if it matches the stored gestures the particular application is performed.
-
Main Method
Embedded system using ARM processor has feature of image/video processing by using various features and classification algorithms which have been proposed for emotions detection. It overcomes the performance in terms of sensors and hardware cost . This system captures image by means of web camera connected to ARM processor through USB and the captured image is processed by using technique of image processing. In the image processing of the image,the captured image is preprocessed and detected using haar like features algorithm.
The processed data are sent to max232 which is a level converter and using serial port through RS-232 cable processed hand movements is sent to PC where PC is controlled through gestures and hence various applications are controlled. Here our hand gestures are used to act as a mouse for pc, depending upon the
movement of the hand, various functions of mouse can be done by our hand gestures.
Capturing a real time image from webcam
Capturing a real time image from webcam
-
Frame Work
Start
Step 1 : Start the board of ARM11 processor.
Step 2 : Capture a real time image from the webcam that is hand gesture.
Step 3 : Preprocessing is step where a captured image pixels are adjusted and if the image is in different position adjust it in normal view.
Step 4 : In the detection phase Haar like features algorithm is used. Haar-Like Features is a feature of digitalize images to image analysation in object recognition.Here preprocessed image is taken and the background image is removed and a fore ground image is taken where the algorithm is applied to get a particular position of hand gesture. In this paper, we use a simple rectangular Haar-Like Features as an approach. The value of a Haar-like feature gives the difference between the sums of the pixel values within the white and black rectangles,where hand geture is detected.
Step 5: Compare the processed image with the database if it relates with the stored data it is processed to next step otherwise step 2 is repeated.
Step 6 : A rectangular box is drawn and caculate centroid so that so the movement of finger can be easily identified.On the lcd display we get a rectangular box to show that a hand is detected at that position.To get a rectangular box we calculate the position of the hand on the lcd.
Step 7 : Depending upon the movement of hand particular mouse operation is performed.If t he hand is detected on the left side,we get a rectangular box on the left side indicating hand is detected on the left side and cursor moves towards left.In the same way the process is done for up,down,right,single click and double click.Using a doubleclick we can open a file and close it etc.,
Preprocessing
Preprocessing
Detection
Detection
NO
Compare
Draw Rectangular Box and Calculate Centroid
Draw Rectangular Box and Calculate Centroid
with database
Perform Particular Mouse Operation
Perform Particular Mouse Operation
Stop
YES
Step 8 : Stop the process.
Figure 1. Frame Work
-
Algorithm Used For Detection
In this paper, we use an algorithm i.e haar like features.There are hand gestures and hand postures. A hand posture is a the hand configuration that is static and without any movements of hand location involved and no movement of the hand. A hand gesture refers to a sequence of hand postures connected by continuous motions over a short time span or collection of hand posture that different one of another so it will make a movement not just silence. A hand gesture is taken as a series of hand postures that act as transition states. Gesture recognition can be taken in two levels low level posture recognition and high level gesture recognition. For hand postures, the repeatability is usually poor due to the high degree of freedom of the hand as well as the difficulty of duplicating the same working environment such as the background To solve the problem, we use a statistical approach based on a set of Haar-like features that focus more on the information within a ertain area of the image rather than each single pixel . Haar-Like Features is a to digitalize images and to analyze images in recognition of the object. There are three kinds of methods in the Haar-Like Features algorithm, including a simple rectangular Haar- Like Features Tilted HaarLike Features and fast Computation of Haar-Like Features.Here, we use a simple rectangular Haar-Like Features as an approach for detection process.
The simple Haar-like features are used in the Viola and Jones algorithm. There are two motivations for the employment of the Haar-like features rather than raw pixel values. The reason is that the Haar-like features can encode ad-hoc knowledge, which is difficult to describe using a finite quantity of training data. Comparing with the image containing the raw pixels, the Haar-like features can efficiently reduce/increase the in- class/out-of-class variability and thus making classification easier . The Haar-like features can give the ratio between the bright and dark areas. One Haar-like feature can efficiently catch that character. The motivation is that a Haar-like feature-based system can operate much faster than a pixel-based system. Besides the above advantages, the Haar-like features are also relatively robust to noise and lighting changes because they compute the gray level difference between the black and white rectangles. The noise and lighting variations affect the pixel values on the whole feature area, and this influence can be counteracted. Each Haar-like feature consists of two or three connected white and black rectangles. Fig. 2 shows the extended Haar-like features . The value of a Haar-like feature is the
difference between the sums of the pixel values within the black and white rectangles
Figure 2. Haar Like Features
The (Fig. 3) integral image at location of pixel(x, y) contains the sum of the pixel values above and left of this pixel inclusive:
P(x,y) = x<=x1,y<=yl p(x1,y1)
According to the definition of integral image, the sum of the pixel value within the area D in Fig. 3 it can be computed by:
P1 + P4 P2 P3
where P1 = A, P2 = A+B, P3 = A+C, and
P4 = A+B+C+D
To detect an region of interest, scanning of the image is done by a sub-window which contains Haar-like feature.
Based on each Haar-like feature fj, a correspondent classifier hj(x) is defined by:
hj(x) = 1,if pjfj(x)< pjj
0, otherwise
where x is a sub-window, and is a threshold. pj indicates the direction of the inequality sign
Figure 3. Concept of Integral Image
-
Results
The below figure 4 shows the captured hand gesture which is shown on the lcd .So depending upon the movement of the hand towards left,right,up or down before webcam which is shown on lcd the cursor moves accordingly .
Figure 4. Captured hand gesture shown on lcd
-
Hand gesture recognition for a double click
Figure 5. Cursor on media player
The above figure 5 shows that cursor is on the vlc media player before any hand gesture operation is performed.
Figure 6. Hand gesture recognition for a double click
The above figure 6 shows that when a hand gesture is detected on on top right of lcd a red colour box is shown on lcd which indicates a double click is performed.
Figure 7. Media Player is opened
The above figure 7 shows vlc media player opened as the double click is performed.
-
-
Conclusion
Commonly, we use mouse to control the Pc. As time changes, technologies always develop a better and easier technology, such as touch screen technology. But now, the world began to turn to hand gesture technology. This technology doesnt need a special monitor to control it, it just needs a camera to capture the gesture and turn it into a program to control the Pc.
We try to make a hand gesture to control the Pc. We use a camera to capture the hand movements. To recognize the gesture,the algorithm used is haar like features. So when the user moves their hands, the camera captures the movement and then the Haar Like Feature will read the posture of the hands. After that, the posture will be recognized and sent to pc where particular application is controlled.
In the proposed method gestures of hand can be captured by webcam and based on the identified gestures the PC is controlled. In future we can implement the same method by using zigbee or Bluetooth technology.At present we are implementing this project by using MINI 6410 board. In future we can implement the same project by using Beagle Board which speed is 150 times faster than ARM11 board.
-
References
-
Cuong Tran, Student Member, IEEE, and Mohan Manubhai Trivedi, Fellow, IEEE 3-DPosture and Gesture Recognition for Interactivity in Smart Spaces IEEE Transactions on industrial informatics, vol. 8, no. 1, february 2012.
-
Xu Zhang, Xiang Chen, Yun Li, Vuokko Lantz, Kongqiao Wang, and Jihai Yang A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,IEEE Transaction on systems, vol. 41, N0. 6, November 2011.
-
Chan Wa Ng, S. Rangganath, Real-time gesture recognition system and application.Image and Vision computing (20): 993-1007, 2002.
-
Chen-Chiung Hsieh and Dung-Hua Liou, A Real Time Hand Gesture Recognition System Using Motion History Image, 2nd International Conference on Signal Processing Systems (ICSPS),2010. http://opencv.willowgarage.com
-
http://www.friendlyarm.net/products/mini2440
-
http://qt-project.org/wiki/Category:Tools::QtCreator [7]http://docs.opencv.org/modules/objdetect/doc/cascade
_classification.html [8]http://docs.opencv.org/modules/core/doc/drawing_fun ctions.html#rectangle [9]http://www.alldatasheet.com/datasheet- pdf/pdf/83786/SAMSUNG/S3C2440A.html [10]http://www.andahammer.com/assets/Uploads/All244 0/LinuxDNW.pdf2002, Vol. 1, pp. 900-903, 2002.
Authors
M.Vasavi , Pursuing M.Tech in Electronics and Communication Engineering from Joginpally B.R Engineering College,Moinabad, Rangareddy Dist, Hyderabad, Andrapradesh.She has received the B.Tech. degree in Electronics and Communication Engineering from Sree Visvesvaraya Institute of Technology and Science,Mahabubnagar in 2011.
P.Ramesh ,Associate Professor of JBREC, Hyderabad Completed his B.E. from Osmania University and Masters of Technology from JNTUH and now he is pursuing Ph.D from JNTU- Hyderabad. He has published 3 International papers, attended 2 national conferences and attended many work shops in India. He has 5years of the industrial experience and 8 years of the teaching experience. He guided many PG&UG students in his teaching carrier. His research area includes in Low Power VLSI Design, Embedded Systems, and SoC design. He has Attended as a Judge for national level Paper& Poster Presentation.
Dr. D.N.Rao B.Tech,M.E,Ph.D, Principal of JBREC,Hyderabad.His carrier spans nearly three decades in the field of teaching, administration, R&D, and other diversified in-depth experience in academics and administration. He has actively involved in organizing various conferences and workshops. He has published over 11 international journal papers out of his reseach work. He presented more than 15 research papers at various national and international conferences. He is currently approved reviewer of IASTED International journals and conferences from the year 2006. He is also guiding the projects of P.G./Ph.D students of various universities.