- Open Access
- Total Downloads : 316
- Authors : Sakshi Sharma, Shubhank Sharma, Piyush Yadav
- Paper ID : IJERTV6IS040352
- Volume & Issue : Volume 06, Issue 04 (April 2017)
- DOI : http://dx.doi.org/10.17577/IJERTV6IS040352
- Published (First Online): 12-04-2017
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Design and Implementation of Robotic Hand Control Using Gesture Recognition
Sakshi Sharma, Shubhank Sharma, Piyush Yadav
UG Student1,2 , Assistant Professor3
Department of Electronics and Communication Engineering G.L.Bajaj Institute of Technology and Management
Greater Noida, India
Abstract The idea is to change the perception of remote controls for actuating manually operated Robotic-Hand. Well, this paper provides a way to eradicate the buttons, joysticks and replace them with some other more different technique, that is, controlling the complete Robotic Hand by the users hand movement or motion or gesture. In this paper we deal with the design and development of a Five Fingered Robotic Hand (FFRH) using Arduino board, sensors and wireless feedback. The design of the system is based on a simple, flexible and minimal control strategy. The Robotic Hand has some independent commands for all the five fingers open and close, wrist up and down, base clockwise and counter clockwise, movement of bot, Pick and Place and Home position to move the fingers. Implementation of pick and place operation of the different object using these commands are discussed. The Sensor based human hand replication system is a system that can help and secure the human presence to be put under harmful situations such as radioactive and biohazardous. The technology can also be helpful in very precise instrumentation workings like a doctor operating a patient by a robot without its own hands. The technology has its many useful applications in the field of robotics, surgical operations, humanoid robots, etc.
Keywords: Robotic Hand, Object Hunting, Wired and Wireless Feedback
-
INTRODUCTION
This paper deals with the Design and Implementation of a Wireless Gesture Controlled Robotic Hand with Vision. The system design is has 3 parts namely: Accelerometer Part, Robotic Hand and Platform. It is basically an Accelerometer based Robotic Hand system which controls a Robotic Hand wirelessly using a, small and low-cost, 3-axis (DOFs) accelerometer via RF signals. The Robotic Hand is mounted over a movable platform which is also controlled wirelessly by another accelerometer [1].
One accelerometer is mounted / attached on the human one hand, capturing its behaviour (gestures and postures) and thus the robotic arm moves accordingly and the other accelerometer is mounted on other hand of the user / operator, capturing its gestures and postures and thus the platform moves accordingly. The robotic arm and platform is synchronised with the gestures and postures of both hands of the user / operator, respectively. The different motions performed by Robotic Hand are: PICK and PLACE / DROP, RAISING and LOWERING the objects.
Also, the motions performed by the platform are: FORWARD, BACKWARD, RIGHT and LEFT. The system
is equipped with an IP based camera also which can stream real time video wirelessly to any Internet enabled device such as Mobile Phone, Laptop, etc [2].
The main object of this paper is to design and implement Five Fingered Robotic Hand (FFRH) for providing a simple reflexive grasp that can be utilized for a wide variety of objects. The FFRH is designed based on servo, point-to-point, and cylindrical robot structure with five-pronged grippers(five fingers).This approach is focusing primarily on the task of grasping objects of different shapes and not that of manipulating or assembling objects. This type of a grasping device has a variety of applications in object retrieval systems for the handicapped, planetary, underwater exploration and robotic surgery. This paper basically deals with object picking and dropping. It works on hand gesture using glove based technology [3].
-
RELATED WORK
Today, there are a number of robotic hand used in robotics research, different hand with different features and design criteria. In this section, brief of some recent widely-used and/or Influential robotic arms is given. In the robotics field, several research efforts have been directed towards recognizing human gestures. Few popular systems are:
-
Vision-based Gesture Recognition
It basically worked in the field of Service Robotics and the researchers finally designed a Robot which will perform the cleaning task. They designed a hand gesture-based interface to control a mobile robot equipped with a manipulator. This will uses a camera to track a person and recognize gestures involving arm motion. A fast, adaptive tracking algorithm enables the robot to track and follow a person easily through office environments with changing lighting conditions. It will perform the cleaning task and uses a camera for tracking [4].
-
Motion Capture Sensor Recognition
This recognition technique made it possible to implement a system based on accelerometer to communicate with an industrial robotic arm wirelessly. In this particular project the robotic arm is powered with ARM7 based LPC1768 core. MEMS is a three dimensional accelerometer sensor, it will captures gestures of human-arm and produces three different analog output voltages in three dimensional axes. For gripper movement two flex sensors are used [5].
-
Finger Gesture Recognition System based on Active Tracking Mechanisms
The prime aim of the system (based on the above mentioned recognition methodology) proposed by the author is to make it feasible such that it will interact with a portable device or a computer through the recognition of finger gestures. Apart from the gestures, speech can also be other mode of interaction because of which this system can form part of a so-called Perceptual User Interface (PUI). The system could be used for Virtual Reality or Augmented Reality systems [6].
-
Accelerometer-based Gesture Recognition
This Gesture Recognition methodology has become increasingly popular in a very short span of time. There are two factors that makes it an effective tool to detect and recognize human body gestures are low moderate cost and relative small size of the accelerometer. There are several studies have been conducted on the recognition of gestures from acceleration data using Artificial Neural Networks (ANNs) [7] [8] [9].
-
-
TECHNICAL REQUIREMENTS Components required for robot to made are aurdino mega, HC-05 Bluetooth module, servo motors, battery. The description of these components is given below:
-
Arduino mega 2560
The Mega 2560 is a microcontroller board based on the ATmega2560. It has 54 digital input/output pins (of which 15 can be used as PWM outputs), 16 analog inputs, 4 UARTs (hardware serial ports), a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP header, and a reset button.
-
Arduino nano 3.0
The Arduino Nano is a small, complete, and breadboard- friendly board based on the ATmega328 (Arduino Nano 3.0) or ATmega168 (Arduino Nano 2.x). It has more or less the same functionality of the Arduino Duemilanove, but in a different package. It lacks only a DC power jack, and works with a Mini-B USB cable instead of a standard one. The Nano was designed and is being produced by Gravitech.
-
Accelerometer
An accelerometer measures acceleration or gravitational force. By tilting an accelerometer along its measured axis, one read the gravitational force relative to the amount of tilt. Most of the accelerometers available today are small surface mount components, so that we can easily interface them to a microcontroller [10]. There are three axes that can be measured by an accelerometer and they are labelled as X, Y and Z. Each measurd axis represents a separate Degree of Freedom (DOF) from the sensorthus a triple axis accelerometer might be labelled as 3 DOF. In this paper, only 2 axes namely X and Y are used. The accelerometer used in this paper is ADXL3xx [11].
-
Camera
This system uses a smartphone with camera for continuous real time video streaming of the system and its surroundings. An IP based Android application [12], running on the smartphone enables the system to transmit the real time video wirelessly[13].
-
-
OVERALL DESIGN OF THE SYSTEM
-
Proposed Block Diagram
The overall design of the system shown in figure 1.It include Flex Sensors, Servo Motors, Arduino Mega, Arduino Nano, LCD, Bluetooth, Motor Driver L293D, accelerometer.
Figure. 1: Proposed Block Diagram
Robotic glove houses the circuitry which controls the robotic hand. It consists of Arduino Mega which is programmed in such a way that it transfers the required data with the help of a transmitter Module. At the same time the Flex sensor is doing its job by sending the degree of movement of the finger to the Arduino Nano. The processed values are then transmitted from the Module (NRF Transmitter) to the robotic arm. The module takes the feedback from the arm and sends the new processed signals to it.Fig shows the Robotic glove we designed.
Figure. 2: Robotic Glove
-
Major Parts of Robotic Hand
In this paper, we design a robotic hand with three degrees of freedom, which is able to pick the different object and place them at the different location. Based on functionality, the system has been divided into the following parts:-
-
Robotic arm
-
Platform
-
Communication system
-
Wireless Video Transmission All these parts are discussed below:
-
Robotic Arm
This is the vital part of the system as it is that part which will Pick up and Drop task of the project. The robotic arm is equipped with a Gripper (for picking and placing the objects) and an Arm (for raising and lowering the objects), Both the Arm and Gripper are equipped with Servo Motor to control the movement. These movements are synchronised with the hand gestures of the user, operating the Robotic Arm.
Figure. 3: Robotic hand (Front View)
The lowest point servo is attached in such a way that it moves the upper base horizontally from 0-180 degree depending upon the values from the NRF module.
Figure. 4: Robotic Hand (Top View)
The different figure shows that robotic hand graps different objects and the different hand gestures, shown in Figure, are described below:
GESTURE 1: To stable the Arm GESTURE 2: To Lower the Arm GESTURE 3: To Raise the Arm
GESTURE 4: To move clockwise, pick up and drop the object
GESTURE 5: To move anticlockwise, pick up and drop the object.
Figure. 5: To Stable Robotic Arm
Figure. 6: To Lower the Arm
Figure. 7: To Raise the Arm
Figure. 8: To move Arm clockwise, pick up and drop the object
Figure. 9: To move Arm anticlockwise, pick up and drop the object.
Robotic hand graps different objects as shown in figure 10:
Figure. 10: Robotic Hand pick different objects
-
Platform
Platform is nothing but it is the part of the project onto which the Robotic Arm is mounted. The platform is fitted with DC Motors and its movement is synchronised with the other hand gestures of the user, operating the Robotic Arm. The accelerometer is mounted on the one hand of the user, which will captures the hand gestures. Also, the other hand gestures which results in the movement of the platform. It is this part of the project which takes the entire project from one place to another.
GESTURE 1: To Stable the platform
GESTURE 1: To make the platform move in Forward direction
GESTURE 2: To make the platform move in Backward direction
GESTURE 3: To make the platform take a turn towards Right GESTURE 4: To make the platform take a turn towards Left
Figure. 11: To Stable the Robotic Hand
Figure. 12: To move Platform in Forward Direction
Figure. 13: To move Platform in Backward Direction
Figure. 14: To move Platform towards Right
Figure. 15: To move Platform towards Left
-
Communication System
The entire paper is basically depends upon that communication. No system / project can work without the communication system. Similar is the case with this project also. The RF Module, details of which are mentioned under Section 3.2, is the only communication equipment required in this paper. This Module is used to transmit the different hand gestures made by the user (encoded in the form of 4-bit digital data) wirelessly to the receiver [14], which decodes
the received 4-bit digital data and according to which the arm, gripper and platform moves. The block diagrams shown in Figure 16 & Figure 17 depicts the entire communication system of the project. The Linker (Circle, named A) in Figure 16 and Figure 17 is used to show the connection (flow of signals) between the Transmitter End and the Receiver End.
Figure. 16: Transmitter
Figure. 17: Receiver
-
Wireless Video Transmission
-
In this paper we integrate IP based Camera with this system for real time video streaming. Camera used here, it will capture the video and transmitted over the internet and that can be viewed on any internet enabled device by entering IP address in the URL bar.
-
-
CONCLUSION
The objective of this paper has been achieved which was developing the hardware and software for a gesture based robotic hand. From observation that has been made, it clearly shows that its movement is precise, accurate, and is easy to control and user friendly to use. The robotic hand has been developed successfully as the movement of the robot can be controlled precisely. This robotic hand control method is expected to overcome the problem such as placing or picking objects that are away from the user, pick and place hazardous objects in a very fast and easy manner or augmenting our abilities to perform such tasks.
REFERENCES
-
Pedro Neto, J. Norberto Pires, A. Paulo Moreira, Accelerometer- Based Control of an Industrial Robotic Arm Available at: http://arxiv.org/ftp/arxiv/papers/ 1309/1309.2090.pdf.
-
Dr. R. V. Dharaskar, S. A. Chhabria, Sandeep Ganorkar, Robotic Arm Control Using Gesture and Voice, In International Journal of Computer, Information Technology & Bioinformatics (IJCITB), Vol. 1, Issue 1,pp. 41-46 Available at: http://www.ijcitb.com/issues/paper_9.pdf
-
Ikuo Yamano et al (2005), FiveFingered Robot Hand using Ultrasonic Motors and Elastic Elements, Proceeding of the 2005 IEEE/RSJ International Conference on Robots and System, pp. 2673- 2678
-
S. Waldherr, R. Romero and S. Thrun, 2000, A gesture based interface for human-robot interaction, In Autonomous Robots in Springer, vol. 9, Issue 2, pp. 151173 Available at http://www.cs.cmu.edu/~thrun/ papers/waldherr.gesturesjournal.pdf
-
K. Brahmani, K. S. Roy, Mahaboob Ali, April 2013, Arm 7 Based Robotic Arm Control by Electronic Gesture Recognition Unit Using Mems, International Journal of Engineering Trends and Technology, Vol. 4 Issue 4 Available at: http://www.ijettjournal.org/volume- 4/issue4/IJETT-V4I4P347.pdf
-
S. Perrin, A. Cassinelli and M. Ishikawa, May 2004, Gesture Recognition Using Laser-Based Tracing System, In Automated Face and Gesture Recognition. Proceeding, Sixth IEEE Conference, pp. 541- 546 Available at: http://ieeexplore.ieee.org/xpls/ abs_all.jsp?arnumber
=1301 589
-
Y. Song, S. Shin, S. Kim, D. Lee, and K. H. Lee, Speed estimation from a tri-axial accelerometer using neural networks, in 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS 2007, pp. 3224-3227, 2007 Available at: http://www.ncbi.nlm.nih.gov/pubmed/1802682
-
J. Yang, W. Bang, E. Choi, S. Cho, J. Oh, J. Cho, S. Kim, E. Ki and D. Kim, 2006, A 3D Hand drawn Gesture Input Device using Fuzzy ARTMAP-based Recognizer, In Journal of Systemic, Cybernetics and Informatics, Vol. 4 Issue 3, pp. 1-7. Available at: http:// www.iiisci.org/ journal/CV$/sci/pdfs/P771618.pdf
-
K. Murakami and H. Taguchi, 1991, Gesture Recognition using Recurrent Neural Networks, In Proceedings of ACM CHI91 Conference on Human Factors in Computing Systems, New Orleans, USA, pp. 237-242. Available at:http://openexhibits.org/wpcontent/ uploads/papers/Murakami_NNgesturerecognition_ 1993.pdf
-
AccelerometerAvailableat: http://en.wikipedia.org/wiki/Accelerometer>
-
ADXL3xx accelerometer available at:http://arduino.cc/en/tutorial/ ADXL3xx
-
Android Application-IP Webcam Available at https://play.google.com/ store/apps/details?id=com.pas.web cam&hl=en
-
IP Camera Available at: http://en.wikipedia.org/wiki/IP_camera
-
Android ApplicationIP Webcam Available at https://play.google.com/ store/apps/details?id=com.pas.web cam&hl=en>