- Open Access
- Authors : Grace Mary Mathew, Melbin George
- Paper ID : IJERTCONV9IS13041
- Volume & Issue : NCREIS – 2021 (Volume 09 – Issue 13)
- Published (First Online): 02-08-2021
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Real Time Imitation of Human Hand
Grace Mary Mathew
PG Scholar,
Department of Electronics and Communication, College of Engineering Kidangoor, Kerala, India
Melbin George
Assistant Professor
Department of Electronics and Communication, College of Engineering Kidangoor, Kerala, India
Abstract- This paper puts forward a system that imitates or replicates the gesture of the human hand. It consists of a sensory transmitter glove which captures the movement of the human hand and collects the values and sends it to the receiver section through a RF module. The receiver section will receive the values from the transmitter through a RF module. The receiver section is the robotic arm. The robotic arm is made through 3D printing. The robotic arm consists of motor driver and servo motors for making the movement in the robotic arm. On both the transmitter and the receiver section Arduino Uno is used.
Keywords- Arduino nano, transceiver, flex sensors, servo motors, robotic arm, wireless glove.
-
INTRODUCTION
Nowadays Human-Robot interaction is a widespread field of study, which is based on communication between humans and the robot. The prevalence of the subject has brought many communication and control methods. In this paper, social learning and skill acquisition of a robotic hand via teaching and imitation is aimed. The subject of Human-Robot collaboration, which includes the theme of this paper, is a common field of experiments in our age of technology. Many disabilities can be defeated or many other things, which a human being would not be able to do, can be done with the help of this technology.
As an example, a robotic hand can be a light of hope for a person who does not have a hand or wants to hold an object remotely over the internet. So, in this paper, it is explained how a robotic hand can learn by imitation. In the experiment a robotic hand, which was printed by a 3D printer, was used and controlled wirelessly by a computer that recognizes human hand gestures via motion capturing algorithms.
The communication between the computer and the robot is provided with a 2.4 – 2.5 GHz single chip radio transceiver. This project presents a platform to implement and evaluate a learning by imitation framework which enables humanoid robots to learn hand gestures from human beings. A marker-based system is used to capture human motion data.
-
PROPOSED SYSTEM
The proposed system mainly consists of two sections.
-
Transmitter Section
The transmitter section is the wireless glove, depicted in fig 1. It consists of flex sensors, microcontroller and a transceiver. The flex sensor will sense the movement of the human arm and the sensory values are given to the microcontroller. The flex sensors are variable resistors, so a voltage divider is used here, using a 10K resistor. Here Arduino nano is used as the microcontroller. The values
from the sensor are coded and sent to the receiver section using an nRF24L01 transceiver. A 9V dc battery is used as the input power. ATmega328 Arduino nano is used here as the microcontroller. As the human demonstrator moves his hand the flex sensor in the wireless glove will change the values and this value is analog. These analog values are converted into digital values using an Analog to Digital Converter (ADC). The digital values will be between 0- 1023. We must check the
Fig 1. Wireless Glove at the Transmitter section
values of each demonstrators hand while it is open and closed. note these two values and allow these two-value range and restrict all other values.
-
Receiver Section
The receiver section consists of the robotic arm. The robotic arm is 3D printed according to delineated structure in fig 3. It consists of an nRF24L01 transceiver, which receives the output from the transmitter section. Here Arduino nano microcontroller is used. It decodes the sensory values and gives them to the motor driver. LM293D motor driver is used here. It acts as an interface between the microcontroller and the servo motor as tethered in fig 2. The servo motor will make the movement in the robotic arm. The servo motors have a shaft, and an elastic thread is connected between the robotic fingers and the shaft of the motors. Each finger has a separate motor. A 12V dc power supply is required to operate this. Since the motor driver consumes more power, if we use a 12V dc battery we must replace the battery frequently. However, a power adapter of 220Vac/12Vdc has solved this problem.
Fig 2. Robotic hand at the receiver section
-
Design
-
-
-
SYSTEM SETUP
Fig 4. 3D Printed Robotic Arm
C. Embedded System
Microcontroller, servo control, flex sensor control, wireless control with nRF24L01, mapped in the fig 5, are the core embedded part of this project. The flex sensors require a circuit for them to be compatible with Arduino. Flex sensors are variable resistors, so usage of a voltage divider is recommended. Here, a 10K resistor is used.
In this paper, 3D robot hand assembly, servo control, flex sensor control, wireless control with nRF24L01 are employed. A wireless glove is used to control the robotic hand. The servo motors are aligned at 10 or 170 degrees before attaching the servo pulleys to the servo motors. When mounting the servo pulleys, keep fingers in the closed or opened position (according to user servo angles). Then wrap around the servo pulley until the braid wires or strings become stretched.
Fig 3. 3D Printed Robotic Arm
-
Assembly
Assembly of robotic arm parts is very detailed and complex. Each of the four fingers and thumb in each hand are activated with a servo in the forearm. The servos are connected to the fingers via the tendons made from braided fishing line. There are left and right versions of the hands. The robotic arm in fig 4 is the receiver section. Servo motors have integrated gears and shafts which can be controlled very well. It also consists of a motor driver which acts as an interface between the motor and the embedded circuit.
For the transmitter section, we have a wireless glove with sensors, embedded system and a transceiver module. The sensor used is the flex sensor. The flex sensor will sense the movement of the hand by measuring the change in the resistance. The value obtained by the flex sensor is encoded by the microcontroller and sends the values to the robotic arm via a transceiver module.
Fig 5. Embedded circuit
-
-
RESULT AND ANALYSIS
The setup proposed in this paper is a low powered system with maximum efficiency. The energy saving methodology is acquired using the implementation of flex sensors. They are more accurate and less power consuming. The system has less latency of around 130 µs because of the data transmission using the nRF24L01 + Automatic Packet Handling. The flex sensor acts as a voltage divider with varying feedback resistor which is in accordance with the movement of fingers of the human hand. The voltage signal is captured by the analog ports of the arduino and processed by the ATmega328p. The signals are processed and mapped within a time period of 50 nanoseconds. The signal is routed to the transmitter and received at the robotic hand at a frequency of 7.7 kHz. Later the robotic arm will emulate in accordance with the mapped values controlled by the L293D motor driver at a frequency of 1.25 mHz. The total time required for a single bit of emulation is 180 µs.
The antecedent system is much slower than the proposed system since it's working on the domain of imge capturing technique and image processing. It consists of more power draining and time-consuming processes and precise equipment are required. The images of the human arm are captured by the capturing device. Then it is transferred to the processor. The processing contains at least four numbers of algorithms to be executed which consumes a time period of 0.6741 sec for Pixel count, 1.181 sec for Detection of Circles, 2.0053 sec of Morphological Operations, 1.0564 sec of scanning method. The total time required for processing itself is about to 3.86 seconds. The crucial dominance of the proposed system from other systems that are juxtaposed in the table 1, is the efficiency and accuracy produced by the setup with less consumption of power. If the proposed system is upgraded with most of the high-tech update, the system will not lag in the processing and the handling of the parameters as well as implementation of advanced transceivers can distribute to multiple recipients without causing any delay and suppression of noise can help to achieve more accuracy while transmitting. Front-end flex sensors can assist to access the minute positions of the human fingers which will also help to achieve accuracy in input segment. So, the speed and accuracy and latency of the individual system help to maximize the performance of the whole system.
Comparison with the Existing System and Proposed System
Study
Gesture imitation using ANN
Gesture imitation using Image Processing
Proposed system
Equipment
Image capturing device, Powerful system for neural network processing.
High capture cards, image processing techniques.
Bend sensor, low power processor
Latency Speed
Depends on the number of trained stacks
3.6 seconds
180 µs
High quality
images of the
Position of
Captured image
hand position
fingers are
is processed in
are processed
detected and
the neural
to get the
mapped with
network in order
accurate
the help of
Procedure
to match with
structure of the
flex sensors.
most accurate
finger through
trained stacks.
various image
processing
techniques.
Faster
Process all the
Compatible to
processing,
inputs with a
all input
energy
Pros
wide range of
capturing
efficient,
trained stacks.
environment.
cipher saver,
remote
operable.
Remotely access
Obstacles or
Capturer
is much time
noises in the
environment
Cons
consuming.
input frame
must be
cause cipher
modified for
crash.
different
inputs
Study
Gesture imitation using ANN
Gesture imitation using Image Processing
Proposed system
Equipment
Image capturing device, Powerful system for neural network processing.
High capture cards, image processing techniques.
Bend sensor, low power processor
Latency Speed
Depends on the number of trained stacks
3.6 seconds
180 µs
High quality
images of the
Position of
Captured image
hand position
fingers are
is processed in
are processed
detected and
the neural
to get the
mapped with
network in order
accurate
the help of
Procedure
to match with
structure of the
flex sensors.
most accurate
finger through
trained stacks.
various image
processing
techniques.
Faster
Process all the
Compatible to
processing,
inputs with a
all input
energy
Pros
wide range of
capturing
efficient,
trained stacks.
environment.
cipher saver,
remote
operable.
Remotely access
Obstacles or
Capturer
is much time
noises in the
environment
Cons
consuming.
input frame
must be
cause cipher
modified for
crash.
different
inputs
TABLE 1. Differentiation of the existing system and the proposed system.
The proposed system is much better than the existing system like gesture imitation using ANN or Image Processing.
-
CONCLUSION
The developed core system helps to attain maximum performance by the collaborated efforts of the subsystems with high speed at the input obtaining, accurate in processing as well as less latency in data transfer. The system can be useful in various domains like medical, military, celestial space.
REFERENCE
-
P. Dhepekar and Y. G. Adhav, "Wireless robotic hand for remote operations using flex sensor," 2016 International Conference on Automatic Control and Dynamic Optimization Techniques (ICACDOT), Pune, 2016, pp. 114-118. doi: 10.1109/ICACDOT.2016.7877562.
-
N. Rodriguez, G. Carbone and M. Ceccarelli, "Antropomorphic design and operation of a new low-cost humanoid robot", Biomedical Robotics and Biomechatronics 2006. BioRob 2006. The First IEEE/RAS-EMBS International Conference on, pp. 933-938, Feb 2006, doi: 10.1109/BIOROB.2006.1639211
-
L. Sbernini, A. Pallotti and G. Saggio, "Evaluation of a Stretch Sensor for its inedited application in tracking hand finger movements," 2016 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Benevento, 2016, pp. 1-6. doi: 10.1109/MeMeA.2016.7533809
-
Ji Jun, Yu MengSun, Zhou YuBin and Jin ZhangRui, "A Wireless EEG Sensors System for Computer Assisted Detection of Alpha Wave in Sleep," 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, Shanghai, 2005, pp. 5351-5353. doi: 10.1109/IEMBS.2005.1615690
-
M. B. H. Flores, C. M. B. Siloy, C. Oppus and L. Agustin, "User- oriented finger-gesture glove controller with hand movement virtualization using flex sensors and a digital accelerometer," 2014 International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), Palawan, 2014, pp. 1- 4. doi: 10.1109/HNICEM.2014.7016195