- Open Access
- Total Downloads : 15
- Authors : R. Balamurugan, N. R. Nagarajan
- Paper ID : IJERTCONV5IS13084
- Volume & Issue : ICONNECT – 2017 (Volume 5 – Issue 13)
- Published (First Online): 24-04-2018
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Automatic Robotic Arm using Hand Gestures
R. Balamurugan1
Electronics and communication engineering K.Ramakrishnan college of engineering Thiruchirapalli
N. R. Nagarajan2
Electronics and communication engineering K.Ramakrishnan college of engineering Thiruchirapalli
Abstract This system presents a model for gesture controlled user interface (GCUI), and identifies trends in technology, application and usability. We present an integrated approach is real time detections, gesture based data which control vehicle movement and manipulation on gesture of the user using hand movements. A three axis accelerometer is adaption. As the person moves their hand, the accelerometer also moves accordingly. The gesture is capture by accelerometer and processed by gesture.
Keywords Gesture recognition, Arduino microcontroller, Wireless sensor networks.
three axes(x, y, z). It converts physical values in to electrical values. Then the electrical values are given to the arduino microcontroller. Using a wireless network (i.e.wifi) we are going to transfer the data to the receiver.
LCD
-
INTRODUCTION
An embedded system is a computer system with a dedicated function within a larger mechanical or electrical system, often with real-time computing constraints. It is embedded as part of a complete device often including hardware and mechanical parts. Embedded systems control many devices in common use today. Ninety-eight percent of all microprocessors are manufactured as components of embedded systems .Now a days, robots became a helping hand for human being as robot can perform any task which is
ACCELEROMETER
MICRO CONTROLLER (ARDUINO)
POWER SUPPLY
Fig. 1: Transmitter Block
WIFI TRANSCEIVER
very difficult to complete for human being in very less time such as medical surgeries, in industries for mass production as well as to pick and place heavy parts, in agricultural field, in different hazardous conditions, in land mine detection, in vacuum chambers, etc., according to the instruction of human being. In this paper we suggested such a robotic arm which is helpful to perform above suggested task. This robotic arm is controlled with hand gestures. Human robot interaction is an active area in robotic research. Research in robotics results in the development of two fields: Robotic manipulation and the input feeding system. In the in-situ standoff control, human gestures are given as input. Gesture recognition is mainly achieved using the camera and methodologies such as Grey scale conversion, Inverse algorithm, thresholding model. The aim is to make the robotic arm to understand the human body language thereby building a bridge between human and the robot. This system consists of a mechanical based robotic arm. Their movements such as forward, backward, left, right, rotate are controlled by using hand gestures. The range for robotic arm and human hands is 100 meters as both are
The receiving unit controls the robotic arm and varies the angle with respect to the acquired data. The accelerometer sensors in the controlling side, monitors the angle variation at human hand. The micro controller connected with these sensors, collects the analog value and converts it into digital data. The digitalized data are transmitted to the receiver through RF transmitter. At the robotic arm the received signal is processed and the motors are used to control the robotic arm movement. The robotic arm can monitor the pressure applied at the object surface and also it can increase the grip if the object is slipping. It consists of IR sensor, pressure sensor, motor (servo), powersupply, accelerometer.IR sensor is used to determine about grip of the object. Pressure sensor is used to show the objects pressure .Motor is used for robotic hand movement. And additional LCD is used to display the angle variation.
IR SENSOR
PRESSURE SENSOR
MOTOR1
spatially separated.
WIFI TRANSCEIVER
PROPOSED METHOD The transmitting unit is at human end can be mounted on a glove which is worn at human hand. It consists of various numbers of sensors to monitor the angle
MICRO
CONTROLLER
POWER SUPPLY
MOTOR DRIVER MOTOR2
ACCELEROMETER
MOTOR3
variation of human hand. The collected information is then send to the receiving in the robotic arm. The accelerometer in the transmitter is used to measure the angular variation in the
Fig. 2: Receiver Block
-
SIMULATION
For simulation purpose we are using software called proteus software version 7.7. Instead of using accelerometer potential divider is used here. Proteus software is easy to handle. It consists of virtual terminal to display the result. It accepts only hex or debug files. Arduino compiler is used to convert the transmitter and receiver program into hex file.
Fig. 3: Transmitter Simulation
In transmitter simulation, analog values from the accelerometer are converted into digital values. 6 Potential dividers are used, which are used to vary the angle.
Fig. 4: Receiver Simulation
In receiver simulation, LCD shows the angle variation of the robotic arm. Servo motor is used for the movement of the robotic arm.
-
CONCLUSION
Thus this approach provides a better way to control a robotic arm using accelerometer which is more intuitive and easy to work, besides offering the possibility to control a robot by other wireless means Also, many applications which require precise control and work like human beings can be easily implemented using this method and it provides more flexible control mechanism.
-
REFERENCE
-
Babul Salam Kader Ibrahim, Wireless Mobile Robotic Arm, June 2012.
-
Babul Salam Kader Ibrahim, Internet Controller Robotic Arm , July 2012.
-
Chanwahng, Surendra Ranganath, Real-Time Gesture Recognition system and Application, August 2002.
-
Pattnaik, Ashutosh and Ranjan Rajiv, Robotic Arm Control through Human Arm, March 2013.
-
Seyed Eghbal Ghobadi, Omar Edmond Loepprich, Real Time Hand Based Robot Control Using Multimodal Images, May 2008.
-
Mihara, Y. Yamauchi, and M. Doi, A real-time vision-based interface , February 2015.