- Open Access
- Total Downloads : 23
- Authors : T. Arunkumar , S. Dinesh Sundar
- Paper ID : IJERTCONV3IS04042
- Volume & Issue : NCRTET – 2015 (Volume 3 – Issue 04)
- Published (First Online): 30-07-2018
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
A Novelty System for Implementation of Mouse to Control Robotic Arm
T. Arunkumar1
PG Student1
1Department of Electronics and Communication Engineering
Parisutham Institute of Technology and Science, Thanjavur, Tamilnadu, India.
Mr. S. Dinesh Sundar2
Assistant Professor2
2Department of Electronics and Communication Engineering
Parisutham Institute of Technology and Science, Thanjavur, Tamilnadu, India
Abstract-Human manipulation is necessary in making a decision and in controlling a robot, particularly in unstructured dynamic environments. Robots have replaced human beings in a wide variety of industries in difficult situations. The basic and contradictory needs of an industrial process are the repetitive tasks and of high accuracy. The remote controlled robotic arm may act as a solution to such problems. In this paper proposes the development of a robotic arm will control by using wireless mouse. The two servo motors are placed at the shoulder joint, one at the elbow and two at the wrist joint for the gripper. DC motors can be programmed to rotate to a specific angle 0 to 180 degrees. Also, a wireless mouse is a device which is very widely used and any layman has expertise over its usage. Hence, we have built a robotic arm controlled by using wireless mouse.
Keywords Robot, Wireless mouse, Servo motor, Microcontroller, Zigbee.
-
INTRODUCTION
Human manipulation is necessary in making a decision and in controlling a robot, particularly in unstructured dynamic environments. Some commonly used humanrobot interfaces [1] include joysticks [2], dials [3], and robotarm replica [4]. However, using these mechanical contact devices for teleoperation tasks to control the robotarm movements. In the industry most of the robots have no vision system, they just move following predefined paths, which they have learned previously, but no decision is made by them, we can tell almost no artificial intelligence is implemented in their control software.
Nowadays, the commonly used human-robot interface is the contacting mechanical devices such as joysticks and robot replicas. Human can just simply move the wireless mouse and arm to complete the tele operation tasks with these devices. However, these motions are mechanical. The operator has to practice for a long time to manipulate the robot effectively. There is another kind of human-robot interface, which can track the position and orientation of the operators hand in real time, such as electromagnetic tracking devices, inertial sensors and data gloves, also used in robot teleoperation area. However,
because the devices are also contacting, people encounter the same problem while using them, the motion of the operator is unnatural.
Human hand control robotic arms are the vital part of almost all the industries. A robotic arm performs various different tasks such as welding, trimming, picking and placing etc in industry. Moreover the biggest advantage of these arms is that it can work in hazardous areas and also in the areas which cannot be accessed by human. Many variants of these robots/robotic are available or designed as per the requirement. Few variants are Keypad Controlled, Voice Control, Gesture Control, etc. However, most of the industrial robots are still programmed using the typical teaching process which is still a tedious and time- consuming task that requires technical expertise. Therefore, there is a need for new and easier ways for programming the robots.
-
EXISTING SYSTEM
A human hand gesture command to control the robotic arm by moving left, right, up, down etc. and also to pick the desired object and place them at the desired location. Based on functionality, the system has been categorized into the following parts:-
-
Camera
-
Blob analysis technique
-
Robotic arm
-
Hand gesture process and Personal computer
Fig. 1 System architecture
-
LQE Algorithm
LQE (Linear quadrature estimation algorithm) is used to estimate the state of IMU from a set of noisy and incomplete measurements, because both gyroscopes and magnetometer have white noise and random walk. LQE is a stochastic technique that estimates the state at time k from the state at time k 1.It is also called Kalman filter algorithm.
There are two analyses in Kalman filter algorithm:
-
Object tracking and analysis
-
Merge split handling
Fig.2 Object tracking and analysis
The Kalman filter was developed to solve specific problems in the areas of rocket tracking and autonomous or assisted spacecraft navigation (e.g. Apollo space program). Since the Kalman filter has found applications in hundreds of diverse areas, including all forms of navigation (aerospace, land, and marine), signal processing and communication, nuclear power plant method, demographic modeling, Etc.,
Fig.3 Merge split handling
The Kalman filter consists of two steps, the prediction and the correction. In the first step is predicted with the dynamic model and the second step is corrected with the observation model. The LQE function predicts the position of a moving object based on its past values. LQE is otherwise called kalman filter. It uses a Kalman filter estimator, a recursive filter that estimates the state of a dynamic system from a series of noisy measurements. Kalman filtering has a wide range of application in areas such as signal processing, Tracking objects, Navigation.
-
-
Blob analysis
In the field of computer vision, blob detection refers to gesture detection methods that are aimed at detecting regions in a digital image that differ in properties, such as color, compared to areas surrounding those regions. Informally, a blob extraction is a region of a digital image in which some properties are constant or vary within a prescribed range of values; all the points in a blob can be considered in some sense to be similar to each other.
Blob analysis Example
-
Working Principle
In this project one thing was clear that a system is going to be developed which can capture a hand gesture performed by the user in front of camera, this captured image is then processed to identify the valid gesture through specific algorithm & execute the corresponding operation. The overall implementation of process is described as follows:
Fig. 4 Interaction among the components
-
-
PROPOSED SYSTEM
This project deals with the design, fabrication and control of a robotic arm having features like grip, mobility and placement, with high accuracy and speed. A Robotic Arm, having 5 Degrees of Freedom, is controlled by the 8 functions, using a Wireless Mouse. These 8 functions are:
x-y movements of mouse, left, right, arm up, arm down, forward , backward, pick and place It is implemented using an pic16f778a microcontroller board. This board controls the servo motors and also responds to the mouse. These servo motors can rotate maximum 180 degrees.
The position of these motors is at:
-
Base: for horizontal movement of the arm
-
Shoulder: for vertical movement of upper arm
-
Elbow: for vertical movement of lower arm
-
Wrist: for moving the wrist in clockwise and anticlockwise direction
-
Grip: for opening& closing of the palm.
The objective of this Robotic Arm is to easily transport objects, held in the gripper, from one place to another, with accuracy and speed. The main feature of this project is, that it works with only one microcontoller board which replaces USB Host Shield and Servo Motor Driver boards. This has reduced the overall bulkiness and the cost of the project.
-
TRANSMITTER SECTION:
-
RECEIVER SECTION:
-
-
Working Principle:
Fig.5 Data Flow and block diagram for proposed system
Zigbee Protocol
From the wireless remote mouse control unit, the zigbee transmitter of 2.4 GHz transmit signals to the main circuit board. Each address/data input can be set to one of the two logic states.
The programmed addresses/data bits from the PIC16F628 are transmitted with R-F transmission. A transmitting antenna denoted as T-X Antenna then sends the signals via a receiving antenna R-X to the microcontroller PIC16F877 on the main board which decodes the signal to perform the appropriate instructions. For the robot unit, the Xbee receiver receives sent signals from the R-F receiver module at 2.4 GHz the R-X antenna which the PIC16F877 microcontroller executes the instructions transmitted from the PIC16F628, the microcontroller uses the assembly language for programming which executes data input required for the motors (servo-motors) and executes instructions for direction then movement of the robot. The motors (servo- motor) are controlled with necessary instructions which enable the wheels of the motor car to move as well as the robotic arm to move in the required direction. The system data flow diagram is as shown in Figure.
With the control unit powered and in operation, the process of controlling the robot will proceed through the following:
-
The wireless remote user sends a command signal to the robots receiver.
-
Robot receiver receives the command signal sent from the user wireless remote control.
-
Robot receiver decodes the signal and sends the command to the microcontroller.
-
Microcontroller issues command to robots parts such as the wheels and motors for movement and direction respectively.
PIC16F877A
TABLE I
-
-
Dc Motor Operation
-
Algorithms Of Sea And Mfa
The x- and y-axis coordinates increase when the mouse moves right and down, respectively. The coordinates are redefined based on screen resolution. The SEA automatically provides suggestions when a user moves the mouse to certain coordinates. X, Ycurrent denotes the current coordinates of the mouse based on the resolution acquired by Resolution Function. MFA Function is developed for implementing a function in applications with MFA. Agree decision and Disagree decision constants are set up for applications developed with SEA to record the final decision of the user and the last coordinates of the mouse.
In the MFA, Tolerence_Value denotes the threshold value set by users and X,Yicon denotes the coordinates of the target icon the user wants to click. When
|X,Ycurrent- X,Yicon|, |Xcurrent- Xicon|, or |Ycurrent- Yicon| < Tolerence_Value exceed a predefined Tolerence_Value, the system automatically provides notifications and suggestions, given in Suggestion Message. Finally,
applications can record the preference of the user and stores it in the database for training the database by implementing Record Function. Training algorithms will be developed in the future.
Fig. 6 Mouse coordinates settings for SEA and MFA
-
Advantages
Robotic arm is controlled by mouse, so it has many movements compared to the one controlled with ordinary remote. Also a remote is required to be at line of sight with the robotic arm, which is not required for the wireless mouse. The Robotic Arm can cover a distance of more than a foot in x direction with 180 degrees rotation, allowing it to pick up objects. The controls of the robotic arm are simple to understand and implement. Other advantages are simple kinematic model, easy to visualize, good access into cavities and machine openings, and very powerful when hydraulic drives are used. The kinematic structure of the robotic arm allows us to position its gripper at any (x, y, z) location in the 3D space within a specific range.
-
Applications
-
Mechanical: The robotic arm can have a variety of applications in mechanical field. It can be used for welding at high temperature environment usually dangerous for a human to work. It is used by many automobile industries for transport and placement of automobile parts.
-
Medical: Robotic arm finds a number of applications in this field. Surgeons use artificial robotic hands for performing surgeries requiring high precision and stability. They are useful in removing tumours and for performing cataract operations.
-
Space: Remote Manipulator System has robotic arms with many degrees of freedom, used for performing inspections by cameras and sensors attached at the gripper end. These Robotic Arms can be autonomous or manually controlled.
-
Household: Robotic Arms are not only useful in industries but also find a variety of applications in household. Using a cordless mouse, they can be accessed from a distance, thus proving beneficial at household chores like cleaning, placing objects in the shelf, etc.
-
-
RESULT
In this experiment the objective is analyze human to control robots from mouse. We used sea and mfa based approaches and we included the resulting images and output. In center position the x and y value is constant. For move up direction the y-axis value is decreased and down direction the y-axis value is increased. Based upon these values the gesture direction is displayed in the system and the gesture is transmitted by using zigbee protocol to the robot. Based upon these gestures directions the robot arm will move in corresponding movements.
Fig.7 Structure of a Robotic Arm
-
CONCLUSION
In this paper we have proposed a Mouse controlled Robotic Arm aims at providing assistance to industry as well as domestic applications. We present a cost effective, easy to operate, and having good range, Robotic Arm. In conclusion, we propose an efficient artificial machine, in flow with the recent developments in the robotics field. It can be improved further to upgrade the standard of living of human beings.
ACKNOWLEDGEMENT
I would like to thank our guide Prof. Mr.Dineshsundar Asst. Prof., Electronics and communication engineering Department, Parisutham Institute of Technology and Science, thanjavur for his help and guidance to enable us to propose this system.
REFERENCES
-
C. Mitsantisuk, S. Katsura, and K. Ohishi, Force control of human robot interaction using twin direct-drive motor system based on modal space design, IEEE Trans. Ind. Electron., vol. 57, no. 4, pp. 13381392, Apr. 2010.
-
S. Hirche and M. Buss, Human-oriented control for haptic teleoperation, Proc. IEEE, vol. 100, no. 3, pp. 623647, Mar. 2012.
-
T. Ando, R. Tsukahara, and M. Seki, A haptic interface Force Blinker 2 for navigation of the visually impaired, IEEE Trans. Ind. Electron., vol. 59, no. 11, pp. 41124119, Nov. 2012.
-
K. Kiguchi, S. Kariya, and K. Watanabe, An exoskeleton robot for human elbow motion support-sensor fusion, adaptation, and control, IEEE Trans. Syst., Man, Cybern., B, Cybern., vol. 31, no. 3, pp. 353361, Jun. 2001.
-
J. Kofman, X. Wu, T. J. Luu, and S Verma, Teleoperation of a robot manipulator using a vision-based humanrobot interface, IEEE Trans. Ind. Electron., vol. 52, no. 5, pp. 12061219, Oct. 2005.
-
J. H. Park, Y. D. Shin, J. H. Bae, and M. H. Baeg, Spatial uncertainty model for visual features using a Kinect sensor, Sensors, vol. 12, no. 7, pp. 86408662, Jun. 2012.
-
P.Vijaya Kumar, N.R.V.Praneeth And Sudheer, Hand And Finger Gesture Recognition System For Robotic Application – International Journal Of Computer Communication And Information System (Ijccis) Vol2. No1. ISSN: 09761349 July Dec 2010
-
Shamsheer erma B.Tech (ICE), Bharati Vidyapeeth College of Engineering, New Delhi. Hand Gestures Remote Controlled Robotic Arm Advance in Electronic and Electric Engineering. ISSN 2231- 1297, Volume 3, Number 5 (2013), pp. 601-606
-
Gianluca Antonelli, Stefano Chiaverini, And Giuseppe Fusco, A New On-Line Algorithm For Inverse Kinematics Of Robot Manipulators Ensuring Path Tracking Capability Under Joint Limits IEEE Transactions On Robotics And Automation, Vol. 19, No. 1, February 2003.
-
Luis M. Munoz and Alcia Casals, Senior Member, IEEE, Improving the HumanRobot Interface Through Adaptive Multispace Transformation IEEE Transactions On Robotics, Vol. 25, No. 5, October 2009.
-
A Multimodal Interface to Control a Robot Arm via Web: A Case Study on Remote Programming R. MarÃn, Member, IEEE, P. J. Sanz, Member, IEEE, P. Nebot, and R. Wirz Department of Computer Science & Engineering. University of Jaume I E-12006 Castelló SPAIN {rmarin,sanzp,pnebot}@uji.es