Bomb Defusing Robotic Arm using Gesture Control

DOI : 10.17577/IJERTV4IS020105

Download Full-Text PDF Cite this Publication

Text Only Version

Bomb Defusing Robotic Arm using Gesture Control

Siddharth Narayanan Research Center Imarat, DRDO Hyderabad, India 500069

C. Ramesh Reddy

Scientist-E, Research Center Imarat, DRDO Hyderabad, India 500069

Abstract- In this paper, a human machine interface for Explosive Ordnance Disposal is proposed based on gesture control. With the growing role of Remotely Operated Vehicles in bomb defusing scenarios aiding experts to locate, handle and destroy hazardous objects, new intuitive gesture based systems can be modeled on human hand movements to make the control of a complex gripper arm instinctive. Integration of such an arm for more precise control in accurately manipulating explosive devices would allow the ROV to actively assist in defusing the bombs as well. Currently, the control station used by experts is a laptop-like device which consists of a monitor showing the robots point of view as well as its surroundings, plus a joystick and control panel to manipulate the arm and maneuver the tracks. In this study, an adaptive manipulation scheme is proposedthrough a communication interface between an Arduino Uno Micro controller, Leap Motion controller and OWI robotic arm. The results of the implementationdemonstrate the ease of operation and effectiveness of gesture control as atechnique.

Keywords- Explosive Ordnance Disposal (EOD), Leap Motion Controller, Remotely Operated Vehicle (ROV), H- Bridge Circuit, Gesture control

  1. INTRODUCTION

    The latest research and development has resulted in faster, more maneuverable, better equipped robots with a dexterity that could rival an explosive ordnance disposal technician operating in person. Electrically powered and remotely controlled bomb defusing systems are now growing to play pivotal roles. One of the most widely used bomb-disposal robots today is the CobhamtEODor. The standard tEODor is a bomb-disposal specialist along with a twin-track vehicle with a host of military applications. iRobotsPackBot was massively popular with US and international coalition forces in Iraq and Afghanistan. Northrop Grumman rolled out CUTLASS for the British Armed Forces in 2012 as a next generation Unmanned Ground Vehicle. Even the Defense Research & Development Organization, India broke new ground in 2011 when it handed over the first batch of Daksh Remotely Operated Vehicle (ROV) to the Indian Army.

    While technology has now allowed Bomb disposal teams to stay as far away from their work site as humanly possible,

    preferably only interacting via remote-controlled, expendable robots, bomb disposal is also an extremely delicate job and most robots lack the finesse to properly disarm a bomb. For example, the robotic arm used by the bomb squad in San Francisco in 2011

    dropped a grenade it was carrying and then drove over it, eventhough the driver of the robot was completely oblivious to the incident [7]. The gripper was unable to hold the bomb while carrying it out of a garage, and once it dropped the bomb, the robot took four minutes to pick it up again, showing the deficiencies in the system. Even then, the robot did not have the capability to disarm the grenade; instead, it had to hand the explosive to a person to disarm it. Only a few of the current designs have the capability to safely retrieve an explosive after it has fallen.

    More complex grippers with the ability to adapt their grasp to different objects can increase the stability of the grasp as well as make the entire process faster and more reliable [8]. Since gestural control is both intuitive and hands-free, operating any device with gestural control demands much less of your attention and gives the operator an immersive environment for a far more precise manipulation of objects.

    The paper is organized as follows. In Section 2 a brief overview of the Leap Motion Technology is given. Section 3 discusses the use of gesture control for more precise as a means for handling dangerous devices remotely. Section 4 describes the implementation and the results are presented in Section 5. Section 6 gives an insight into its application. Section 7 provides the conclusion and in Section 8 references follow

  2. RESEARCH ELABORATIONS

    The Leap Motion controller introduces a new novel gesture and position tracking system that can track all 10 of the human fingers simultaneously with sub-millimeter accuracy [1]. Its low purchase cost, ease of use and highly active developer base make it an ideal platform to demonstrate an intuitive and adaptive manipulation using the Leap Motion controller and the OWI 535 arm.

    The evolution and increasing research in computer interaction has resulted in a large set of input devices for interaction. The keyboard and mouse have been for long the main instruments to provide an input to a computer. Newer technologies have however, allowed for a more natural and intuitive way of information exchange. Communication between people and sensor based devices has been growing to mimic human to human interaction. Innovative technologies empower users to be more natural and spontaneous when dealing with them while systems adopting these technologies show increased efficiency, speed, power, and realism. Many users feel comfortable with traditional interaction methods like mice and keyboards to the extent that they are often unwilling to embrace new, alternative interfaces. A possible reason for that might be the complexity of these new technologies, where very often users find it disturbing to spend a lot of time learning and adapting to these new devices. Gesture-based human-computer interaction presents a potential solution for this problem since they are the most primary and expressive form of human communication [2]. In the last few years, different optical sensors have been developed, which allow the mapping and acquisition of 3-D information. Many applications have been introduced to exploit the increasing robustness and decreasing cost over time of 3-D sensors [3]. Applications range from industrial use, object tracking, motion detection and analysis, to 3-D scene reconstruction and gesture-based human machine interfaces [4]. These applications have different requirements in terms of resolution, frame-rate throughput, and operating distance. The accuracy of the sensor is considered a challenging task especially in gesture-based user interfaces [3, 5].

    The Leap Motion Controller operation is based on infrared optics and cameras instead of depth sensors. Its unmatched motion sensing precision detects each fingertip of the user to within approximately 0.01mm with a frame rate of up to 300 fps and offers a very wide field of view [6], that captures all of the user's real world movements in 3D. Information on the Cartesian space of predefined objects such as the finger tips or hand position is provided by the manufacturer as part of the Software Development Kit (SDK), along with information about the rotations of the hand (e.g. Roll, Pitch, and Yaw). All delivered positions are measured relative to the Leap Motion Controllers center point.

  3. PROPOSED CONCEPT

    After surveying the literature on manipulation of objects and limitations of typical EOD grippers, the design of a robotic hand is proposed that can mimic the motion of the user's hand and could be attached to bomb disposal robots. The proposed concept depends on what is so called a more natural human-robot interaction. Such a system would have the benefits of performing complex tasks as in soft-robotics, coupled with the force and stability of the mechanical arm. A growing trend in the number of improvised explosive devices and thei increasing prevalence in conflict areas over the past few years [13] makes a strong case for more advanced systems to deal with threats especially in urban

    areas where the gripper must perform tasks in small spaces. In the application of bomb disposal, a gripper must be strong enough to accomplish all of the tasks current robots are capable of doing, such as opening doors and carrying bombs; however, the gripper must be dexterous enough to reach into an explosive device and disarm it.

    The Leap Motion Controller is used to operate the 5-DOF OWI Robot to handle objects in a more intuitive way instead of using the conventional approach via a keyboard or a joystick, which according to the complexity of the robotic arm, require a series of configurations and mode selection routines by pressing a series of buttons, in order to select an operating mode, or to perform a specific trajectory path. The controller monitors the users hand/hands, fingers, and all the accompanied positions and angles. All information regarding the user palm Cartesian position is retrieved from the controller and fed to the algorithm. The algorithm uses the current and previous information supplied by the controller and achieves an optimum realistic mapping between the users real arm and the robotic arm while arm fingers were also programmed to follow all grasp and release operations performed by the user fingers. Additionally the arms angular features such as roll, pitch, and yaw angles are considered to the mapping procedure, enabling a more realistic imitation of the human arm.

  4. IMPLEMENTATION

    The users hand movements are captured by the Leap Motion Controller and sent to the computer. The software algorithm performs all necessary computations, and information is received-from/sent-to the Arduino Uno via Bluetooth. Additional sensors, actuators, and display systems can also be attached via the micro-controller board. The Arduino Board in turn drives the motors of the robotic arm.

    Figure 1: Information Flow diagram between the different entities from the users hand movements to the robotic arm output

    1. Hardware

      The ATmega328-based Arduino UNO microcontroller outputs controlling voltage signals to the DC and receives feedback signal from the robotic arm joints (using potentiometers for the DC motors only). The H-bridge circuit acts as a selective switch to turn motors in clockwise or counter-clockwise rotation according to a desired angle. The hardware is powered by an external high-current DC power supply, and the Arduino 5V output pin. L293 Drivers were used to drive the motors on the OWI arm. An H-bridge motor

      driver is a common method to drive DC motors in two directions under control of a computer. H-bridges can be built from scratch with bi-polar junction transistors (BJT) or with field effect transistors (FET), or can be purchased as an integrated unit in a single integrated circuit package such as the L293. Each one of these can control two motors, utilizing PWM for speed control. Since there are five motors on the robotic arm, three motor drivers were required.

    2. Software

      A computers inbuilt Bluetooth module is used to send Leap Motion data wirelessly to a Bluetooth-enabled Arduino. Arduino has been interfaced with Java for communicating to the serial port via the RXTX Java library. The Eclipse IDE was used for Bluetooth RXTX using the correct Java Run- time Environment and the Bluetooth serial port in the Laptop was used in the code. The LeapJava.jar (also the DLLs) and RXTXcomm.jar were added to the Eclipse project.

    3. Gripper

      The gripper of robotic arm were programmed to follow all grasp and release operations performed by the user. Figure 6 depicts a demonstration of a grasp and release routine. The hands angular characteristics such as roll, pitch, and yaw angles were also considered to enable more realistic imitation of the user's arm. A two finger gesture was used to control the simple gripper setup of the OWI robotic arm. A similar line of thought can be used to implement more complex systems with higher degrees of freedom and better gripper configurations. Connection with the Arduino Uno micro-controller allows the implementation of additional sensors, buttons or display systems.

    4. Threshold Value

      The threshold is a predefined value set by applying conventional filtering techniques (the moving average filter) to the readings while allowing the extreme noisy signals caused by hand tremor movements to be filtered out. Information on the Cartesian positions of the user's arm are fed to the Leap Motion SDK files and Arduino IDE. The algorithm uses the current and previous information supplied by the Leap Controller to achieve an optimum mapping between the users real palm position and OWI robotic arm position.

      Figure 3: Overview of the mapping algorithm. NR is the New Reading from the newly arrived frame while OR refers to the Old Reading from the previous frame. T is the threshold value used to compare the difference of the readings and proceed to the necessary action

    5. Mapping

      In The algorithm is designed to control both Cartesian motion (X, Y, and Z), and Angular motion (roll, pitch, and yaw) of the robotic arm. It compares the reading of the previous frame with the new frame of the Leap Motion controller and acts on the next steps accordingly. When the difference between the readings is higher than the threshold, the arm accordingly moves in the positive or negative direction. A motion is neglected by the arm if the difference value is less than the threshold.

  5. RESULTS

    All components of the final system performed as expected allowing the robotic arm to move, open and close according to the user input. Rotation of the bottom most DC motor of the OWI arm was programmed to follow the x-axis of the leap motion controller while the other 4 motors were programed to work on the z-axis (forward and backward) and y-axis (up and down). The gesture based scheme was also successfully tested to pick up and grab an object using its gripper arm. Images of the actual arm are provided in Figure 4, showingthe movement of the gripper arm.

      1. (b)

    remains to make an effective bomb disposal robot that can disarm bombs with the same capability and adaptability as a human bomb expert. The gesture based robotic arm interface would also allow much more flexibility in improvising and adapting to changing situations.

    )

    (c) (d)

    Figure 4: (a) Robotic gripper in open state (b) Robotic gripper arm in closed state (c) L293 Driver circuit (d) Gripper arm grabbing an object

    The number of degrees of freedom gives us a fair idea of the measure of abilities of a robotic arm, and essentially refers to the number of separately controlled joints in the gripper arm. Grippers most commonly have two components that move independently allowing for either one or two degrees of freedom. For comparison, the human hand comprises 27 degrees of freedom [10]. More degrees of freedom allow a gripper to grab and manipulate objects more easily and in a greater number of ways. When there are only a few degrees of freedom, the operator must first position the gripper arm accurately with respect to the device. The gripper in turn must be equipped to place large forces on the object to have a sturdy grip [8]. Thus the stability and sensitive of both the arm and explosive are important factors to consider. Greater degrees of freedom allow a greater number of points of contact which make the grasp sturdier while also reducing the force applied on the object.

  6. APPLICATION

    A more advanced system would allow the ROV to rotate, move and manipulate the explosive device with the support of the gripper arm. Designing grippers with many degrees of freedom often poses a challenge in controlling movements accurately. Higher degrees of freedom can lead to complex interfaces that are difficult to follow, to communicate with te gripper. This is turn would require special operators with expert training to operate the system. The gesture based control for the robotic arm would make the control of complex robotic arms instinctive and easy to operate. End effectors, or grippers, for bomb disposal robots are constantly under development; however, current grippers still have major limitations. The function of most grippers is limited to accessing, moving, and detonating explosive devices safely, but with more advanced systems, they can be designed to also completely replace the manual task of disarming a bomb. Furthermore, Programmed robotic arms coupled with autonomous ROV to independently handle certain types of devices requiring minimal human intervention. Currently, only some specialized bomb disposal robots can be used to disarm specific types of bombs. The SAPBER robot for example, can remove the end cap from pipe bombs and allow bomb experts to examine the inner materials. This system, however, is very specialized to handle pipe bombs, and would be useless in the case of other bombs, or even if made differently than traditional pipe bombs [11]. The challenge

    While the primary application for this hand proposed in this paper is to attach to an EOD robot and use it to disarm bombs, numerous other applications exist for a similar system. Future projects in extreme environments could be aided by this hand. Suitable modifications can be made to improve consistency and dexterity. This technique is especially useful in dangerous and risky application areas like space exploration, surveillance, surgery, nuclear plants, and underwater operations. Nowadays, the use of tele-operation systems is spreading to include non- hazardous environments as well. They are widely used all around the world in various applications from space applications to entertainment applications [9]. Moreover, work from home for disabled people is now being possible. The business sector has also been positively affected by the introduction of such systems. Operating costs have lowered as the real-operators share on the control process is reduced, and the virtual-operators share is increased instead.

  7. CONCLUSION

    A human machine interface is proposed dealing with the intuitive manipulation of a robotic arm for implementing remote EOD system. The main objective of this study is to introduce a simple and straightforward robotic arm manipulation scheme, in order to enable the incorporation of robotic systems into ROVs to enhance the independence. After constructing a robotic hand according to the proposed design, it was concluded that the basic design of the hand was effective. All systems worked as designed but limitations in the materials used limited the efficiencies of the system and reduced the dexterity. Future extensions could focus on programming the system to record the users motions and later perform the same action through the arm.

  8. REFERENCES

  1. Weichert, F., Bachmann, D., Rudak, B., Fisseler, D., Analysis of the accuracy and robustness of the leap motion controller, Sensors, 13(5) (2013) 63806393.\

  2. Wachs, J.P., Kölsch, M., Stern, H., Edan, Y., Vision-Based Hand- Gesture Applications. Communications of the acm, 54 (2) (2011) 60- 71.

  3. Khoshelham, K., Elberink, S.O., Accuracy and resolution of kinect depth data for indoor mapping applications, Sensors, 12 (2012) 1437 1454.

  4. Biswas, K.K., Basu, S., Gesture Recognition using Microsoft Kinect, Proceedings of the IEEE International Conference on Automation, Robotics and Applications (ICARA), Delhi, India, 68 December 2011.

  5. Stoyanov, T., Louloudi, A., Andreasson, H., Lilienthal, A.J., Comparative Evaluation of Range Sensor Accuracy in Indoor Environments, Proceedings of the European Conference on Mobile Robots (ECMR), Sweden, September 2011. pp. 19-24.

  6. Leap Motion | Mac & PC Motion Controller for Games, Design, & More. 2014. Available at: http://www.leapmotion.com. [Accessed January 2014].

  7. Clumsy! The bomb disposal robot that dropped a live grenade (February 2011). Retrieved on 9/28/2012 from http://www.dailymail.co.uk

  8. Massa, B., Roccella S., Carrozza C., Dario P. (2002) Design and Development of an Underactuated Prosthetic Hand. Proceedings of the 2002 IEEE International Conference on Robotics and Automation, May 2002, 3374-3359.

  9. Pala, M., Lorencik, D., Sincak, P., towards the robotic teleoperation systems in education. ICETA 2012, 10th IEEE International Conference on emerging eLearning Technologies and Applications, November 2012, pp. 241-246.

  10. Smagt P. Grebenstein M. Urbanek M. Fligge N. Strohmayr M Stillfried G. Parrish J. and Gustus A. (2009) Robotics of human

    movements. Journal of Physiology Paris 103, 119-132. doi: 10.1016/j.jphysparis.2009.07.009

  11. Lecher, C. (2012) A New Robot Dismantles Pipe Bombs While Leaving Forensic Evidence Intact. Retrieved 12/4/12 from www.popsci.com

  12. New Robots Planned for Bomb Disposal Teams (February 2011).

    Retrieved on 9/28/2012 from http://www.dailymail.co.uk

  13. Norton-Taylor, Richard. 2014. More than 53,000 civilians killed or injured by IEDs in three years. The Guardian, July 2014 http://www.theguardian.com/world/2014/jul/03/ieds-kill-53000- civilians-in-three-years-70-per-cent-rise [Accessed 4 November 2014].

Leave a Reply