Gesture-based Flying Robot for Precision Agriculture

DOI : 10.17577/IJERTCONV5IS06010

Download Full-Text PDF Cite this Publication

Text Only Version

Gesture-based Flying Robot for Precision Agriculture

Chandan H S1, Ankitha Jain M B2, Roja K N3, Vivek B P4, Shubha C5

Department of Information Science and Engineering, Siddaganga Institute of Technology, Tumakuru, Karnataka-572103

Abstract In traditional agricultural practices farmers need to visit the fields for identifying pests, insects, weeds, animal attacks, crop condition monitoring, maintenance of plant regarding water level, spraying of pesticides and many other tasks which are time consuming for farmers and need more human resource to maintain. To reduce human effort and save time, this paper proposes an intuitive way to control the Flying Robot using predefined hand gestures, which captures the present condition of the field and the same information is furnished to the farmers. Flying Robot collects the data using softwares, later the current status of the crop is estimated. The detailed report of the current crop condition, suggestion, and guidance for boosting the crop yield using precision agricultural techniques will be provided to the farmers.

Index Terms Precision Agriculture, Gesture Recognition, Leap Motion Controller.

  1. INTRODUCTION

    In the rural side of India, agriculture is the main source of income for livelihood and it contributes for Indian economy also. As of now there are lots of problems exist in the field of agriculture due the traditional agricultural practice [3]. Traditional agriculture practices include identifying of pests, insects, weeds, animal attacks and crop condition/ health monitoring, maintenance of crops in terms of water level, spraying of pesticides and many other tasks. Farmers need to visit fields frequently to look after crop conditions, if the field is wider and far away from residence then farmers needs to hire more people to look after each and every agro task by paying wages [6]. This consumes lots of time and requires more effort and is bit costlier if the land area is more. To overcome this problem the key idea is to use Drone technology to handle many of the agricultural tasks by capturing images through the flying robots inbuilt HD camera and there by analyzing the gathered data using precision agricultural software to give appropriate solution to the identified crop condition.

    For precision agriculture using of flying robot is becoming a new approach, a framework to control a flying robot and its application in the field of agriculture is presented in this paper. There are currently two types of Unmanned Arial Vehicles

    (UAVs) available in the market: autonomous and remotely piloted aircrafts. Remotely piloted aircrafts are tough to control using hand held remotes and need professional training. The concept of human computer interaction (HCI) is used and it is based on gesture/postures, thus implementing a so called natural user interface (NUI) [4]. In human machine communication gesture interface plays a main role because it contains a direct expression of mental concepts. Because of this reason NUIs are increasing in communicating and in controlling robots in a natural and efficient way.

    The idea is to control the flying robot is by means of natural hand gestures in an easy learnable manner rather than controlling by complex hand held remotes. The operator gives gestures in front of gesture recognition device which can sense gestures and send them to computing device for processing. The processed information serves as a command to control the flying robot. Gesture-Based Flying Robot is an effective solution to the existing problems in the field of agriculture by reducing time consumption, cost and boosting the crop yield.

  2. RELATED WORK

    Numerous studies have been done to compare the gesture control with remote control and to see how well an effective replacement can be designed and the results already published. To know the methods followed in previous and current agricultural practices, and to provide an effective solution to the problems, following parameters are considered.

    1. Agriculture

      Traditional agriculture practices include ploughing of farm fields using bullock ploughs, hand sowing of seeds, visiting fields for inspection of weeds, pests, monitoring crop condition, water level monitoring, spraying of pesticides/insecticides, applying fertilizers, harvesting, irrigating of land, weather prediction based on experience, manual testing of soil, weed clearance, avoiding and preventing animal attacks using manual sounding devices like crackers, using bullock carts for transportation and many other [3].

      To overcome some of the manual works machineries like tractors, tillers are being used for ploughing, sowing of seeds, cultivation of soil, transportation etc… Due to the advancement in technology Unmanned Ground Vehicles (UGVs) have been invented to carry out some of the agricultural tasks like harvesting, inspection, weed clearance etc. but some of the UGVs failed to work effectively due to the need of more space in between the crops for their movement and their distraction to the crop growth and also it does not work effectively in wet soil.

    2. Unmanned Aerial Vehicles

      An unmanned aerial vehicle (UAV) is an unmanned aircraft system (UAS) commonly known as a drone, is an aircraft without a human pilot aboard. The manned aircraft cannot be used in some cases like where a human cannot handle the dangerous machine or place, can be handled by UAVs. There are currently two types of unmanned aerial vehicles (UAVs)/Flying Robots: autonomous aircrafts and remotely piloted aircrafts. Autonomous aircrafts are aircrafts without a human pilot which flies autonomously without or with very less user intervention. A radio- controlled aircraft (often called RC aircraft or RC plane) is a small flying machine that is controlled remotely by an operator on the ground using a hand-held radio transmitter. Controlling an RC plane requires some training and basic knowledge about aircrafts. There are various methods to learn the operations of remotes; the RC flight simulator is the very popular and basic method of RC airplane training.

    3. Human Computer Interaction

      Human-Computer Interaction (HCI) is a computer technology which gives a platform for users to interact with computers, for this interaction users need an interface which is a Natural User Interface (NUI). A Graphical User Interface (GUI) can be controlled using users voice or gestures/postures and is called a Natural User Interface. Gesture based interface comes under NUI, which plays an important role in Human Machine Interaction (HMI). Human Computer Interaction has a subset called Human Robot Interaction (HRI), in early stages for HRI a glove with microcontroller and wires to cameras to communicate the gestures between camera and glove were used, and also many other wearable devices are used to control a robot by human operators[4].

    4. Precision Agriculture

    Precision Agriculture (PA) meant to grow quality crops by reducing time, cost and environmental impacts. Precision agriculture is one of the major applications in the field of Information Technology (IT). If any farming or agriculture uses technology for some benefit then it is termed as precision agriculture. Precision agriculture changes the traditional

    agricultural practices and farming methodologies by using technology to improve efficiency, accuracy and yield [5]. As the technologies are emerging, new agricultural techniques need to be adopted which is linked to precision agriculture.

  3. METHODOLOGY

    1. Proposed System

      If we consider traditional agriultural practices practically, it will not support mass food production, yield will be supported for local consumption and it is time consuming. To overcome the limitations in traditional agriculture practices and Unmanned Ground Vehicles (UGVs) we identified

      Flying robot or UAV which can handle many of the agricultural tasks more effectively in terms of time and effort which is better than manual work. The operator takes pictures and videos using flying robots inbuilt HD camera and these images show plant stress caused by: nutrients, water stress, disease, insect infestation, overall health status crop damage and insurance claim weed infestation, soil quality, soil map, the images show changes in crop vigour sooner than visible images. Sophisticated post-processing software to generate NDVI images, high-resolution geo located photographs and 3D surface gradient maps. A typical flight will take as many images we want based on the storage capacity, software stitched together into a single high-resolution image.

      Remotely piloted aircrafts are tough to control, not very comfortable, not very receptive by the pilots, very hard to do in a long run and need professionally trained pilots for proper flying. To overcome the limitations of electro- mechanical devices, gesture and body posture recognition techniques have been introduced. In particular, body postures can be recognized by using sensors which need to be worn as well as vision based techniques [1]. The humans will get a natural, expressive, and intuitive way to control robotic system. One benefit of such systems is that they propose natural ways to send geometrical information to the robot, such as: up, down, etc. A human mainly uses body as the main communicating media which can be easily understand. Because of this gestured interface is used for communication [4]. Children learn to communicate by using gestures or body signs before they learn to speak in verbal form. This is one of the example to show gestured language is the universal form of communication. So we propose an intuitive approach to control these flying robots using simple hand gestures rather than using complicated hand held remotes. Our idea is to develop an improved & more intuitive method of piloting through natural hand gestures which adds overall enjoyable experience in piloting and in creating a control system for the most novice users.

      The main aim is to build a framework to interact with human and flying robot. The user is the controller, and a new form of HRI can therefore be experienced. Our

      project allows a user to control a Flying Robot (a type of UAV) using hand gestures. The operator can make these gestures in front of a Leap Motion device, which can sense gestures. Our design is composed of two main parts: gesture recognition and Flying Robot control.

    2. High Level System Diagram

      Fig. 1 represents the proposed systems high level working of the system. The users (operator) hand gestures are tracked by the leap motion device which is connected to the computing device via USB connection. The tracked information from leap motion controller is sent to the computing device for processing. The hand gestures are processed as commands and translated into control signal which is sent to the flying robot through Wi-Fi. The fig.2 shows the connection establishment between leap motion controller and flying robot. The computing device acts as an interface between these two devices. According to the gestures given by the user the flying robot can be controlled with bear hands. The objective is to fully control the flying robot (Drone) movements using a natural controller that is human.

      Fig.1: High level working model of proposed system

      The Graphical User Interface (GUI) lets the operator to monitor remotely many parameters of the platform, navigation, altitude of flight, battery level and streaming video from the inbuilt cameras of the flying robot. The images captured from the flying robot are stored in the computer for further analysis. The collected sets of images are sent for processing. Different precision agricultural software and digital image processing techniques are used to determine the

      crop condition based on various parameters prescribed by agro scientists. The detailed test results are studied to identify the problems in the crops and appropriate solutions to overcome the problem by prescribing suitable fertilizers, pesticides, insecticides, herbicides etc. are suggested to use in the specified ratio. The guidance related to weather conditions, growing of crops based on soil type and climatic conditions and some other agro information is given along with the documented report. The complete report is preferably made in the regional specific language in an easily understandable manner, and finally the detailed report is handed over to the farmer.

      Fig.2: Connection between system components

    3. Control Flow of Proposed System

      The development of gestured interface to control the flying robot uses the concept of machine learning. The proposed approach consists of three steps namely training, detection and recognition. The complete control flow of proposed system is shown in fig.3.

      As soon as the connection is established between the leap motion and the computing device a signal from the IR sensor is generated. If no signal, then check for connection and again wait for the signal from IR sensor. If LED glows by indicating ON state user can give hand gestures. The leap motion controller captures the hand movements in frames per second and it is responsible for capturing the information about the operator in the flow of video and depth sensing, the captured information is sent for preprocessing. If the hand is not detected by the leap motion then gain input the hand gestures. Otherwise detect the contour of the hand and extract the features from the current frame and features computed over a reference image. Leap motion returns a set of relevant hand points and hand pose features and these features are extracted by recognizing the operators hand and mapping the hand movements with predetermined gestures learnt in the training phase. The extracted key points are the coordinates of finger positions from the input gesture. At the runtime, distances are calculated from the feature points. If the extracted feature is not matched with the predefined gesture again give the gesture, otherwise gesture is recognized. The recognized gesture is converted into control signals which serve as a command to control the flying robot and it flies accordingly.

      V. CONCLUSION

      Gesture based flying robot is an effective solution to the existing problems in the field of agriculture and other related fields. The aim is to provide service to farmers by reducing their efforts in simplifying the tasks, save time consumption and boost the yield of crops. This is achieved using available technologies with innovation. The proposed idea helps for precision agriculture by overcoming the traditional agricultural practices and also it is a substitute for remotely controlled aircrafts.

      Fig.3: Control flow of proposed system

  4. ADVANTAGES

    1. Avoids professional training to control flying robots.

    2. Reduces human effort by simplifying the tasks.

    3. Saves time and boosts the yield of crops.

    4. Avoids crop destruction caused by manual inspection.

    5. Useful in field surveys.

In future the proposed idea may provide safe, fast, accurate and cost effective alternatives to traditional manned systems. It contributes for better economic growth, increases exports and motivates to do agriculture. It may also create revolution in Indian agriculture sector similar to the Green Revolution which happened in 1960s.

REFERENCES

  1. Richard Szeliski, Computer vision: Algorithms and Applications, Springer Science & Business Media, 30- Sept 2010.

  2. Roger Pressman, SoftwareEngineering, 7th Edition.

  3. Md Sikandar Azam1 and Dr. M. Banumathi2, The role of demographic factors in adopting organic farming: A logistic model approach. International Journal of Advanced Research (2015).

  4. Kevin R. Wheeler, Device Control Using Gestures Sensed from EMG. IEEE Intl. Workshop on Soft Computing in Industrial Application Binghamton University. NASA Ames Research Centre Moffett Field, CA 94035.

  5. L. Biggs1 and D. Giles2, Current and future agricultural practices and technologies which affect fuel efficiency, Version: 1.

  6. Agriculture in India [Online]. Available from: https://en.wikipedia.org/wiki/Agriculture_in_India

Leave a Reply