Vision Based Object Tracking and Navigation of an Autonomous System

DOI : 10.17577/IJERTV3IS050901

Download Full-Text PDF Cite this Publication

Text Only Version

Vision Based Object Tracking and Navigation of an Autonomous System

P. P. Chavan 1, U. C. Kulkarni 2, P. R. Pethkar3

1 Assistant Prof, Department of Electronics & Telecommunication, K J College of Engineering & Management Research, Pune, Maharashtra, India

2,3 Student of Department of Electronics & Telecommunication, K J College of Engineering & Management Research, Pune,

Maharashtra, India

Abstract – There are several categories of robot navigation and object tracking. Sensor based systems and vision based systems are the two major categories of them. Sensor based uses various kinds of sensors such as IR sensors and ultrasonic sensors. In vision based approach, it uses a vision system to extract the features it needs in order to navigate. The vision system is mostly a video or wireless camera.In this paper an autonomous robot utilizes a camera and tracks a specific colour ball along with GPS navigation techniques. Using this information, the robot would control two motors to steer the robot towards the ball and update the location of ball in terms of co-ordinates to the server in control unit.The microcontroller acts as a prime controlling element for the custom built robot. The GPS receiver has been interfaced with the robot which is capable of receiving information. The movement of robot is detected by the IR sensor. The challenging portion of the project is to synchronize motion detector camera and GPS navigation unit simultaneously

Keyword:Object tracking, Navigation, Centroid method, Wireless Communication, Object avoidance IR proximity sensor.

  1. INTRODUCTION

    A number of potential markets are slowly emerging for mobile robotic systems. Entertainment applications and household or office assistants are the primary targets in this area of development. These types of autonomous robots are designed to move around within an often highly difficult environment. Vision is one of the most powerful and popular sensing method used for autonomous navigation. When compared with other vision based approaches to navigation these continue to demand a lot of attention from the robot research community, because of its ability to provide detailed information about the environment, which is not be available using combinations of sensors. Vision-based navigation for mobile robots is still an open research area.

    The proposal is to use novel object tracking and algorithms to address the problem of mobile robot navigation by adopting a centre following method, without the use of prior environmental information. Visual feedback information will be provided using a Laptop with camera to acquire information using the

    mapless navigation technique and developed methodologies.

    Figure1:Basic block description of the system.

  2. CENTROID METHOD

      1. Pre -processing:

        First of all, image frames are captured from the video streamof the camera. Then, the pre-processing step is applied. This step consists of binarization and morphological process. Binarization process is done by implementingthresholding process. Morphological process with basic dilation and erosion isinserted to remove remaining artifact from the binary image.

        Capturing Frames

        Binarization Morphological Process

        Window Matching Initial Centroid Algorithm

        Shift of Origin

        Figure 2:Flowchart of method followed.

      2. Initial Centroid Algorithm:

        A centroid of an object is intersection of all hyper planes that divide into two parts of equal moment. It is also defined as center of mass. We divide the frame into four Quadrants to determine the position of centroid.

        Figure 3:Calculation of centroid.

        The formula to calculate centroid is given as: A=r²

        X=0 Y=0

      3. Shift Of Origin:

    Once the origin is calculated the position of the ball is determined. The first value calculated is a default value. By this value it can be known that the ball is in initial position. However, if the centroid is deflected from its initial position again the new centroid is calculated. This is called as shift of origin. If the centroid shifts in left quadrant the robot moves in left direction and if it shifts in right quadrant the robot moves in right direction. Thus the ball is followed accordingly.

  3. COMMUNICATION

      1. Introduction

        The server and client protocol or P2P protocol is established using Wi-Fi. It is a specification for a suite of high level communication protocols used to create personal area networks. The decentralized nature of such wireless ad hoc networks makes them suitable for applications where a central node can't be relied upon. We use serial communication for data transfer for transmission or memory register as per the process call.

      2. Wi-Fi Technology

        Wi-Fi is a technology that allows an electronic device to exchange data or connect to the internet wirelessly using microwaves in the 2.4 GHz and 5 GHz bands.Wi-Fi allows communications directly from one

        computer to another without an access point. This is known as an ad hoc Wi-Fi transmission.

      3. Data Transmission

    Data enters the module UART through the DI pin (pin 3) as an asynchronous serial signal. The signal should remain idle high when no data is being fed to transmit. Each data byte consists of a start bit (low), 8 data bits (least significant bit first) and a stop bit (high). The figure illustrates the serial bit pattern of data passing through the module.

    Figure4:UART data packet as transmitted through the RF module. Example Data Format is 8 bits, No parity bit and 1 stop bit.

  4. GLOBAL POSITIONING SYSTEMS

    The Global Positioning System (GPS) is a space-based satellite navigation system that provides location and time information in all weather conditions, where there is an unobstructed line of sight to four or more GPS satellites. The system provides facilities to military, civil and commercial users around the world.

      1. To Receive GPS string Data

        GPS unit is likely to be outputting its data non-stop between 1 and 4Hz, although we really don't need a reading more than once every 5 seconds or so to track our accurately. It's usually at slow baud rate that is from 2400 to 9600bps. Also 4800bps is a typical standard, but not the rule. The datasheet of GPS module will be more specific.

        It's a very non-demanding signal. This means that if microcontroller does not have enough UART ports, a software-based UART can be used without issue.

        The data is likely to be in TTL format or RS-232 (if it's an old unit). It stores the string as an array, and analyse it looking for the variables we want. GPS strings are standardized, so we can likely find code for our microcontroller online. It's best to get a GPSthat outputs normal TTL so that it can be directly interfaced with a microcontroller.

      2. GPS Accuracy

        The accuracy of a position determined with GPS depends on the type of receiver. Most of the GPS devices have accuracy of about +/-10m. Differential GPS (DGPS) requires an additional receiver fixed at a known location nearby. Observations by the stable device receiver are used to correct positions recorded by the devices producing an accuracy that is greater than 1 meter.

  5. OBSTACLE DETECTON USING IR SENSOR The robot is trying to plagiarism the human

    senses for obstacle detection; the most economical solution to this is by introducing a proximity sensor to the robotic unit. Calculation for the parameter f distance is carried in such a way that the robot should not get lost in its path of detection and confuse its vital information of its positioning and orientation.

    Every possibility of detection in the complete arena is accounted into the robotic system, this design after the detection of obstacle in its path, is made by keeping track of its position on the arena, orientation, parameters of arena and to avoid the obstacle and continue to map the rest fields.

  6. WORKING

    The initiation is done by turning on all the devices.First of all for running the function it is necessary to calibrate that is know the exact values of pixel elements of the ball to be followed. This is done by capturing the image of ball and checking the value of its pixels at various points.But it also displays the value of pixel where the cursor is currently located at the bottom of the screen.For more accuracy take at least ten images at different angles and find minimum and maximum pixel value for the ball and put it in the code given.

    Now the function moves as follows:

    The camera attached to the robot takes images at frequent intervals and processes it by converting the image to binary image such that image is white if pixel value lies in desired range and black otherwise.Next we determine the centroidof the image so that robot moves towards the centroid as found out. After finding the centroid we open the serial port for communication with the robot.

    When the serial port is open and we know the centroid, we send the direction to be followed by robot over the serial port and the robot follows accordingly when programmed as a simple pc controlled algorithm. Matlab sends it commands to Bascom AVR where it gives signals for the motors to move and follow the ball.

    Co ordinates of the exact location of the ball are updated to the server simultaneously while the object is moving. A GPS navigation module having transmitter and receiver is used. Finally we show an application of

    avoiding and tracking of an objectusing GPS navigation techniques.

    In case of no obstacle is detected the robotic units keeps updating its position and mark it onto the window terminal on the screen. This process is continuously displayed onto the screen on the real time basis.

  7. FLOWCHART AND ALGORITHM

The overview of our project can be depicted from the following flowchart and algorithm.

Figure 5:Flowchart of our Project.

ALGORITHM:

  1. Start

  2. Turn all the devices on.

  3. Camera on- Display input video stream

  4. Detection of object

    1. If object is detected go to next step

    2. If object is not detected go back to step 3.

  5. Detection of red colour.

  6. Compare the minimum and maximum threshold values with calibrated values.

  7. Extraction of image- removing noise using filters.

  8. Calculate and find the centroid of the object.

  9. Calculating displacement of object using centroid algorithm.

    1. Initially calculate default value.

    2. If object centroid is shifted to right (compare with default value), move the robot towards right.

    3. If object centroid is shifted to left (compare with default value), move the robot towards left.

  10. Observe the displacement of the object.

    1. If object is changing location (Right shift or Left shift) go to next step.

    2. If object is not changing location go back to step 8.

  11. Follow the object.

    1. Matlab sends commands to Bascom AVR

    2. Serial Communication is initiated with microcontroller.

    3. Steer the motors of the robot towards the object.

  12. GPS module keeps a check on the movement of motors, if

    1. Movement is detected go to next step.

    2. Else go back to step 9.

  13. GPS module updates coordinates of current location of the object.

  14. Stop.

Figure 6:Object tracking and navigating system set-up.

8. APPLICATIONS

      1. Unmanned Military vehicles:It can be used for land soldier automation. On the war field also to give real time video signals of the intruders.

      2. Pipe reconstruction: Using same technology we can develop different mini robots. These robots can be used for pipe line inspection for wiring checking and device defaults detections.

      3. Mining: If we attach an emergency torch or any lighting device. In case of night operation user can use this to see the front view. Thus, this strategy can be used in mining.

      4. Metal detector: Using GPS navigation techniques it is easy to navigate the robot. With the help of wireless camera the obstacle can be tracked and avoided. Due to this, it is easy to detect metals and track the robot accordingly.

      5. Self -Defensive Machine Gun: A light emitting diode which generate long distance red laser beam can be attached with machine gun so that it will move at machine guns direction.

      6. Terrorist attacks: It can be very useful in ground level combat save most worthy human life. It will continuously capture video frames using web camera along with current location of robot.

        1. RESULTS

          The following figure shown below is the arrangement of a laptop having a camera that has been placed on a chassis having two motors for the displacement. Camera follows a specific colour object lets say red colour for instance. GPS module is attached to the laptop and co ordinates of the same are updated simultaneously. We use a LCD for display. Thus, a moving robot with the visual tracking and navigating method is proposed.

          Figure 7:GPS co-ordinates updated simultaneously.

        2. ACKNOWLEDGEMENT

          The success of any project depends upon the guidance and encouragement by the mentor and many others.I would like to heartily thank our guide, Prof. MrsP. P. Chavan and acknowledge her able guidance and constant encouragement. Her knowledge and experience in the field of Electronics and Telecommunication was of immense help to us during the preparation of my project.We appreciatively thank our HOD, Prof. Mr P. U. Chavan. His experience and expertise, and insistence, on our no less than, the best efforts, guided us to achieve our goal. We are sincerely thankful to him.

          We are grateful to our institute for providing us with favourable environment and all the facilities to complete this project. We would like to dedicate this project to our family and friends who were the true pillars, both morally and monetarily. Their endless support and belief on us are the basis of our focus and dedication.

        3. REFERENCES

  1. Vikas Verma, Jaipur National University, Jagatpura, Jaipur, Mobile robot Navigation Techniques: A Survey , Vol. 1 Issue II, September2013 ISSN: 2321-9653

  2. Performance Enhancement of MEMS-based INS/GPS Integration for Low-Cost Navigation Applications, IEEE Transactions on (Volume:58, Issue:3), 2009,pp.1077-1096

  3. D. Comaniciu and P. Meer. Mean shift analysis and applications. In IEEE Int. Conf. on Computer Vision, volume 2, pages 1197 1203, 1999.

  4. Adaptive Navigation of Mobile Robots with obstacle Detection and Avoidance, Robotics and Automation, IEEE transactions on (volume:13,Issue 4), 1997,pp 596-601

  5. Author:Muhammad Ali MazidiThe AVR Microcontroller and Embedded Systems: Using Assembly and C.

  6. Gordon McComb,MykePredko, Robot Builders Bonanza- Third Edition

  7. Open Source Computer Vision Library Reference Manual Intel Corporation

  8. Gonzalez and Woods Digital Image Processing

Leave a Reply