- Open Access
- Total Downloads : 12
- Authors : Raunaq Chawhan, Suraksha Kulal, Srikant Battula
- Paper ID : IJERTCONV3IS06039
- Volume & Issue : ICONECT – 2015 (Volume 3 – Issue 06)
- Published (First Online): 24-04-2018
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
System for Assisted Mobility using Eye Movement
Raunaq Chawhan1, Suraksha Kulal2, Srikant Battula3
Department of Electronics and Telecommunication Engineering1,2,3
K.C. College of Engineering and Management Studies and Research, Thane, India1,2,3
AbstractThis paper presents an idea to control a robot using eye movements. In this paper how a robot can be controlled using eye movements has been described. Image of the eye looking in different directions is captured by camera and is given to the MATLAB script. The MATLAB program processes this signal and gives it to the microcontroller which drives the motor of the robot accordingly. This eye controlled system can further be implemented on a wheelchair for patients with physical disabilities or chronic diseases which have no control over their limbs for movement.
KeywordsEye movement, Electro-oculography, Infra-red Oculography, Scleral Search Coil Method, MATLAB, Micro- controller.
-
INTRODUCTION
People with physical disabilities or chronic diseases have problems related to their locomotion. To assist these patients in their movement wheelchairs are used. In order to deal with various disabilities, various kinds of interfaces have been developed like the electric wheelchair with joystick to control it, wheelchair based on the movement of head and movement of tongue. But there are cases wherein the patients with certain disabilities do not have the ability to control the wheelchair using the above mentioned interfaces. They have lost complete control over their body and in that case all these interfaces are of no use for that patient and we need to come up with an idea which will be useful for them without being dependent on others for their movement. Here the eye controlled wheelchair gives them a possible alternative. So the idea is to create an eye monitored system which will control a robot depending upon the eye movements. This idea can further be implemented on a wheelchair to help fully paralyzed patients. In this eye monitored system a camera will constantly monitor the persons eye. Eye movements will be captured and these signals will be given to a MATLAB script running on the laptop. The signals will be processed and the desired information will be given to the microcontroller which will drive the motors of the robot accordingly.
-
LITERATURE SURVEY
The most important part in any eye controlled system is the eye tracking and its detection because once the eye detection is done properly, the appropriate signals can then be sent to the microcontroller. The systems for the eye detection are as follows:
-
Electro-oculography-
Electrooculography (EOG/E.O.G.) is a technique for measuring the corneoretinal standing potential that exists between the front and the back of the human eye. The
resulting signal is called the electrooculogram.[2]Due to the higher metabolic rate at the retina compared to the cornea, the eye maintains a voltage of +0.40 to +1.0 millivolts with respect to the retina. This corneo-retinal potential, which is roughly aligned with the optic axis and hence rotates with the direction of gaze, can be measured by surface electrodes placed on the skin around the eyes.[8]Pairs of electrodes are typically placed either above or below the eyeor to the left and right of the eye.If the eye moves from centerposition toward one of the two electrodes, this electrode "sees" the positive side of the retina and the opposite electrode "sees" the negative side of the retina. When the eyes look straight ahead, a steady dipole is created between the two electrodes. When the gaze is shifted to the left, the positive cornea becomes closer to the left electrode, which becomes more positive with zero potential at the right electrode and vice versa.[1][3]
The advantage of this technique is that the potential difference can be calculated easily in both light and dark conditions. It is the most inexpensive eye movement recording system. [4]
The disadvantage of this technique is that the system is less accurate and signal also varies due to skin resistance. [4]
-
Infra-red oculography-
Infra-red oculography is based on the principle that, if a fixed light source is directed at the eye, the amount of light reflected back to a fixed detector will vary with the eyes position. This method utilizes measuring the diffused reflection of infra-red light from the frontal surface of the eyeball. A number of IR light sources are used for illumination, and photo detectors aimed at receiving the reflected light for picking up the signal. The systems track the limbus (the boundary between sclera and iris), or the pupil-iris-boundary, to measure relative eye rotation.Infra- red light is used as this is "invisible" to the eye, and doesnt serve as a distraction to the subject. As infra-red detectors are not influenced to any great extent by other light sources, the ambient lighting level does not affect measurements.
This technique works well for measuring the horizontal eye movements and also the spatial resolution is good.
The disadvantage of this technique is that for vertical eye movement the signal is worse as the eyelids obstruct the iris-sclera boundary. Blinks can also be a problem as not only do the lids cover the surface of the eye, but the eye retracts slightly, altering the amount of light reflected for a short time after the blink. [4]
-
Scleral search coil method-
When a coil of wire moves in a magnetic field, the field induces a voltage in the coil. If the coil is attached to the eye, then a signal of eye position will be produced. In order to measure human eye movements, small coils of wire are embedded in a modified contact lens.This is inserted into the eye after local anesthetic has been introduced. A wire from the coil leaves the eye at the temporal canthus. The field is generated by two field coils placed either side of the head. This allows horizontal eye movement to be recorded.If it is necessary to also monitor vertical eye movements, then a second set of field coils, usually set orthogonally to the first set is used. The two signals (one for horizontal, one for vertical eye movement) generated in the eye coil can then be disentangled using appropriate electronics.
The advantage of this technique is that the accuracy is good and coil signals provide a better signal stability.
The disadvantage of this technique is that thethin wire connecting the coil with the measuring device is not comfortable for the subject. It is an invasive method, requiring something to be placed into the eye. [4]
-
-
IMPLEMENTATION OF PROPOSED SYSTEM
Fig1. Methodology
The proposed system uses a camera which constantly monitors the eye. After eye detection and capturing eye movement, the image is given to the MATLAB program running on the laptop which processes it. The program which processes the signal determines in which direction the person wants to move and communicates this information using serial communication to a microcontroller which drives motors in the desired direction.
Eye movement tracking is used in this proposed system to determine the motion of eye. Based on this direction of motion of eye, commands are given to a robot. The eye motion tracking hardware includes camera. This camera is adjusted so that it lies in front of one of the eye of user. The
drivers of the camera are installed in a PC to which the camera is plugged in. The software module for image processing works on three different modules: video capturing, frame extractionand pixel color detection.
Fig2. Block diagram
The block diagram shows how the entire system is going to work. A webcam captures and recognizes an object in view and tracks the users eye movements using computer-vision based technique. It sends the data to the computer. The camera, in a sense, acts as a digital eye, seeing what the user sees. The camera recognizes objects around instantly. The data received by the computer is processed using MATLAB software running on the computer. The output generated is then fed to the microcontroller. The microcontroller using these signals drives the motors of the robot.
Fig3. Circuit diagram
There are two major components of our proposed system and they are eye detection and the microcontroller controlled device.
-
Eye-Detection
Fig4. Eye tracking module
The above flowchart shows how the eye detection is done. It is divided into three parts i.e. first isolation takes place, second is eye calibration and third is tracking the eye. This is done by using a camera which is positioned in such a way that it continuously monitors the users eyes. A MATLAB program designed to process the images captured is wired to the camera.Since the image consists of only the eye, basic thresholding method is sufficient for the extraction the eyeball. For that purpose, first the RGB image is converted into a grayscale image. Then with proper threshold value, the eyeball is extracted. Along with the eyeball, even the eyelashes get extracted. They are filtered on the basis of area of the extracted region. The eyeball region has the maximum area occupied in the image and thus the eyeball region is extracted by filtering the regions of eyelashes on the basis of area difference. Then the centroid of the eyeball is found out to track the eyeball. MATLAB and its image processing toolbox are used for processing the image. The image is first converted into grayscale and then a threshold is used to convert it into binary image. Once the eyeball is tracked and the information is extracted from the image as to in which direction the robot should move, this signal is then
communicated to the microcontroller. Microcontroller on receiving these signals moves the robot in the desired direction.
Fig5. Eye isolation
The flowchart shown above gives the detailed information as how the eye detection is done to give an appropriate signal to the robot.
-
Microcontroller controlled device:
A decision based on the processing done by the MATLAB application is communicated and received by the microcontroller. The controller on reception forces the port pin high on which the motors have been connected for desired motion of the device.
There are two parts in the code structure. The first part is detecting the eye movements and theother part is to drive the motors. Following steps explain the code structure:
-
Initialization: Initially we set up the serial communication that will be used later for the interface between MATLAB and the controller, the video capture and the program variables.
-
Image and Video Processing: We then take continuous video frames and sample the input and save it as the screen shots. Each frame is then converted into the black and white frames.
-
Estimation: Now, after working on the each frame we try to detect the eyes. This we do by estimating the position of left as well as the right eye.
-
Detection: In this step we actually detect the eye movements. Sometimes, it may be possible that only one of the either eyes will be detected. In that case, we will give preference to the eye that is detected currently.
-
Motion: Now after detecting the eye movements, we have to come up with a decision algorithm that will help the controller to drive the motors accordingly.
-
Serial Communication: Now according to the detected command, the MATLAB application will transmit bits for left, right and straight respectively to the controller which will drive the motors.
-
-
CONCLUSION
Various systems have been developed to assist fully paralyzed patients such as electro oculography, infra-red oculography and scleral search coil method. But these systems required more careful handling and at certain extent were likely to cause damage to the eyes as these systems required physical contact with the eyes.
This paper has presented a system which tracks eye movement using web camera. The user will not feel any strain on the eyes as the system is not physically connected to the eyes.Eye movement is faster than other input media and thus the system is efficient. This system also doesnt require any particular training or co-ordination for the user to use it.
REFERENCES
-
Eye and Retina, http://thalamus.wustl.edu/course/eyeret.html.
-
http://en.wikipedia.org/wiki/Electrooculography
-
http://www.ijstr.org/final-print/feb2014/Eye-Movement-Based- Electronic-Wheel-Chair-For-Physically-Challenged-Persons.pdf.
-
http://www.ijsrp.org/research-paper-0912/ijsrp-p0929.pdf.
-
http://www.realexposer.com/2010/01/advantages-and- disadvantages-of-electro.html.
-
Culpepper, B.J, Keller, R.M: "Enabling computer decisions based on EEG input", IEEE Trans on Neural Systems and Rehabilitation Engineering, 11, 354-360, [2003]
-
Shafi. M, Chung. P. W. H: A Hybrid Method for Eyes Detection in Facial Images, International Journal of Electrical, Computer, and Systems Engineering, 231-236, [2009].
-
Digital Image Processing Using Matlab By Rafael C. Gonzalez,
Richard Eugene Woods, Steven L. Eddins
-
http://waset.org/publications/2968/retina-based-mouse-control-rbmc-
-
S. Tameemsultana and N. Kali Saranya, Implementation of Head and Finger Movement Based Automatic Wheel Chair, Bonfring International Journal of Power Systems and Integrated Circuits, vol. 1, Special Issue, pp 48-51, December 2011.
-
A B Usakli, S Gurkan, E Aloise, G Vecchiato and E Babliloni, On the Use ofElectrooculogramforEfficientHumanComputerInterfaces, Hindawi Publishing corporation, Computational Intelligence and Neuroscience, pp 1-5, Volume 2010.
-
Manuel Mazo, Francisco J. Rodriguez, Jose L, Lazaro, Jesus Urena, Juan C. Garcia, Enrique Santiso, Pedro Revenga and J. Jesus Garcia, Wheelchair for Physically Disabled People with Voice, Ultrasonic and Infrared Sensor Control , Autonomous Robots, vol.2, no. 3, pp. 203-224 ,Sep 1995.
-
QX. Nguyen and S. Jo, Electric wheelchair control using head pose free eye-gaze tracker, Electronics Letters, Vol. 48 No. 13, 21st June 2012
-
http://mil.ufl.edu/4924/projects/s12/final/DelRosario_Kiep.pdf
-
K Aai, and R Mardiyanto, Comparative Study on Blink Detection and Gaze Estimation Methods for HCI, in Particular, Gabor Filter Utilized Blink Detection Method, 2011 8th International Conference on Information Technology: New generation, 2011, pp 441-446
-
NH Cuong, and H T Hoang, Eye Gaze Detection with a Single Webcam Based on Geometry Features Extraction, 2010 11th International Conference on Control, Automation, Robotics and vision, Singapore, 7-10th December, 2010, pp 2507-2512.