- Open Access
- Total Downloads : 20
- Authors : Jeevan Y S, Manjunath N, Chandrashekar B N
- Paper ID : IJERTCONV6IS13019
- Volume & Issue : NCESC – 2018 (Volume 6 – Issue 13)
- Published (First Online): 24-04-2018
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Wheel Chair for Physically Challenged Person
Jeevan Y S
Vidyavardhaka college of engineering, Mysuru, Karanataka,
India
Manjunath N
Vidyavardhaka college of engineering Mysuru, Karanataka,
India
Chandrashekar B N
Vidyavardhaka college of engineering Mysuru, Karanataka,
India
Abstract:- A powered wheel chair is a mobility-aided device for persons with moderate/severe physical disabilities or chronic diseases as well as the elderly. In order to take care for different disabilities, various kinds of interfaces have been developed for powered wheelchair control; such as joystick control, head control and sip-puff control. Many people with disabilities do not have the ability to control powered wheel chair using the above mentioned interfaces. The proposed model is a possible alternative. In this paper, we use four modes to control powered wheel chair. The four modes are eye tracking mode, touch sensing mode, keypad sensing mode and normal mode.
General Terms:- Hough circles algorithm, pull down logic.
-
INTRODUCTION
The ability to move is most important to all the persons. Physically disabled person and elderly person are unable to move without the help of other person. So we designed a prototype wheel chair that is controlled by the eye movement of the user, keypad input by the user, touch screen input from the user and the last by physical help. This is achieved by using different modes, this helps to reduce the stress of the user from using the only one mode, it can be used by most of disabled person.
-
PROBLEM STATEMENT
To develop a wheel chair controlled by eye movement of the user, touch screen input from the user, keypad from the user and using physical strength of the user to move in any desirable direction.
-
PROPOSED METHODOLOGY
-
System overview
The overview of the system is as shown in the figure 9 and figure 10. The figure 9 shows that the raspberry pi 3 board is the center processor of the block it receives the information from the pi camera, keypad, mode switch and process the signal and send the corresponding result to the HT12E and transmitter block, where it transmits the command to the receiver mounted on the prototype wheel chair. The figure 10 shows that the arduino is the main block. The HT12D and receiver block receives the transmitted code and pass the signal to the arduino block. The signal from the ultra sonic detector and touch screen is passed to the arduino, where the signal is processed and
corresponding code is generated and sent to motor driver to move the motors.
-
Description of the components
The components used in the proposed system are USB camera, Raspberry pi 3, arduino, keypad, touchpad, HT12E, HT12D, toggle switch, motors and motor drivers.
-
Pi camera
It is used to capture an image of an eye movement of the user. It is a 5 mega pixel fixed-focus camera that supports 1080p30, 720p60 and VGA90 video modes, as well as stills capture. It attaches via a 15cm ribbon cable to the CSI port on the Raspberry Pi. It can be accessed through the MMAL and V4L APIs, and there are numerous third-party libraries built for it, including the Pi camera Python library.
Fig 1: Pi camera
-
Raspberry Pi 3
It is used to process the camera input, keypad input from the user. The processed signal is sent to the HT12E and transmitter to transmit the signal. The Raspberry Pi is a credit-card-sized single-board computer developed in the UK by the Raspberry Pi Foundation. The Raspberry Pi has a Broadcom BCM2837 system on a chip (SoC), which includes an ARM Cortex-A53 1.2 GHz 64-bit quad-core processor, Video Core IV GPU, and was originally shipped with 1 megabytes of RAM.
Fig 2: Raspberry Pi 3
3.2.8 Motors
Fig 7: Ultra sonic sensor
-
Arduino
It is used to interface with the touch screen and process the signal and generate the corresponding code to move the wheel chair and send it to the motor driver.
It is used to move the wheel chair depending on the signal sent by the motor driver.
3.3 Working
Fig 8: Motors
The mode switch is used to switch the modes from one to other. The modes are as shown in the table 1.
-
Keypad
Fig 3: Arduino
Table 1. Table shows the modes
It is used to receive the input from the user to move the wheel chair in the desired direction and pass the signal to the raspberry pi.
-
Touch screen
It is used to receive the input from the user to move the wheel chair in the desired direction and pass the signal to the arduino.
-
Toggle switch
It is used to switch the modes. There are four modes available they are eye movement mode, touch screen mode, keypad mode and normal mode.
-
Ultra sonic sensor
-
It is used to detect the obstacle that is present at the range of 10 cm. When it detects the obstacle it sends a signal to ardunio to stop the wheel chair.
SW1
SW2
MODE
System operating
0
0
0
Normal operation
0
1
1
Eye recognition system
1
0
2
Remote interfacing system
1
1
3
Touch sensing system
The mode switch is connected to the raspberry pi , where the processing takes place and the modes is selected. The working of the modes is explained below:
-
Mode 0
This mode is normal mode. It is selected when the mode switch is set to 00. The raspberry pi generates a stop code, so that the wheel chair is not moved automatically. It can be move manually only.
-
Mode 1
This mode is eye movement mode. It is selected when the mode switch is 01. The eye image of the user is captured and sent to the raspberry pi for further processing. The center of the pupil is detected by using hough circles algorithm. Which gives the centre coordinates of the pupil. Depending on the coordinates the code to move the wheel chair is generated.
.
-
Mode 2
This mode is keypad mode. It is selected when the mode switch is 10. The key press is detected using the pull down logic. When a key is pressed there a active low signal generated at the pins. Depending on the key pressed the code to move the wheel chair is generated.
-
Mode 4
This mode is touch screen mode. It is selected when the mode switch is 11. The user command through touch is detected. The corresponding code to move the wheel chair is generated.
.
Fig 9: Block diagram representation at raspberry pi part
Fig 10: Block diagram representation at arduino part
-
-
RESULTS
The images of the corresponding input and its output images are as shown below:
Fig 11: mode 0(normal mode)
Fig 12: No movement
Fig 13: Mode 1(eye movement mode)
Fig 14: pupil position to stop
Fig 15: pupil position to turn right
Fig 16: pupil position to turn left
Fig 17: pupil position to move forward
Fig 18: pupil position to move reverse
Fig 19: mode 2(keypad mode)
Fig 20: key to move forward
Fig 21: key to turn left
Fig 22: key to move reverse
Fig 23: key to turn right
Fig 24: mode 3(touch screen mode)
Fig 25: wheel chair moving forward
Fig 25: wheel chair turning right
Fig 25: wheel chair turning left
-
CONCULUSION
The prototype wheel chair responded to the command and moved in the desired direction. All the mode are working properly as per requirement.
-
REFERENCES
-
Shafi. M, Chung. P. W. H: A Hybrid Method for Eyes Detection in Facial Images, International Journal of Electrical, Computer, and Systems Engineering, 231-236, [2009].
-
Morimoto, C., Koons, D., Amir, A., Flickner. M: Pupil detection and tracking using multiple light sources, Image and Vision Computing, 18,4, 331-335, [2000].
-
R. Newman, Y. Matsumoto, S. Rougeaux and A. Zelinsky:Real-time stereo tracking for head pose and gaze estimation, Proceedings of the Fourth International Conference on Automatic Face and Gesture Recognition, 122-128, [2000].Tavel, P.
-
K.-N. Kim and R. S. Ramakrishna: Vision-based eye gaze tracking for human computer interface, Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, 2, 324-329, [1999].
-
Takegami. T, Gotoh. T, Kagei. S, Minamikawa-Tachino. R:A Hough Based Eye Direction Detection Algorithm without On-site Calibration, Proceedings of the 7th Digital Image Computing: Techniques and Applications, 459-468, [2003].
-
Gary Bradski, Andrian Kaebler: Learning Computer Vision with the OpenCV Library, OREILLY, 214-219, [2008].
-
Zhiwei Zhu, Qiang Ji, Fujimura, K., Kuangchih Lee: Combining Kalman filtering and mean shift for real time eye tracking under active IR illumination, Proceedings of the 16th Pattern Recognition International Conference, 4, 318- 321, [2002].
-
Haro. A, Flickner. M, Essa, I: Detecting and Tracking Eyes By Using Their Physiological Properties, Dynamics, and Appearance , Proceedings of the CVPR 2000,163- 168,[2000]
- [9] D. Purwanto, R. Mardiyanto, K. Arai: Electric wheelchair control with gaze direction and eye blinking, Proceedings of The Fourteenth International Symposium on Artificial Life and Robotics, GS21-5, B-Con Plaza, Beppu, [2008].
-
Sarangi, P., Grassi, V., Kumar, V., Okamoto, J.: Integrating Human Input with autonomous behaviors on an Intelligent Wheelchair Platform, Journal of IEEE Intelligent System, 22, 2, 33-41, [2007].