- Open Access
- Total Downloads : 969
- Authors : Sangameshwar.Ch, Bt.Vedanth
- Paper ID : IJERTV2IS90974
- Volume & Issue : Volume 02, Issue 09 (September 2013)
- Published (First Online): 03-10-2013
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Hand Motion Based Crane Control Using Wireless Communication
Sangameshwar.Ch 1, 1Student,
1 ECE DEPARTMENT, MALLAREDDY INSTITUTE OF ENGINEERING AND TECHNOLOGY.
HYDERABAD, AP, INDIA.1,
Bt.Vedanth 2 2Assistant Professor
2ECE DEPARTMENT, MALLAREDDY INSTITUTE OF ENGINEERING AND TECHNOLOGY.
HYDERABAD, AP., INDIA2
Abstract:The proposed method designs a wireless MEMS by using ZIGBEEtechnology which controls the crane with hand gestures (movements of hand). In the existing method the Cranes are controlled by either wired livers or joysticks for selecting or controlling the Crane tasks by holding the joysticks and levers in hands.It is very difficult to Handle heavy loads using these methods. This is a low cost wireless technology which is very easy to operate or control the Cranes. The controller used at the hand ARM7 is used and controller connected to Crane is 8051. The system uses a compact circuitry built around LPC2148 (ARM7) microcontroller Programs are developed in Embedded
-
Flash magic is used for loading programs into Microcontroller.
Keywords:LPC2148 (ARM7), AT89S52, ZIGBEE,
limit switch and hand movement sensor.
-
INTRODUCTION
Gesture recognition has been a research area which received much attention from many research communities such as human computer interaction and image processing. The increase in human-machine interactions in our daily lives has made user interface technology progressively more important. Physical gestures as intuitive expressions will greatly ease the interaction process and enable humans to more naturally command computers or machines. For example, in tele-robotics, slave robots have been demonstrated to follow the masters hand motions remotely [1]. Other proposed applications of recognizing hand gestures include character-recognition in 3-D space using inertial sensors [2], [3], gesture recognition to control a television set remotely [4], enabling a hand as a 3-D mouse [5], and using hand gestures as a control mechanism in virtual reality [6]. It can also be used for the improvement of interaction between two humans. In our work, a miniature MEMS accelerometer based recognition system which can recognize eight hand gestures in 3-D space is built. The system has potential uses such it act as a vocal tract for speech impaired people.
Most of existing systems in the gesture recognition follows image-based approaches. It requires
sophisticated image processing platforms .Mostly cameras were used as input devices. Object needs to be present in front of the cameras for capturing gestures, which limits the mobility. Power consumption is a challenging one. Several other existing devices can capture gestures, such as a Wiimote, joystick, trackball and touch tablet. Some of them can also be employed to provide input to a gesture recognizer. But sometimes, the technology employed for capturing gestures can be relatively expensive, such as a vision system or a data glove [8]. There are mainly two existing types of gesture recognition methods, i.e., vision-based and accelerometer and/or gyroscope based. Existing gesture recognition approaches include template- matching [11], dictionary lookup [12], statistical matching [13], linguistic matching [14], and neural network [15]. Three different gesture recognition models, 1) sign sequence and Hopfield based gesture recognition model; 2) velocity increment based gesture recognition model; and 3) sign sequence and template matching based gesture recognition model are also available.
To overcome the limitations such as unexpected ambient optical noise, slower dynamic response, and relatively large data collections/processing of vision- based method [9], and to strike a balance between accuracy of collected data and cost of devices, a Micro Inertial Measurement Unit is utilized in this project to detect the accelerations of hand motions in three dimensions. The proposed recognition system is implemented based on MEMS acceleration sensors. Since heavy computation burden will be brought if gyroscopes are used for inertial measurement [10], our current system is based on MEMS accelerometers only and gyroscopes are not implemented for motion sensing. Fig.1 shows the system architecture of the proposed gesture recognition system based on MEMS accelerometer. The details of the individual steps are described below.
The sensing device senses acceleration in three axes
.Those sensed signals are conditioned and given to the controller circuit. The controller compares the incoming signal values with the pre-stored values. Commands for each gesture were separately allotted to each channel in the voice chip .When the incoming acceleration value matches with pre-stored one corresponding channels
will be enabled and the command be displayed .The same be played through the speaker after amplification since the signal from voice chip is very low.
-
SYSTEM DESIGN MODEL
-
Software design module
It is possible to create the source files in a text editor such as Notepad, run the Compiler on each C source file, specifying a list of controls, run the Assembler on each Assembler source file, specifying another list of controls, run either the Library Manager or Linker (again specifying a list of controls) and finally running the Object-HEX Converter to convert the Linker output file to an Intel Hex File. Once that has been completed the Hex File can be downloaded to the target hardware and debugged. Alternatively KEIL can be used to create source files; automatically compile, link and covert using options set with an easy to use user interface and finally simulate or perform debugging on the hardware with access to C variables and memory. Unless you have to use the tolls on the command line, the choice is clear. KEIL Greatly simplifies the process of creating and testing an embedded application.
The user of KEIL centers on projects. A project is a list of all the source files required to build a single application, all the tool options which specify exactly how to build the application, and if required how the application should be simulated. A project contains enough information to take a set of source files and generate exactly the binary code required for the application. Because of the high degree of flexibility required from the tools, there are many options that can be set to configure the tools to operate in a specific manner. It would be tedious to have to set these options up every time the application is being built; therefore they are stored in a project file. Loading the project file into KEIL informs KEIL which source files are required, where they are, and how to configure the tools in the correct way. KEIL can then execute each tool with the correct options. It is also possible to create new projects in KEIL. Source files are added to the project and the tool options are set as required. The project can then be saved to preserve the settings. The project is reloaded and the simulator or debugger started, all the desired windows are opened. KEIL project files have the extension.
-
Hardware design module
Crane controlled by hand gestures is a system which controls the Crane instead of joystick or leaver. Generally the wireless mouse is IR control one so the distance for controlling CRANE is should in line of sight only so in order to overcome this problem this is designed with RF control system and no need to hold this mouse with so this mouse is attached to hand and based on hand movement the courser on screen changes and selection of particular one isdone by clicking the limit switches connected to it.
BLOCK DIAGRAM
Figure: Transmitter section block of experimental set-up.
This method consists of two system one system is designed with ARM7 controller and other designed with AT89S52 controller.
The AT89S52 controller system is placed at hand which is interfaced with hand movement recognition sensor, zigbee to controller. Hand movement recognition sensor is interfaced to controller through comparator. This sensor can recognizes hand movement in X and Y directions. This sensor contains four pins one pin is Vcc and other is ground and remaining pins are X and Y given to comparator. Comparator compares with predefined values with the values given from sensor movement and generates required output. When we change hand position its position is recognized by the hand movement sensor and sends it to controller comparator and this compares with predefined values and sends to controller. Then controller changes the direction of cursor movement at CRANE. In the same way to select or to give right click limit switch connected to controller is pressed and that pressed limit switch data is send to zigbee from there data is zigbee present at CRANE to micro controller then to CRANE through RS-232 cable and MAX-232 IC.
Figure: Receiver section block of experimental set- up
The ARM7 system is interfaced to zigbee and CRANE. The sent data by the zigbee at hand is received by the zigbee connected to ARM7 through antenna of zigbee and send it to micro controller through TXD of zigbee to RXD pin of micro controller. The signal is received by RXD pin of micro controller sends data to CRANE through MAX-232 IC and RS-232 cable, since RS-232 cable does not support logic states.
-
-
EXPERIMENTAL RESULTS
The controller programming was implemented using Embedded C. The software used in this project for simulation is Proteus-Lab center Electronics. The simulated result for this work and prototype model for the proposed system is shown below (Figures). Advantage of this approach is the potential of mobility. The accelerometer can be used independently with an embedded processor or by connecting wirelessly with mobile devices such as mobile phones or PDAs. For simulated model, input device is the potential divider (crimp port) instead of MEMS accelerometer, as the accelerometer is not available in this software library. Using this crimp port we can change the acceleration value.
Figure: experimental set-up implementation.
-
CONCLUSION
The project HAND MOTION BASED CRANE CONTROL USING WIRELESS
COMMUNICATION has been successfully designed and tested. It has been developed by integrating features of all the hardware components used. Presence of every module has been reasoned out and placed carefully thus contributing to the best working of the unit. Secondly, using highly advanced ICs and with the help of growing technology the project has been successfully implemented.
-
REFERENCES
-
-
T. H. Speeter (1992), Transformation human hand motion for tele manipulation, Presence, 1, 1, pp. 6379.
-
S. Zhou, Z. Dong, W. J. Li, and C. P. Kwong (2008), Hand-written character recognition using MEMS motion sensing technology, in Proc. IEEE/ASME Int. Conf. Advanced Intelligent Mechatronics, pp.14181423.
-
J. K. Oh, S. J. Cho, and W. C. Bang et al. (2004), Inertial sensor based recognition of 3-D character gestures with an ensemble of classifiers, presentedat the 9th Int. Workshop on Frontiers in Handwriting Recognition.
-
W. T. Freeman and C. D. Weissman (1995) , TV control by hand gestures, presented at the IEEE Int. Workshop on Automatic Face and Gesture Recognition, Zurich, Switzerland.
-
L. Bretzner and T. Lindeberg(1998), Relative orientation from extended sequences of sparse point and line correspondences using the affine trifocal tensor, in Proc. 5th Eur. Conf. Computer Vision, Berlin, Germany,1406, Lecture Notes in Computer Science, pp.141157, Springer Verlag.
-
D. Xu (2006), A neural network approach for hand gesture recognition in virtual reality driving training system of SPG, presented at the 18th Int. Conf. Pattern Recognition.
-
H. Je, J. Kim, and D. Kim (2007), Hand gesture recognition to understand musical conducting action, presented at the IEEE Int. Conf. Robot &Human Interactive Communication. [8] T. Yang, Y. Xu, and A. (1994) , Hidden Markov Model for Gesture Recognition,CMU-RI-TR-94 10, Robotics Institute, Carnegie Mellon Univ.,Pittsburgh, PA.
-
S. Zhou, Q. Shan, F. Fei, W. J. Li, C. P. Kwong, and C. K. Wu et al (2009)., Gesture recognition for interactive controllers using MEMS motion sensors, in Proc. IEEE Int. Conf. Nano /Micro Engineered and Molecular Systems, pp. 935940.
-
S. Zhang, C. Yuan, and V. Zhang (2008), Handwritten character recognition using orientation quantization based on 3-D accelerometer, presented at the 5th Annu. Int. Conf. Ubiquitous Systems.
-
J. S. Lipscomb (1991), A trainable gesture recognizer,
Pattern. Recognit.,24, 9, pp. 895907.
-
W. M. Newman and R. F. Sproull (1979), Principles of Interactive Computer Graphics. New York: McGraw-Hill.
-
D. H. Rubine (1991), The Automatic Recognition of Gesture, Ph.D dissertation, Computer Science Dept., Carnegie Mellon Univ., Pittsburgh, PA.
-
K. S. Fu, Syntactic Recognition in Character Recognition. New York: Academic, 1974, 112, Mathematics in Science and Engineering.
-
S. S. Fels and G. E. Hinton(1993), Glove-talk: A neural network interface between a data glove and a speech synthesizer, IEEE Trans. Neural Netw., 4, l, pp. 28.
-
C. M. Bishop(2006), Pattern Recognition and Machine Learning, 1st ed. New York: Springer.
-
T. Schlomer, B. Poppinga, N. Henze, and S. Boll (2008), Gesture recognition with a Wii controller, in Proc. 2nd Int. Conf. Tangible and Embedded Interaction (TEI08), Bonn, Germany, pp. 1114.