- Open Access
- Total Downloads : 10
- Authors : Ventylees Raj.S
- Paper ID : IJERTCONV1IS06026
- Volume & Issue : ICSEM – 2013 (Volume 1 – Issue 06)
- Published (First Online): 30-07-2018
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Pervasive and mobile computing based human activity recognition system
VENTYLEES RAJ.S,
ME-Pervasive Computing Technologies, Kings College of Engg,
Punalkulam.
Pudukkottai,India, ventyleesraj.pct@gmail.com
Abstract-Human Activity Recognition (HAR) based on wearable sensors is gaining increasing attention by the pervasive computing research community, especially for the development of context-aware systems. This paper presents our approach for HAR based on wearable accelerometers and supervised learning algorithms. Providing accurate and opportune information on peoples activities and behaviors is one of the most important tasks in pervasive computing. Innumerable applications can be visualized, for instance, in medical, security, entertainment, and tactical scenarios. Despite human activity recognition (HAR) been an active field for more than a decade, there are still key views that, if addressed, would establish a substantial turn in the way people interact with mobile devices. This paper presents components of human behavior, how they might be integrated into computers, and how far we are from realizing the front end of human computing, that is, how far are we from enabling computers to understand human behavior.
Keywords: Pervasive Computing, HAR, context-
aware, Human Computing
-
INTRODUCTION
In recent years the environment devices can be converted into smart devices using by computing technologies. Pervasive computing vision is environments that are getting saturated with computing and communication capability, yet gracefully integrated with human users. Human Activity Recognition (HAR) based on wearable sensors is gaining increasing attention by the pervasive computing research community, especially for the development of context- aware systems. This paper presents our approach for HAR based on wearable accelerometers and supervised learning algorithms. Human activity discovery and recognition play an important role in a wide range of applications from assisted living in security and
surveillance. One such application domain is smart environments.
Many definitions exist for Human Activity Recognition (HAR) system available in the literature. During the past decade, there has been an exceptional development of microelectronics and computer systems, enabling sensors and mobile devices with unprecedented characteristics. Their high computational power and low cost permit people to interact with the smart devices as part of their daily living. That was the genesis of Pervasive Sensing, an active research area with the main intention of extracting knowledge from the data acquired by pervasive sensors [1]. Especially, the recognition of human activities has become a task of high interest within the area, especially for medical, military, and security applications. For instance, patients with diabetes or heart disease are often needed to follow a well determined exercise routine as part of their treatment [2]. Therefore, recognizing activities such as walking, running, or cycling becomes quite useful to provide feedback to the caregiver about the patients behavior. Likewise, patients with dementia and other mental pathologies could be monitored to detect abnormal activities and thereby prevent undesirable consequences [3]. In tactical scenarios, precise information on the soldiers activities along with their locations and health conditions is highly beneficial for their performance and safety. Such information is also helpful to support decision making in both combat and training scenarios.
The Smart homes [4] – [6] are a distinctive good example of external sensing application. These systems are able to recognize fairly vital activities (e.g., eating, taking a shower, etc.) because they trust on data from a number of smart sensors placed in target objects which users are supposed to interact with smart devices. However, nothing can be done if the user is out of the
757
S.Ventyleesraj
reach of the smart sensors or they perform activities that do not require interaction with them.Cameras have also been applied as external sensors for HAR. In fact, the recognition of activities and gestures from video sequences has been the focus of extensive explores [7]
[8]. This is especially suitable for security (e.g., intrusion detection) and interactive applications. The aforementioned restrictions motivate the use of wearable sensors in HAR. Most of the evaluated attributes are related to the users movement (GPS), environmental variables (temperature and humidity), or physiological signals (heart rate). These evaluated data are naturally influenced over the time dimension, permitting us to define the human activity recognition system.
-
GENERAL STRUCTURE OF HAR SYSTEMS The design of any HAR system depends on the activities to be recognized. In fact, changing the activity sets A immediately turns a given HARP into a completely different problem. From the literature, seven groups of activities can be distinguished. These groups and the individual activities that belong to each group are summarized in Table1.
sensors are generally accompanied by accelerometers and other sensors.
-
Acceleration: Triaxial accelerometers are perhaps the most broadly used sensors to recognize ambulation activities (e.g., walking, running, lying, etc. Accelerometers are inexpensive, require relatively low power, and are embedded in most of todays cellular phones. Several papers have reported high recognition accuracy 92.25%, 97% and up to 98%, under different evaluation methodologies. However, other daily activities such as eating, working at a computer, or brushing teeth, are confusing from the acceleration point of view. For instance, eating might be confused with brushing teeth due to arm motion. The impact of the sensor specifications has also been analyzed. In fact, studied the behavior of the recognition accuracy as a function of the accelerometer sampling rate (which lies between 10 Hz and 100 Hz).
-
Physiological signals: Vital signs data (e.g., heart rate, respiration rate, skin temperature, skin conductivity, ECG, etc.) have also been considered in a few works [3]. The proposed an activity recognition system that combines data from five triaxial accelerometers and a heart rate monitor. However, they concluded that the heart rate is not useful in a HAR context because after performing physically demanding
Table1. Types of activity recognized by HAR system Group
Activities
activities (e.g., running) the heart rate remains at a high
level for a while, even if the individual is lying or sitting. Now, in order to measure physiological signals,
Ambulation Walking, running,
asditdtiitnigo,nal sensors would be required, thereby
standing still, lying, descinencrdeinasging the system cost and introducing
stairs.
obtrusiveness. Also, these sensors generally use
Transportation Riding a bus, cyclingw, irealnesds communication which entails higher energy
driving.
expenditures.
Phone usage Text messaging, making a3.c2a.llR. ecognition performance
Daily activities Eating, drinking, workingThaet tpheerformance of a HAR system depends on several
PC, watching TV, raesapdeinctgs,, such as the activity set , the quality of the
brushing teeth.
training data, the feature extraction method, and the
Exercise/fitness Rowing, lifting wleeaigrnhitns,g algorithm. In the first place, each set of
spinning, Nordic walkinagc,tivaintides brings a totlly different pattern recognition
doing push ups.
3.1.1) Environmental attributes: These attributes, such as temperature, humidity, audio level, etc., are intended to provide context information describing the individuals surroundings. If the audio level and light intensity are fairly low, for instance, the subject may be sleeping. Various existing systems have utilized microphones, light sensors, humidity sensors, and thermometers, among others. Those sensors alone, though, might not provide sufficient information as individuals can perform each activity under diverse contextual conditions in terms of weather, audio loudness, or illumination. Therefore, environmental
problem. For example, discriminating among walking, running, and standing still, turns out to be much easier than incorporating more complex activities such as watching TV, eating, ascending, and descending. Secondly, there should be a sufficient amount of training data, which should also be similar to the expected testing data. Finally, a comparative evaluation of several learning methods is desirable as each data set exhibits distinct characteristics that can be either beneficial or detrimental for a particular method. Such interrelationship among datasets and learning methods can be very hard to analyze theoretically, which accentuates the need of an experimental study.
-
ACTIVITY RECOGNITION METHODS
758
S.Ventyleesraj
In Section 2, displayed to enable the recognition of human activities, raw data have to first pass through the process of feature extraction. Then, the recognition model is built from the set of feature instances by means of machine learning techniques. Once the model is trained, unseen instances (i.e., time windows) can be evaluated in the recognition model, yielding a prediction on the performed activity. Next, the most noticeable approaches in feature extraction and learning will be covered.
-
Feature extraction
Human activities are performed during relatively long periods of time (in the order of seconds or minutes) compared to the sensors sampling rate (up to 250 Hz). Besides, a single sample on a specific time instant does not provide sufficient information to describe the activity performed. Thus, activities need to be recognized in a time window basis rather than a sample basis. Now, the question is: how do we compare two given time windows? It would be nearly impossible for the signals to be exactly identical, even if they come from the same subject performing the same activity. This is the main motivation for applying feature extraction (FE) methodologies to each time window: filtering relevant information and obtaining quantitative measures that allow signals to be compared.
-
Acceleration: Acceleration signals (see Fig 3) are highly fluctuating and oscillatory, which makes it difficult to recognize the underlying patterns using their raw values. Existing HAR systems based on accelerometer data employ statistical feature extraction and, in most of the cases, either time or frequency domain features. Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA) have also been applied with promising results, as well as autoregressive model coefficients.
-
Environment variables: Environmental attributes, along with acceleration signals, have been
numbers of instances of class i that was actually classified as class j. The following values can be obtained from the confusion matrix in a binary classification problem:
-
True Positives (TP): The number of positive instances that were classified as positive.
-
True Negatives (TN): The number of negative instances that were classified as negative.
-
False Positives (FP): The number of negative instances that were classified as positive.
-
False Negatives (FN): The number of positive instances that were classified as negative.
The accuracy is the most standard metric to summarize the overall classification performance for all classes and it is defined as follows:
The precision, often referred to as positive predictive value, is the ratio of correctly classified positive instances to the total number of instances classified as positive:
The recall, also called true positive rate, is the ratio of correctly classified positive instances to the total number of positive instances:
The F-measure combines precision and recall in a single value:
Although defined for binary classification, these metrics can be generalized for a problem with n classes. In such case, an instance could be positive or negative, according to a particular class, e.g., positives might be all instances of running while negatives would be all instances other than running.
4.3. Wearable Prototype for HAR
I decide the postures and movements for the classification task: sitting, standing, walking, standing up (transient movement), and sitting down (transient movement). These are the most common and basic activities and their recognition have the potential to be combined with contextual information to feed context- aware systems to support collaboration. From the raw
used to enrich context awareness. For instance, the values of air pressure and light intensity are helpful to determine whether the individual is outdoors or indoors. Also, audio signals are useful to conclude that the user is having a conversation rather than listening to music. Table2. Summarizes the feature extraction methods for environmental attributes
Table 3. List of particip ants and profiles Particip
ant
Sex Age Height Weight Instanc
es
Table 2.
Environmental Variables. Attribute Altitude
Audio
Barometric pressure Humidity Light Temperature
Features
Time-domain Speech recognizer Time-domain Time-domain
Time&freq-domain
A Female 46 y.o. 1.62m 67kg 51,577 B Female 28 y.o. 1.58m 53kg 49,797 C Male 31 y.o. 1.71m 83kg 51,098 D Male 75 y.o. 1.67m 67kg 13,161 data, calculate derived features, according to the
advices found on literature review and our own experimental results.
759
S.Ventyleesraj
-
Feature Selection: I used Mark Hall algorithm to select most valuable features. The main goal is to decide the minimum amount of data necessary to make a good prediction. It is also useful to support decision making about discarding a sensor: if a sensor readings do not produce features providing information gain, then it is discarded. As a result, 10 features were selected from 4 sensors: (1) accelerometer on the waist: mean of an acceleration module vector, variety of pitch and roll; (2) accelerometer on the right thigh: mean of an acceleration module vector, acceleration vector module, and variance of pitch; (3) accelerometer on the right ankle: mean of an acceleration module vector, and variety of pitch and roll; (4) accelerometer on right upper arm: acceleration module vector.
-
Experimental evaluation: The evaluation contained 10-fold cross-validation tests. The used experimental algorithms are Support Vector Machine (SVM), Voted Perceptron (one-against-all strategy), Multilayer Perceptron (Back Propagation) and C4.5 Decision Trees. The best result was with C4.5 with a confidence factor of 0.15 (accuracy of 98.2%). Later on, I used the AdaBoost ensemble learning with 10 decision trees (C4.5). In a simplified manner, with the use of AdaBoost, the C4.5 algorithm was trained with a different distribution of samples for each iteration, thus favoring the hardest samples. The overall recognition performance was of 99.4% (weighted average) using a 10-fold cross validation testing mode. The accuracies per class follow: sitting 100%, sitting down 96.9%, stand- in 99.8%, standing up 96.9%, and walking 99.8%. The sensor on taper arm was discarded as a result of the feature selection procedure..
-
-
-
-
CONCLUSIONS
This pape presented the state-of-the-art in human activity recognition based on wearable sensors. Two- level taxonomy is introduced that organizes HAR systems according to their response time and learning scheme. The fundamentals of feature extraction and machine learning are also included, as they are important components of every HAR system. Finally, various ideas are proposed for future research to extend this field to more realistic and pervasive scenarios.
REFERENCES
-
O. D. Lara, A. J. Perez, M. A. Labrador, and J. D. Posada, Centinela: A human activity recognition system based on acceleration and vital sign data, Journal on Pervasive and Mobile Computing, 2011.
-
A. J. Perez, M. A. Labrador, and S. J. Barbeau, G- sense: A scalable architecture for global sensing and monitoring, IEEE Network, vol. 24, no. 4, pp. 5764, 2010.
-
J. Yin, Q. Yang, and J. Pan, Sensor-based abnormal human-activity detection, IEEE Transactions
on Knowledge and Data Engineering, vol. 20, no. 8, pp. 10821090, 2008.
-
S. Ventylees Raj, Implementation of Pervasive Computing based High-Secure Smart Home System IEEE International Conference on Computational Intelligence and Computing Research, 2012.
-
E. Kim, S. Helal, and D. Cook, Human activity recognition and pattern discovery, Pervasive Computing, IEEE, vol. 9, no. 1, pp. 4853, 2010.
-
J. Yang, J. Lee, and J. Choi, Activity recognition based on rfid object usage for smart mobile devices, Journal of Computer Science and Technology, vol. 26, pp. 239246, 2011.
-
P. Turaga, R. Chellappa, V. Subrahmanian, and O. Udrea, Machine recognition of human activities: A survey, IEEE Transactions on Circuits and Systems for Video Technology, vol. 18, no. 11, pp. 14731488, 2008.
-
J. Candamo, M. Shreve, D. Goldgof, D. Sapper, and
-
Kasturi, Understanding transit scenes: A survey of human behavior-recognition algorithms, IEEE Transactions on Intelligent Transportation Systems, vol. 11, no. 1, pp. 206224, 2010.
-
-
J. Shotton, A. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, Alex, Kipman, and A. Blake, Real-time human poses recognition in parts from single depth images, in IEEE Conference on Computer Vision and Pattern Recognition, 2011.
760
-
Ventyleesraj