Improving the accuracy of complex activities recognition using accelerometer-embedded mobile phone classifiers

Using mobile phones for Human Activities Recognition (HAR) is very helpful in observing daily habit of the user and early detecting health diseases or accidents. Many studies have been published which have investigated the HAR with the help of mobile phones. However, these studies mainly focused on simple single locomotion activities. In real-world situations, human activities are often performed in complex manners. This study will investigate recognizing of complex activities with common classifiers those using in recognizing human activities. Data was collected about complex activities, then features were extract, finally the activities were classified. The experiment shows that the recognition accuracy of low level activities is higher than high level in all seven classifiers. Also it was noticed that the highest accuracy was got by IBK classifier (KNN). It got the highest accuracy in both positions, and in the three activity levels. Finally, the activities with armband position got more accuracy in all seven classifiers than in waist position. The study concluded that those classifiers are good to recognize low level activities (simple), but their performance reduces when the complexity of activities increase. So proper classifiers are needed to deal with the complex activities.

[1]  Diane J. Cook,et al.  Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments , 2016, J. Ambient Intell. Humaniz. Comput..

[2]  Md. Kamrul Hasan,et al.  Human Activity Recognition using Smartphone Sensors with Context Filtering , 2016, ACHI 2016.

[3]  Liming Chen,et al.  Semantic segmentation of real-time sensor data stream for complex activity recognition , 2017, Personal and Ubiquitous Computing.

[4]  Xiao-Ping Zhang,et al.  A Hierarchical Spatio-Temporal Model for Human Activity Recognition , 2017, IEEE Transactions on Multimedia.

[5]  Arkady B. Zaslavsky,et al.  Complex activity recognition using context-driven activity theory and activity signatures , 2013, ACM Trans. Comput. Hum. Interact..

[6]  Paul Lukowicz,et al.  Recording a Complex, Multi Modal Activity Data Set for Context Recognition , 2011, ARCS Workshops.

[7]  Hongnian Yu,et al.  Activity classification using a single wrist-worn accelerometer , 2011, 2011 5th International Conference on Software, Knowledge Information, Industrial Management and Applications (SKIMA) Proceedings.

[8]  Shu Wang,et al.  A framework of mining semantic-based probabilistic event relations for complex activity recognition , 2017, Inf. Sci..

[9]  Diane J. Cook,et al.  Simple and Complex Activity Recognition through Smart Phones , 2012, 2012 Eighth International Conference on Intelligent Environments.

[10]  Shu Wang,et al.  Towards complex activity recognition using a Bayesian network-based probabilistic generative framework , 2017, Pattern Recognit..

[11]  Abdelhamid Bouchachia,et al.  Human activity recognition in pervasive single resident smart homes: State of art , 2015, 2015 12th International Symposium on Programming and Systems (ISPS).

[12]  Yuxin Peng,et al.  Complex activity recognition using time series pattern dictionary learned from ubiquitous sensors , 2016, Inf. Sci..

[13]  Nirmalya Roy,et al.  Unseen Activity Recognitions: A Hierarchical Active Transfer Learning Approach , 2017, 2017 IEEE 37th International Conference on Distributed Computing Systems (ICDCS).

[14]  Xiao-Ping Zhang,et al.  Learning a hierarchical spatio-temporal model for human activity recognition , 2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[15]  Simon A. Dobson,et al.  KCAR: A knowledge-driven approach for concurrent activity recognition , 2015, Pervasive Mob. Comput..