Self-learning Based Motion Recognition Using Sensors Embedded in a Smartphone for Mobile Healthcare

Human motion recognition using wearable sensors is becoming a popular topic in the field of mobile health recently. However, most previous studies haven’t solved the problem of unlabeled motion recognition very well due to the limitation of learning ability of their systems. In this paper, we propose a self-learning based motion recognition scheme for mobile healthcare, in which a patient only needs to carry an ordinary smartphone that integrates some common inertial sensors, and both labeled and unlabeled motion types can be recognized by using a self-learning data analysis scheme. Experimental results demonstrate that the proposed self-learning scheme behaves better than some existing ones, and its average accuracy reaches above 80 % for motion recognition.

[1]  Rama Chellappa,et al.  Machine Recognition of Human Activities: A Survey , 2008, IEEE Transactions on Circuits and Systems for Video Technology.

[2]  Bernhard Schölkopf,et al.  Estimating the Support of a High-Dimensional Distribution , 2001, Neural Computation.

[3]  John W. Sammon,et al.  An Optimal Set of Discriminant Vectors , 1975, IEEE Transactions on Computers.

[4]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[5]  Li-Chen Fu,et al.  Active-learning assisted self-reconfigurable activity recognition in a dynamic environment , 2009, 2009 IEEE International Conference on Robotics and Automation.

[6]  Alberto Muñoz,et al.  Self-organizing maps for outlier detection , 1998, Neurocomputing.

[7]  Meinard Müller,et al.  Dynamic Time Warping , 2008 .

[8]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[9]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[10]  Qiang Yang,et al.  Sensor-Based Abnormal Human-Activity Detection , 2008, IEEE Transactions on Knowledge and Data Engineering.

[11]  Hans-Peter Kriegel,et al.  A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise , 1996, KDD.

[12]  Davide Anguita,et al.  A Public Domain Dataset for Human Activity Recognition using Smartphones , 2013, ESANN.

[13]  Ronald Poppe,et al.  A survey on vision-based human action recognition , 2010, Image Vis. Comput..

[14]  Martin L. Griss,et al.  NuActiv: recognizing unseen new activities using semantic attribute-based learning , 2013, MobiSys '13.

[15]  Liran Ma,et al.  iSound: A Smartphone Based Intelligent Sound Fusion System for the Hearing Impaired , 2015, WASA.

[16]  Ahmed Kattan,et al.  Better Physical Activity Classification using Smartphone Acceleration Sensor , 2014, Journal of Medical Systems.

[17]  Alberto L. Morán,et al.  On the Effect of Previous Technological Experience on the Usability of a Virtual Rehabilitation Tool for the Physical Activation and Cognitive Stimulation of Elders , 2015, Journal of Medical Systems.

[18]  Norbert Gyorbíró,et al.  An Activity Recognition System For Mobile Phones , 2009, Mob. Networks Appl..

[19]  Christopher M. Bishop,et al.  Novelty detection and neural network validation , 1994 .

[20]  Cem Ersoy,et al.  A Review and Taxonomy of Activity Recognition on Mobile Phones , 2013 .