Automatic Annotation for Human Activity Recognition in Free Living Using a Smartphone

Data annotation is a time-consuming process posing major limitations to the development of Human Activity Recognition (HAR) systems. The availability of a large amount of labeled data is required for supervised Machine Learning (ML) approaches, especially in the case of online and personalized approaches requiring user specific datasets to be labeled. The availability of such datasets has the potential to help address common problems of smartphone-based HAR, such as inter-person variability. In this work, we present (i) an automatic labeling method facilitating the collection of labeled datasets in free-living conditions using the smartphone, and (ii) we investigate the robustness of common supervised classification approaches under instances of noisy data. We evaluated the results with a dataset consisting of 38 days of manually labeled data collected in free living. The comparison between the manually and the automatically labeled ground truth demonstrated that it was possible to obtain labels automatically with an 80–85% average precision rate. Results obtained also show how a supervised approach trained using automatically generated labels achieved an 84% f-score (using Neural Networks and Random Forests); however, results also demonstrated how the presence of label noise could lower the f-score up to 64–74% depending on the classification approach (Nearest Centroid and Multi-Class Support Vector Machine).

[1]  Chris D. Nugent,et al.  Evaluation of Prompted Annotation of Activity Data Recorded from a Smart Phone , 2014, Sensors.

[2]  Josef F. Krems,et al.  The German Naturalistic Cycling Study – Comparing cycling speed of riders of different e-bikes and conventional bicycles , 2014 .

[3]  Thomas Plötz,et al.  Using unlabeled data in a sparse-coding framework for human activity recognition , 2014, Pervasive Mob. Comput..

[4]  Sung-Bae Cho,et al.  Human activity recognition with smartphone sensors using deep learning neural networks , 2016, Expert Syst. Appl..

[5]  A. Seward,et al.  THE DARWIN CELEBRATION AT CAMBRIDGE. , 1909, Science.

[6]  Chris D. Nugent,et al.  Rich context information for just-in-time adaptive intervention promoting physical activity , 2017, 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[7]  Hanghang Tong,et al.  Activity recognition with smartphone sensors , 2014 .

[8]  Katarzyna Radecka,et al.  A Comprehensive Analysis on Wearable Acceleration Sensors in Human Activity Recognition , 2017, Sensors.

[9]  Duc A. Tran,et al.  The 11th International Conference on Mobile Systems and Pervasive Computing (MobiSPC-2014) A Study on Human Activity Recognition Using Accelerometer Data from Smartphones , 2014 .

[10]  Henrik Blunck,et al.  Robust Human Activity Recognition using smartwatches and smartphones , 2018, Eng. Appl. Artif. Intell..

[11]  Diane J. Cook,et al.  Human Activity Recognition and Pattern Discovery , 2010, IEEE Pervasive Computing.

[12]  Davide Anguita,et al.  Energy Efficient Smartphone-Based Activity Recognition using Fixed-Point Arithmetic , 2013, J. Univers. Comput. Sci..

[13]  Josef Hallberg,et al.  Personalized Online Training for Physical Activity monitoring using weak labels , 2018, 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops).

[14]  B. Ainsworth,et al.  Translating physical activity recommendations into a pedometer-based step goal: 3000 steps in 30 minutes. , 2009, American journal of preventive medicine.

[15]  Chris D. Nugent,et al.  DANTE: A Video Based Annotation Tool for Smart Environments , 2010, S-CUBE.

[16]  Sourav Bhattacharyaa,et al.  Towards Using Unlabeled Data in a Sparse-coding Framework for Human Activity Recognition , 2014 .

[17]  Ahmad Almogren,et al.  A robust human activity recognition system using smartphone sensors and deep learning , 2018, Future Gener. Comput. Syst..

[18]  Aitor Almeida,et al.  Extending knowledge-driven activity models through data-driven learning techniques , 2015, Expert Syst. Appl..

[19]  Daniel Roggen,et al.  Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition , 2016, Sensors.

[20]  Davide Anguita,et al.  Transition-Aware Human Activity Recognition Using Smartphones , 2016, Neurocomputing.

[21]  Diane J. Cook,et al.  Simple and Complex Activity Recognition through Smart Phones , 2012, 2012 Eighth International Conference on Intelligent Environments.

[22]  Jafet Morales,et al.  Physical activity recognition by smartphones, a survey , 2017 .

[23]  Juha Röning,et al.  Recognizing Human Activities User-independently on Smartphones Based on Accelerometer Data , 2012, Int. J. Interact. Multim. Artif. Intell..

[24]  Andrew Zisserman,et al.  Learning sign language by watching TV (using weakly aligned subtitles) , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[25]  Jake K. Aggarwal,et al.  Human Activity Recognition , 2005, PReMI.

[26]  Athanasios V. Vasilakos,et al.  GCHAR: An efficient Group-based Context - aware human activity recognition on smartphone , 2017, J. Parallel Distributed Comput..

[27]  Thomas Kirste,et al.  Tool support for the online annotation of sensor data , 2016, iWOAR.

[28]  Miguel A. Labrador,et al.  A Survey on Human Activity Recognition using Wearable Sensors , 2013, IEEE Communications Surveys & Tutorials.

[29]  Ana M. Bernardos,et al.  Activity logging using lightweight classification techniques in mobile devices , 2012, Personal and Ubiquitous Computing.

[30]  Dimitris Kanellopoulos,et al.  Handling imbalanced datasets: A review , 2006 .

[31]  Brian Caulfield,et al.  Pervasive Sound Sensing: A Weakly Supervised Training Approach , 2016, IEEE Transactions on Cybernetics.

[32]  Sung-Bae Cho,et al.  Activity Recognition Using Hierarchical Hidden Markov Models on a Smartphone with 3D Accelerometer , 2011, HAIS.

[33]  Bernt Schiele,et al.  Weakly Supervised Recognition of Daily Life Activities with Wearable Sensors , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[34]  Timo Sztyler,et al.  Challenges in Annotation of useR Data for UbiquitOUs Systems: Results from the 1st ARDUOUS Workshop , 2018, ArXiv.

[35]  Cem Ersoy,et al.  Online Human Activity Recognition on Smart Phones , 2012 .

[36]  A Moncada-Torres,et al.  Activity classification based on inertial and barometric pressure sensors at different anatomical locations , 2014, Physiological measurement.

[37]  Paul J. M. Havinga,et al.  A Survey of Online Activity Recognition Using Mobile Phones , 2015, Sensors.

[38]  G. Cavagna,et al.  The two power limits conditioning step frequency in human running. , 1991, The Journal of physiology.

[39]  Jin-Hyuk Hong,et al.  Toward Personalized Activity Recognition Systems With a Semipopulation Approach , 2016, IEEE Transactions on Human-Machine Systems.

[40]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.

[41]  Ilkka Korhonen,et al.  Detection of Daily Activities and Sports With Wearable Sensors in Controlled and Uncontrolled Conditions , 2008, IEEE Transactions on Information Technology in Biomedicine.

[42]  Yuwei Chen,et al.  Human Behavior Cognition Using Smartphone Sensors , 2013, Sensors.

[43]  Lucas P J J Noldus,et al.  The Observer XT: A tool for the integration and synchronization of multimodal signals , 2009, Behavior research methods.

[44]  Cem Ersoy,et al.  A Review and Taxonomy of Activity Recognition on Mobile Phones , 2013 .