On the Homogenization of Heterogeneous Inertial-Based Databases for Human Activity Recognition

In the last years supervised machine learning techniques are largely employed for automatic Human Activity Recognition (HAR) using inertial sensors, such as accelerometer and gyroscope. HAR has many applications in several domains such as, for example, healthcare, sport, and entertainment. Machine learning scientists made available to the community a plenty of labeled databases for benchmarking that, unfortunately, are not consistent, both syntactically (e.g., different sampling frequency) and semantically (e.g., labels with different meanings). Commonly, due to this inconsistency, scientists evaluate their progress on individual databases separately, which corresponds to training and testing using the same database. Coherent merging of existing databases would enable: 1) evaluation of generalization capabilities of methods across databases; 2) use of deep learning techniques that, unlike traditional ones, require much more labeled data for the training process. Moreover, the growth in the daily use of wearable devices will produce a big amount of inertial data which, if not correctly labeled, cannot be efficiently exploited for the study of automatic HAR. In this paper we propose a semi-automatic procedure to coherently merge existing databases based on signal and word similarity. Preliminary experiments demonstrates the effectiveness of the proposed procedure.

[1]  Davide Anguita,et al.  Energy Efficient Smartphone-Based Activity Recognition using Fixed-Point Arithmetic , 2013, J. Univers. Comput. Sci..

[2]  Qing Lei,et al.  A Comprehensive Survey of Vision-Based Human Action Recognition Methods , 2019, Sensors.

[3]  John Whaley,et al.  AcctionNet : A Dataset Of Human Activity Recognition Using On-phone Motion Sensors , 2017 .

[4]  Jin-Hyuk Hong,et al.  Toward Personalized Activity Recognition Systems With a Semipopulation Approach , 2016, IEEE Transactions on Human-Machine Systems.

[5]  Andrea Cavallaro,et al.  Protecting Sensory Data against Sensitive Inferences , 2018, P2DS@EuroSys.

[6]  S. Katz,et al.  Progress in development of the index of ADL. , 1970, The Gerontologist.

[7]  Cem Ersoy,et al.  Online Human Activity Recognition on Smart Phones , 2012 .

[8]  Nadir Weibel,et al.  Context Recognition In-the-Wild , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[9]  Miguel A. Labrador,et al.  A Survey on Human Activity Recognition using Wearable Sensors , 2013, IEEE Communications Surveys & Tutorials.

[10]  A. Asuncion,et al.  UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences , 2007 .

[11]  Gary M. Weiss,et al.  Activity recognition using cell phone accelerometers , 2011, SKDD.

[12]  Gang Zhou,et al.  Accurate, Fast Fall Detection Using Gyroscopes and Accelerometer-Derived Posture Information , 2009, 2009 Sixth International Workshop on Wearable and Implantable Body Sensor Networks.

[13]  Richard Browdie,et al.  Sidney Katz, MD: a new paradigm for chronic illness and long-term care. , 2014, The Gerontologist.

[14]  Eduardo Casilari-Pérez,et al.  UMAFall: A Multisensor Dataset for the Research on Automatic Fall Detection , 2017, FNC/MobiSPC.

[15]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[16]  Zhenyu He,et al.  Activity recognition from acceleration data based on discrete consine transform and SVM , 2009, 2009 IEEE International Conference on Systems, Man and Cybernetics.

[17]  Miguel A. Labrador,et al.  Centinela: A human activity recognition system based on acceleration and vital sign data , 2012, Pervasive Mob. Comput..

[18]  Diane J. Cook,et al.  Simple and Complex Activity Recognition through Smart Phones , 2012, 2012 Eighth International Conference on Intelligent Environments.

[19]  Paul J. M. Havinga,et al.  A Survey of Online Activity Recognition Using Mobile Phones , 2015, Sensors.

[20]  William J. Christmas,et al.  When Face Recognition Meets with Deep Learning: An Evaluation of Convolutional Neural Networks for Face Recognition , 2015, 2015 IEEE International Conference on Computer Vision Workshop (ICCVW).

[21]  John Nelson,et al.  Activity recognition with smartphone support. , 2014, Medical engineering & physics.

[22]  Johanna Völker,et al.  Self-tracking Reloaded: Applying Process Mining to Personalized Health Care from Labeled Sensor Data , 2016, Trans. Petri Nets Other Model. Concurr..

[23]  Daniela Micucci,et al.  Falls as anomalies? An experimental evaluation using smartphone accelerometer data , 2015, J. Ambient Intell. Humaniz. Comput..

[24]  Juha Röning,et al.  OpenHAR: A Matlab Toolbox for Easy Access to Publicly Open Human Activity Data Sets , 2018, UbiComp/ISWC Adjunct.

[25]  M. Lawton,et al.  Assessment of Older People: Self-Maintaining and Instrumental Activities of Daily Living , 1969 .

[26]  Manolis Tsiknakis,et al.  The MobiAct Dataset: Recognition of Activities of Daily Living using Smartphones , 2016, ICT4AgeingWell.

[27]  Daniela Micucci,et al.  UniMiB SHAR: a new dataset for human activity recognition using acceleration data from smartphones , 2016, ArXiv.

[28]  Faicel Chamroukhi,et al.  Physical Human Activity Recognition Using Wearable Sensors , 2015, Sensors.

[29]  Davide Anguita,et al.  A Public Domain Dataset for Human Activity Recognition using Smartphones , 2013, ESANN.

[30]  Daniela Micucci,et al.  UniMiB AAL: An android sensor data acquisition and labeling suite , 2018 .

[31]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[32]  Mi Zhang,et al.  A feature selection-based framework for human activity recognition using wearable multimodal sensors , 2011, BODYNETS.

[33]  Daniela Micucci,et al.  A Framework for Long-Term Data Collection to Support Automatic Human Activity Recognition , 2019, Intelligent Environments.

[34]  W. Spector,et al.  The hierarchical relationship between activities of daily living and instrumental activities of daily living. , 1987, Journal of chronic diseases.

[35]  Paul J. M. Havinga,et al.  Fusion of Smartphone Motion Sensors for Physical Activity Recognition , 2014, Sensors.

[36]  J. Wiener,et al.  Measuring the activities of daily living: comparisons across national surveys. , 1990, Journal of gerontology.

[37]  Hanghang Tong,et al.  Activity recognition with smartphone sensors , 2014 .

[38]  Paolo Napoletano,et al.  Multimodal Car Driver Stress Recognition , 2019, PervasiveHealth.