Deep Learning for Smartphone-Based Human Activity Recognition Using Multi-sensor Fusion

In the field of ubiquitous computing, machines need to be aware of the present context to enable anticipatory communication with humans. This leads to human-centric applications that have the primary objective of improving the Quality-of-Life (QoL) of its users. One important type of context information for these applications is the current activity of the user, which can be derived from environmental and wearable sensors. Due to the processing capabilities and the number of sensors embedded in a smartphone, this device exhibits the most promise among other existing technologies in human activity recognition (HAR) research. While machine learning-based solutions have been successful in past HAR studies, several design struggles can be easily resolved with deep learning. In this paper, we investigated Convolutional Neural Networks and Long Short-Term Memory Networks in dealing with common challenges in smartphone-based HAR, such as device location and subject dependency, and manual feature extraction. We showed that the CNN model accomplished location- and subject-independent recognition with overall accuracy of 98.38% and 90.61%, respectively. The LSTM model also performed location-independent recognition with an accuracy of 97.17% but has a subject-independent recognition accuracy of only 80.02%. Finally, optimal performance of the network was achieved by performing Bayesian Optimization using Gaussian Processes in tuning the design hyperparameters.

[1]  M. Amaç Güvensan,et al.  Activity Recognition on Smartphones: Efficient Sampling Rates and Window Sizes , 2016, 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops).

[2]  Julien Penders,et al.  Energy expenditure estimation using wearable sensors: a new methodology for activity-specific models , 2012, Wireless Health.

[3]  Paul J. M. Havinga,et al.  Fusion of Smartphone Motion Sensors for Physical Activity Recognition , 2014, Sensors.

[4]  Nestor Michael C. Tiglao,et al.  Basic Human Activity Recognition based on sensor fusion in smartphones , 2017, 2017 IFIP/IEEE Symposium on Integrated Network and Service Management (IM).

[5]  Alex Mihailidis,et al.  A Survey on Ambient-Assisted Living Tools for Older Adults , 2013, IEEE Journal of Biomedical and Health Informatics.

[6]  Weihua Sheng,et al.  Multi-sensor fusion for human daily activity recognition in robot-assisted living , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  Yuting Zhang,et al.  Continuous functional activity monitoring based on wearable tri-axial accelerometer and gyroscope , 2011, 2011 5th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops.

[8]  Nuno M. Garcia,et al.  From Data Acquisition to Data Fusion: A Comprehensive Review and a Roadmap for the Identification of Activities of Daily Living Using Mobile Devices , 2016, Sensors.

[9]  Jadwiga Indulska,et al.  Sensor-Based Activity Recognition with Dynamically Added Context , 2015, EAI Endorsed Trans. Energy Web.

[10]  Adil Mehmood Khan,et al.  Activity Recognition on Smartphones via Sensor-Fusion and KDA-Based SVMs , 2014, Int. J. Distributed Sens. Networks.

[11]  Manolis Tsiknakis,et al.  The MobiFall Dataset: Fall Detection and Classification with a Smartphone , 2014, Int. J. Monit. Surveillance Technol. Res..

[12]  Tahmina Zebin,et al.  Human activity recognition with inertial sensors using a deep learning approach , 2016, 2016 IEEE SENSORS.