Convolutional Neural Network for Human Activity Recognition and Identification

The widespread use of smartphones has triggered active research and various applications of Human Activity Recognition. On the other hand, the role of smartphones in our daily life also makes it a pivotal portal to personal information, various digitalized services, and the digital systems which the smartphone owner has access to. Therefore, the security and user authentication of smartphones has become a problem of paramount importance. This paper presents a deep learning approach to Human Activity Recognition and Identification (HARI) from an entity’s “behavioral signature” represented by the features extracted from the signals of the accelerometer and gyroscope of the smartphone an entity carries, by using onedimensional convolutional neural network (1D-CNN) model. Our primary research question is whether it is possible to identify not only the activity being performed but also the identity of the entity performing the activity based on signals from the gyroscope and accelerometer of the smartphone an entity carries. Based on the initial experiments we conducted on MotionSense dataset, the 1D-CNN model for classifying activity was able to achieve an accuracy of 96.77%, while the model for classifying identity achieved an accuracy of 82.37% in just a one-second time window. The accuracies achieved were based on design decisions that may have limited the success of the models, particularly, the design decision of using a sample length of one second. This design decision is discussed along with additional research topics that may help improve the accuracy in future iterations of this research.

[1]  Rama Chellappa,et al.  Continuous User Authentication on Mobile Devices: Recent progress and remaining challenges , 2016, IEEE Signal Processing Magazine.

[2]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[3]  Davide Anguita,et al.  A Public Domain Dataset for Human Activity Recognition using Smartphones , 2013, ESANN.

[4]  Steven P. Weber,et al.  Active Authentication on Mobile Devices via Stylometry, Application Usage, Web Browsing, and GPS Location , 2017, IEEE Systems Journal.

[5]  Xiaohui Peng,et al.  Deep Learning for Sensor-based Activity Recognition: A Survey , 2017, Pattern Recognit. Lett..

[6]  Miguel A. Labrador,et al.  A Survey on Human Activity Recognition using Wearable Sensors , 2013, IEEE Communications Surveys & Tutorials.

[7]  Zhaozheng Yin,et al.  Human Activity Recognition Using Wearable Sensors by Deep Convolutional Neural Networks , 2015, ACM Multimedia.

[8]  Zenghui Wang,et al.  Deep Convolutional Neural Networks for Image Classification: A Comprehensive Review , 2017, Neural Computation.

[9]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[10]  Petia Radeva,et al.  Human Activity Recognition from Accelerometer Data Using a Wearable Device , 2011, IbPRIA.

[11]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[12]  Qing Yang,et al.  HMOG: New Behavioral Biometric Features for Continuous Authentication of Smartphone Users , 2015, IEEE Transactions on Information Forensics and Security.

[13]  Adam J. Aviv,et al.  Smudge Attacks on Smartphone Touch Screens , 2010, WOOT.

[14]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[15]  Yi Zheng,et al.  Time Series Classification Using Multi-Channels Deep Convolutional Neural Networks , 2014, WAIM.

[16]  Simon Haykin,et al.  GradientBased Learning Applied to Document Recognition , 2001 .

[17]  Andrea Cavallaro,et al.  Protecting Sensory Data against Sensitive Inferences , 2018, P2DS@EuroSys.

[18]  Mamadou D. Seck,et al.  Towards trustworthy smart cyber-physical-social systems in the era of Internet of Things , 2016, 2016 11th System of Systems Engineering Conference (SoSE).

[19]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.