An Efficient and Lightweight Deep Learning Model for Human Activity Recognition Using Smartphones

Traditional pattern recognition approaches have gained a lot of popularity. However, these are largely dependent upon manual feature extraction, which makes the generalized model obscure. The sequences of accelerometer data recorded can be classified by specialized smartphones into well known movements that can be done with human activity recognition. With the high success and wide adaptation of deep learning approaches for the recognition of human activities, these techniques are widely used in wearable devices and smartphones to recognize the human activities. In this paper, convolutional layers are combined with long short-term memory (LSTM), along with the deep learning neural network for human activities recognition (HAR). The proposed model extracts the features in an automated way and categorizes them with some model attributes. In general, LSTM is alternative form of recurrent neural network (RNN) which is famous for temporal sequences’ processing. In the proposed architecture, a dataset of UCI-HAR for Samsung Galaxy S2 is used for various human activities. The CNN classifier, which should be taken single, and LSTM models should be taken in series and take the feed data. For each input, the CNN model is applied, and each input image’s output is transferred to the LSTM classifier as a time step. The number of filter maps for mapping of the various portions of image is the most important hyperparameter used. Transformation on the basis of observations takes place by using Gaussian standardization. CNN-LSTM, a proposed model, is an efficient and lightweight model that has shown high robustness and better activity detection capability than traditional algorithms by providing the accuracy of 97.89%.

[1]  Qingsong Hua,et al.  Industrial Big Data Analysis in Smart Factory: Current Status and Research Strategies , 2017, IEEE Access.

[2]  Thomas Engel,et al.  An Open Dataset for Human Activity Analysis using Smart Devices , 2017 .

[3]  Shenghui Zhao,et al.  A Comparative Study on Human Activity Recognition Using Inertial Sensors in a Smartphone , 2016, IEEE Sensors Journal.

[4]  Miguel A. Labrador,et al.  A Survey on Human Activity Recognition using Wearable Sensors , 2013, IEEE Communications Surveys & Tutorials.

[5]  Laurence T. Yang,et al.  Big Data Real-Time Processing Based on Storm , 2013, 2013 12th IEEE International Conference on Trust, Security and Privacy in Computing and Communications.

[6]  Hanyu Wang,et al.  LSTM-CNN Architecture for Human Activity Recognition , 2020, IEEE Access.

[7]  Antonio Fernández-Caballero,et al.  A survey of video datasets for human action and activity recognition , 2013, Comput. Vis. Image Underst..

[8]  Teh Ying Wah,et al.  Data fusion and multiple classifier systems for human activity detection and health monitoring: Review and open research directions , 2019, Inf. Fusion.

[9]  Amir A. Aljarrah,et al.  Human Activity Recognition using PCA and BiLSTM Recurrent Neural Networks , 2019, 2019 2nd International Conference on Engineering Technology and its Applications (IICETA).

[10]  Tzonelih Hwang,et al.  BSN-Care: A Secure IoT-Based Modern Healthcare System Using Body Sensor Network , 2016, IEEE Sensors Journal.

[11]  Farshad Firouzi,et al.  Human Activity Recognition: From Sensors to Applications , 2020, 2020 International Conference on Omni-layer Intelligent Systems (COINS).

[12]  Ronald Poppe,et al.  A survey on vision-based human action recognition , 2010, Image Vis. Comput..

[13]  Masud Ahmed,et al.  Challenges in Sensor-based Human Activity Recognition and a Comparative Analysis of Benchmark Datasets: A Review , 2019, 2019 Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 2019 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR).

[14]  Ahmad Almogren,et al.  A robust human activity recognition system using smartphone sensors and deep learning , 2018, Future Gener. Comput. Syst..

[15]  Mansaf Alam,et al.  A Lightweight Deep Learning Model for Human Activity Recognition on Edge Devices , 2019, Procedia Computer Science.

[16]  Tao Dong,et al.  A Review of Wearable Technologies for Elderly Care that Can Accurately Track Indoor Position, Recognize Physical Activities and Monitor Vital Signs in Real Time , 2017, Sensors.

[17]  Alexander J. Casson,et al.  Design and Implementation of a Convolutional Neural Network on an Edge Computing Smartphone for Human Activity Recognition , 2019, IEEE Access.

[18]  Ali Kashif Bashir,et al.  Localizing pedestrians in indoor environments using magnetic field data with term frequency paradigm and deep neural networks , 2021, Int. J. Mach. Learn. Cybern..

[19]  Eliasz Kantoch,et al.  Human activity recognition for physical rehabilitation using wearable sensors fusion and artificial neural networks , 2017, 2017 Computing in Cardiology (CinC).

[20]  Tahmina Zebin,et al.  Human activity recognition with inertial sensors using a deep learning approach , 2016, 2016 IEEE SENSORS.

[21]  Quan Z. Sheng,et al.  Different Approaches for Human Activity Recognition: A Survey , 2019, ArXiv.

[22]  Larry S. Davis,et al.  AVSS 2011 demo session: A large-scale benchmark dataset for event recognition in surveillance video , 2011, AVSS.

[23]  Yansong Tang,et al.  COIN: A Large-Scale Dataset for Comprehensive Instructional Video Analysis , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[24]  Héctor Pomares,et al.  Window Size Impact in Human Activity Recognition , 2014, Sensors.

[25]  Yuwen Chen,et al.  LSTM Networks for Mobile Human Activity Recognition , 2016 .

[26]  Sang Min Yoon,et al.  Human activity recognition from accelerometer data using Convolutional Neural Network , 2017, 2017 IEEE International Conference on Big Data and Smart Computing (BigComp).