Sustainable Wearable System: Human Behavior Modeling for Life-Logging Activities Using K-Ary Tree Hashing Classifier

Human behavior modeling (HBM) is a challenging classification task for researchers seeking to develop sustainable systems that precisely monitor and record human life-logs. In recent years, several models have been proposed; however, HBM remains an inspiring problem that is only partly solved. This paper proposes a novel framework of human behavior modeling based on wearable inertial sensors; the system framework is composed of data acquisition, feature extraction, optimization and classification stages. First, inertial data is filtered via three different filters, i.e., Chebyshev, Elliptic and Bessel filters. Next, six different features from time and frequency domains are extracted to determine the maximum optimal values. Then, the Probability Based Incremental Learning (PBIL) optimizer and the K-Ary tree hashing classifier are applied to model different human activities. The proposed model is evaluated on two benchmark datasets, namely DALIAC and PAMPA2, and one self-annotated dataset, namely, IM-LifeLog, respectively. For evaluation, we used a leave-one-out cross validation scheme. The experimental results show that our model outperformed existing state-of-the-art methods with accuracy rates of 94.23%, 94.07% and 96.40% over DALIAC, PAMPA2 and IM-LifeLog datasets, respectively. The proposed system can be used in healthcare, physical activity detection, surveillance systems and medical fitness fields.

[1]  Ahmad Jalal,et al.  Wearable Inertial Sensors for Daily Activity Analysis Based on Adam Optimization and the Maximum Entropy Markov Model , 2020, Entropy.

[2]  Raghavendra G. Kulkarni Synthesis of a new signal processing window , 2019 .

[3]  Kurt Mehlhorn,et al.  Weisfeiler-Lehman Graph Kernels , 2011, J. Mach. Learn. Res..

[4]  Tae-Seong Kim,et al.  Depth video-based human activity recognition system using translation and scaling invariant features for life logging at smart home , 2012, IEEE Transactions on Consumer Electronics.

[5]  Zunash Zaki LOGISTIC REGRESSION BASED HUMAN ACTIVITIES RECOGNITION , 2020, JOURNAL OF MECHANICS OF CONTINUA AND MATHEMATICAL SCIENCES.

[6]  Andrea Mannini,et al.  Activity Recognition in Youth Using Single Accelerometer Placed at Wrist or Ankle , 2017, Medicine and science in sports and exercise.

[7]  Maria Mahmood,et al.  WHITE STAG model: wise human interaction tracking and estimation (WHITE) using spatio-temporal and angular-geometric (STAG) descriptors , 2019, Multimedia Tools and Applications.

[8]  Xu Yong,et al.  Three-stage network for age estimation , 2019 .

[9]  Ahmed Kattan,et al.  Physical Activities Monitoring Using Wearable Acceleration Sensors Attached to the Body , 2015, PloS one.

[10]  Hong Qu,et al.  Deep Dilation on Multimodality Time Series for Human Activity Recognition , 2018, IEEE Access.

[11]  Jürgen Weber,et al.  Analytical analysis of single-stage pressure relief valves , 2019, International Journal of Hydromechatronics.

[12]  Shaohua Wang,et al.  Human interaction recognition using spatial-temporal salient feature , 2018, Multimedia Tools and Applications.

[13]  Asif Iqbal,et al.  Wearable Internet-of-Things platform for human activity recognition and health care , 2020, Int. J. Distributed Sens. Networks.

[14]  R. Narmadha,et al.  Robust Human Action Recognition System via Image Processing , 2020 .

[15]  Athanasios V. Vasilakos,et al.  GCHAR: An efficient Group-based Context - aware human activity recognition on smartphone , 2017, J. Parallel Distributed Comput..

[16]  A. Goris,et al.  Detection of type, duration, and intensity of physical activity using an accelerometer. , 2009, Medicine and science in sports and exercise.

[17]  Sumam Mary Idicula,et al.  Action Recognition in Still Images using Residual Neural Network Features , 2018 .

[18]  Hema Swetha Koppula,et al.  Learning human activities and object affordances from RGB-D videos , 2012, Int. J. Robotics Res..

[19]  Negar Golestani,et al.  Human activity recognition using magnetic induction-based motion signals and deep recurrent neural networks , 2020, Nature Communications.

[20]  S. Domnic,et al.  Walsh–Hadamard Transform Kernel-Based Feature Vector for Shot Boundary Detection , 2014, IEEE Transactions on Image Processing.

[21]  Fadi Al Machot,et al.  A review on applications of activity recognition systems with regard to performance and evaluation , 2016, Int. J. Distributed Sens. Networks.

[22]  Duoqian Miao,et al.  Influence of kernel clustering on an RBFN , 2019, CAAI Trans. Intell. Technol..

[23]  Daijin Kim,et al.  Robust human activity recognition from depth video using spatiotemporal multi-fused features , 2017, Pattern Recognit..

[24]  Ahmad Lotfi,et al.  Human activity learning for assistive robotics using a classifier ensemble , 2018, Soft Comput..

[25]  Majid Ali Khan Quaid,et al.  Wearable sensors based human behavioral pattern recognition using statistical features and reweighted genetic algorithm , 2019, Multimedia Tools and Applications.

[26]  Chalavadi Krishna Mohan,et al.  Human action recognition in RGB-D videos using motion sequence information and deep learning , 2017, Pattern Recognit..

[27]  B. Bharathi,et al.  Classification and analysis of human activities , 2017, 2017 International Conference on Communication and Signal Processing (ICCSP).

[28]  Md. Rashedul Islam,et al.  Enhanced Human Activity Recognition Based on Smartphone Sensor Data Using Hybrid Feature Selection Model , 2020, Sensors.

[29]  Seba Susan,et al.  New shape descriptor in the context of edge continuity , 2019, CAAI Trans. Intell. Technol..

[30]  Ioan Salomie,et al.  Machine Learning Based Technique for Detecting Daily Routine and Deviations , 2018, 2018 IEEE 14th International Conference on Intelligent Computer Communication and Processing (ICCP).

[31]  Rossitza Goleva,et al.  Improving Activity Recognition Accuracy in Ambient-Assisted Living Systems by Automated Feature Engineering , 2017, IEEE Access.

[32]  Ennio Gambi,et al.  A Human Activity Recognition System Using Skeleton Data from RGBD Sensors , 2016, Comput. Intell. Neurosci..