An LSTM-based Descriptor for Human Activities Recognition using IMU Sensors

In this article, we present a public human activity dataset called ‘HAD-AW’. It consists of four types of 3D sensory signals: acceleration, angular velocity, rotation displacement, and gravity for 31 activities of daily living ADL measured by a wearable smart watch. It is created as a benchmark for algorithms comparison. We succinctly survey some existing datasets and compare them to ‘HAD-AW’. The goal is to make the dataset usable and extendible by others. We introduce a framework of ADL recognition by making various pre-processing steps based on statistical and physical features which we call AMED. These features are then classified using an LSTM recurrent network. The proposed approach is compared to a random-forest algorithm. Finally, our experiments show that the joint use of all four sensors has achieved the best prediction accuracy reaching 95.3% for all activities. It also achieves savings from 88% to 98% in the training and testing time; compared to the random forest classifier. To show the effectiveness of the proposed method, it is evaluated on other four public datasets: CMU-MMAC, USC-HAD, REALDISP, and Gomaa datasets.

[1]  Sara Ashry Mohammed,et al.  ADL Classification Based on Autocorrelation Function of Inertial Signals , 2017, 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA).

[2]  Martial Hebert,et al.  Temporal segmentation and activity classification from first-person sensing , 2009, 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[3]  Massimo Piccardi,et al.  Structural SVM with Partial Ranking for Activity Segmentation and Classification , 2015, IEEE Signal Processing Letters.

[4]  Trevor Darrell,et al.  Long-term recurrent convolutional networks for visual recognition and description , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Tullio Vernazza,et al.  Human motion modelling and recognition: A computational approach , 2012, 2012 IEEE International Conference on Automation Science and Engineering (CASE).

[6]  Tullio Vernazza,et al.  Analysis of human behavior recognition algorithms based on acceleration data , 2013, 2013 IEEE International Conference on Robotics and Automation.

[7]  Héctor Pomares,et al.  A benchmark dataset to evaluate sensor displacement in activity recognition , 2012, UbiComp.

[8]  Mi Zhang,et al.  USC-HAD: a daily activity dataset for ubiquitous activity recognition using wearable sensors , 2012, UbiComp.

[9]  Jessica K. Hodgins,et al.  Guide to the Carnegie Mellon University Multimodal Activity (CMU-MMAC) Database , 2008 .

[10]  Héctor Pomares,et al.  Dealing with the Effects of Sensor Displacement in Wearable Activity Recognition , 2014, Sensors.

[11]  Vasileios Megalooikonomou,et al.  Human motion detection in daily activity tasks using wearable sensors , 2014, 2014 22nd European Signal Processing Conference (EUSIPCO).