Understanding and Improving Deep Neural Network for Activity Recognition

Activity recognition has become a popular research branch in the field of pervasive computing in recent years. A large number of experiments can be obtained that activity sensor-based data's characteristic in activity recognition is variety, volume, and velocity. Deep learning technology, together with its various models, is one of the most effective ways of working on activity data. Nevertheless, there is no clear understanding of why it performs so well or how to make it more effective. In order to solve this problem, first, we applied convolution neural network on Human Activity Recognition Using Smart phones Data Set. Second, we realized the visualization of the sensor-based activity's data features extracted from the neural network. Then we had in-depth analysis of the visualization of features, explored the relationship between activity and features, and analyzed how Neural Networks identify activity based on these features. After that, we extracted the significant features related to the activities and sent the features to the DNN-based fusion model, which improved the classification rate to 96.1%. This is the first work to our knowledge that visualizes abstract sensor-based activity data features. Based on the results, the method proposed in the paper promises to realize the accurate classification of sensor- based activity recognition.

[1]  Lina Yao,et al.  Fullie and Wiselie: A Dual-Stream Recurrent Convolutional Attention Model for Activity Recognition , 2017, ArXiv.

[2]  Yuqing Chen,et al.  A Deep Learning Approach to Human Activity Recognition Based on Single Accelerometer , 2015, 2015 IEEE International Conference on Systems, Man, and Cybernetics.

[3]  Graham W. Taylor,et al.  Adaptive deconvolutional networks for mid and high level feature learning , 2011, 2011 International Conference on Computer Vision.

[4]  Ivan Marsic,et al.  Activity recognition for medical teamwork based on passive RFID , 2016, 2016 IEEE International Conference on RFID (RFID).

[5]  Siddika Parlak Polatkan Object detection and activity recognition in dynamic medical settings using RFID , 2013 .

[6]  Davide Anguita,et al.  Transition-Aware Human Activity Recognition Using Smartphones , 2016, Neurocomputing.

[7]  Kris M. Kitani,et al.  Going Deeper into First-Person Activity Recognition , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[8]  Jianbo Yang,et al.  Deep Learning for Human Activity Recognition , 2020 .

[9]  Takeshi Nishida,et al.  Deep recurrent neural network for mobile human activity recognition with high throughput , 2017, Artificial Life and Robotics.

[10]  Venet Osmani,et al.  Human activity recognition in pervasive health-care: Supporting efficient remote collaboration , 2008, J. Netw. Comput. Appl..

[11]  Davide Anguita,et al.  Energy Efficient Smartphone-Based Activity Recognition using Fixed-Point Arithmetic , 2013, J. Univers. Comput. Sci..

[12]  Paul Lukowicz,et al.  Transforming sensor data to the image domain for deep learning — An application to footstep detection , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).

[13]  Bo Yu,et al.  Convolutional Neural Networks for human activity recognition using mobile sensors , 2014, 6th International Conference on Mobile Computing, Applications and Services.

[14]  Andreu Català,et al.  Basketball Activity Recognition using Wearable Inertial Measurement Units , 2015, Interacción.

[15]  Björn Eskofier,et al.  Sensor-based stroke detection and stroke type classification in table tennis , 2015, SEMWEB.

[16]  Wei-Shi Zheng,et al.  Jointly Learning Heterogeneous Features for RGB-D Activity Recognition , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Rob Fergus,et al.  Visualizing and Understanding Convolutional Networks , 2013, ECCV.

[18]  Marcus Edel,et al.  Binarized-BLSTM-RNN based Human Activity Recognition , 2016, 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN).

[19]  Zhaozheng Yin,et al.  Human Activity Recognition Using Wearable Sensors by Deep Convolutional Neural Networks , 2015, ACM Multimedia.

[20]  Daniel Roggen,et al.  Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition , 2016, Sensors.

[21]  Davide Anguita,et al.  Human Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine , 2012, IWAAL.

[22]  Larry H. Matthies,et al.  First-Person Activity Recognition: Feature, Temporal Structure, and Prediction , 2015, International Journal of Computer Vision.

[23]  Guang-Zhong Yang,et al.  Deep learning for human activity recognition: A resource efficient implementation on low-power devices , 2016, 2016 IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN).

[24]  Lei Liu,et al.  Human Daily Activity Recognition for Healthcare Using Wearable and Visual Sensing Data , 2016, 2016 IEEE International Conference on Healthcare Informatics (ICHI).

[25]  Majid Sarrafzadeh,et al.  Designing a Robust Activity Recognition Framework for Health and Exergaming Using Wearable Sensors , 2014, IEEE Journal of Biomedical and Health Informatics.

[26]  Lina Yao,et al.  Learning from less for better: semi-supervised activity recognition via shared structure discovery , 2016, UbiComp.

[27]  Miguel A. Labrador,et al.  A Survey on Human Activity Recognition using Wearable Sensors , 2013, IEEE Communications Surveys & Tutorials.

[28]  Xiaoli Li,et al.  Deep Convolutional Neural Networks on Multichannel Time Series for Human Activity Recognition , 2015, IJCAI.