Position independent activity recognition using shallow neural architecture and empirical modeling

The goal of the SHL recognition challenge 2019 is to recognize transportation modalities in a sensor placement independent manner. In this paper, the performance of shallow neural networks is benchmarked by Team Orion in such a manner on the dataset provided in the challenge, using 156 handcrafted temporal and spectral features per sensor through the application of parallel processing and out-of-memory architecture. Using scaled conjugate gradient back-propagation (SCGB) algorithm, combining classes 7 and 8 and taking 5000 frames of bag-hips-torso data from validation set, classification accuracy of 87.2% was obtained on the validation dataset of the same labels for a shallow two-layer feed-forward network. 71% accuracy was obtained on the validation set of classes 7 and 8 via transfer of 2500 frames using another shallow neural network of similar architecture. Using empirically observed variable based transfer of 7088 frames from hand validation data to training dataset, 77.5% accuracy was obtained on hand validation data for classes 1 to 7/8, and 70% classification accuracy of classes 7 and 8 via transfer of 1809 frames from hand validation data. The results illustrate how carefully crafted features coupled with empirical transfer of labeled knowledge and combination of problematic classes can tune a neural classifier to work in a new feature space.

[1]  Sozo Inoue,et al.  Supervised and Unsupervised Transfer Learning for Activity Recognition from Simple In-home Sensors , 2016, MobiQuitous.

[2]  Stephanie Thomas,et al.  Representation of Data for Machine Learning in MATLAB , 2017 .

[3]  Qiang Yang,et al.  Cross-domain activity recognition via transfer learning , 2011, Pervasive Mob. Comput..

[4]  Stefan Valentin,et al.  Enabling Reproducible Research in Sensor-Based Transportation Mode Recognition With the Sussex-Huawei Dataset , 2019, IEEE Access.

[5]  Gerhard Tröster,et al.  The adARC pattern analysis architecture for adaptive human activity recognition systems , 2011, Journal of Ambient Intelligence and Humanized Computing.

[6]  Md. Atiqur Rahman Ahad,et al.  A Comparative Approach to Classification of Locomotion and Transportation Modes Using Smartphone Sensor Data , 2018, UbiComp/ISWC Adjunct.

[7]  Rong Yang,et al.  PACP: A Position-Independent Activity Recognition Method Using Smartphone Sensors , 2016, Inf..

[8]  Diane J. Cook,et al.  Multi Home Transfer Learning for Resident Activity Discovery and Recognition , 2010 .

[9]  Lin Wang,et al.  The University of Sussex-Huawei Locomotion and Transportation Dataset for Multimodal Analytics With Mobile Devices , 2018, IEEE Access.

[10]  Wisuwat Sunhem,et al.  A comparison between shallow and deep architecture classifiers on small dataset , 2016, 2016 8th International Conference on Information Technology and Electrical Engineering (ICITEE).

[11]  Martin Fodslette Møller,et al.  A scaled conjugate gradient algorithm for fast supervised learning , 1993, Neural Networks.

[12]  Niall Twomey,et al.  Active transfer learning for activity recognition , 2016, ESANN.

[13]  Kazuya Murao,et al.  Summary of the Sussex-Huawei locomotion-transportation recognition challenge 2019 , 2019, UbiComp/ISWC Adjunct.

[14]  Md. Atiqur Rahman Ahad,et al.  Computer Vision and Action Recognition - A Guide for Image Processing and Computer Vision Community for Action Understanding , 2011, Atlantis Ambient and Pervasive Intelligence.

[15]  Bernt Schiele,et al.  Remember and transfer what you have learned - recognizing composite activities based on activity spotting , 2010, International Symposium on Wearable Computers (ISWC) 2010.

[16]  Hristijan Gjoreski,et al.  Benchmark Performance for the Sussex-Huawei Locomotion and Transportation Recognition Challenge 2018 , 2019, Human Activity Sensing.

[17]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[18]  Md. Atiqur Rahman Ahad,et al.  Motion History Images for Action Recognition and Understanding , 2012, SpringerBriefs in Computer Science.

[19]  Philip S. Yu,et al.  Stratified Transfer Learning for Cross-domain Activity Recognition , 2017, 2018 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[20]  Md. Atiqur Rahman Ahad,et al.  IoT Sensor-Based Activity Recognition - Human Activity Recognition , 2021, Intelligent Systems Reference Library.

[21]  Md. Atiqur Rahman Ahad,et al.  Supervised and Neural Classifiers for Locomotion Analysis , 2018, UbiComp/ISWC Adjunct.

[22]  Md. Atiqur Rahman Ahad,et al.  Feature Extraction, Performance Analysis and System Design Using the DU Mobility Dataset , 2018, IEEE Access.

[23]  Jani Bizjak,et al.  A New Frontier for Activity Recognition: The Sussex-Huawei Locomotion Challenge , 2018, UbiComp/ISWC Adjunct.

[24]  Qiang Yang,et al.  Cross-domain activity recognition , 2009, UbiComp.

[25]  Lin Wang,et al.  Summary of the Sussex-Huawei Locomotion-Transportation Recognition Challenge , 2018, UbiComp/ISWC Adjunct.