ADL Classification Based on Autocorrelation Function of Inertial Signals

Recognition of human activities is one of the most promising research areas in artificial intelligence. This has come along with the technological advancement in sensing technologies as well as the high demand for applications that are mobile, context-aware, and real-time. In this paper, we use a smart watch to collect sensory data for 14 ADL activities. We collect three types of sensory signals: acceleration, angular velocity, and rotation displacement; each is a tri-axial signal. From each given signal we compute the autocorrelation function up to certain lag and take these computed values as representative features of the given signal. We then feed these features to a random-forest-based classifier for training and prediction. We experiment with different combinations of sensory data. The joint use of acceleration with angular velocity has achieved the best performance in prediction accuracy which reaches about 80% for the whole set of 14 activities.

[1]  Huiru Zheng,et al.  Activity Monitoring Using a Smart Phone's Accelerometer with Hierarchical Classification , 2010, 2010 Sixth International Conference on Intelligent Environments.

[2]  Angelo M. Sabatini,et al.  Machine Learning Methods for Classifying Human Physical Activity from On-Body Accelerometers , 2010, Sensors.

[3]  Ping-Min Lin,et al.  A fall detection system using k-nearest neighbor classifier , 2010, Expert Syst. Appl..

[4]  Hwee Pink Tan,et al.  Deep Activity Recognition Models with Triaxial Accelerometers , 2015, AAAI Workshop: Artificial Intelligence Applied to Assistive Technologies and Smart Environments.

[5]  Tullio Vernazza,et al.  Analysis of human behavior recognition algorithms based on acceleration data , 2013, 2013 IEEE International Conference on Robotics and Automation.

[6]  Miguel A. Labrador,et al.  A mobile human activity recognition system , 2012, 2012 IEEE Consumer Communications and Networking Conference (CCNC).

[7]  Haibo Hu,et al.  Wearable Sensor-Based Human Activity Recognition Method with Multi-Features Extracted from Hilbert-Huang Transform , 2016, Sensors.

[8]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[9]  政子 鶴岡,et al.  1998 IEEE International Conference on SMCに参加して , 1998 .

[10]  Bernt Schiele,et al.  Analyzing features for activity recognition , 2005, sOc-EUSAI '05.

[11]  Thestaffofthebenjaminrosehos MULTIDISCIPLINARY studies of illness in aged persons. II. A new classification of functional status in activities of daily living. , 1959, Journal of chronic diseases.

[12]  Hui Yang,et al.  A low power and high accuracy MEMS sensor based activity recognition algorithm , 2014, 2014 IEEE International Conference on Bioinformatics and Biomedicine (BIBM).

[13]  Tapio Seppänen,et al.  Recognizing human motion with multiple acceleration sensors , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[14]  Zhiquan Wang,et al.  Recognition of human activities using SVM multi-class classifier , 2010, Pattern Recognit. Lett..

[15]  Zhaozheng Yin,et al.  Human Activity Recognition Using Wearable Sensors by Deep Convolutional Neural Networks , 2015, ACM Multimedia.

[16]  Tullio Vernazza,et al.  Human motion modelling and recognition: A computational approach , 2012, 2012 IEEE International Conference on Automation Science and Engineering (CASE).

[17]  Faicel Chamroukhi,et al.  An Unsupervised Approach for Automatic Activity Recognition Based on Hidden Markov Model Regression , 2013, IEEE Transactions on Automation Science and Engineering.

[18]  D. Kulić,et al.  Automatic Human Motion Segmentation and Identification using Feature Guided HMM for Physical Rehabilitation Exercises , 2011 .

[19]  Miguel A. Labrador,et al.  A mobile platform for real-time human activity recognition , 2012, 2012 IEEE Consumer Communications and Networking Conference (CCNC).

[20]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.