A Hierarchical Approach in Food and Drink Intake Recognition Using Wearable Inertial Sensors

Despite the increasing attention given to inertial sensors for Human Activity Recognition (HAR), efforts are principally focused on fitness applications where quasi-periodic activities like walking or running are studied. In contrast, activities like eating or drinking cannot be considered periodic or quasi-periodic. Instead, they are composed of sporadic occurring gestures in continuous data streams. This paper presents an approach to gesture recognition for an Ambient Assisted Living (AAL) environment. Specifically, food and drink intake gestures are studied. To do so, firstly, waist-worn tri-axial accelerometer data is used to develop a low computational model to recognize whether a person is at moving, sitting or standing estate. With this information, data from a wrist-worn tri-axial Micro-Electro-Mechanical (MEM) system was used to recognize a set of similar eating and drinking gestures. The promising preliminary results show that states can be recognized with 100% classification accuracy with the use of a low computational model on a reduced 4-dimensional feature vector. Additionally, the recognition rate achieved for eating and drinking gestures was above 99%. Altogether suggests that it is possible to develop a continuous monitoring system based on a bi-nodal inertial unit. This work is part of a bigger project that aims at developing a self-neglect detection continuous monitoring system for older adults living independently.

[1]  Thomas Plötz,et al.  Deep, Convolutional, and Recurrent Models for Human Activity Recognition Using Wearables , 2016, IJCAI.

[2]  Ilkka Korhonen,et al.  Detection of Daily Activities and Sports With Wearable Sensors in Controlled and Uncontrolled Conditions , 2008, IEEE Transactions on Information Technology in Biomedicine.

[3]  Paul Lukowicz,et al.  Towards wearable sensing-based assessment of fluid intake , 2010, 2010 8th IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops).

[4]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[5]  Hasan Ogul,et al.  Integrating Features for Accelerometer-based Activity Recognition , 2016, EUSPN/ICTH.

[6]  Nuno M. Fonseca Ferreira,et al.  Combining discriminative spatiotemporal features for daily life activity recognition using wearable motion sensing suit , 2017, Pattern Analysis and Applications.

[7]  Ritu Gupta,et al.  Classification of team sport activities using a single wearable tracking device. , 2015, Journal of biomechanics.

[8]  Alexander Kmentt 2017 , 2018, The Treaty Prohibiting Nuclear Weapons.

[9]  Christianna S. Williams,et al.  The mortality of elder mistreatment. , 1998, JAMA.

[10]  Ruize Xu,et al.  MEMS Accelerometer Based Nonspecific-User Hand Gesture Recognition , 2012, IEEE Sensors Journal.

[11]  Yujie Dong,et al.  Detecting Periods of Eating During Free-Living by Tracking Wrist Motion , 2014, IEEE Journal of Biomedical and Health Informatics.

[12]  Daniel Roggen,et al.  Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition , 2016, Sensors.

[13]  Evangelos Kalogerakis,et al.  RisQ: recognizing smoking gestures with inertial sensors on a wristband , 2014, MobiSys.

[14]  Paul Lukowicz,et al.  Gesture spotting with body-worn inertial sensors to detect user activities , 2008, Pattern Recognit..

[15]  Ahmad Lotfi,et al.  A Hierarchical Approach towards Activity Recognition , 2017, PETRA.

[16]  A. Naik,et al.  Impairment in instrumental activities of daily living and the geriatric syndrome of self-neglect. , 2008, The Gerontologist.

[17]  Andrey Ignatov,et al.  Real-time human activity recognition from accelerometer data using Convolutional Neural Networks , 2018, Appl. Soft Comput..

[18]  Gerhard Tröster,et al.  Probabilistic parsing of dietary activity events , 2007, BSN.

[19]  Duc A. Tran,et al.  The 11th International Conference on Mobile Systems and Pervasive Computing (MobiSPC-2014) A Study on Human Activity Recognition Using Accelerometer Data from Smartphones , 2014 .

[20]  Weihua Sheng,et al.  Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[21]  Basel Kikhia,et al.  Optimal Placement of Accelerometers for the Detection of Everyday Activities , 2013, Sensors.

[22]  Gerhard Tröster,et al.  Recognition of dietary activity events using on-body sensors , 2008, Artif. Intell. Medicine.

[23]  Bjoern H. Menze,et al.  A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data , 2009, BMC Bioinformatics.

[24]  M. Bruce,et al.  Predictors of self-neglect in community-dwelling elders. , 2002, The American journal of psychiatry.