A Multimodal Benchmark Tool for Automated Eating Behaviour Recognition

In this paper, we present the multimodal eating behaviour dataset called iEatSet (iCareNet Multimodal Eating Behaviour DataSet). iEatSet shall serve as an algorithm benchmark dataset and aims to facilitate research in automatic dietary monitoring, eating recognition, and activity recognition in general. iEatSet provides multimodal synchronised data streams, including multi-camera vision, inertial motion sensor data, and associated ground truth labelling, recorded from 15 participants over 5 meals in a natural restaurant environment. Recordings included food selection and consumption without scripted protocol, to provide naturalistic behaviour. Having validated methods and tools to recognize people’s eating behaviour could advance research and coaching in applications related to nutrition and dieting. Currently, dieting analysis is done manually and is very tedious. Hence, an automated analysis tool is desired. The current state-of-the-art tools in activity recognition first need to be trained before they can recognize activities. The iEatSet can be particularly useful to benchmark supervised recognition algorithms and serve as reference for unsupervised algorithm analysis.

[1]  Nicole J. J. P. Koenderink,et al.  Unravelling the language of eating , 2014, UbiComp Adjunct.

[2]  Cordelia Schmid,et al.  Actions in context , 2009, CVPR.

[3]  C. Schmid,et al.  Actions in context , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[4]  Hong Cheng,et al.  Real world activity summary for senior home monitoring , 2011, 2011 IEEE International Conference on Multimedia and Expo.

[5]  van der Eddy Zee,et al.  Proceedings of Measuring Behavior 2014 , 2014 .

[6]  Claire François,et al.  Manual Corpus Annotation: Giving Meaning to the Evaluation Metrics , 2012, COLING.

[7]  Stephen J. McKenna,et al.  Combining embedded accelerometers with computer vision for recognizing food preparation activities , 2013, UbiComp.

[8]  Gerhard Tröster,et al.  Detection of eating and drinking arm gestures using inertial body-worn sensors , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).