An analysis of eating activities for automatic food type recognition

Nowadays, chronic diseases such as type 2 diabetes or cardiovascular diseases are considered to be one of the most serious threats to healthy life. These kinds of diseases are primarily caused by an unhealthy lifestyle including lack of exercise, irregular meal patterns and abuse of addictive substances such as alcohol, caffeine and nicotine. Therefore, observing our daily lives is crucial in developing interventions to reduce the risk of lifestyle diseases. In order to manage and predict progression of diseases of a patient, objective measurement of lifestyle is essential. However, self-reporting questionnaires and interviews have limitation due to human errors and difficulty of conducting. In this paper, we analysed users' eating activities and comprising sub-actions for developing eating activity recognition system based on a tri-axial accelerometer embedded wrist band. By analysing actions in eating activities, we can improve the accuracy of the recognition of eating activities and also provide clues that identifying the type of foods.

[1]  Martin Hofmann,et al.  The Noldus database: Automated recognition of restaurant related activities for the restaurant of the future , 2011, WIAMIS 2011.

[2]  Gerhard Tröster,et al.  On-Body Sensing Solutions for Automatic Dietary Monitoring , 2009, IEEE Pervasive Computing.

[3]  Sung-Bae Cho,et al.  Activity Recognition Using Hierarchical Hidden Markov Models on a Smartphone with 3D Accelerometer , 2011, HAIS.

[4]  S Policker,et al.  The use of gastric electrical signals for algorithm for automatic eating detection in dogs , 2008, Neurogastroenterology and motility : the official journal of the European Gastrointestinal Motility Society.

[5]  Gary M. Weiss,et al.  Activity recognition using cell phone accelerometers , 2011, SKDD.

[6]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[7]  Andreas Savvides,et al.  Recognizing activities from context and arm pose using finite state machines , 2009, 2009 Third ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC).

[8]  Sen Zhang,et al.  Detection of Activities by Wireless Sensors for Daily Life Surveillance: Eating and Drinking , 2009, Sensors.

[9]  Zhigang Liu,et al.  The Jigsaw continuous sensing engine for mobile phone applications , 2010, SenSys '10.

[10]  Chen-Khong Tham,et al.  Eating activity primitives detection - a step towards ADL recognition , 2008, HealthCom 2008 - 10th International Conference on e-health Networking, Applications and Services.

[11]  Diane J. Cook,et al.  Mining Sensor Streams for Discovering Human Activity Patterns over Time , 2010, 2010 IEEE International Conference on Data Mining.

[12]  Alexandros Iosifidis,et al.  Person specific activity recognition using fuzzy learning and Discriminant Analysis , 2011, 2011 19th European Signal Processing Conference.

[13]  Gwenn Englebienne,et al.  UvA-DARE ( Digital Academic Repository ) Activity recognition using semi-Markov models on real world smart home datasets , 2010 .

[14]  Gerhard Tröster,et al.  Detection of eating and drinking arm gestures using inertial body-worn sensors , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[15]  Kent Larson,et al.  Activity Recognition in the Home Using Simple and Ubiquitous Sensors , 2004, Pervasive.

[16]  Ghim-Eng Yap,et al.  Improving the accuracy of erroneous-plan recognition system for Activities of Daily Living , 2010, The 12th IEEE International Conference on e-Health Networking, Applications and Services.

[17]  François Brémond,et al.  Multisensor Fusion for Monitoring Elderly Activities at Home , 2009, 2009 Sixth IEEE International Conference on Advanced Video and Signal Based Surveillance.