Detection of eating and drinking arm gestures using inertial body-worn sensors

We propose a two-stage recognition system for detecting arm gestures related to human meal intake. Information retrieved from such a system can be used for automatic dietary monitoring in the domain of behavioural medicine. We demonstrate that arm gestures can be clustered and detected using inertial sensors. To validate our method, experimental results including 384 gestures from two subjects are presented. Using isolated discrimination based on HMMs an accuracy of 94% can be achieved. When spotting the gestures in continuous movement data, an accuracy of up to 87% is reached.

[1]  Jin-Hyung Kim,et al.  An HMM-Based Threshold Model Approach for Gesture Recognition , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Arthur D. Fisk,et al.  Aware technologies for aging in place: understanding user needs and attitudes , 2004, IEEE Pervasive Computing.

[3]  Kay Römer,et al.  Smart identification frameworks for ubiquitous computing applications , 2003, Proceedings of the First IEEE International Conference on Pervasive Computing and Communications, 2003. (PerCom 2003)..

[4]  Svetha Venkatesh,et al.  Hierarchical recognition of intentional human gestures for sports video annotation , 2002, Object recognition supported by user interaction for service robots.

[5]  Eamonn J. Keogh,et al.  An online algorithm for segmenting time series , 2001, Proceedings 2001 IEEE International Conference on Data Mining.

[6]  Albrecht Schmidt,et al.  Mediacups: experience with design and use of computer-augmented everyday artefacts , 2001, Comput. Networks.

[7]  Paul Lukowicz,et al.  Continuous recognition of arm activities with body-worn inertial sensors , 2004, Eighth International Symposium on Wearable Computers.

[8]  P. Bajcsy,et al.  Recognition of arm gestures using multiple orientation sensors: gesture classification , 2004, Proceedings. The 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No.04TH8749).