User independent, multi-modal spotting of subtle arm actions with minimal training data

We address a specific, particularly difficult class of activity recognition problems defined by (1) subtle, and hardly discriminative hand motions such as a short press or pull, (2) large, ill defined NULL class (any other hand motion a person may express during normal life), and (3) difficulty of collecting sufficient training data, that generalizes well from one to multiple users. In essence we intend to spot activities such as opening a cupboard, pressing a button, or taking an object from a shelve in a large data stream that contains typical every day activity. We focus on body-worn sensors without instrumenting objects, we exploit available infrastructure information, and we perform a one-to-many-users training scheme for minimal training effort. We demonstrate that a state of the art motion sensors based approach performs poorly under such conditions (Equal Error Rate of 18% in our experiments). We present and evaluate a new multi modal system based on a combination of indoor location with a wrist mounted proximity sensor, camera and inertial sensor that raises the EER to 79%.

[1]  Bernt Schiele,et al.  Daily Routine Recognition through Activity Spotting , 2009, LoCA.

[2]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[3]  Paul Lukowicz,et al.  Design and Evaluation of a Sound Based Water Flow Measurement System , 2008, EuroSSC.

[4]  Jacques Demongeot,et al.  A model for the measurement of patient activity in a hospital suite , 2006, IEEE Transactions on Information Technology in Biomedicine.

[5]  Jun Rekimoto,et al.  Aided eyes: eye activity sensing for daily life , 2010, AH.

[6]  Byeong-Tae Anh Event Semantic Photo Retrieval Management System Based on MPEG-7 , 2007 .

[7]  Bernt Schiele,et al.  Analyzing features for activity recognition , 2005, sOc-EUSAI '05.

[8]  Wan-Young Chung,et al.  Classification of Posture and Movement Using a 3-axis Accelerometer , 2007, 2007 International Conference on Convergence Information Technology (ICCIT 2007).

[9]  Paul Lukowicz,et al.  Performance metrics for activity recognition , 2011, TIST.

[10]  Bernt Schiele,et al.  Toward Recognition of Short and Non-repetitive Activities from Wearable Sensors , 2007, AmI.

[11]  Bernt Schiele,et al.  Discovery of activity patterns using topic models , 2008 .

[12]  Takuya Maekawa,et al.  Object-Based Activity Recognition with Heterogeneous Sensors on Wrist , 2010, Pervasive.

[13]  Paul Lukowicz,et al.  Developing a Sub Room Level Indoor Location System for Wide Scale Deployment in Assisted Living Systems , 2008, ICCHP.

[14]  Kristof Van Laerhoven,et al.  What shall we teach our pants? , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[15]  Gerhard Tröster,et al.  Detection of eating and drinking arm gestures using inertial body-worn sensors , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[16]  Kent Larson,et al.  Activity Recognition in the Home Using Simple and Ubiquitous Sensors , 2004, Pervasive.

[17]  Paul Lukowicz,et al.  Recognizing the Use-Mode of Kitchen Appliances from Their Current Consumption , 2009, EuroSSC.

[18]  Thomas Kieninger,et al.  Gaze guided object recognition using a head-mounted eye tracker , 2012, ETRA '12.

[19]  Tim Lüth,et al.  Recognition of interactions with objects based on radio modules , 2010, 2010 4th International Conference on Pervasive Computing Technologies for Healthcare.