A motion recognition method by constancy-decision

Many context-aware systems using accelerometers have been proposed. Contexts that have been recognized are categorized into postures (e.g. sitting), behaviors (e.g. walking), and gestures (e.g. a punch). Postures and behaviors are states lasting for a certain length of time. Gestures, however, are sporadic or once-off actions. It has been a challenging task to find gestures buried in other contexts. In this paper, we propose a method that classifies contexts into postures, behaviors, and gestures by using the autocorrelation of the acceleration values and recognizes contexts with an appropriate method. We evaluated the recall and precision of recognition for seven kinds of gestures while five kinds of behaviors; The conventional method gave values of 0.75 and 0.59 whereas our method gave 0.93 and 0.93. Our system enables a user to input by gesturing even while he or she is performing a behavior.

[1]  L. R. Rabiner,et al.  A comparative study of several dynamic time-warping algorithms for connected-word recognition , 1981, The Bell System Technical Journal.

[2]  Paul Lukowicz,et al.  Combining Motion Sensors and Ultrasonic Hands Tracking for Continuous Activity Recognition in a Maintenance Scenario , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[3]  Miwako Doi,et al.  LifeMinder: a wearable healthcare support system using user's context , 2002, Proceedings 22nd International Conference on Distributed Computing Systems Workshops.

[4]  Emmanuel Munguia Tapia,et al.  Acquiring in situ training data for context-aware ubiquitous computing applications , 2004, CHI.

[5]  Kent Larson,et al.  Real-Time Recognition of Physical Activities and Their Intensities Using Wireless Accelerometers and a Heart Rate Monitor , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[6]  Paul Lukowicz,et al.  Gesture spotting with body-worn inertial sensors to detect user activities , 2008, Pattern Recognit..

[7]  Svetha Venkatesh,et al.  Hierarchical recognition of intentional human gestures for sports video annotation , 2002, Object recognition supported by user interaction for service robots.

[8]  Zhen Wang,et al.  uWave: Accelerometer-based Personalized Gesture Recognition and Its Applications , 2009, PerCom.

[9]  Johannes Peltola,et al.  Activity classification using realistic data from wearable sensors , 2006, IEEE Transactions on Information Technology in Biomedicine.

[10]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[11]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.