High level activity recognition using low resolution wearable vision

This paper presents a system aimed to serve as the enabling platform for a wearable assistant. The method observes manipulations from a wearable camera and classifies activities from roughly stabilized low resolution images (160 × 120 pixels) with the help of a 3-level Dynamic Bayesian Network and adapted temporal templates. Our motivation is to explore robust but computationally inexpensive visual methods to perform as much activity inference as possible without resorting to more complex object or hand detectors. The description of the method and results obtained are presented, as well as the motivation for further work in the area of wearable visual sensing.

[1]  Tracy Hammond,et al.  Office activity recognition using hand posture cues , 2008 .

[2]  Mathias Kölsch,et al.  Fast 2D Hand Tracking with Flocks of Features and Multi-Cue Integration , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[3]  Kevin Murphy,et al.  Dynamic Bayesian Networks , 2002 .

[4]  Henry A. Kautz,et al.  Learning and inferring transportation routines , 2004, Artif. Intell..

[5]  James M. Rehg,et al.  A Scalable Approach to Activity Recognition based on Object Use , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[6]  Michael C. Horsch,et al.  Dynamic Bayesian networks , 1990 .

[7]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Anthony G. Cohn,et al.  Learning Functional Object-Categories from a Relational Spatio-Temporal Representation , 2008, ECAI.

[9]  Alex Pentland,et al.  Recognizing user context via wearable sensors , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[10]  Henry Kautz,et al.  Hierarchical recognition of activities of daily living using multi-scale, multi-perspective vision and RFID , 2008 .

[11]  Ming-Kuei Hu,et al.  Visual pattern recognition by moment invariants , 1962, IRE Trans. Inf. Theory.

[12]  Martial Hebert,et al.  Spatio-temporal Shape and Flow Correlation for Action Recognition , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[13]  Fabien Lagriffoul,et al.  Activity Recognition Based on Intra and Extra Manipulation of Everyday Objects , 2007, UCS.

[14]  Manolis I. A. Lourakis,et al.  Vision-Based Interpretation of Hand Gestures for Remote Control of a Computer Mouse , 2006, ECCV Workshop on HCI.

[15]  Mubarak Shah,et al.  Actions sketch: a novel action representation , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[16]  Alex Pentland,et al.  Realtime personal positioning system for a wearable computer , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[17]  Henry A. Kautz,et al.  Fine-grained activity recognition by aggregating abstract object usage , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[18]  James W. Davis,et al.  The Recognition of Human Movement Using Temporal Templates , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  John K. Tsotsos,et al.  Hand Gesture Recognition within a Linguistics-Based Framework , 2004, ECCV.

[20]  L. Rabiner,et al.  An introduction to hidden Markov models , 1986, IEEE ASSP Magazine.

[21]  Tâm Huynh,et al.  Human activity recognition with wearable sensors , 2008 .

[22]  Jamie A. Ward Activity monitoring : continuous recognition and performance evaluation , 2006 .

[23]  David W. Murray,et al.  Wearable hand activity recognition for event summarization , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[24]  Ronen Basri,et al.  Actions as Space-Time Shapes , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.