Object interaction detection using hand posture cues in an office setting

Activity recognition plays a key role in providing information for context-aware applications. When attempting to model activities, some researchers have looked towards Activity Theory, which theorizes that activities have objectives and are accomplished through interactions with tools and objects. The goal of this paper is to determine if hand posture can be used as a cue to determine the types of interactions a user has with objects in a desk/office environment. Furthermore, we wish to determine if hand posture is user-independent across all users when interacting with the same objects in a natural manner. Our experiments indicate that (a) hand posture can be used to determine object interaction, with accuracy rates around 97%, and (b) hand posture is dependent upon the individual user when users are allowed to interact with objects as they would naturally.

[1]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[2]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[3]  B. S. Manjunath,et al.  Probabilistic motion parameter models for human activity recognition , 2002, Object recognition supported by user interaction for service robots.

[4]  Henry A. Kautz,et al.  Inferring activities from interactions with objects , 2004, IEEE Pervasive Computing.

[5]  Tom M. Mitchell,et al.  Feature selection for grasp recognition from optical markers , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[6]  Paul Lukowicz,et al.  Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Paul Lukowicz,et al.  Recognizing and Discovering Human Actions from On-Body Sensor Data , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[8]  J. B. J. Bussmann,et al.  Measuring daily behavior using ambulatory accelerometry: The Activity Monitor , 2001, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[9]  Eric Horvitz,et al.  Layered representations for learning and inferring office activity from multiple sensory channels , 2004, Comput. Vis. Image Underst..

[10]  Fabien Lagriffoul,et al.  Activity Recognition Using an Egocentric Perspective of Everyday Objects , 2007, UIC.

[11]  Blake Hannaford,et al.  A Hybrid Discriminative/Generative Approach for Modeling Human Activities , 2005, IJCAI.

[12]  Bill N. Schilit,et al.  The Parctab Ubiquitous Computing Experiment , 1994, Mobidata.

[13]  H. Hotelling Analysis of a complex of statistical variables into principal components. , 1933 .

[14]  Gregory D. Abowd,et al.  The Conference Assistant: combining context-awareness with wearable computing , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[15]  Nasser M. Nasrabadi,et al.  Pattern Recognition and Machine Learning , 2006, Technometrics.

[16]  Gregory D. Abowd,et al.  Towards a Better Understanding of Context and Context-Awareness , 1999, HUC.

[17]  David W. Murray,et al.  Wearable hand activity recognition for event summarization , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[18]  Irfan A. Essa,et al.  Exploiting human actions and object context for recognition tasks , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[19]  B. Prabhakaran,et al.  A similarity measure for motion stream segmentation and recognition , 2005, MDM '05.

[20]  K. Kuutti Activity theory as a potential framework for human-computer interaction research , 1995 .

[21]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[22]  Bill N. Schilit,et al.  Context-aware computing applications , 1994, Workshop on Mobile Computing Systems and Applications.

[23]  Tracy Hammond,et al.  Office activity recognition using hand posture cues , 2008 .

[24]  A. Tikhonov On the stability of inverse problems , 1943 .

[25]  R. Fisher THE STATISTICAL UTILIZATION OF MULTIPLE MEASUREMENTS , 1938 .

[26]  Brian Eoff,et al.  Free-sketch recognition: putting the chi in sketching , 2008, CHI Extended Abstracts.

[27]  Alistair D. N. Edwards,et al.  Hand Tension as a Gesture Segmentation Cue , 1996, Gesture Workshop.

[28]  Kenji Mase,et al.  Activity and Location Recognition Using Wearable Sensors , 2002, IEEE Pervasive Comput..

[29]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[30]  Jason Pascoe,et al.  Adding generic contextual capabilities to wearable computers , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[31]  B. Nardi Studying context: a comparison of activity theory, situated action models, and distributed cognition , 1995 .

[32]  Friedrich Foerster,et al.  Detection of posture and motion by accelerometry : a validation study in ambulatory monitoring , 1999 .

[33]  B. Nardi Context and consciousness: activity theory and human-computer interaction , 1995 .

[34]  Fabien Lagriffoul,et al.  Activity Recognition Based on Intra and Extra Manipulation of Everyday Objects , 2007, UCS.

[35]  Mubarak Shah,et al.  Monitoring human behavior from video taken in an office environment , 2001, Image Vis. Comput..

[36]  Albrecht Schmidt,et al.  Multi-sensor Activity Context Detection for Wearable Computing , 2003, EUSAI.

[37]  Henry A. Kautz,et al.  Fine-grained activity recognition by aggregating abstract object usage , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[38]  Michael Vande Weghe,et al.  An architecture for gesture-based control of mobile robots , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[39]  Tracy Anne Hammond,et al.  MARQS: retrieving sketches learned from a single example using a dual-classifier , 2008, Journal on Multimodal User Interfaces.

[40]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[41]  Katsushi Ikeuchi,et al.  A sensor fusion approach for recognizing continuous human grasping sequences using hidden Markov models , 2005, IEEE Transactions on Robotics.