The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring

In this paper we present a wearable device for control of home automation systems via hand gestures. This solution has many advantages over traditional home automation interfaces in that it can be used by those with loss of vision, motor skills, and mobility. By combining other sources of context with the pendant we can reduce the number and complexity of gestures while maintaining functionality. As users input gestures, the system can also analyze their movements for pathological tremors. This information can then be used for medical diagnosis, therapy, and emergency services.Currently, the Gesture Pendant can recognize control gestures with an accuracy of 95% and userdefined gestures with an accuracy of 97% It can detect tremors above 2HZ within .1 Hz.

[1]  I.A. Essa,et al.  Ubiquitous sensing for smart and aware environments , 2000, IEEE Wirel. Commun..

[2]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Kent Lyons,et al.  Guided by voices: An audio augmented reality system , 2000 .

[4]  Volker Hömberg,et al.  Quantitative analysis of voluntary and involuntary motor phenomena in Parkinson’s disease , 1989 .

[5]  Biing-Hwang Juang,et al.  Hidden Markov Models for Speech Recognition , 1991 .

[6]  Faith Baldwin,et al.  The West Wind , 1962 .

[7]  Gregory D. Abowd,et al.  The smart floor: a mechanism for natural user identification and tracking , 2000, CHI Extended Abstracts.

[8]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.

[9]  L. Baum,et al.  An inequality and associated maximization technique in statistical estimation of probabilistic functions of a Markov process , 1972 .

[10]  Gregory D. Abowd,et al.  The Aware Home: A Living Laboratory for Ubiquitous Computing Research , 1999, CoBuild.

[11]  M. Hallett Tremor , 1991, Neurology.

[12]  R. Penn,et al.  Deep brain stimulation for essential tremor , 1996, Neurology.

[13]  Roy Want,et al.  Implementing phicons: combining computer vision with infrared technology for interactive physical icons , 1999, UIST '99.

[14]  Richard A. Foulds,et al.  Gestural human-machine interaction for people with severe speech and motor impairment due to cerebral palsy , 1994, CHI Conference Companion.

[15]  Sharon L. Oviatt,et al.  Perceptual user interfaces: multimodal interfaces that process what comes naturally , 2000, CACM.

[16]  Philip R. Cohen,et al.  MULTIMODAL INTERFACES THAT PROCESS WHAT COMES NATURALLY , 2000 .