The Pendle: A Personal Mediator for Mixed Initiative Environments

In this paper we propose a novel interaction model for augmented environments based on the concept of mixed initiative interaction, and describe the design of the Pendle, a gesture-based wearable device. By splitting control between user and environment, the interaction model combines the advantages of explicit, direct manipulation with the power of sensor-based proactive environments while avoiding the lack of user control and personalization usually associated with the later. The Pendle is a personalizable wearable device with the capability to recognize hand gestures. It acts as mediator between user and environment, and provides a simple, natural interface that lends itself to casual interaction. Experiences with two concrete examples, the MusicPendle and NewsPendle, demonstrate the advantages of the personalized user experience and the flexibility of the device architecture.

[1]  J. Krumm,et al.  Multi-camera multi-person tracking for EasyLiving , 2000, Proceedings Third IEEE International Workshop on Visual Surveillance.

[2]  J. Finney,et al.  FLUMP : The FLexible Ubiquitous Monitor Project , 1996 .

[3]  Konrad Tollmar,et al.  Face-Responsive Interfaces: From Direct Manipulation to Perceptive Presence , 2002, UbiComp.

[4]  William Buxton,et al.  Evolution of a reactive environment , 1995, CHI '95.

[5]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.

[6]  K. Tsukada,et al.  Ubi-Finger : Gesture Input Device for Mobile Use , 2002 .

[7]  Andy Hopper,et al.  Implementing a Sentient Computing System , 2001, Computer.

[8]  Gaurav S. Sukhatme,et al.  Connecting the Physical World with Pervasive Networks , 2002, IEEE Pervasive Comput..

[9]  Nelson Minar,et al.  Wearable computing meets ubiquitous computing: reaping the best of both worlds , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[10]  W. Keith Edwards,et al.  At Home with Ubiquitous Computing: Seven Challenges , 2001, UbiComp.

[11]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[12]  Alex Pentland,et al.  Coupled hidden Markov models for complex action recognition , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[13]  Joseph A. Paradiso,et al.  An Inertial Measurement Framework for Gesture Recognition and Applications , 2001, Gesture Workshop.

[14]  A. Harter,et al.  Teleporting - Making Applications Mobile , 1994, 1994 First Workshop on Mobile Computing Systems and Applications.

[15]  Ben Shneiderman,et al.  Designing the user interface (2nd ed.): strategies for effective human-computer interaction , 1992 .

[16]  Maribeth Gandy Coleman,et al.  The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[17]  Joseph F. McCarthy,et al.  MUSICFX: an arbiter of group preferences for computer supported collaborative workouts , 2000, CSCW '00.

[18]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..

[19]  A. Jameson Adaptive interfaces and agents , 2002 .

[20]  Ben Shneiderman,et al.  Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .

[21]  Michael Boyle,et al.  PDAs and shared public displays: Making personal information public, and public information personal , 1999, Personal Technologies.

[22]  Roy Want,et al.  The Personal Server: Changing the Way We Think about Ubiquitous Computing , 2002, UbiComp.

[23]  A. Harter,et al.  A distributed location system for the active office , 1994, IEEE Network.