The BlueWand as interface for ubiquitous and wearable computing environments

The achievements of modem electronics more and more enmesh our daily life with various wirelessly communicating gadgets. Some of them are carried around e.g. cell-phones, personal digital assistants, or portable music players. Others are absorbed into the environment, e.g. access control systems, vehicle electronics, remotely controlled home appliances and consumer electronics. Typically, each such device is handled by a dedicated keypad. And sometimes, miniaturization has pushed the keys' size already beyond the limit of easy usage. This paper presents the BlueWand as a control means for such ubiquitous and wearable computing environments. The BlueWand is a small pen-like device that can be used to control other Bluetooth enabled devices by hand-movements. Based on a 6-axis accelerometer and gyroscope system it is able to detect its orientation and movement in space and to transmit this data via Bluetooth to any device that can interpret these movements and execute associated commands. The BlueWand is especially suited for scenarios that require only a limited set of commands or where the controlled devices themselves are an unnatural place for the human-computer interface. Thus the BlueWand facilitates a true wireless experience.

[1]  Joseph A. Paradiso,et al.  An Inertial Measurement Framework for Gesture Recognition and Applications , 2001, Gesture Workshop.

[2]  William Buxton,et al.  An empirical evaluation of graspable user interfaces: towards specialized, space-multiplexed input , 1997, CHI.

[3]  Jennifer Healey,et al.  Augmented Reality through Wearable Computing , 1997, Presence: Teleoperators & Virtual Environments.

[4]  David Levy The Fastap Keypad and Pervasive Computing , 2002, Pervasive.

[5]  Hiroshi Ishii,et al.  Emerging frameworks for tangible user interfaces , 2000, IBM Syst. J..

[6]  Michael Rohs,et al.  Rendezvous Layer Protocols for Bluetooth-Enabled Smart Devices , 2002, ARCS.

[7]  Ivan Poupyrev,et al.  An Introduction to 3D User Interface Design , 2001 .

[8]  Hiroshi Ishii,et al.  Design of spatially aware graspable displays , 1997, CHI Extended Abstracts.

[9]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[10]  Alois Ferscha,et al.  Pervasive Web Access via Public Communication Walls , 2002, Pervasive.

[11]  Joseph A. Paradiso,et al.  The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance , 1999 .

[12]  Jr. Joseph J. LaViola,et al.  Whole-Hand and Speech Input in Virtual Environments , 1999 .

[13]  Petteri Alahuhta,et al.  Location Estimation Indoors by Means of Small Computing Power Devices, Accelerometers, Magnetic Sensors, and Map Knowledge , 2002, Pervasive.

[14]  Robert J. K. Jacob,et al.  Computers in human-computer interaction , 2002 .

[15]  Shuji Hashimoto,et al.  Gesture recognition using an acceleration sensor and its application to musical performance control , 1997 .

[16]  Thomas B. Moeslund Interacting with a Virtual World Through Motion Capture , 2001 .

[17]  Kristofer S. J. Pister,et al.  Acceleration sensing glove (ASG) , 1999, Digest of Papers. Third International Symposium on Wearable Computers.