mCube - Towards a Versatile Gesture Input Device for Ubiquitous Computing Environments

We propose a novel versatile gesture input device called the mCube to support both desktop and hand-held interactions in ubiquitous computing environments. It allows for desktop interactions by moving the device on a planar surface, like a computer mouse. By lifting the device from the surface, users can seamlessly continue handheld interactions in the same application. Since mCube is a single completely wireless device, it can be carried and used for different display platforms. We explore the use of multiple sensors to support a wide range of tasks namely gesture commands, multi-dimensional manipulation and navigation, and tool selections on a pie-menu. This paper addresses the design and implementation of the device with a set of design principles, and demonstrates its exploratory interaction techniques.We also discuss the results of a user evaluation and future directions.

[1]  Shahzad Malik,et al.  Visual touchpad: a two-handed gestural input device , 2004, ICMI '04.

[2]  Zhengyou Zhang,et al.  Flexible camera calibration by viewing a plane from unknown orientations , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[3]  James A. Landay,et al.  Quill: a gesture design tool for pen-based user interfaces , 2001 .

[4]  D. B. Davis,et al.  Intel Corp. , 1993 .

[5]  John Mylopoulos,et al.  The Semantic Web - ISWC 2003 , 2003, Lecture Notes in Computer Science.

[6]  Paul Lukowicz,et al.  Using multiple sensors for mobile sign language recognition , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[7]  Albrecht Schmidt,et al.  Using an autonomous cube for basic navigation and input , 2003, ICMI '03.

[8]  Jaron Lanier,et al.  A hand gesture interface device , 1987, CHI 1987.

[9]  Alex Pentland,et al.  Invariant features for 3-D gesture recognition , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[10]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.

[11]  Shumin Zhai,et al.  User performance in relation to 3D input device design , 1998, COMG.

[12]  Bernd Fröhlich,et al.  The cubic mouse: a new device for three-dimensional input , 2000, CHI.

[13]  Tracy L. Westeyn,et al.  Georgia tech gesture toolkit: supporting experiments in gesture recognition , 2003, ICMI '03.

[14]  Gerd Kortuem,et al.  Exploring cube affordance: towards a classification of non-verbal dynamics of physical interfaces for wearable computing , 2003 .

[15]  Jun Rekimoto,et al.  ToolStone: effective use of the physical manipulation vocabularies of input devices , 2000, UIST '00.

[16]  Mark Weiser,et al.  The world is not a desktop , 1994, INTR.

[17]  Steven A. Shafer,et al.  XWand: UI for intelligent spaces , 2003, CHI '03.

[18]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[19]  Paul Anderson,et al.  Gameplay issues in the design of spatial 3D gestures for video games. , 2006, CHI EA '06.

[20]  Ravin Balakrishnan,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2004, SIGGRAPH 2004.