DigiTap: an eyes-free VR/AR symbolic input device

In this paper we present DigiTap---a wrist-worn device specially designed for symbolic input in virtual and augmented reality (VR/AR) environments. DigiTap is able to robustly sense thumb-to-finger taps on the four fingertips and the eight minor knuckles. These taps are detected by an accelerometer, which triggers capturing of an image sequence with a small wrist-mounted camera. The tap position is then extracted with low computational effort from the images by an image processing pipeline. Thus, the device is very energy efficient and may potentially be integrated in a smartwatch-like device, allowing an unobtrusive, always available, eyes-free input. To demonstrate the feasibility of our approach an initial user study with our prototype device was conducted. In this study the suitability of the twelve tapping locations was evaluated, and the most prominent sources of error were identified. Our prototype system was able to correctly classify 92% of the input locations.

[1]  Falko Kuester,et al.  Towards keyboard independent touch typing in VR , 2005, VRST '05.

[2]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[3]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[4]  Vaughan R. Pratt Thumbcode: A Device-Independent Digital Sign Language , 2005 .

[5]  Daniel J. Wigdor,et al.  A comparison of consecutive and concurrent input text entry techniques for mobile phones , 2004, CHI.

[6]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[7]  Patrick Baudisch,et al.  Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing , 2013, CHI.

[8]  Kongqiao Wang,et al.  A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[9]  Li-Te Cheng,et al.  The WristCam as input device , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[10]  Sougata Karmakar,et al.  Muscle Computer Interface: A Review , 2013 .

[11]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[12]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[13]  Doug A. Bowman,et al.  Text Input Techniques for Immersive Virtual Environments: An Empirical Comparison , 2002 .

[14]  Petr Musílek,et al.  A keystroke and pointer control input interface for wearable computers , 2006, Fourth Annual IEEE International Conference on Pervasive Computing and Communications (PERCOM'06).

[15]  Bruce Howard,et al.  Lightglove: wrist-worn virtual typing and pointing , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[16]  Mel Slater,et al.  The chording glove: a glove-based text input device , 1999, IEEE Trans. Syst. Man Cybern. Part C.

[17]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[18]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[19]  Mathias Wilhelm,et al.  Whole hand modeling using 8 wearable sensors: biomechanics for hand pose prediction , 2013, AH.

[20]  Kent Lyons,et al.  Twiddler typing: one-handed chording text entry for mobile phones , 2004, CHI.

[21]  I. Scott MacKenzie,et al.  Half-QWERTY: a one-handed keyboard facilitating skill transfer from QWERTY , 1993, INTERCHI.

[22]  Doug A. Bowman,et al.  Design and evaluation of menu systems for immersive virtual environments , 2001, Proceedings IEEE Virtual Reality 2001.