Belt: An Unobtrusive Touch Input Device for Head-worn Displays

Belt is a novel unobtrusive input device for wearable displays that incorporates a touch surface encircling the user's hip. The wide input space is leveraged for a horizontal spatial mapping of quickly accessible information and applications. We discuss social implications and interaction capabilities for unobtrusive touch input and present our hardware implementation and a set of applications that benefit from the quick access time. In a qualitative user study with 14 participants we found out that for short interactions (2-4 seconds), most of the surface area is considered as appropriate input space, while for longer interactions (up to 10 seconds), the front areas above the trouser pockets are preferred.

[1]  Mark Billinghurst,et al.  Spatial Information Displays on a Wearable Computer , 1998, IEEE Computer Graphics and Applications.

[2]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[3]  Thad Starner,et al.  Dual Tasks Attention, Memory, and Wearable Interfaces , 2022 .

[4]  Bruce H. Thomas,et al.  Social weight: designing to minimise the social consequences arising from technology use by the mobile professional , 2003, Personal and Ubiquitous Computing.

[5]  Kent Lyons,et al.  Expert chording text entry on the Twiddler one-handed keyboard , 2004, Eighth International Symposium on Wearable Computers.

[6]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.

[7]  Albrecht Schmidt,et al.  Evaluating capacitive touch input on clothes , 2008, Mobile HCI.

[8]  Khai N. Truong,et al.  Virtual shelves: interactions with orientation aware devices , 2009, UIST '09.

[9]  Daniel Ashbrook Enabling mobile microinteractions , 2010 .

[10]  Jan O. Borchers,et al.  Pinstripe: eyes-free continuous input on interactive clothing , 2011, CHI.

[11]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[12]  Chris Harrison,et al.  PocketTouch: through-fabric capacitive touch input , 2011, UIST '11.

[13]  Nicolai Marquardt,et al.  Extending a mobile device's interaction space through body-centric interaction , 2012, Mobile HCI.

[14]  Patrick Baudisch,et al.  Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing , 2013, CHI.

[15]  Wendy E. Mackay,et al.  Body-centric design space for multi-surface interaction , 2013, CHI.

[16]  Ellen Yi-Luen Do,et al.  Don't mind me touching my wrist: a case study of interacting with on-body technology in public , 2013, ISWC '13.

[17]  Tsutomu Terada,et al.  Implementation and evaluation on a concealed interface using abdominal circumference , 2014, AH.

[18]  Pourang Irani,et al.  Consumed endurance: a metric to quantify arm fatigue of mid-air interactions , 2014, CHI.