On-body interaction: armed and dangerous

Recent technological advances in input sensing, as well as ultra-small projectors, have opened up new opportunities for interaction -- the use of the body itself as both an input and output platform. Such on-body interfaces offer new interactive possibilities, and the promise of access to computation, communication and information literally in the palm of our hands. The unique context of on-body interaction allows us to take advantage of extra dimensions of input our bodies naturally afford us. In this paper, we consider how the arms and hands can be used to enhance on-body interactions, which is typically finger input centric. To explore this opportunity, we developed Armura, a novel interactive on-body system, supporting both input and graphical output. Using this platform as a vehicle for exploration, we proto-typed many applications and interactions. This helped to confirm chief use modalities, identify fruitful interaction approaches, and in general, better understand how interfaces operate on the body. We highlight the most compelling techniques we uncovered. Further, this paper is the first to consider and prototype how conventional interaction issues, such as cursor control and clutching, apply to the on-body domain. Finally, we bring to light several new and unique interaction techniques.

[1]  Chris Harrison,et al.  Whack gestures: inexact and inattentive interaction with mobile devices , 2010, TEI '10.

[2]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[3]  François Guimbretière,et al.  Techniques , 2011, Laboratory Investigation.

[4]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[5]  อนิรุธ สืบสิงห์,et al.  Data Mining Practical Machine Learning Tools and Techniques , 2014 .

[6]  Shumin Zhai,et al.  More than dotting the i's --- foundations for crossing-based interfaces , 2002, CHI.

[7]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[8]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[9]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI 1994.

[10]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI Conference Companion.

[11]  Angela Barnett The dancing body as a screen: Synchronizing projected motion graphics onto the human form in contemporary dance , 2009, CIE.

[12]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[13]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[14]  Alina Hang,et al.  Projector phone: a study of using mobile phones with integrated projector for interaction with maps , 2008, Mobile HCI.

[15]  Paul A. Beardsley,et al.  Interaction using a handheld projector , 2005, IEEE Computer Graphics and Applications.

[16]  Manolis I. A. Lourakis,et al.  Vision-Based Interpretation of Hand Gestures for Remote Control of a Computer Mouse , 2006, ECCV Workshop on HCI.

[17]  Ian Witten,et al.  Data Mining , 2000 .

[18]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[19]  Adam Kendon,et al.  How gestures can become like words , 1988 .

[20]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[21]  Joseph A. Paradiso,et al.  Electric Field Sensing For Graphical Interfaces , 1998, IEEE Computer Graphics and Applications.

[22]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[23]  Ivan Poupyrev,et al.  MotionBeam: designing for movement with handheld projectors , 2010, CHI EA '10.

[24]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[25]  T. Scott Saponas Enabling always-available input: through on-body interfaces , 2009, CHI Extended Abstracts.

[26]  J. Warren Unencumbered Full Body Interaction in Video Games , 2003 .

[27]  Xiang Cao,et al.  Multi-user interaction using handheld projectors , 2007, UIST.

[28]  Steven K. Feiner,et al.  Exploring interaction with a simulated wrist-worn projection display , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[29]  Sami Laakso,et al.  Design of a body-driven multiplayer game system , 2006, CIE.

[30]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[31]  Richard A. Bolt,et al.  Two-handed gesture in multi-modal natural dialog , 1992, UIST '92.

[32]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[33]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[34]  Clayton Valli,et al.  The Gallaudet Dictionary of American Sign Language , 2021 .

[35]  Maggie Orth,et al.  Smart fabric, or "wearable clothing" , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[36]  Paul A. Beardsley,et al.  RFIG lamps: interacting with a self-describing world via photosensing wireless tags and projectors , 2004, ACM Trans. Graph..

[37]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[38]  Ken Hinckley,et al.  A survey of design issues in spatial input , 1994, UIST '94.

[39]  Xiang Cao,et al.  Interacting with dynamically defined information spaces using a handheld projector and a pen , 2006, UIST.

[40]  Paul Kabbash,et al.  Human performance using computer input devices in the preferred and non-preferred hands , 1993, INTERCHI.

[41]  Desney S. Tan,et al.  Interfaces on the go , 2010, XRDS.

[42]  Khai N. Truong,et al.  Virtual shelves: interactions with orientation aware devices , 2009, UIST '09.

[43]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[44]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.