Implications of location and touch for on-body projected interfaces

Very recently, there has been a perfect storm of technical advances that has culminated in the emergence of a new interaction modality: on-body interfaces. Such systems enable the wearer to use their body as an input and output platform with interactive graphics. Projects such as PALMbit and Skinput sought to answer the initial and fundamental question: whether or not on-body interfaces were technologically possible. Although considerable technical work remains, we believe it is important to begin shifting the question away from how and what, and towards where, and ultimately why. These are the class of questions that inform the design of next generation systems. To better understand and explore this expansive space, we employed a mixed-methods research process involving more than two thousand individuals. This started with high-resolution, but low-detail crowdsourced data. We then combined this with rich, expert interviews, exploring aspects ranging from aesthetics to kinesthetics. The results of this complimentary, structured exploration, point the way towards more comfortable, efficacious, and enjoyable on-body user experiences.

[1]  E. Rosch,et al.  The Embodied Mind: Cognitive Science and Human Experience , 1993 .

[2]  Angela Barnett The dancing body as a screen: Synchronizing projected motion graphics onto the human form in contemporary dance , 2009, CIE.

[3]  Matthew J. Hertenstein Touch: Its Communicative Functions in Infancy , 2002, Human Development.

[4]  Patrick Baudisch,et al.  Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device , 2011, UIST.

[5]  F. R. Wilson The Hand: How Its Use Shapes the Brain, Language, and Human Culture , 1998 .

[6]  Stefan Weber,et al.  A Portable Image Overlay Projection Device for Computer-Aided Open Liver Surgery , 2011, IEEE Transactions on Biomedical Engineering.

[7]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[8]  Bruce H. Thomas,et al.  Where Does the Mouse Go? An Investigation into the Placement of a Body-Attached TouchPad Mouse for Wearable Computers , 2002, Personal and Ubiquitous Computing.

[9]  Marina Basu The Embodied Mind: Cognitive Science and Human Experience , 2004 .

[10]  C. Davis Touch , 1997, The Lancet.

[11]  R. A. Faste,et al.  The Role of Aesthetics in Engineering , 1995 .

[12]  Jonas Löwgren,et al.  Toward an articulation of interaction esthetics , 2009, New Rev. Hypermedia Multim..

[13]  Chris Harrison,et al.  Where to locate wearable displays?: reaction time performance of visual alerts from tip to toe , 2009, CHI.

[14]  P. Freedson,et al.  Amount of time spent in sedentary behaviors in the United States, 2003-2004. , 2008, American journal of epidemiology.

[15]  Bill Tomlinson,et al.  Who are the crowdworkers?: shifting demographics in mechanical turk , 2010, CHI Extended Abstracts.

[16]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[17]  C. Spence,et al.  The science of interpersonal touch: An overview , 2010, Neuroscience & Biobehavioral Reviews.

[18]  Stanley E. Jones,et al.  A naturalistic study of the meanings of touch , 1985 .

[19]  J. Löwgren Towards an articulation of interaction aesthetics , 2009 .

[20]  A. Montagu,et al.  Touching: The Human Significance of the Skin , 1971 .

[21]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[22]  M. Tomita,et al.  Exploratory Study of Touch zones in college students on two campuses , 2008 .

[23]  Kent Lyons,et al.  Twiddler typing: one-handed chording text entry for mobile phones , 2004, CHI.

[24]  J. Löwgren,et al.  Touching a Stranger: Designing for Engaging Experience in Embodied Interaction , 2011 .

[25]  Steve Mann Smart clothing: The wearable computer and wearcam , 2005, Personal Technologies.

[26]  Per Ola Kristensson,et al.  I did that! Measuring users' experience of agency in their own actions , 2012, CHI 2012.

[27]  Sung H. Han,et al.  Body-based interfaces. , 2004 .

[28]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[29]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[30]  Maxim J. Schlossberg,et al.  Touching: The Human Significance of the Skin (3rd. ed.). , 1987 .

[31]  Chris Harrison,et al.  On-body interaction: armed and dangerous , 2012, TEI.

[32]  Thad Starner,et al.  Hambone: A Bio-Acoustic Gesture Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[33]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[34]  Jacob Buur,et al.  Getting a grip on tangible interaction: a framework on physical space and social interaction , 2006, CHI.

[35]  Erin Manning,et al.  Politics of Touch: Sense, Movement, Sovereignty , 2006 .

[36]  Kosuke Sato,et al.  PALMbit: A Body Interface Utilizing Light Projection onto Palms , 2007 .

[37]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[38]  S. Gallagher How the body shapes the mind , 2005 .

[39]  Maggie Orth,et al.  Smart fabric, or "wearable clothing" , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[40]  Jürgen Steimle,et al.  More than touch: understanding how people use skin as an input surface for mobile computing , 2014, CHI.

[41]  Alva Noë,et al.  Action in Perception , 2006, Representation and Mind.