Emulating human attention-getting practices with wearable haptics

Our computers often need to get our attention, but have inadequate means of modulating the intrusiveness with which they do so. Humans commonly use social touch to gain one another's attention. In this paper, we describe an early exploration of how an expressive, wearable or holdable haptic display could emulate human social practices with the goal of evoking comparable responses from users. It spans three iterations of rapid prototyping and user evaluation, beginning with broad-ranging physical brainstorming, before proceeding to higher-fidelity actuated prototypes. User reactions were incorporated along the way, including an assessment of the low-fidelity prototypes' expressiveness. Our observations suggest that, using simple and potentially unintrusive body-situated mechanisms like a bracelet, it is possible to convey a range of socially gradable attention-getting expressions to be useful in real contexts.

[1]  Robert E. Babe No Time: Stress and the Crisis of Modern Life , 2006 .

[2]  Steve Yohanan,et al.  The Haptic Creature : social human-robot interaction through affective touch , 2012 .

[3]  Wendy Ju,et al.  Creating visceral personal and social interactions in mediated spaces , 2001, CHI Extended Abstracts.

[4]  Robert E. Kraut,et al.  Controlling interruptions: awareness displays and social motivation for coordination , 2004, CSCW.

[5]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[7]  Karon E. MacLean,et al.  Tagged handles: merging discrete and continuous manual control , 2000, CHI.

[8]  Karon E. MacLean,et al.  Putting Haptics into the Ambience , 2009, IEEE Transactions on Haptics.

[9]  Christopher G. Atkeson,et al.  Predicting human interruptibility with sensors , 2005, TCHI.

[10]  Eric Horvitz,et al.  Models of attention in computing and communication , 2003, Commun. ACM.

[11]  Dana Kulic,et al.  Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.

[12]  Karon E. MacLean,et al.  Haptic techniques for media control , 2001, UIST '01.

[13]  Virpi Roto,et al.  Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI , 2005, CHI.

[14]  Nima Motamedi Language for touch: aesthetics, experience and technologies for next generation touch interfaces , 2008 .

[15]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.

[16]  Simone Gumtau Tactile semiotics: the meanings of touch explored with low-tech prototypes , 2005, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference.

[17]  K. MacLean,et al.  The Haptic Creature Project : Social Human-Robot Interaction through Affective Touch , 2008 .

[18]  Lorna M. Brown,et al.  Multidimensional tactons for non-visual information presentation in mobile devices , 2006, Mobile HCI.

[19]  Mary Czerwinski,et al.  Notification, Disruption, and Memory: Effects of Messaging Interruptions on Memory and Performance , 2001, INTERACT.

[20]  Wijnand A. IJsselsteijn,et al.  Mediated social touch: a review of current research and future directions , 2006, Virtual Reality.

[21]  A. F. Rovers,et al.  HIM: a framework for haptic instant messaging , 2004, CHI EA '04.

[22]  Stanley E. Jones,et al.  A naturalistic study of the meanings of touch , 1985 .

[23]  Karon E. MacLean,et al.  Foundations of Transparency in Tactile Information Design , 2008, IEEE Transactions on Haptics.

[24]  L. Fredriksson,et al.  Modes of relating in a caring conversation: a research synthesis on presence, touch and listening. , 1999, Journal of advanced nursing.