A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction

Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting participants (n= 31) acted as if they were coming home in different emotional states (i.e. stressed, depressed, relaxed and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed.

[1]  Sascha Meudt,et al.  Revisiting the EmotiW challenge: how wild is it really? , 2015, Journal on Multimodal User Interfaces.

[2]  R. L. Zasloff,et al.  Measuring attachment to companion animals: a dog is not a cat is not a bird , 1996 .

[3]  D. Keltner,et al.  Touch communicates distinct emotions. , 2006, Emotion.

[4]  Shuichi Nishio,et al.  Importance of Touch for Conveying Affection in a Multimodal Interaction with a Small Humanoid Robot , 2015, Int. J. Humanoid Robotics.

[5]  C. Spence,et al.  The science of interpersonal touch: An overview , 2010, Neuroscience & Biobehavioral Reviews.

[6]  Takanori Shibata,et al.  Living With Seal Robots—Its Sociopsychological and Physiological Influences on the Elderly at a Care House , 2007, IEEE Transactions on Robotics.

[7]  J. E. Veevers The Social Meaning of Pets , 2008 .

[8]  R. Heslin,et al.  Meaning of touch: The case of touch from a stranger or same sex person , 1983 .

[9]  Cynthia Breazeal,et al.  Real-time social touch gesture recognition for sensate robots , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  D. Keltner,et al.  The communication of emotion via touch. , 2009, Emotion.

[11]  Karon E. MacLean,et al.  The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Haptic Creature , 2012, Int. J. Soc. Robotics.

[12]  Dirk Heylen,et al.  Automatic recognition of touch gestures in the corpus of social touch , 2016, Journal on Multimodal User Interfaces.

[13]  Mari Velonaki,et al.  Interpretation of the modality of touch on an artificial arm covered with an EIT-based sensitive skin , 2012, Int. J. Robotics Res..

[14]  Shuichi Nishio,et al.  Recognizing affection for a touch-based interaction with a humanoid robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Peter Eachus,et al.  Pets, people and robots: The role of companion animals and robopets in the promotion of health and well-being , 2001 .

[16]  Jean E. Veevers,et al.  The social meanings of pets: Alternative roles for companion animals. , 1985 .

[17]  Hiroshi Ishiguro,et al.  Map acquisition and classification of haptic interaction using cross correlation between distributed tactile sensors on the whole body surface , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.

[19]  Karon E. MacLean,et al.  Recognizing affect in human touch of a robot , 2015, Pattern Recognit. Lett..

[20]  Karon E. MacLean,et al.  Touch Challenge '15: Recognizing Social Touch Gestures , 2015, ICMI.

[21]  Robin I. M. Dunbar,et al.  Topography of social touching depends on emotional bonds between humans , 2015, Proceedings of the National Academy of Sciences.

[22]  W. Banks,et al.  Animal-assisted therapy and loneliness in nursing homes: use of robotic versus living dogs. , 2008, Journal of the American Medical Directors Association.

[23]  Dong-Soo Kwon,et al.  A robust online touch pattern recognition for dynamic human-robot interaction , 2010, IEEE Transactions on Consumer Electronics.

[24]  Mari Velonaki,et al.  Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin , 2014, Int. J. Soc. Robotics.

[25]  A. Kendon Gesticulation and Speech: Two Aspects of the Process of Utterance , 1981 .

[26]  Björn W. Schuller,et al.  Categorical and dimensional affect analysis in continuous input: Current trends and future directions , 2013, Image Vis. Comput..

[27]  Stanley E. Jones,et al.  A naturalistic study of the meanings of touch , 1985 .

[28]  S. Steidl,et al.  The Prosody of Pet Robot Directed Speech: Evidence from Children. , 2006 .

[29]  G. A. Mendelsohn,et al.  Affect grid : A single-item scale of pleasure and arousal , 1989 .

[30]  Sotaro Kita,et al.  Movement Phase in Signs and Co-Speech Gestures, and Their Transcriptions by Human Coders , 1997, Gesture Workshop.