Situational Context Directs How People Affectively Interpret Robotic Non-Linguistic Utterances

This paper presents an experiment investigating the influence that a situational context has upon how people affectively interpret Non-Linguistic Utterances made by a social robot. Subjects were presented five video conditions showing the robot making both a positive and negative utterance, the robot being subject to an action (e.g. receiving a kiss, or a slap), and then two videos showing the combination of the action and the robot reacting with both the positive and negative utterances. For each video an affective rating of valence was provided based upon how the subjects thought the robot felt given what had happened in the video. This was repeated for 5 different action scenarios. Results show that the affective interpretation of an action appears to override that of an utterance, regardless of the affective charge of the utterance. Furthermore, it is shown that if the meaning of the action and utterance are aligned, the overall interpretation is amplified. These findings are considered with respect to the practical use of utterances during social HRI.

[1]  Nicole C. Krämer,et al.  An Experimental Study on Emotional Reactions Towards a Robot , 2012, International Journal of Social Robotics.

[2]  Laurel D. Riek,et al.  Wizard of Oz studies in HRI , 2012, J. Hum. Robot Interact..

[3]  Bram Vanderborght,et al.  EMOGIB: Emotional Gibberish Speech Database for Affective Human-Robot Interaction , 2011, ACII.

[4]  Tony Belpaeme,et al.  Interpreting non-linguistic utterances by robots: studying the influence of physical appearance , 2010, AFFINE '10.

[5]  Crystal Chao,et al.  Controlling social dynamics with a parametrized model of floor regulation , 2013, HRI 2013.

[6]  Jun Hu,et al.  Improving speech recognition with the robot interaction language , 2012 .

[7]  Pierre-Yves Oudeyer,et al.  The production and recognition of emotions in speech: features and algorithms , 2003, Int. J. Hum. Comput. Stud..

[8]  Tony Belpaeme,et al.  People Interpret Robotic Non-linguistic Utterances Categorically , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Tony Belpaeme,et al.  How to use non-linguistic utterances to convey emotion in child-robot interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Marek P. Michalowski,et al.  Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .

[11]  Kerstin Dautenhahn,et al.  Methodological Issues in HRI: A Comparison of Live and Video-Based Methods in Robot to Human Approach Direction Trials , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[12]  Oudeyer Pierre-Yves,et al.  The production and recognition of emotions in speech: features and algorithms , 2003 .