Exploring the taxonomie and associative link between emotion and function for robot sound design

Sound is a medium that conveys functional and emotional information in a form of multilayered streams. With the use of such advantage, robot sound design can open a way for being more efficient communication in human-robot interaction. As the first step of research, we examined how individuals perceived the functional and emotional intention of robot sounds and whether the perceived information from sound is associated with their previous experience with science fiction movies. The sound clips were selected based on the context of the movie scene (i.e., Wall-E, R2-D2, BB8, Transformer) and classified as functional (i.e., platform, monitoring, alerting, feedback) and emotional (i.e., positive, neutral, negative). A total of 12 participants were asked to identify the perceived properties for each of the 30 items. We found that the perceived emotional and functional messages varied from those originally intended and differed by previous experience.

[1]  Gunnar Johannsen Auditory Displays in Human–Machine Interfaces of Mobile Robots for Non-Speech Communication with Humans , 2001, J. Intell. Robotic Syst..

[2]  Justin Salamon,et al.  A Dataset and Taxonomy for Urban Sound Research , 2014, ACM Multimedia.

[3]  Vanessa Evers,et al.  Sound over Matter: The Effects of Functional Noise, Robot Size and Approach Velocity in Human-Robot Encounters , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[4]  Gunnar Johannsen,et al.  Auditory display of directions and states for mobile systems , 2002 .

[5]  Gunnar Johannsen,et al.  Auditory displays in human-machine interfaces , 2004, Proceedings of the IEEE.

[6]  Aladdin Ayesh,et al.  Emotionally expressive music based interaction language for social robots , 2009 .

[7]  Tony Belpaeme,et al.  Review of Semantic-Free Utterances in Social Human–Robot Interaction , 2016, Int. J. Hum. Comput. Interact..

[8]  Terrence Fong,et al.  Nonverbal signaling for non-humanoid robots during human-robot collaboration , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Pantelis N. Vassilakis,et al.  The Psychology of Music in Multimedia , 2013 .

[10]  Tony Belpaeme,et al.  Situational Context Directs How People Affectively Interpret Robotic Non-Linguistic Utterances , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  Tony Belpaeme,et al.  People Interpret Robotic Non-linguistic Utterances Categorically , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[12]  Davide Rocchesso,et al.  The Sonification Handbook , 2011 .

[13]  Albert S. Bregman,et al.  The Auditory Scene. (Book Reviews: Auditory Scene Analysis. The Perceptual Organization of Sound.) , 1990 .

[14]  Hisato Kobayashi,et al.  Sound design for emotion and intention expression of socially interactive robots , 2010, Intell. Serv. Robotics.

[15]  Z. Zenn Bien,et al.  Effective learning system techniques for human-robot interaction in service environment , 2007, Knowl. Based Syst..