Mood expression through parameterized functional behavior of robots

Bodily expression of affect is crucial to human robot interaction. We distinguish between emotion and mood expression, and focus on mood expression. Bodily expression of an emotion is explicit behavior that typically interrupts ongoing functional behavior. Instead, bodily mood expression is integrated with functional behaviors without interrupting them. We propose a parameterized behavior model with specific behavior parameters for bodily mood expression. Robot mood controls pose and motion parameters, while those parameters modulate behavior appearance. We applied the model to two concrete behaviors - waving and pointing - of the NAO robot, and conducted a user study in which participants (N=24) were asked to design the expression of positive, neutral, and negative moods by modulating the parameters of the two behaviors. Results show that participants created different parameter settings corresponding with different moods, and the settings were generally consistent across participants. Various parameter settings were also found to be behavior-invariant. These findings suggest that our model and parameter set are promising for expressing moods in a variety of behaviors.

[1]  Stefan Kopp,et al.  Towards a Common Framework for Multimodal Generation: The Behavior Markup Language , 2006, IVA.

[2]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[3]  R. Laban,et al.  The mastery of movement , 1950 .

[4]  Catherine Pelachaud,et al.  Studies on gesture expressivity for a virtual agent , 2009, Speech Commun..

[5]  Rob B. Briner,et al.  Changing Moods: The Psychology of Mood and Mood Regulation , 1996 .

[6]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[7]  Atsuo Takanishi,et al.  Whole body emotion expressions for KOBIAN humanoid robot — preliminary experiments with different Emotional patterns — , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[8]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Karsten Berns,et al.  Towards Social Robots: Designing an Emotion-Based Architecture , 2011, Int. J. Soc. Robotics.

[10]  Ronald C. Arkin,et al.  Behavioral overlays for non-verbal communication expression on a humanoid robot , 2007, Auton. Robots.

[11]  Tsai-Yen Li,et al.  Evaluating Emotive Character Animations Created with Procedural Animation , 2009, IVA.

[12]  S. Okuma,et al.  A Study of Emotional Motion Description by Motion Modification and Adjectival Expressions , 2006, 2006 IEEE Conference on Cybernetics and Intelligent Systems.

[13]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[14]  Kenji Amaya,et al.  Emotion from Motion , 1996, Graphics Interface.

[15]  Emilia I. Barakova,et al.  Expressing and interpreting emotional movements in social games with robots , 2010, Personal and Ubiquitous Computing.

[16]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[17]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[18]  Aryel Beck,et al.  Towards an Affect Space for robots to display emotional body language , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[19]  Megumi Masuda,et al.  Motion rendering system for emotion expression of human form robots based on Laban movement analysis , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[20]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[21]  Elisabeth André,et al.  Creation and Evaluation of emotion expression with body movement, sound and eye color for humanoid robots , 2011, 2011 RO-MAN.

[22]  A. Young,et al.  Emotion Perception from Dynamic and Static Body Expressions in Point-Light and Full-Light Displays , 2004, Perception.