Talking with Sentiment: Adaptive Expression Generation Behavior for Social Robots

This paper presents a neural-based approach for generating natural gesticulation movements for a humanoid robot enriched with other relevant social signals depending on sentiment processing. In particular, we take into account some simple head postures, voice parameters, and eyes colors as expressiveness enhancing elements. A Generative Adversarial Network (GAN) allows the proposed system to extend the variability of basic gesticulation movements while avoiding repetitive and monotonous behavior. Using sentiment analysis on the text that will be pronounced by the robot, we derive a value for emotion valence and coherently choose suitable parameters for the expressive elements. In this way, the robot has an adaptive expression generation during talking. Experiments validate the proposed approach by analyzing the contribution of all the factors to understand the naturalness perception of the robot behavior.

[1]  Heather Knight,et al.  Eight Lessons Learned about Non-verbal Interactions through Robot Theater , 2011, ICSR.

[2]  Cindy L. Bethel,et al.  A Survey of Using Vocal Prosody to Convey Emotion in Robot Speech , 2016, Int. J. Soc. Robotics.

[3]  Raymond H. Cuijpers,et al.  Imitating Human Emotions with Artificial Facial Expressions , 2013, Int. J. Soc. Robotics.

[4]  Klaus R. Scherer,et al.  The role of intonation in emotional expressions , 2005, Speech Commun..

[5]  J. Russell,et al.  The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology , 2005, Development and Psychopathology.

[6]  Klaus Diepold,et al.  Evaluation of a RGB-LED-based Emotion Display for Affective Agents , 2016, ArXiv.

[7]  Lillian Lee,et al.  Opinion Mining and Sentiment Analysis , 2008, Found. Trends Inf. Retr..

[8]  Elena Lazkano,et al.  Adaptive Emotional Chatting Behavior to Increase the Sociability of Robots , 2017, ICSR.

[9]  Giovanni Pilato,et al.  I Feel Blue: Robots and Humans Sharing Color Representation for Emotional Cognitive Interaction , 2012, BICA.

[10]  Giovanni Pilato,et al.  Binding representational spaces of colors and emotions for creativity , 2013, BICA 2013.

[11]  Ignazio Infantino,et al.  Affective Human-Humanoid Interaction Through Cognitive Architecture , 2012 .

[12]  Eric Gilbert,et al.  VADER: A Parsimonious Rule-Based Model for Sentiment Analysis of Social Media Text , 2014, ICWSM.

[13]  Ana Paiva,et al.  How Facial Expressions and Small Talk May Influence Trust in a Robot , 2016, ICSR.

[14]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[15]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[16]  Adriana Tapus,et al.  Towards an intelligent system for generating an adapted verbal and nonverbal combined behavior in human–robot interaction , 2015, Autonomous Robots.

[17]  Hans-Peter Seidel,et al.  Annotated New Text Engine Animation Animation Lexicon Animation Gesture Profiles MR : . . . JL : . . . Gesture Generation Video Annotated Gesture Script , 2007 .