Robots that express emotion elicit better human teaching

Does the emotional content of a robot's speech affect how people teach it? In this experiment, participants were asked to demonstrate several “dances” for a robot to learn. Participants moved their bodies in response to instructions displayed on a screen behind the robot. Meanwhile, the robot faced the participant and appeared to emulate the participant's movements. After each demonstration, the robot received an accuracy score and the participant chose whether or not to demonstrate that dance again. Regardless of the participant's input, however, the robot's dancing and the scores it received were arranged in advance and constant across all participants. The only variation between groups in this study was what the robot said in response to its scores. Participants saw one of three conditions: appropriate emotional responses, often-inappropriate emotional responses, or apathetic responses. Participants that taught the robot with appropriate emotional responses demonstrated the dances, on average, significantly more frequently and significantly more accurately than participants in the other two conditions.

[1]  Hideki Kozima,et al.  Interactive robots for communication-care: a case-study in autism therapy , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[2]  Fiorella de Rosis,et al.  Evaluating a realistic agent in an advice-giving task , 2005, Int. J. Hum. Comput. Stud..

[3]  Kristinn R. Thórisson,et al.  The Power of a Nod and a Glance: Envelope Vs. Emotional Feedback in Animated Conversational Agents , 1999, Appl. Artif. Intell..

[4]  Ana Paiva,et al.  Are emotional robots more fun to play with? , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[5]  Elliot Aronson,et al.  The Theory of Cognitive Dissonance: A Current Perspective1 , 1969 .

[6]  Stacy Marsella,et al.  EMA: A process model of appraisal dynamics , 2009, Cognitive Systems Research.

[7]  Andrea Lockerd Thomaz,et al.  Tutelage and socially guided robot learning , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[8]  Russell Beale,et al.  Affective interaction: How emotional agents affect users , 2009, Int. J. Hum. Comput. Stud..

[9]  D. Goleman Social Intelligence: The New Science of Human Relationships , 2006 .

[10]  Andrea Lockerd Thomaz,et al.  Teachable robots: Understanding human teaching behavior to build more effective robot learners , 2008, Artif. Intell..

[11]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[12]  Rosalind W. Picard,et al.  Embedded Empathy in Continuous, Interactive Health Assessment , 2005 .

[13]  Stacy Marsella,et al.  Modeling the cognitive antecedents and consequences of emotion , 2009, Cognitive Systems Research.

[14]  Aroldo Rodrígues,et al.  The theory of cognitive dissonance: a current perspective , 1970 .

[15]  Andrea Lockerd Thomaz,et al.  Learning from human teachers with Socially Guided Exploration , 2008, 2008 IEEE International Conference on Robotics and Automation.

[16]  Hiroshi Nakajima,et al.  We learn better together: enhancing eLearning with emotional characters , 2005, CSCL.

[17]  Russell Beale,et al.  Psychological responses to simulated displays of mismatched emotional expressions , 2008, Interact. Comput..

[18]  Winslow Burleson,et al.  Gender-Specific Approaches to Developing Emotionally Intelligent Learning Companions , 2007, IEEE Intelligent Systems.

[19]  Christoph Bartneck,et al.  Interacting with an embodied emotional character , 2003, DPPI '03.

[20]  Timothy W. Bickmore,et al.  Establishing and maintaining long-term human-computer relationships , 2005, TCHI.