How does affective robot feedback influence learner experience in a real-world treasure hunt?

We explore the effect of the feedback strategy used by a virtual robot agent in the context of a real-world treasure-hunt activity carried out by children aged 11–12. We compare two versions of a tablet-based virtual robot agent, which provides either neutral or affective feedback during the treasure hunt. The results suggest that the use of the tablet app increased the perceived difficulty of the instruction-following task compared to a paper-based version, while the affective robot feedback increased the perceived difficulty of the questions.

[1]  Mitsuru Ishizuka,et al.  Persona Effect Revisited --- Using Bio-signals to Measure and Reflect the Impact of Character-bas , 2003 .

[2]  Stacy Marsella,et al.  A domain-independent framework for modeling emotion , 2004, Cognitive Systems Research.

[3]  Jason M. Harley,et al.  Aligning and Comparing Data on Emotions Experienced during Learning with MetaTutor , 2013, AIED.

[4]  Julita Vassileva,et al.  Affective pedagogical agents and user persuasion , 2001, HCI.

[5]  Arthur C. Graesser,et al.  AutoTutor and affective autotutor: Learning by talking with cognitively and emotionally intelligent computers that talk back , 2012, TIIS.

[6]  Tony Belpaeme,et al.  The Robot Who Tried Too Hard: Social Behaviour of a Robot Tutor Can Negatively Affect Child Learning , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  Joseph Bates,et al.  The role of emotion in believable agents , 1994, CACM.

[8]  Ana Paiva,et al.  Modelling empathic behaviour in a robotic game companion for children: An ethnographic study in real-world settings , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  J. Read,et al.  Endurability, Engagement and Expectations: Measuring Children’s Fun , 2002 .

[10]  Stuart MacFarlane,et al.  Using the fun toolkit and other survey methods to gather opinions in child computer interaction , 2006, IDC '06.

[11]  Timothy W. Bickmore,et al.  Establishing and maintaining long-term human-computer relationships , 2005, TCHI.

[12]  Mei Yii Lim Emotions, behaviour and belief regulation in an intelligent guide with attitude , 2007 .

[13]  Amy Isard,et al.  Evaluating Description and Reference Strategies in a Cooperative Human-Robot Dialogue System , 2009, IJCAI.

[14]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[15]  Timothy V. Schafer Better Game Characters by Design: A Psychological Approach , 2006 .

[16]  Hiroshi Nakajima,et al.  We learn better together: enhancing eLearning with emotional characters , 2005, CSCL.

[17]  Susanne van Mulken,et al.  The impact of animated interface agents: a review of empirical research , 2000, Int. J. Hum. Comput. Stud..

[18]  Li Gong Is happy better than sad even if they are both non-adaptive? Effects of emotional expressions of talking-head interface agents , 2007, Int. J. Hum. Comput. Stud..

[19]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[20]  Ana Paiva,et al.  Feeling and Reasoning: A Computational Model for Emotional Characters , 2005, EPIA.

[21]  Russell Beale,et al.  Affective interaction: How emotional agents affect users , 2009, Int. J. Hum. Comput. Stud..

[22]  Elena Márquez Segura,et al.  Memory and the design of migrating virtual agents , 2013, AAMAS.

[23]  G. Fricchione Descartes’ Error: Emotion, Reason and the Human Brain , 1995 .

[24]  Jan Kedzierski,et al.  EMYS—Emotive Head of a Social Robot , 2013, Int. J. Soc. Robotics.

[25]  Marilyn A. Walker,et al.  Towards developing general models of usability with PARADISE , 2000, Natural Language Engineering.