Screen feedback in human-robot interaction: How to enhance robot expressiveness

The feedback of a robot is a powerful means to establish smooth human-robot interaction (HRI). We report on a user study to assess the applicability of a screen in a human-robot game-playing scenario. The screen was deployed to compensate for expressive shortcomings of a social robot due to its mechanical limitations of non-movable facial features. The participants played Rock-Paper-Scissors with the robot for which we programmed the right hand to show gestures. Half of the participants received facial expressions via a screen during the game, whereas the other half did not get screen feedback. We annotated the video-recorded interactions and collected questionnaire and interview data to assess the applicability of the screen. With our data we could show noteworthy results: First, the robot which provided facial expressions was rated more intelligent. Second, the participants who had interacted with the robot showing facial expressions, rated the task as more attractive than participants who did not receive facial expressions from the robot. The fact that we could not detect a negative influence of the screen, underlines the applicability of a screen in an HRI game-playing scenario to display facial expressions and contribute to an enhanced interaction.

[1]  Manfred Tscheligi,et al.  Feedback guidelines for multimodal human-robot interaction: How should a robot give feedback when asking for directions? , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Taezoon Park,et al.  Understanding Communication Patterns for Designing Robot Receptionist , 2010, ICSR.

[3]  Selma Sabanovic,et al.  Perceptions of Affective Expression in a minimalist robotic face , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[4]  Evgenios Vlachos,et al.  Android Emotions Revealed , 2012, ICSR.

[5]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[6]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[7]  Gerard Jounghyun Kim,et al.  Robots with projectors: An alternative to anthropomorphic HRI , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  Katherine M. Tsui,et al.  Exploring use cases for telepresence robots , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Britta Wrede,et al.  The social robot ‘Flobi’: Key concepts of industrial design , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[10]  C. Bartneck,et al.  In your face, robot! The influence of a character's embodiment on how users perceive its emotional expressions , 2004 .

[11]  Xue Yan,et al.  iCat: an animated user-interface robot with personality , 2005, AAMAS '05.

[12]  Haizhou Li,et al.  Screen feedback: How to overcome the expressive limitations of a social robot , 2013, 2013 IEEE RO-MAN.

[13]  J. Fleiss,et al.  The measurement of interrater agreement , 2004 .

[14]  Fabio Tesser,et al.  Multimodal child-robot interaction: building social bonds , 2013, HRI 2013.

[15]  Tony Belpaeme,et al.  Refined human-robot interaction through retro-projected robotic heads , 2012, 2012 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO).

[16]  Adriana Tapus,et al.  The role of physical embodiment of a therapist robot for individuals with cognitive impairments , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[17]  Gabriel Skantze,et al.  Furhat: A Back-Projected Human-Like Robot Head for Multiparty Human-Machine Interaction , 2011, COST 2102 Training School.

[18]  Myung Jin Chung,et al.  Interactive facial robot system on a smart device Enhanced touch screen input recognition and robot's reactive facial expression , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Yasuhisa Hasegawa,et al.  Facial expression of robot face for human-robot mutual communication , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[20]  Ana Paiva,et al.  Are emotional robots more fun to play with? , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[21]  Paolo Dario,et al.  On the development of the emotion expression humanoid robot WE-4RII with RCH-1 , 2004, 4th IEEE/RAS International Conference on Humanoid Robots, 2004..

[22]  Hiroshi Ishiguro,et al.  Evaluating facial displays of emotion for the android robot Geminoid F , 2011, 2011 IEEE Workshop on Affective Computational Intelligence (WACI).

[23]  Britta Wrede,et al.  Domestic applications for social robots: an online survey on the influence of appearance and capabilities , 2008 .

[24]  Kolja Kühnlenz,et al.  An emotional adaption approach to increase helpfulness towards a robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[25]  J. Mccroskey,et al.  The measurement of interpersonal attraction , 1974 .

[26]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[27]  Martin Buss,et al.  Design and Evaluation of Emotion-Display EDDIE , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.