Screen feedback: How to overcome the expressive limitations of a social robot

It is the aim of this work to research how short-comings of a social robot due to its expressive limitations may be overcome by multimodal feedback. An experiment is proposed in which a robot that cannot produce facial expressions plays a game of rock, paper, scissors with people. A screen which is built-in the torso of the robot is used to compensate for these limitations in expressiveness and provide the participant with facial expressions during the game. To assess the impact of the screen on the user's rating of the robot and the interaction as such, there will be a control condition in which the screen will stay turned off. With this experiment we would like to show that sophisticated feedback setup can contribute to make a playful interaction between a human and a robot even more enjoyable.

[1]  Vladimir A. Kulyukin,et al.  On natural language dialogue with assistive robots , 2006, HRI '06.

[2]  Stephen A. Brewster,et al.  Feedback is... late: measuring multimodal delays in mobile device touchscreen interaction , 2010, ICMI-MLMI '10.

[3]  Taezoon Park,et al.  Understanding Communication Patterns for Designing Robot Receptionist , 2010, ICSR.

[4]  H.S.M. Cramer,et al.  People’s responses to autonomous and adaptive systems , 2010 .

[5]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[6]  Min Kyung Lee,et al.  How do people talk with a robot?: an analysis of human-robot dialogues in the real world , 2009, CHI Extended Abstracts.

[7]  Eric Horvitz,et al.  Facilitating multiparty dialog with gaze, gesture, and speech , 2010, ICMI-MLMI '10.

[8]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[9]  Hiroshi Ishiguro,et al.  Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Anja Austermann,et al.  Learning to understand multimodal commands and feedback for human-robot interaction , 2010 .

[11]  Kolja Kühnlenz,et al.  An emotional adaption approach to increase helpfulness towards a robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Manfred Tscheligi,et al.  Feedback guidelines for multimodal human-robot interaction: How should a robot give feedback when asking for directions? , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[13]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Stefan Kopp,et al.  A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction , 2011, 2011 RO-MAN.

[15]  Xilin Chen,et al.  International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction , 2010, ICMI 2010.

[16]  Bilge Mutlu,et al.  Pay attention!: designing adaptive agents that monitor and improve user engagement , 2012, CHI.

[17]  Britta Wrede,et al.  Domestic applications for social robots: an online survey on the influence of appearance and capabilities , 2008 .