Automatic Evaluation of Human-Robot Interaction

for 10th Joint Symposium on Neural Computation Automatic Evaluation of Human-Robot Interaction Joel Chenu, Ian Fasel, Takayuki Kanda, Hiroshi Ishiguro, Javier R. Movellan Intelligent Robotics and Communication Laboratories ATR, Kyoto, Japan. & Institute for Neural Computation University of California, San Diego, CA 92093 The objective of this study is twofold: (1) To introduce and investigate a technique for automatically evaluating the quality of a human-robot social interaction based on the analysis of facial expressions and (2) To determine wether the analysis of facial expressions is a valid means of judging the quality of this interaction. The study also evaluates techniques for integrating information from multiple cameras. The study involves one-on-one interaction between human participants and Robovie, a social robot under development at ATR and the University of Osaka (H. Ishiguro et al., 2001). 14 participants, male and female, were instructed to interact with Robovie for 5 minutes. Their facial expressions were recorded via 4 video cameras. The study was followed by a questionnaire in which the participants were asked to evaluate different aspects of their interaction with Robovie. The participants’ facial expressions were automatically analyzed using an expression recognizer developed at the UCSD MPLab (I. Fasel et al., 2002; G. Littlewort et al., 2002). We are interested in several questions: Is this a reasonable technique for the evaluation of human-robot interaction? Does the expression recognition system work in the field? Could the system potentially be used for real-time interaction? Social robots and agents designed to recognize facial expression might provide a much more interesting and engaging social interaction.