Autonomous development of gaze control for natural human-robot interaction

Gaze behavior is one of the most important nonverbal behaviors during human-human close encounters. For this reason, many researchers in natural human-robot interaction focus on developing robots that can achieve human-like gaze behavior. Many approaches have been proposed to achieve this natural gaze behavior based on accurate analysis of human behaviors during natural interactions. One limitation of most available approaches is that the behavior is hardwired to the robot and learning techniques are used only, if ever, for adjusting the parameters of the behavior. In this paper we propose and evaluate a different approach in which the robot learns natural gaze behavior by watching natural interactions between humans. The proposed approach uses the LiEICA architecture developed by the authors and is completely unsupervised which leads to grounded behavior. We compare the resulting gaze controller with a state-of-the-art gaze controller that achieved human-like behavior and show that the proposed approach leads to a more natural gaze behavior based on subjective evaluations of subjects.

[1]  A. Kendon Some functions of gaze-direction in social interaction. , 1967, Acta psychologica.

[2]  A. Kendon Movement coordination in social interaction: some examples described. , 1970, Acta psychologica.

[3]  M. Argyle Bodily communication, 2nd ed. , 1988 .

[4]  Martin Davies,et al.  Mental Simulation: Evaluations and Applications - Reading in Mind and Language , 1995 .

[5]  Cynthia Breazeal,et al.  A Motivational System for Regulating Human-Robot Interaction , 1998, AAAI/IAAI.

[6]  Tetsuo Ono,et al.  Development and evaluation of an interactive humanoid robot "Robovie" , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[7]  T. Kanda,et al.  Robot mediated round table: Analysis of the effect of robot's gaze , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[8]  Alexander Zelinsky,et al.  Intuitive Human-Robot Interaction Through Active 3D Gaze Tracking , 2003, ISRR.

[9]  Paolo Dario,et al.  Effective emotional expressions with expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1 , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[10]  Candace L. Sidner,et al.  Where to look: a study of human-robot engagement , 2004, IUI '04.

[11]  C. Breazeal,et al.  Transparency and Socially Guided Machine Learning , 2006 .

[12]  Hirotake Yamazoe,et al.  Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking , 2007, ICMI '07.

[13]  Toyoaki Nishida,et al.  TOWARD AGENTS THAT CAN LEARN NONVERBAL INTERACTIVE BEHAVIOR , 2008 .

[14]  Hideaki Kuzuoka,et al.  Precision timing in human-robot interaction: coordination of head movement and utterance , 2008, CHI.

[15]  Yasser F. O. Mohammad,et al.  The H3R Explanation Corpus human-human and base human-robot interaction dataset , 2008, 2008 International Conference on Intelligent Sensors, Sensor Networks and Information Processing.

[16]  Tzung-Pei Hong,et al.  Opportunities and Challenges for Next-Generation Applied Intelligence , 2009 .

[17]  Toyoaki Nishida,et al.  Toward combining autonomy and interactivity for social robots , 2009, AI & SOCIETY.

[18]  Toyoaki Nishida,et al.  Learning Interaction Structure Using a Hierarchy of Dynamical Systems , 2009 .

[19]  Toyoaki Nishida,et al.  Constrained Motif Discovery in Time Series , 2009, New Generation Computing.

[20]  Toyoaki Nishida,et al.  Controlling gaze with an embodied interactive control architecture , 2010, Applied Intelligence.

[21]  J. Hodgins,et al.  Designing gaze behavior for humanlike robots , 2009 .