A study of a retro-projected robotic face and its effectiveness for gaze reading by humans

Reading gaze direction is important in human-robot interactions as it supports, among others, joint attention and non-linguistic interaction. While most previous work focuses on implementing gaze direction reading on the robot, little is known about how the human partner in a human-robot interaction is able to read gaze direction from a robot. The purpose of this paper is twofold: (1) to introduce a new technology to implement robotic face using retro-projected animated faces and (2) to test how well this technology supports gaze reading by humans. We briefly discuss the robot design and discuss parameters influencing the ability to read gaze direction. We present an experiment assessing the user's ability to read gaze direction for a selection of different robotic face designs, using an actual human face as baseline. Results indicate that it is hard to recreate human-human interaction performance. If the robot face is implemented as a semi sphere, performance is worst. While robot faces having a human-like physiognomy and, perhaps surprisingly, video projected on a flat screen perform equally well and seem to suggest that these are the good candidates to implement joint attention in HRI.

[1]  Minoru Hashimoto,et al.  Effect of emotional expression to gaze guidance using a face robot , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Hiromitsu Kobayashi,et al.  Unique morphology of the human eye , 1997, Nature.

[3]  Minoru Asada,et al.  Learning for joint attention helped by functional development , 2006, Adv. Robotics.

[4]  Yoshinori Kuno,et al.  Bidirectional Eye Contact for Human-Robot Communication , 2005, IEICE Trans. Inf. Syst..

[5]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Takeshi Kishimoto,et al.  Gaze following among toddlers. , 2008, Infant behavior & development.

[7]  Tony Belpaeme,et al.  Towards retro-projected robot faces: An alternative to mechatronic and android faces , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[8]  Myung Jin Chung,et al.  A novel non-intrusive eye gaze estimation using cross-ratio under large head motion , 2005, Comput. Vis. Image Underst..

[9]  Yuichiro Yoshikawa,et al.  Responsive Robot Gaze to Interaction Partner , 2006, Robotics: Science and Systems.

[10]  A. Boxer,et al.  Signal functions of infant facial expression and gaze direction during mother-infant face-to-face play. , 1979, Child development.

[11]  Gérard Bailly,et al.  Scrutinizing Natural Scenes: Controlling the Gaze of an Embodied Conversational Agent , 2007, IVA.

[12]  Alexander Zelinsky,et al.  Active gaze tracking for human-robot interaction , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[13]  Shoji Itakura,et al.  Cultural Display Rules Drive Eye Gaze During Thinking , 2006, Journal of cross-cultural psychology.

[14]  Javier Ruiz-del-Solar,et al.  Robot Head Pose Detection and Gaze Direction Determination Using Local Invariant Features , 2009, Adv. Robotics.

[15]  M. Tomasello,et al.  Understanding and sharing intentions: The origins of cultural cognition , 2005, Behavioral and Brain Sciences.

[16]  A. Kendon Some functions of gaze-direction in social interaction. , 1967, Acta psychologica.

[17]  V. Bruce,et al.  Do the eyes have it? Cues to the direction of social attention , 2000, Trends in Cognitive Sciences.

[18]  Yoshinori Kuno,et al.  Active eye contact for human-robot communication , 2004, CHI EA '04.

[19]  F. Kaplan,et al.  The challenges of joint attention , 2006 .