Humotion: A Human Inspired Gaze Control Framework for Anthropomorphic Robot Heads

In recent years, an attempt is being made to control robots more intuitive and intelligible by exploiting and integrating anthropomorphic features to boost social human-robot interaction. The design and construction of anthropomorphic robots for this kind of interaction is not the only challenging issue -- smooth and expectation-matching motion control is still an unsolved topic. In this work we present a highly configurable, portable, and open control framework that facilitates anthropomorphic motion generation for humanoid robot heads by enhancing state-of-the-art neck-eye coordination with human-like eyelid saccades and animation. On top of that, the presented framework supports dynamic neck offset angles that allow animation overlays and changes in alignment to the robots communication partner whileretaining visual focus on a given target. In order to demonstrate the universal applicability of the proposed ideas we used this framework to control the Flobi and the iCub robot head, both in simulation and on the physical robot. In order to foster further comparative studies of different robot heads, we will release all software, based on this contribution, under an open-source license.

[1]  Takayuki Kanda,et al.  Is The Uncanny Valley An Uncanny Cliff? , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Alessandro Roncone,et al.  Gaze stabilization for humanoid robots: A comprehensive framework , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[3]  Herbert Kaufmann 1.3 Neurophysiologie der Augenbewegungen , 2004 .

[4]  Giorgio Metta,et al.  Blink-Sync : Mediating Human-Robot Social Dynamics with Naturalistic Blinking Behavior , 2015 .

[5]  Ingo Lütkebohle,et al.  The bielefeld anthropomorphic robot head “Flobi” , 2010, 2010 IEEE International Conference on Robotics and Automation.

[6]  Norman I. Badler,et al.  Look me in the Eyes: A Survey of Eye and Gaze Animation for Virtual Agents and Artificial Systems , 2014, Eurographics.

[7]  Brian Scassellati,et al.  Active vision for sociable robots , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[8]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[9]  Dirk Heylen,et al.  Gaze behaviour, believability, likability and the iCat , 2009, AI & SOCIETY.

[10]  Gérard Bailly,et al.  An articulated talking face for the iCub , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[11]  H. Collewijn,et al.  Early components of the human vestibulo-ocular response to head rotation: latency and gain. , 2000, Journal of neurophysiology.

[12]  L. Stark,et al.  Gaze latency: Variable interactions of head and eye latency , 1982, Experimental Neurology.

[13]  Sven Wachsmuth,et al.  The Cognitive Interaction Toolkit - Improving Reproducibility of Robotic Systems Experiments , 2014, SIMPAR.

[14]  R A Abrams,et al.  Speed and accuracy of saccadic eye movements: characteristics of impulse variability in the oculomotor system. , 1989, Journal of experimental psychology. Human perception and performance.

[15]  Terry A. Bahill,et al.  Variability and development of a normative data base for saccadic eye movements. , 1981, Investigative ophthalmology & visual science.

[16]  Xue Yan,et al.  iCat: an animated user-interface robot with personality , 2005, AAMAS '05.

[17]  Sebastian Wrede,et al.  A middleware for collaborative research in experimental robotics , 2011, 2011 IEEE/SICE International Symposium on System Integration (SII).

[18]  [The frequency of rapid eye movements and blinks as an activation indicator (author's transl)]. , 1977, Journal of neurology.

[19]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[20]  Brent Lance,et al.  The Expressive Gaze Model: Using Gaze to Express Emotion , 2010, IEEE Computer Graphics and Applications.

[21]  Giorgio Metta,et al.  YARP: Yet Another Robot Platform , 2006 .

[22]  Torsten Kröger,et al.  Opening the door to new sensor-based robot applications—The Reflexxes Motion Libraries , 2011, 2011 IEEE International Conference on Robotics and Automation.

[23]  S. Liversedge,et al.  Oxford handbook of eye movements , 2011 .

[24]  Giorgio Metta,et al.  Design of the robot-cub (iCub) head , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[25]  H. Ishiguro,et al.  Geminoid: Teleoperated Android of an Existing Person , 2007 .

[26]  Lorenzo Natale,et al.  Workshop on Behavior Coordination between Animals, Humans and Robots , 2015, HRI.

[27]  Michele A. Basso,et al.  Not looking while leaping: the linkage of blinking and saccadic gaze shifts , 2004, Experimental Brain Research.

[28]  N. Shimizu [Neurology of eye movements]. , 2000, Rinsho shinkeigaku = Clinical neurology.

[29]  John P. Lewis,et al.  Automated eye motion using texture synthesis , 2005, IEEE Computer Graphics and Applications.

[30]  K. Barrett,et al.  Ganong's Review of Medical Physiology , 2010 .

[31]  D. Guitton,et al.  Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. , 1987, Journal of neurophysiology.

[32]  J. Büttner-Ennever,et al.  Nervous control of eyelid function. A review of clinical, experimental and pathological data. , 1992, Brain : a journal of neurology.

[33]  N. Badler,et al.  Eyes Alive Eyes Alive Eyes Alive Figure 1: Sample Images of an Animated Face with Eye Movements , 2022 .

[34]  C. Rashbass,et al.  The relationship between saccadic and smooth tracking eye movements , 1961, The Journal of physiology.

[35]  Bilge Mutlu,et al.  Stylized and Performative Gaze for Character Animation , 2013, Comput. Graph. Forum.

[36]  C. Evinger,et al.  Eyelid movements. Mechanisms and normal data. , 1991, Investigative ophthalmology & visual science.

[37]  Atsuo Takanishi,et al.  Development of Expressive Robotic Head for Bipedal Humanoid Robot with Wide Moveable Range of Facial Parts, Facial Color , 2013 .