Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces

This study explores deriving minimal features for a robotic face to convey information (via facial expressions) that people can perceive and understand. Recent research in computer vision has shown that a small number of moving points/lines can be used to capture the majority of information ($$\sim $$∼95 %) in human facial expressions. Here, we apply such findings to a minimalist robot face design, which was run through a series of experiments with human subjects (n = 75) exploring the effect of various factors, including added neck motion and degree of expression. Facial expression identification rates were similar to more complex robots. In addition, added neck motion significantly improved facial expression identification rates to 100 % for all expressions (except Fear). The Negative Attitudes towards Robots (NARS) and Godspeed scales were also collected to examine user perceptions, e.g. perceived animacy and intelligence. The project aims to answer a number of fundamental questions about robotic face design, as well as to develop inexpensive and replicable robotic faces for experimental purposes.

[1]  Hiroshi Ishiguro,et al.  Evaluating facial displays of emotion for the android robot Geminoid F , 2011, 2011 IEEE Workshop on Affective Computational Intelligence (WACI).

[2]  Peter Robinson,et al.  Computation of emotions in man and machines , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[3]  Joanna Bryson,et al.  Emotions as Durative Dynamic State for Action Selection , 2007, IJCAI.

[4]  Tony Belpaeme,et al.  Towards retro-projected robot faces: An alternative to mechatronic and android faces , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[5]  Brian Scassellati How Developmental Psychology and Robotics Complement Each Other , 2000 .

[6]  Lola Cañamero,et al.  I show you how I like you - can you read it in my face? [robotics] , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[7]  M. Yuki,et al.  Are the windows to the soul the same in the East and West? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States , 2007 .

[8]  Jacquetta Hill Culture in Mind: Cognition, Culture, and the Problem of Meaning , 1998 .

[9]  Dong-Soo Kwon,et al.  Emotion Interaction System for a Service Robot , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[10]  Lola Cannery,et al.  I Show You how I Like You-Can You Read it in My Face , 2001 .

[11]  Ajith Abraham,et al.  Design and Application of Hybrid Intelligent Systems , 2004 .

[12]  A. Damasio,et al.  Emotion, decision making and the orbitofrontal cortex. , 2000, Cerebral cortex.

[13]  Joelle Pineau,et al.  Pearl: A Mobile Robotic Assistant for the Elderly , 2002 .

[14]  Sandra Clara Gadanho,et al.  Robot Learning Driven by Emotions , 2001, Adapt. Behav..

[15]  Karsten Berns,et al.  Control of facial expressions of the humanoid robot head ROMAN , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Jeffrey F. Cohn Advances in Behavioral Science Using Automated Facial Image Analysis and Synthesis [Social Sciences] , 2010, IEEE Signal Processing Magazine.

[17]  Sven Behnke,et al.  The humanoid museum tour guide Robotinho , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[18]  Cynthia Breazeal,et al.  Cognition as coordinated non-cognition , 2007, Cognitive Processing.

[19]  T. Kanda,et al.  On proposing the concept of robot anxiety and considering measurement of it , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[20]  Bernd Kleinjohann,et al.  MEXI: Machine with Emotionally eXtended Intelligence , 2003, HIS.

[21]  Wolfram Burgard,et al.  Experiences with two Deployed Interactive Tour-Guide Robots , 1999 .

[22]  C. Breazeal Role of expressive behaviour for robots that learn from people , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[23]  R. Henson,et al.  Electrophysiological and haemodynamic correlates of face perception, recognition and priming. , 2003, Cerebral cortex.

[24]  Jodi Forlizzi,et al.  All robots are not created equal: the design and perception of humanoid robot heads , 2002, DIS '02.

[25]  David Lee,et al.  Perception of Robot Smiles and Dimensions for Human-Robot Interaction Design , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[26]  Arthur Richards,et al.  Endurance Optimisation of Battery-Powered Rotorcraft , 2015, TAROS.

[27]  Michio Okada,et al.  Minimal design for human–agent communication , 2006, Artificial Life and Robotics.

[28]  Candace L. Sidner,et al.  Attentional Gestures in Dialogues Between People and Robots , 2007 .

[29]  W. R. Howard Conversational Informatics: An Engineering Approach , 2008 .

[30]  Harukazu Igarashi,et al.  Design and Application of Hybrid Intelligent Systems , 2003 .

[31]  Reid G. Simmons,et al.  Affective social robots , 2010, Robotics Auton. Syst..

[32]  Rachael E. Jack,et al.  Cultural Confusions Show that Facial Expressions Are Not Universal , 2009, Current Biology.

[33]  Bruce Edmonds,et al.  Socially Intelligent Agents: Creating Relationships With Computers And Robots , 2013 .

[34]  B. Gelder,et al.  Why bodies? Twelve reasons for including bodily expressions in affective neuroscience , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[35]  Norman I. Badler,et al.  Creating Interactive Virtual Humans: Some Assembly Required , 2002, IEEE Intell. Syst..

[36]  Marek P. Michalowski,et al.  Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .

[37]  M. Bartlett,et al.  Machine Analysis of Facial Expressions , 2007 .

[38]  A. Young,et al.  Understanding the recognition of facial identity and facial expression , 2005, Nature Reviews Neuroscience.

[39]  Hiroshi Ishiguro,et al.  Development of an android robot for psychological support in medical and welfare fields , 2011, 2011 IEEE International Conference on Robotics and Biomimetics.

[40]  Takanori Shibata,et al.  EmotiRob: Companion robot Project , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[41]  P. Ekman Darwin's contributions to our understanding of emotional expressions , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[42]  Hiroshi Ishiguro,et al.  Android science: Toward a new cross-interdisciplinary framework , 2005 .

[43]  Goldie Nejat,et al.  The Design of an Expressive Humanlike Socially Assistive Robot , 2009 .

[44]  Karl F. MacDorman,et al.  Too real for comfort? Uncanny responses to computer generated faces , 2009, Comput. Hum. Behav..

[45]  R. Nisbett The geography of thought : how Asians and Westerners think differently--and why , 2003 .

[46]  Sara B. Kiesler,et al.  The advisor robot: tracing people's mental model from a robot's physical attributes , 2006, HRI '06.

[47]  Scott S. Snibbe,et al.  Experiences with Sparky, a Social Robot , 2002 .

[48]  B. Scassellati,et al.  Robots for use in autism research. , 2012, Annual review of biomedical engineering.

[49]  G. Rizzolatti,et al.  Brain Response to a Humanoid Robot in Areas Implicated in the Perception of Human Emotional Gestures , 2010, PloS one.

[50]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[51]  Dolores Cañamero,et al.  Modeling motivations and emotions as a basis for intelligent behavior , 1997, AGENTS '97.

[52]  Lester M. Hyman,et al.  Which are the stimuli in facial displays of anger and happiness? Configurational bases of emotion recognition. , 1992 .

[53]  J.R. Movellan,et al.  The RUBI/QRIO Project: Origins, Principles, and First Steps , 2005, Proceedings. The 4nd International Conference on Development and Learning, 2005..

[54]  J. Russell Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. , 1994, Psychological bulletin.

[55]  J. Russell,et al.  The psychology of facial expression: Frontmatter , 1997 .

[56]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[57]  Gabriele Trovato,et al.  Cross-Cultural Perspectives on Emotion Expressive Humanoid Robotic Head: Recognition of Facial Expressions and Symbols , 2013, Int. J. Soc. Robotics.

[58]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[59]  Louis-Philippe Morency,et al.  The effect of head-nod recognition in human-robot conversation , 2006, HRI '06.

[60]  Ravi Vaidyanathan,et al.  Design and testing of a hybrid expressive face for a humanoid robot , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[61]  Bram Vanderborght,et al.  Expressing Emotions with the Social Robot Probo , 2010, Int. J. Soc. Robotics.

[62]  Andrea C. Pierno,et al.  Robotic movement elicits visuomotor priming in children with autism , 2008, Neuropsychologia.

[63]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[64]  H. Ishiguro,et al.  The uncanny advantage of using androids in cognitive and social science research , 2006 .

[65]  Amanda J. C. Sharkey,et al.  Contextual Recognition of Robot Emotions , 2011, TAROS.

[66]  Jennifer M. B. Fugate Categorical Perception for Emotional Faces , 2013, Emotion review : journal of the International Society for Research on Emotion.

[67]  Witold Pedrycz,et al.  Face recognition: A study in information fusion using fuzzy integral , 2005, Pattern Recognit. Lett..

[68]  M. Asada,et al.  Detection and categorization of facial image through the interaction with caregiver , 2008, 2008 7th IEEE International Conference on Development and Learning.

[69]  Peter W. McOwan,et al.  A real-time automated system for the recognition of human facial expressions , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[70]  Kolja Kühnlenz,et al.  Towards robotic facial mimicry: System development and evaluation , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[71]  Xue Yan,et al.  iCat: an animated user-interface robot with personality , 2005, AAMAS '05.

[72]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[73]  Maja Pantic,et al.  Machine analysis of facial behaviour: naturalistic and dynamic behaviour , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[74]  Martin Buss,et al.  Design and Evaluation of Emotion-Display EDDIE , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[75]  R. Dolan,et al.  Emotion, Cognition, and Behavior , 2002, Science.

[76]  D. Hermans,et al.  The affective priming effect: Automatic activation of evaluative information in memory. , 1994 .