Humanoid robots versus humans: How is emotional valence of facial expressions recognized by individuals with schizophrenia? An exploratory study

BACKGROUND The use of humanoid robots to play a therapeutic role in helping individuals with social disorders such as autism is a newly emerging field, but remains unexplored in schizophrenia. As the ability for robots to convey emotion appear of fundamental importance for human-robot interactions, we aimed to evaluate how schizophrenia patients recognize positive and negative facial emotions displayed by a humanoid robot. METHODS We included 21 schizophrenia outpatients and 17 healthy participants. In a reaction time task, they were shown photographs of human faces and of a humanoid robot (iCub) expressing either positive or negative emotions, as well as a non-social stimulus. Patients' symptomatology, mind perception, reaction time and number of correct answers were evaluated. RESULTS Results indicated that patients and controls recognized better and faster the emotional valence of facial expressions expressed by humans than by the robot. Participants were faster when responding to positive compared to negative human faces and inversely were faster for negative compared to positive robot faces. Importantly, participants performed worse when they perceived iCub as being capable of experiencing things (experience subscale of the mind perception questionnaire). In schizophrenia patients, negative correlations emerged between negative symptoms and both robot's and human's negative face accuracy. CONCLUSIONS Individuals do not respond similarly to human facial emotion and to non-anthropomorphic emotional signals. Humanoid robots have the potential to convey emotions to patients with schizophrenia, but their appearance seems of major importance for human-robot interactions.

[1]  L. Nummenmaa,et al.  Dissociation between recognition and detection advantage for facial expressions: a meta-analysis. , 2015, Emotion.

[2]  Thierry Chaminade,et al.  Comparing the effect of humanoid and human face for the spatial orientation of attention , 2013, Front. Neurorobot..

[3]  Y. Hsieh,et al.  Effects of robot-assisted upper limb rehabilitation on daily function and real-world arm activity in patients with chronic stroke: a randomized controlled trial , 2012, Clinical rehabilitation.

[4]  Giulio Sandini,et al.  The iCub humanoid robot: An open-systems platform for research in cognitive development , 2010, Neural Networks.

[5]  K. Jordan,et al.  Feasibility of using a humanoid robot for enhancing attention and social skills in adolescents with autism spectrum disorder , 2013, International journal of rehabilitation research. Internationale Zeitschrift fur Rehabilitationsforschung. Revue internationale de recherches de readaptation.

[6]  Maja J. Mataric,et al.  The role of physical embodiment in human-robot interaction , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[7]  Mary Lavelle,et al.  Is Nonverbal Communication Disrupted in Interactions Involving Patients With Schizophrenia? , 2012, Schizophrenia bulletin.

[8]  Ruben C. Gur,et al.  Emotion recognition deficits in schizophrenia-spectrum disorders and psychotic bipolar disorder: Findings from the Bipolar-Schizophrenia Network on Intermediate Phenotypes (B-SNIP) study , 2014, Schizophrenia Research.

[9]  B. Scassellati,et al.  Robots for use in autism research. , 2012, Annual review of biomedical engineering.

[10]  D. Sheehan,et al.  The Mini-International Neuropsychiatric Interview (M.I.N.I.): the development and validation of a structured diagnostic psychiatric interview for DSM-IV and ICD-10. , 1998, The Journal of clinical psychiatry.

[11]  T. Burns,et al.  Social functioning as an outcome measure in schizophrenia studies , 2007, Acta psychiatrica Scandinavica.

[12]  J. Russell,et al.  Facial and vocal expressions of emotion. , 2003, Annual review of psychology.

[13]  Natalie C. Ebner,et al.  FACES—A database of facial expressions in young, middle-aged, and older women and men: Development and validation , 2010, Behavior research methods.

[14]  Julien Lagarde,et al.  Movement similarities and differences during social interaction: The scientific foundation of the ALTEREGO European project , 2014, 2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[15]  D. De Rossi,et al.  Can a Humanoid Face be Expressive? A Psychophysiological Investigation , 2015, Front. Bioeng. Biotechnol..

[16]  Cornelia Wendt,et al.  Nonverbal humor as a new dimension of HRI , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[17]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[18]  L. Williams,et al.  Facial emotion identification in early-onset and first-episode psychosis: A systematic review with meta-analysis , 2014, Schizophrenia Research.

[19]  Maja J. Mataric,et al.  Embodiment and Human-Robot Interaction: A Task-Based Perspective , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[20]  Sang Ryong Kim,et al.  Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people's loneliness in human-robot interaction , 2006, Int. J. Hum. Comput. Stud..

[21]  R. Schmidt,et al.  Impairments of Social Motor Coordination in Schizophrenia , 2012, PloS one.

[22]  D. Wegner,et al.  Feeling robots and human zombies: Mind perception and the uncanny valley , 2012, Cognition.

[23]  R. Gaag,et al.  Verbal memory and Performance IQ predict theory of mind and emotion recognition ability in children with autistic spectrum disorders and in psychiatric control children. , 1999, Journal of child psychology and psychiatry, and allied disciplines.

[24]  J. Nadel,et al.  Human brain spots emotion in non humanoid robots. , 2011, Social cognitive and affective neuroscience.

[25]  Anthony Steed,et al.  A Comparison of Avatar-, Video-, and Robot-Mediated Interaction on Users’ Trust in Expertise , 2016, Front. Robot. AI.

[26]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[27]  D. Capdevielle,et al.  Face recognition in schizophrenia disorder: A comprehensive review of behavioral, neuroimaging and neurophysiological studies , 2015, Neuroscience & Biobehavioral Reviews.

[28]  D. Penn,et al.  Deficits in domains of social cognition in schizophrenia: a meta-analysis of the empirical evidence. , 2013, Schizophrenia bulletin.

[29]  K. Allott,et al.  Emotion recognition in unaffected first-degree relatives of individuals with first-episode schizophrenia , 2015, Schizophrenia Research.

[30]  Q. Gong,et al.  Impaired facial emotion perception in schizophrenia: A meta-analysis , 2010, Psychiatry Research.

[31]  Cynthia Breazeal,et al.  Cognition as coordinated non-cognition , 2007, Cognitive Processing.

[32]  Lisa M. DeBruine,et al.  The many faces of research on face perception , 2011, Philosophical Transactions of the Royal Society B: Biological Sciences.

[33]  P. Pattison,et al.  Emotion recognition via facial expression and affective prosody in schizophrenia: a methodological review. , 2002, Clinical psychology review.

[34]  Ludovic Marin,et al.  Social priming enhances interpersonal synchronization and feeling of connectedness towards schizophrenia patients , 2015, Scientific Reports.

[35]  F. Trémeau A review of emotion deficits in schizophrenia , 2006, Dialogues in clinical neuroscience.

[36]  A. Billard,et al.  Role of Gaze Cues in Interpersonal Motor Coordination: Towards Higher Affiliation in Human-Robot Interaction , 2016, PloS one.

[37]  T. Shibata,et al.  Robot Therapy: A New Approach for Mental Healthcare of the Elderly – A Mini-Review , 2010, Gerontology.

[38]  G. Rizzolatti,et al.  Brain Response to a Humanoid Robot in Areas Implicated in the Perception of Human Emotional Gestures , 2010, PloS one.

[39]  S. Kay,et al.  The positive and negative syndrome scale (PANSS) for schizophrenia. , 1987, Schizophrenia bulletin.

[40]  A. Mackinnon,et al.  [The estimation of premorbid intelligence levels in French speakers]. , 2005, L'Encephale.

[41]  Ingo Lütkebohle,et al.  The bielefeld anthropomorphic robot head “Flobi” , 2010, 2010 IEEE International Conference on Robotics and Automation.

[42]  D. Wegner,et al.  Dimensions of Mind Perception , 2007, Science.

[43]  Heloir,et al.  The Uncanny Valley , 2019, The Animation Studies Reader.

[44]  Andrew L. Mackinnon,et al.  Estimation de l’intelligence prémorbide chez les francophones , 2005 .

[45]  C. Frith Role of facial expressions in social interactions , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[46]  K. Verfaillie,et al.  Gaze control during face exploration in schizophrenia , 2010, Neuroscience Letters.

[47]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[48]  Mark H. Chignell,et al.  Communication of Emotion in Social Robots through Simple Head and Arm Movements , 2011, Int. J. Soc. Robotics.

[49]  Aude Billard,et al.  Assessing Interaction Dynamics in the Context of Robot Programming by Demonstration , 2013, International Journal of Social Robotics.

[50]  P. Pauli,et al.  Arousal, valence, and the uncanny valley: psychophysiological and self-report findings , 2015, Front. Psychol..

[51]  R. Schmidt,et al.  Nonverbal expressive behaviour in schizophrenia and social phobia , 2013, Psychiatry Research.

[52]  Bruce A. MacDonald,et al.  Does the Robot Have a Mind? Mind Perception and Attitudes Towards Robots Predict Use of an Eldercare Robot , 2014, Int. J. Soc. Robotics.

[53]  K. Samejima,et al.  Impact of stimulus uncanniness on speeded response , 2015, Front. Psychol..

[54]  Tobias Brosch,et al.  Beyond Fear , 2008, Psychological science.