Unsupervised emotional state classification through physiological parameters for social robotics applications

Abstract Future social robots should have personalized behaviors based on user emotional state to fit more in ordinary users’ activities and to improve the human–robot interaction. Several, literature works use cameras to record emotions. However, these approaches may not be effective in everyday life, due to camera obstructions and different types of stimulation, which can be related also with the interaction with other human beings. Therefore, in this work, it is investigated the electrocardiogram, the electrodermal activity, and the electric brain activity physiological signals as main informative channels. The aforementioned signals have been acquired through the use of a wireless wearable sensor network. An experimental methodology was proposed to induce three different emotional states by means of social interaction. Two different combinations of sensors were analyzed using three different time-window frames (180s, 150s, and 120s) and classified with three unsupervised machine learning approaches (K-Means, K-medoids and Self-organizing maps). Finally, their classification performances were compared to the ones obtained by four commonly used supervised techniques (i.e. Support Vector Machine, Decision Tree and k-nearest neighbor) to discuss the optimal combination of sensors, time-window length, and unsupervised classifier. Fifteen healthy young participants were recruited in the study and more than 100 instances were analyzed. The proposed approaches achieve an accuracy of 77% in the best-unsupervised case and 85% with the best-supervised ones.

[1]  Amedeo Cesta,et al.  Psychophysiological Methods to Evaluate User's Response in Human Robot Interaction: A Review and Feasibility Study , 2013, Robotics.

[2]  Kristof Van Laerhoven,et al.  Wearable affect and stress recognition: A review , 2018, ArXiv.

[3]  Kyandoghere Kyamakya,et al.  Improving Subject-independent Human Emotion Recognition Using Electrodermal Activity Sensors for Active and Assisted Living , 2018, PETRA.

[4]  Min Chen,et al.  Smart Clothing: Connecting Human with Clouds and Big Data for Sustainable Health Monitoring , 2016, Mobile Networks and Applications.

[5]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[6]  Filippo Cavallo,et al.  A Survey of Behavioral Models for Social Robots , 2019, Robotics.

[7]  Stefan Winkler,et al.  ASCERTAIN: Emotion and Personality Recognition Using Commercial Sensors , 2018, IEEE Transactions on Affective Computing.

[8]  Schahram Dustdar,et al.  Incorporating Unsupervised Learning in Activity Recognition , 2011, Activity Context Representation.

[9]  Filippo Cavallo,et al.  Evaluation of an Integrated System of Wearable Physiological Sensors for Stress Monitoring in Working Environments by Using Biological Markers , 2018, IEEE Transactions on Biomedical Engineering.

[10]  Joseph E LeDoux Emotional networks and motor control: a fearful view. , 1996, Progress in brain research.

[11]  María Malfaz,et al.  Study of Scenarios and Technical Requirements of a Social Assistive Robot for Alzheimer’s Disease Patients and Their Caregivers , 2015, International Journal of Social Robotics.

[12]  Ana Paiva,et al.  Accessing Emotion Patterns from Affective Interactions Using Electrodermal Activity , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[13]  Filippo Cavallo,et al.  Physiological Wireless Sensor Network for the Detection of Human Moods to Enhance Human-Robot Interaction , 2018, ForItAAL.

[14]  H. Harry Asada,et al.  Artifact-resistant power-efficient design of finger-ring plethysmographic sensors , 2001, IEEE Transactions on Biomedical Engineering.

[15]  Nicole C. Krämer,et al.  Great Expectations? Relation of Previous Experiences With Social Robots in Real Life or in the Media and Expectancies Based on Qualitative and Quantitative Assessment , 2019, Front. Psychol..

[16]  G. Cheng,et al.  Embodied artificial agents for understanding human social cognition , 2016, Philosophical Transactions of the Royal Society B: Biological Sciences.

[17]  Anil K. Jain,et al.  Algorithms for Clustering Data , 1988 .

[18]  Filippo Cavallo,et al.  Physiological Sensor System for the Detection of Human Moods Towards Internet of Robotic Things Applications , 2018, SoMeT.

[19]  Tamás D. Gedeon,et al.  Objective measures, sensors and computational techniques for stress recognition and classification: A survey , 2012, Comput. Methods Programs Biomed..

[20]  Min Wu,et al.  A facial expression emotion recognition based human-robot interaction system , 2017, IEEE/CAA Journal of Automatica Sinica.

[21]  John Allen Photoplethysmography and its application in clinical physiological measurement , 2007, Physiological measurement.

[22]  Praminda Caleb-Solly,et al.  Mutual Shaping in the Design of Socially Assistive Robots: A Case Study on Social Robots for Therapy , 2019, Int. J. Soc. Robotics.

[23]  Leanne Hides,et al.  Measures of incentives and confidence in using a social robot , 2018, Science Robotics.

[24]  R. Hodgson,et al.  Obsessional-compulsive complaints. , 1977, Behaviour research and therapy.

[25]  Li Dan,et al.  Cognitive emotion model for eldercare robot in smart home , 2015, China Communications.

[26]  Thomas J Grabowski,et al.  Limbic and Basal Ganglia Neuroanatomical Correlates of Gait and Executive Function: Older Adults With Mild Cognitive Impairment and Intact Cognition , 2017, American journal of physical medicine & rehabilitation.

[27]  Sylvia D. Kreibig,et al.  Autonomic nervous system activity in emotion: A review , 2010, Biological Psychology.

[28]  A. Beck,et al.  An inventory for measuring depression. , 1961, Archives of general psychiatry.

[29]  David Sander,et al.  Basic tastes and basic emotions: Basic problems and perspectives for a nonbasic solution , 2008 .

[30]  Geng Yang,et al.  cGAN Based Facial Expression Recognition for Human-Robot Interaction , 2019, IEEE Access.

[31]  Carlos Carrascosa,et al.  A new emotional robot assistant that facilitates human interaction and persuasion , 2018, Knowledge and Information Systems.

[32]  Vijay Kumar,et al.  The grand challenges of Science Robotics , 2018, Science Robotics.

[33]  Ioannis Patras,et al.  Fusion of facial expressions and EEG for implicit affective tagging , 2013, Image Vis. Comput..

[34]  Agnieszka Wykowska,et al.  From social brains to social robots: applying neurocognitive insights to human–robot interaction , 2019, Philosophical Transactions of the Royal Society B.

[35]  Caitlyn Clabaugh,et al.  Robots for the people, by the people: Personalizing human-machine interaction , 2018, Science Robotics.

[36]  Peter C. Searson,et al.  Wearable Devices for Precision Medicine and Health State Monitoring , 2019, IEEE Transactions on Biomedical Engineering.

[37]  Paolo Dario,et al.  Emotion Modelling for Social Robotics Applications: A Review , 2018 .