Probabilistic Multisensory Emotion Estimation Framework for Assistive Robotic Applications

Computer-based emotion recognition is an emerging field with envisioned applications ranging from customer satisfaction evaluation to human-machine interaction. In this paper we present a general framework for continuous emotion inference based on Bayesian biometric data fusion and the circumplex model of affect. We apply this framework to the field of assistive robotics focused on elderly and impaired people who require a wheelchair for mobility purposes. The objective is to provide an emotion-based safety layer that complements the classical collision avoidance approaches typically included in these systems. In many real-case applications the calculation of the emotional valence is not feasible, and therefore we also present here a promising novel context-based alternative currently under development. Introduction Humans seem most capable (probably due to self introspection) of distinguishing certain emotional states in other individuals just by glancing at them for a short period of time, extracting behavioural, facial and other relevant cues. Computers on the other hand, lack such capabilities and therefore need to be trained for that purpose. Due to the difference between the perceptual and analytical abilities of humans and computer, however, cues that are most suitable for us (e.g. gestures, poses, expressions) may result significantly less useful for a computer system. On the other hand, some other information of a more ”numerical nature” (e.g. electrocardiogram measurements), not that indicative for humans, could provide a computer system with key insights into the emotional state of an individual. The work we present here is the result of our preliminary research into continuous emotion estimation and its application to the development of an emotion-based navigation assistive layer for an intelligent wheelchair typically operated by a physically and/or cognitively impaired elderly person (our target user). This layer is completely user-centric and provides a technology that the user utilises unconsciously. Its task is to adapt the level of support that the user receives from the wheelchair based on the physical and cognitive capabilities of the patient, his/her driving performance, the navigation context and the user’s biometric readings. Copyright c © 2008, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. Our work differs from those of other authors in the field in a number of aspects. Firstly, it is concerned with real-time estimation of emotions, and as such, it presents techniques and mathematical models that can operate online. Secondly, while most of the related works are concerned with user satisfaction and digital entertainment, our goal is to develop robotic systems capable of reacting and adapting their behaviour in real-time (e.g. modify the navigation strategy of a mobility platform) based on the user emotions. Thirdly, due to the nature of our application and target population, we are restricted in the type and number of sensors we can use. Finally, we propose the use of context-based information in order to solve the problem of valence calculation without using intrusive or uncomfortable sensors.

[1]  S. Tomkins Affect, imagery, consciousness, Vol. 3: The negative affects: Anger and fear. , 1991 .

[2]  P. Ekman An argument for basic emotions , 1992 .

[3]  Karl Rihaczek,et al.  1. WHAT IS DATA MINING? , 2019, Data Mining for the Social Sciences.

[4]  Wolfram Burgard,et al.  Probabilistic Robotics (Intelligent Robotics and Autonomous Agents) , 2005 .

[5]  Laurence Devillers,et al.  Five emotion classes detection in real-world call center data : the use of various types of paralinguistic features , 2007 .

[6]  Michael Stonebraker,et al.  The Morgan Kaufmann Series in Data Management Systems , 1999 .

[7]  Ian H. Witten,et al.  Data mining - practical machine learning tools and techniques, Second Edition , 2005, The Morgan Kaufmann series in data management systems.

[8]  Krzysztof Slot,et al.  Low-dimensional feature space derivation for emotion recognition , 2005, INTERSPEECH.

[9]  S. Paradiso,et al.  Book Review: Affective Neuroscience: The Foundations of Human and Animal Emotions , 2000 .

[10]  P. Desmet,et al.  Framework of product experience , 2007 .

[11]  Christian Mandel,et al.  Towards an Autonomous Wheelchair: Cognitive Aspects in Service Robotics , 2005 .

[12]  B. Parkinson,et al.  Emotion and motivation , 1995 .

[13]  J. Russell,et al.  The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology , 2005, Development and Psychopathology.

[14]  Ing-Marie Jonsson,et al.  Automatic recognition of affective cues in the speech of car drivers to allow appropriate responses , 2005, OZCHI.

[15]  Ulises Cortés,et al.  The Impact of Cognitive Navigation Assistance on People with Special Needs , 2007, IWANN.

[16]  Veikko Surakka,et al.  Emotions and heart rate while sitting on a chair , 2005, CHI.

[17]  D. Thalmann,et al.  Using physiological measures for emotional assessment: A computer-aided tool for cognitive and behavioural therapy , 2004 .

[18]  R. Simpson Smart wheelchairs: A literature review. , 2005, Journal of rehabilitation research and development.

[19]  Regan L. Mandryk,et al.  A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies , 2007, Int. J. Hum. Comput. Stud..

[20]  J. Russell,et al.  Fuzzy concepts in a fuzzy hierarchy: varieties of anger. , 1994, Journal of personality and social psychology.

[21]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[22]  James C. Lester,et al.  Predicting User Physiological Response for Interactive Environments: An Inductive Approach , 2006, AIIDE.

[23]  Jyh-Shing Roger Jang,et al.  ANFIS: adaptive-network-based fuzzy inference system , 1993, IEEE Trans. Syst. Man Cybern..

[24]  Javier Minguez,et al.  Nearness diagram (ND) navigation: collision avoidance in troublesome scenarios , 2004, IEEE Transactions on Robotics and Automation.

[25]  P. A. Blight The Analysis of Time Series: An Introduction , 1991 .

[26]  Ulises Cortés,et al.  Shared Autonomy in Assistive Technologies , 2007, IWANN.

[27]  Ian H. Witten,et al.  Data mining: practical machine learning tools and techniques, 3rd Edition , 1999 .

[28]  M. Bradley,et al.  Looking at pictures: affective, facial, visceral, and behavioral reactions. , 1993, Psychophysiology.

[29]  J. Russell A circumplex model of affect. , 1980 .