Automatic Recognition of Multiple Affective States in Virtual Rehabilitation by Exploiting the Dependency Relationships

The automatic recognition of multiple affective states can be enhanced if the underpinning computational models explicitly consider the interactions between the states. This work proposes a computational model that incorporates the dependencies between four states (tiredness, anxiety, pain, and engagement)known to appear in virtual rehabilitation sessions of post-stroke patients, to improve the automatic recognition of the patients' states. A dataset of five stroke patients which includes their fingers' pressure (PRE), hand movements (MOV)and facial expressions (FAE)during ten sessions of virtual rehabilitation was used. Our computational proposal uses the Semi-Naive Bayesian classifier (SNBC)as base classifier in a multiresolution approach to create a multimodal model with the three sensors (PRE, MOV, and FAE)with late fusion using SNBC (FSNB classifier). There is a FSNB classifier for each state, and they are linked in a circular classifier chain (CCC)to exploit the dependency relationships between the states. Results of CCC are over 90% of ROC AUC for the four states. Relationships of mutual exclusion between engagement and all the other states and some co-occurrences between pain and anxiety for the five patients were detected. Virtual rehabilitation platforms that incorporate the automatic recognition of multiple patient's states could leverage intelligent and empathic interactions to promote adherence to rehabilitation exercises.

[1]  Luis Enrique Sucar,et al.  Unobtrusive Inference of Affective States in Virtual Rehabilitation from Upper Limb Motions: A Feasibility Study , 2020, IEEE Transactions on Affective Computing.

[2]  Temitayo A. Olugbade,et al.  The relationship between guarding, pain, and emotion , 2019, Pain reports.

[3]  Luis Enrique Sucar,et al.  Recognition of Affective States in Virtual Rehabilitation using Late Fusion with Semi-Naive Bayesian Classifier , 2019, PervasiveHealth.

[4]  Nicolai Marquardt,et al.  How Can Affect Be Detected and Represented in Technological Support for Physical Rehabilitation? , 2019, ACM Trans. Comput. Hum. Interact..

[5]  Luis Enrique Sucar,et al.  Circular Chain Classifiers , 2018, PGM.

[6]  Claudia Manfredi,et al.  Analysis of facial expressions in parkinson's disease through video-based automatic methods , 2017, Journal of Neuroscience Methods.

[7]  Jun Wang,et al.  Multiple Emotion Tagging for Multimedia Data by Exploiting High-Order Dependencies Among Emotions , 2015, IEEE Transactions on Multimedia.

[8]  Luis Enrique Sucar,et al.  Probabilistic Graphical Models: Principles and Applications , 2015, Advances in Computer Vision and Pattern Recognition.

[9]  Q. Ji,et al.  Multiple emotional tagging of multimedia data by exploiting dependencies among emotions , 2015, Multimedia Tools and Applications.

[10]  David J. Reinkensmeyer,et al.  Gesture Therapy: An Upper Limb Virtual Reality-Based Motor Rehabilitation Platform , 2014, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[11]  Alex Alves Freitas,et al.  A Genetic Algorithm for Optimizing the Label Ordering in Multi-label Classifier Chains , 2013, 2013 IEEE 25th International Conference on Tools with Artificial Intelligence.

[12]  Qiang Ji,et al.  Emotional tagging of videos by exploring multiple emotions' coexistence , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[13]  Shender María Avila Sansores Adaptación en línea de una política de decisión utilizando aprendizaje por refuerzo y su aplicación en rehabilitación virtual , 2013 .

[14]  Concha Bielza,et al.  Multi-dimensional classification with Bayesian networks , 2011, Int. J. Approx. Reason..

[15]  Geoff Holmes,et al.  Classifier chains for multi-label classification , 2009, Machine Learning.

[16]  L. Enrique Sucar,et al.  Clinical evaluation of a low-cost alternative for stroke rehabilitation , 2009, 2009 IEEE International Conference on Rehabilitation Robotics.

[17]  Miriam Martínez Arroya Aprendizaje de Clasificadores Bayesianos Estáticos y Dinámicos , 2007 .

[18]  Luis Enrique Sucar,et al.  Learning an Optimal Naive Bayes Classifier , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[19]  Geoffrey I. Webb,et al.  Proportional k-Interval Discretization for Naive-Bayes Classifiers , 2001, ECML.

[20]  P. Philippot Inducing and assessing differentiated emotion-feeling states in the laboratory. , 1993, Cognition & emotion.

[21]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[22]  C. N. Liu,et al.  Approximating discrete probability distributions with dependence trees , 1968, IEEE Trans. Inf. Theory.

[23]  Luis Enrique Sucar,et al.  User Modelling for Patient Tailored Virtual Rehabilitation , 2015, Foundations of Biomedical Knowledge Representation.

[24]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[25]  Grigorios Tsoumakas,et al.  Multi-Label Classification of Music into Emotions , 2008, ISMIR.

[26]  Michael J. Pazzani,et al.  Searching for Dependencies in Bayesian Classifiers , 1995, AISTATS.

[27]  J. Gross,et al.  Emotion elicitation using films , 1995 .