Multi-Label and Multimodal Classifier for Affective States Recognition in Virtual Rehabilitation

Computational systems that process multiple affective states may benefit from explicitly considering the interaction between the states to enhance their recognition performance. This work proposes the combination of a multi-label classifier, Circular Classifier Chain (CCC), with a multimodal classifier, Fusion using a Semi-Naive Bayesian classifier (FSNBC), to include explicitly the dependencies between multiple affective states during the automatic recognition process. This combination of classifiers is applied to a virtual rehabilitation context of post-stroke patients. We collected data from post-stroke patients, which include finger pressure, hand movements, and facial expressions during ten longitudinal sessions. Videos of the sessions were labelled by clinicians to recognize four states: tiredness, anxiety, pain, and engagement. Each state was modelled by the FSNBC receiving the information of finger pressure, hand movements, and facial expressions. The four FSNBCs were linked in the CCC to exploit the dependency relationships between the states. The convergence of CCC was reached by 5 iterations at most for all the patients. Results (ROC AUC)) of CCC with the FSNBC are over $0.940 \pm 0.045$ ( $mean \pm std.\,deviation$ ) for the four states. Relationships of mutual exclusion between engagement and all the other states and co-occurrences between pain and anxiety were detected and discussed.

[1]  Nicolai Marquardt,et al.  How Can Affect Be Detected and Represented in Technological Support for Physical Rehabilitation? , 2019, ACM Trans. Comput. Hum. Interact..

[2]  Kostas Karpouzis,et al.  Emotion Analysis in Man-Machine Interaction Systems , 2004, MLMI.

[3]  L. Enrique Sucar,et al.  Clinical evaluation of a low-cost alternative for stroke rehabilitation , 2009, 2009 IEEE International Conference on Rehabilitation Robotics.

[4]  Rosalind W. Picard,et al.  Automatic Recognition Methods Supporting Pain Assessment: A Survey , 2019, IEEE Transactions on Affective Computing.

[5]  Temitayo A. Olugbade,et al.  The relationship between guarding, pain, and emotion , 2019, Pain reports.

[6]  Luis Enrique Sucar,et al.  Recognition of Affective States in Virtual Rehabilitation using Late Fusion with Semi-Naive Bayesian Classifier , 2019, PervasiveHealth.

[7]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[8]  P. Philippot Inducing and assessing differentiated emotion-feeling states in the laboratory. , 1993, Cognition & emotion.

[9]  Hatice Gunes,et al.  Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[10]  Michel F. Valstar,et al.  Fusing Deep Learned and Hand-Crafted Features of Appearance, Shape, and Dynamics for Automatic Pain Estimation , 2017, 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017).

[11]  Grigorios Tsoumakas,et al.  Multi-Label Classification of Music into Emotions , 2008, ISMIR.

[12]  Luis Enrique Sucar,et al.  Unobtrusive Inference of Affective States in Virtual Rehabilitation from Upper Limb Motions: A Feasibility Study , 2020, IEEE Transactions on Affective Computing.

[13]  Mihai Gavrilescu,et al.  Recognizing emotions from videos by studying facial expressions, body postures and hand gestures , 2015, 2015 23rd Telecommunications Forum Telfor (TELFOR).

[14]  Shender María Avila Sansores Adaptación en línea de una política de decisión utilizando aprendizaje por refuerzo y su aplicación en rehabilitación virtual , 2013 .

[15]  Luis Enrique Sucar,et al.  Detecting affective states in virtual rehabilitation , 2015, 2015 9th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth).

[16]  Nadia Bianchi-Berthouze,et al.  Supporting Everyday Function in Chronic Pain Using Wearable Technology , 2017, CHI.

[17]  Hatice Gunes,et al.  Bi-modal emotion recognition from expressive face and body gestures , 2007, J. Netw. Comput. Appl..

[18]  Luis Enrique Sucar,et al.  Circular Chain Classifiers , 2018, PGM.

[19]  Sidney K. D'Mello,et al.  A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..

[20]  Geoff Holmes,et al.  Classifier chains for multi-label classification , 2009, Machine Learning.

[21]  Luis Enrique Sucar,et al.  Probabilistic Graphical Models: Principles and Applications , 2015, Advances in Computer Vision and Pattern Recognition.

[22]  C. Jonassaint,et al.  Abstract Animations for the Communication and Assessment of Pain in Adults: Cross-Sectional Feasibility Study , 2018, Journal of medical Internet research.

[23]  Luis Enrique Sucar,et al.  User Modelling for Patient Tailored Virtual Rehabilitation , 2015, Foundations of Biomedical Knowledge Representation.

[24]  Erik Cambria,et al.  A review of affective computing: From unimodal analysis to multimodal fusion , 2017, Inf. Fusion.

[25]  Andrea Bonarini,et al.  Affective Evaluation of Robotic Rehabilitation of Upper Limbs in Post-Stroke Subjects , 2010 .

[26]  Concha Bielza,et al.  Multi-dimensional classification with Bayesian networks , 2011, Int. J. Approx. Reason..

[27]  Geoffrey I. Webb,et al.  Proportional k-Interval Discretization for Naive-Bayes Classifiers , 2001, ECML.

[28]  Claudia Manfredi,et al.  Analysis of facial expressions in parkinson's disease through video-based automatic methods , 2017, Journal of Neuroscience Methods.

[29]  C. N. Liu,et al.  Approximating discrete probability distributions with dependence trees , 1968, IEEE Trans. Inf. Theory.

[30]  Nadia Bianchi-Berthouze,et al.  Automatic Recognition of Multiple Affective States in Virtual Rehabilitation by Exploiting the Dependency Relationships , 2019, 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII).

[31]  Petros Maragos,et al.  Fusing Body Posture With Facial Expressions for Joint Recognition of Affect in Child–Robot Interaction , 2019, IEEE Robotics and Automation Letters.

[32]  Qiang Ji,et al.  Multiple emotional tagging of multimedia data by exploiting dependencies among emotions , 2013, Multimedia Tools and Applications.

[33]  Jun Wang,et al.  Multiple Emotion Tagging for Multimedia Data by Exploiting High-Order Dependencies Among Emotions , 2015, IEEE Transactions on Multimedia.

[34]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[35]  David J. Reinkensmeyer,et al.  Gesture Therapy: An Upper Limb Virtual Reality-Based Motor Rehabilitation Platform , 2014, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[36]  Luis Enrique Sucar,et al.  Learning an Optimal Naive Bayes Classifier , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[37]  Michael J. Pazzani,et al.  Searching for Dependencies in Bayesian Classifiers , 1995, AISTATS.

[38]  Alex Alves Freitas,et al.  A Genetic Algorithm for Optimizing the Label Ordering in Multi-label Classifier Chains , 2013, 2013 IEEE 25th International Conference on Tools with Artificial Intelligence.

[39]  A. Mihailidis,et al.  The development of an adaptive upper-limb stroke rehabilitation robotic system , 2011, Journal of NeuroEngineering and Rehabilitation.

[40]  Nicholas D. Lane,et al.  Recurrent network based automatic detection of chronic pain protective behavior using MoCap and sEMG data , 2019, UbiComp.

[41]  J. Gross,et al.  Emotion elicitation using films , 1995 .

[42]  Qiang Ji,et al.  Emotional tagging of videos by exploring multiple emotions' coexistence , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[43]  Miriam Martínez Arroya Aprendizaje de Clasificadores Bayesianos Estáticos y Dinámicos , 2007 .