A Multi-Task Cascaded Network for Prediction of Affect, Personality, Mood and Social Context Using EEG Signals

This paper presents a multi-task cascaded deep neural network which jointly predicts people's affective levels (valence and arousal) and personal factors using EEG signals recorded in response to presentation of affective multimedia content. Studied personal factors are the Big-five personality traits, mood (Positive Affect and Negative Affect Schedules) and social context (individual vs group). The cascaded network consists of two levels of prediction. The first level consists of a hybrid network designed to predict affective levels for individual video segments by combining the capabilities of CNN and RNN units. The second level consists of an RNN unit designed to predict personal factors based on the sequence of affective levels predictions of consecutive video segments. The first level reduces the dimensionality of the input while keeping affective information. The fusion of decisions of RNN and CNN networks produces an increase in performance of 3% and 4% f1-score respectively for valence and arousal recognition. And, our results for personal factors recognition, on average, outperform baseline studies by at least 2:7% mean f1-score.

[1]  Jason M. Harley,et al.  Examining the predictive relationship between personality and emotion traits and students’ agent-directed emotions: towards emotionally-adaptive agent-based learning environments , 2016, User Modeling and User-Adapted Interaction.

[2]  Mohammad Soleymani,et al.  Multimedia implicit tagging using EEG signals , 2013, 2013 IEEE International Conference on Multimedia and Expo (ICME).

[3]  Honglak Lee,et al.  Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations , 2009, ICML '09.

[4]  Konstantinos N. Plataniotis,et al.  Affective states classification using EEG and semi-supervised deep learning approaches , 2016, 2016 IEEE 18th International Workshop on Multimedia Signal Processing (MMSP).

[5]  Alex Pentland,et al.  Capturing Individual and Group Behavior with Wearable Sensors , 2009, AAAI Spring Symposium: Human Behavior Modeling.

[6]  Hyo Jong Lee,et al.  Deep learninig of EEG signals for emotion recognition , 2015, 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW).

[7]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[8]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[9]  J. Pennebaker,et al.  LEXICAL PREDICTORS OFPERSONALITY TYPE , 2005 .

[10]  J. Russell A circumplex model of affect. , 1980 .

[11]  Shuicheng Yan,et al.  Don't ask me what i'm like, just watch and listen , 2012, ACM Multimedia.

[12]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[13]  Stefan Winkler,et al.  Implicit User-centric Personality Recognition Based on Physiological Responses to Emotional Videos , 2015, ICMI.

[14]  Christoph Goller,et al.  Learning task-dependent distributed representations by backpropagation through structure , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).

[15]  Subramanian Ramanathan,et al.  User-centric Affective Video Tagging from MEG and Peripheral Physiological Responses , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[16]  Mohammad Soleymani,et al.  Human-centered implicit tagging: Overview and perspectives , 2012, 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[17]  D. Watson,et al.  Toward a consensual structure of mood. , 1985, Psychological bulletin.

[18]  Adriana Tapus,et al.  Impact of personality on the recognition of emotion expressed via human, virtual, and robotic embodiments , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[19]  Y. Nesterov A method for solving the convex programming problem with convergence rate O(1/k^2) , 1983 .

[20]  Nicu Sebe,et al.  Inference of personality traits and affect schedule by analysis of spontaneous reactions to affective videos , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[21]  A. Damasio,et al.  Subcortical and cortical brain activity during the feeling of self-generated emotions , 2000, Nature Neuroscience.

[22]  Nicu Sebe,et al.  AMIGOS: A dataset for Mood, personality and affect research on Individuals and GrOupS , 2017, ArXiv.

[23]  M. Perugini,et al.  The Big Five Marker Scales (BFMS) and the Italian AB5C taxonomy: Analyses from an emic-etic perspective , 2002 .

[24]  David Watson,et al.  The PANAS-X manual for the positive and negative affect schedule , 1994 .

[25]  Nadia Mana,et al.  Multimodal recognition of personality traits in social interactions , 2008, ICMI '08.

[26]  Jennifer Golbeck,et al.  Predicting personality with social media , 2011, CHI Extended Abstracts.

[27]  Chika Sugimoto,et al.  Recognition of persisting emotional valence from EEG using convolutional neural networks , 2016, 2016 IEEE 9th International Workshop on Computational Intelligence and Applications (IWCIA).

[28]  Karen O. Egiazarian,et al.  Blind Source Separation by Entropy Rate Minimization , 2010, IEEE Signal Processing Letters.

[29]  P. Wilson,et al.  The Nature of Emotions , 2012 .

[30]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..