PPG and EMG Based Emotion Recognition using Convolutional Neural Network

Emotion recognition is an essential part of human computer interaction and there are many sources for emotion recognition. In this study, physiological signals, especially electromyogram (EMG) and photoplethysmogram (PPG) are used to detect the emotion. To classify emotions in more detail, the existing method of modeling emotion which represents the emotion as valence and arousal is subdivided by four levels. Convolutional Neural network (CNN) is adopted for feature extraction and emotion classification. We measure the EMG and PPG signals from 30 subjects using selected 32 videos. Our method is evaluated by what we acquired from

[1]  Soraia M. Alarcão,et al.  Emotions Recognition Using EEG Signals: A Survey , 2019, IEEE Transactions on Affective Computing.

[2]  Ugur Halici,et al.  A novel deep learning approach for classification of EEG motor imagery signals , 2017, Journal of neural engineering.

[3]  Guanghua Wu,et al.  The Analysis of Emotion Recognition from GSR Based on PSO , 2010, 2010 International Symposium on Intelligence Information Processing and Trusted Computing.

[4]  Hyeoncheol Kim,et al.  Emotion extraction based on multi bio-signal using back-propagation neural network , 2018, Multimedia Tools and Applications.

[5]  Mann Oo. Hay Emotion recognition in human-computer interaction , 2012 .

[6]  Trevor Darrell,et al.  DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition , 2013, ICML.

[7]  Le Zhang,et al.  A Deep Network for Arousal-Valence Emotion Prediction with Acoustic-Visual Cues , 2018, ArXiv.

[8]  ZhangJianhua,et al.  Recognition of emotions using multimodal physiological signals and an ensemble deep learning model , 2017 .

[9]  Lina Yao,et al.  EEG-based Intention Recognition from Spatio-Temporal Representations via Cascade and Parallel Convolutional Recurrent Neural Networks , 2017, ArXiv.

[10]  Zhong Yin,et al.  Recognition of emotions using multimodal physiological signals and an ensemble deep learning model , 2017, Comput. Methods Programs Biomed..

[11]  Matti Pietikäinen,et al.  Face Description with Local Binary Patterns: Application to Face Recognition , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Li-Jun Ji,et al.  Two Sides of Emotion: Exploring Positivity and Negativity in Six Basic Emotions across Cultures , 2017, Front. Psychol..

[13]  J. Russell A circumplex model of affect. , 1980 .

[14]  Theodoros Iliou,et al.  Features and classifiers for emotion recognition from speech: a survey from 2000 to 2011 , 2012, Artificial Intelligence Review.

[15]  Sheng-Chieh Huang,et al.  Musical Rhythms Affect Heart Rate Variability: Algorithm and Models , 2014 .

[16]  Oh-Wook Kwon,et al.  Noise reduction of PPG signals using a particle filter for robust emotion recognition , 2011, 2011 IEEE International Conference on Consumer Electronics -Berlin (ICCE-Berlin).

[17]  Wlodek Zadrozny,et al.  Emotion Detection in Text: a Review , 2018, ArXiv.

[18]  Eva Hudlicka,et al.  To feel or not to feel: The role of affect in human-computer interaction , 2003, Int. J. Hum. Comput. Stud..