Influence of EQ on the Difference of Biometric Emotion and Self-evaluated Emotion

There are many methods to estimating human emotions based on data obtained by sensors. The well-known examples are emotion recognition by analyzing image data of the facial expression and speech emotion recognition analyzing voice data. However, since facial expressions and speech can be arbitrarily changed, they can be said to lack objectivity, which is necessary for emotion estimation. Therefore, emotional analysis using biological signal such as heartbeat and brain waves has been studied. Biological signal cannot be changed arbitrarily, therefore can be said to suit the necessity of being objective, meaning more suitable for emotion estimation. To measure the accuracy of the emotion estimation method using biological signal, it is common to obtain the degree of error between the estimation method and subjective evaluation of one’s emotion. However, the problem with this method is that there is no guarantee that the subjective evaluation is equal to the actual “real feeling” that one’s embracing. Therefore, in this study, we evaluated the emotion estimation method using biological signal using Emotional Intelligence Quotient (EQ). We examined whether the degree of error between the emotion estimation by biological signal and subjective evaluation of one’s emotion can be explained by the level of EQ. In this study, emotions were estimated using biometric data calculated by brainwaves and heartbeat obtained from sensors. As a result, we were able to show the effectiveness of EQ as the indicator of how close bio-estimated emotion is to the subjective emotion evaluation.

[1]  Charalampos Bratsas,et al.  Toward Emotion Aware Computing: An Integrated Approach Using Multichannel Neurophysiological Recordings and Affective Visual Stimuli , 2010, IEEE Transactions on Information Technology in Biomedicine.

[2]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[3]  P. Lang International affective picture system (IAPS) : affective ratings of pictures and instruction manual , 2005 .

[4]  Christine L. Lisetti,et al.  Emotion recognition from physiological signals using wireless sensors for presence technologies , 2004, Cognition, Technology & Work.

[5]  D. Sauter,et al.  Commonalities outweigh differences in the communication of emotions across human cultures [Letter to the editor] , 2013 .

[6]  Bill N. Schilit,et al.  Context-aware computing applications , 1994, Workshop on Mobile Computing Systems and Applications.

[7]  Oliver G. B. Garrod,et al.  Facial expressions of emotion are not culturally universal , 2012, Proceedings of the National Academy of Sciences.

[8]  J. Russell A circumplex model of affect. , 1980 .

[9]  Midori Sugaya,et al.  Preliminary Reaction Analysis of Audience with Bio-Emotion Estimation Method , 2018, 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC).

[10]  Radhika M. Pai,et al.  Automatic Facial Expression Recognition Using DCNN , 2016 .

[11]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[12]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[13]  A. Mehrabian Basic Dimensions For A General Psychological Theory , 1980 .