Survey on AI-Based Multimodal Methods for Emotion Detection

Automatic emotion recognition constitutes one of the great challenges providing new tools for more objective and quicker diagnosis, communication and research. Quick and accurate emotion recognition may increase possibilities of computers, robots, and integrated environments to recognize human emotions, and response accordingly to them a social rules. The purpose of this paper is to investigate the possibility of automated emotion representation, recognition and prediction its state-of-the-art and main directions for further research. We focus on the impact of emotion analysis and state of the arts of multimodal emotion detection. We present existing works, possibilities and existing methods to analyze emotion in text, sound, image, video and physiological signals. We also emphasize the most important features for all available emotion recognition modes. Finally, we present the available platform and outlines the existing projects, which deal with multimodal emotion analysis.

[1]  Daniel McDuff,et al.  AutoEmotive: bringing empathy to the driving experience to manage stress , 2014, DIS Companion '14.

[2]  Hakan Ferhatosmanoglu,et al.  Short text classification in twitter to improve information filtering , 2010, SIGIR.

[3]  A. Goshvarpour,et al.  Fusion of heart rate variability and pulse rate variability for emotion recognition using lagged poincare plots , 2017, Australasian Physical & Engineering Sciences in Medicine.

[4]  Katarzyna Wegrzyn-Wolska,et al.  Monitoring chronic disease at home using connected devices , 2018, 2018 13th Annual Conference on System of Systems Engineering (SoSE).

[5]  Petros Maragos,et al.  The development of the Athens Emotional States Inventory (AESI): collection, validation and automatic processing of emotionally loaded sentences , 2015, The world journal of biological psychiatry : the official journal of the World Federation of Societies of Biological Psychiatry.

[6]  Marcel Salathé,et al.  The dynamics of health behavior sentiments on a large online social network , 2012, EPJ Data Science.

[7]  Piotr Prokopowicz,et al.  Comparison of the Efficiency of Time and Frequency Descriptors Based on Different Classification Conceptions , 2015, ICAISC.

[8]  Bing Liu,et al.  Sentiment Analysis and Opinion Mining , 2012, Synthesis Lectures on Human Language Technologies.

[9]  Marko Horvat,et al.  Comparative analysis of emotion estimation methods based on physiological measurements for real-time applications , 2014, Int. J. Hum. Comput. Stud..

[10]  P. Ekman,et al.  Facial action coding system , 2019 .

[11]  Erik Cambria,et al.  Fusing audio, visual and textual clues for sentiment analysis from multimodal content , 2016, Neurocomputing.

[12]  Yiorgos Chrysanthou,et al.  The Next Big Thing Automatic Emotion Recognition Based on Body Movement Analysis A Survey , 2014 .

[13]  Byoung-Jun Park,et al.  Analysis of physiological signals for recognition of boredom, pain, and surprise emotions , 2015, Journal of Physiological Anthropology.

[14]  Frank Dellaert,et al.  Recognizing emotion in speech , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.

[15]  Zhenyu Yang,et al.  Sentiment analysis on tweets for social events , 2013, Proceedings of the 2013 IEEE 17th International Conference on Computer Supported Cooperative Work in Design (CSCWD).

[16]  Erik Cambria,et al.  Tensor Fusion Network for Multimodal Sentiment Analysis , 2017, EMNLP.

[17]  Mehmet Siraç Özerdem,et al.  Emotion recognition based on EEG features in movie clips with channel selection , 2017, Brain Informatics.

[18]  Björn W. Schuller,et al.  YouTube Movie Reviews: Sentiment Analysis in an Audio-Visual Context , 2013, IEEE Intelligent Systems.

[19]  Minyi Guo,et al.  Emoticon Smoothed Language Models for Twitter Sentiment Analysis , 2012, AAAI.

[20]  Peter Robinson,et al.  Perception of emotional expressions in different representations using facial feature points , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[21]  Zhigang Deng,et al.  Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.

[22]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[23]  Basilio Sierra,et al.  Feature Selection for Speech Emotion Recognition in Spanish and Basque: On the Use of Machine Learning to Improve Human-Computer Interaction , 2014, PloS one.

[24]  I. Dunder,et al.  Techniques and applications of emotion recognition in speech , 2016, 2016 39th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO).

[25]  Shun Li,et al.  Emotion recognition using Kinect motion capture data of human gaits , 2016, PeerJ.

[26]  Gyanendra K. Verma,et al.  Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals , 2014, NeuroImage.

[27]  F. B. Reguig,et al.  Emotion recognition from physiological signals , 2011, Journal of medical engineering & technology.

[28]  A. Mehrabian,et al.  Inference of attitudes from nonverbal communication in two channels. , 1967, Journal of consulting psychology.

[29]  Verónica Pérez-Rosas,et al.  Utterance-Level Multimodal Sentiment Analysis , 2013, ACL.

[30]  Louis-Philippe Morency,et al.  Multimodal Sentiment Intensity Analysis in Videos: Facial Gestures and Verbal Messages , 2016, IEEE Intelligent Systems.

[31]  David Zimbra,et al.  Twitter brand sentiment analysis: A hybrid system using n-gram analysis and dynamic artificial neural network , 2013, Expert Syst. Appl..

[32]  Lillian Lee,et al.  Opinion Mining and Sentiment Analysis , 2008, Found. Trends Inf. Retr..

[33]  Zhenqi Li,et al.  A Review of Emotion Recognition Using Physiological Signals , 2018, Sensors.

[34]  Björn W. Schuller,et al.  OpenEAR — Introducing the munich open-source emotion and affect recognition toolkit , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[35]  Rachael E. Jack,et al.  The Human Face as a Dynamic Tool for Social Communication , 2015, Current Biology.

[36]  Ahmad R. Sharafat,et al.  Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals , 2015, Comput. Methods Programs Biomed..

[37]  Thomas S. Huang,et al.  Final Report To NSF of the Planning Workshop on Facial Expression Understanding , 1992 .

[38]  Björn W. Schuller,et al.  Fisher Kernels on Phase-Based Features for Speech Emotion Recognition , 2016, IWSDS.

[39]  Katarzyna Wegrzyn-Wolska,et al.  Tool of the Intelligence Economic: Recognition Function of Reviews Critics - Extraction and Linguistic Analysis of Sentiments , 2008, ICSOFT.

[40]  Matti Pietikäinen,et al.  Multimodal emotion recognition by combining physiological signals and facial expressions: A preliminary study , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[41]  Mike Develin,et al.  Once More with Feeling: Supportive Responses to Social Sharing on Facebook , 2016, CSCW.

[42]  Christopher M. Danforth,et al.  Forecasting the onset and course of mental illness with Twitter data , 2016, Scientific Reports.

[43]  Ragini Verma,et al.  Class-level spectral features for emotion recognition , 2010, Speech Commun..

[44]  Ethan Kross,et al.  When perceptions defy reality: The relationships between depression and actual and perceived Facebook social support. , 2016, Journal of affective disorders.

[45]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[46]  Piotr Prokopowicz,et al.  Fuzzy System for the Classification of Sounds of Birds Based on the Audio Descriptors , 2014, ICAISC.

[47]  Katarzyna Wegrzyn-Wolska,et al.  An Autonomous System Designed for Automatic Detection and Rating of Film Reviews , 2008, 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology.

[48]  Ben D. Fulcher,et al.  Predicting Depression From Language-Based Emotion Dynamics: Longitudinal Analysis of Facebook and Twitter Status Updates , 2018, Journal of medical Internet research.

[49]  Daniel A. Newman,et al.  Twitter Analysis: Studying US Weekly Trends in Work Stress and Emotion , 2016 .

[50]  Daniel Lundqvist,et al.  The EU-Emotion Stimulus Set: A validation study , 2016, Behavior research methods.

[51]  Katarzyna Wegrzyn-Wolska,et al.  RRSS - Rating Reviews Support System Purpose Built for Movies Recommendation , 2007, AWIC.

[52]  Alex Pappachen James,et al.  Detection and Analysis of Emotion From Speech Signals , 2015, ArXiv.

[53]  P. Ekman Facial expression and emotion. , 1993, The American psychologist.

[54]  Stefan Steidl,et al.  Automatic classification of emotion related user states in spontaneous children's speech , 2009 .

[55]  Gang Chen,et al.  Emotion Recognition Based on Weighted Fusion Strategy of Multichannel Physiological Signals , 2018, Comput. Intell. Neurosci..

[56]  Fabien Ringeval,et al.  Affective and behavioural computing: Lessons learnt from the First Computational Paralinguistics Challenge , 2019, Comput. Speech Lang..

[57]  Erik Cambria,et al.  Context-Dependent Sentiment Analysis in User-Generated Videos , 2017, ACL.

[58]  Tzu-Chien Hsiao,et al.  The effect of emotion on keystroke: An experimental study using facial feedback hypothesis , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[59]  Bashar Nuseibeh,et al.  A Hybrid Model for Automatic Emotion Recognition in Suicide Notes , 2012, Biomedical informatics insights.

[60]  Zhigang Deng,et al.  Emotion recognition based on phoneme classes , 2004, INTERSPEECH.

[61]  Katarzyna Wegrzyn-Wolska,et al.  EXPLORE THE EFFECTS OF EMOTICONS ON TWITTER SENTIMENT ANALYSIS , 2016 .