AI-Enabled Emotion Communication

With the development of AI technology, the application of AI will greatly change and influence people's daily lives. While AI technology brings great convenience to people's lives, people have shifted their focus from the physical world to the spiritual world, so there is an increasing demand for emotional services. As a result, emotional AI systems and emotional calculation are favored by many scholars nowadays. However, the existing emotional AI work mainly focuses on improving the accuracy of emotion recognition, lacking personalized emotional services for users. Therefore, in this article, the authors propose AI-EmoCom, which casts emotion as a communication medium in the network and makes the emotional communication system more intelligent by combining it with AI technology. We applied the AI-enabled emotional communication system to the field of unmanned driving, and proposed "people-centered" hybrid driving to reduce the incidence of traffic accidents to a greater extent. We also apply AI-enabled emotional communication to emotional social robots to provide users with personalized service emotion. Then the system architecture for the AI-enabled emotion communication is introduced in detail, and the no-tag learning model of dataset labeling and processing as well as the AI algorithm model for emotion recognition are elaborated in detail, and experiments are done to verify the interactive delay in the AI-enabled emotion communication system and accuracy of emotion recognition.

[1]  Jiebo Luo,et al.  End-to-end Multi-Modal Multi-Task Vehicle Control for Self-Driving Cars with Visual Perceptions , 2018, 2018 24th International Conference on Pattern Recognition (ICPR).

[2]  Theodoros Iliou,et al.  Features and classifiers for emotion recognition from speech: a survey from 2000 to 2011 , 2012, Artificial Intelligence Review.

[3]  John Atkinson,et al.  Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers , 2016, Expert Syst. Appl..

[4]  Min Chen,et al.  AIWAC: affective interaction through wearable computing and cloud technology , 2015, IEEE Wireless Communications.

[5]  M. Shamim Hossain,et al.  Emotion recognition using deep learning approach from audio-visual emotional big data , 2019, Inf. Fusion.

[6]  Torsten Sattler,et al.  3D visual perception for self-driving cars using a multi-camera system: Calibration, mapping, localization, and obstacle detection , 2017, Image Vis. Comput..

[7]  Keitaro Naruse,et al.  Implement human-robot interaction via robot's emotion model , 2017, 2017 IEEE 8th International Conference on Awareness Science and Technology (iCAST).

[8]  Erik Cambria,et al.  Convolutional MKL Based Multimodal Emotion Recognition and Sentiment Analysis , 2016, 2016 IEEE 16th International Conference on Data Mining (ICDM).

[9]  Fadel Adib,et al.  Emotion recognition using wireless signals , 2016, MobiCom.

[10]  Jonas Beskow,et al.  Reverse Engineering Psychologically Valid Facial Expressions of Emotion into Social Robots , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[11]  M. Shamim Hossain,et al.  Emotion-Aware Connected Healthcare Big Data Towards 5G , 2018, IEEE Internet of Things Journal.

[12]  Victor C. M. Leung,et al.  Cognitive Information Measurements: A New Perspective , 2019, Inf. Sci..

[13]  George Trigeorgis,et al.  Adieu features? End-to-end speech emotion recognition using a deep convolutional recurrent network , 2016, 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[14]  Min Chen,et al.  Emotion Communication System , 2017, IEEE Access.