A happiness emotion detection method based on deep learning

Some researches show that learner’s Emotional state has an important impact on affective and cognitive processes influencing learning. A positive emotional state can enhance learning outcome. It is important to detect learner’s emotion state in learning processes unconsciously. Generally, emotions can be classified within the two dimensions, valence and activation. Happiness is an activating positive valence emotional state. This paper presents a happiness emotion detection method based on deep learning. Firstly, a certain amount of face images which include static emotion are selected from the image database. Faces are detected by using a face detector and aligned by using eye locations. Then, the face images are clipped into proper size to match the convolutional neural network input. In our classifier, input layer accepts single channel to process grayscale images, and the output layer outputs two classes, i.e. happiness emotion and non-happiness. Fourfold cross-validation is performed on the facial expression image dataset which is divided into four subsets randomly. In every round of cross validation, one subset is used for testing and other three subsets are used for training. The experiment results show that the average accuracy is up to 98.78 percent which is enough to use in learning outcome evaluation.

[1]  Shanmuganathan Raman,et al.  Facial Expression Recognition Using Visual Saliency and Deep Learning , 2017, 2017 IEEE International Conference on Computer Vision Workshops (ICCVW).

[2]  Elizabeth O. Hayward,et al.  Emotional design in multimedia learning: Effects of shape and color on affect and learning , 2014 .

[3]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[4]  Trang Nguyen,et al.  A hybrid framework for smile detection in class imbalance scenarios , 2019, Neural Computing and Applications.

[5]  Bing-Fei Wu,et al.  Adaptive Feature Mapping for Customizing Deep Learning Based Facial Expression Recognition Model , 2018, IEEE Access.

[6]  R. Pekrun The Control-Value Theory of Achievement Emotions: Assumptions, Corollaries, and Implications for Educational Research and Practice , 2006 .

[7]  Ali Douik,et al.  Facial Expression Recognition via Deep Learning , 2017, 2017 IEEE/ACS 14th International Conference on Computer Systems and Applications (AICCSA).

[8]  Yueli Cui,et al.  Learning Affective Video Features for Facial Expression Recognition via Hybrid Deep Learning , 2019, IEEE Access.

[9]  T. Goetz,et al.  Academic emotions from a social-cognitive perspective: antecedents and domain specificity of students' affect in the context of Latin instruction. , 2006, The British journal of educational psychology.

[10]  Giancarlo Fortino,et al.  A facial expression recognition system using robust face features from depth videos and deep learning , 2017, Comput. Electr. Eng..

[11]  Julia Müller,et al.  Emotional design in multimedia learning: Differentiation on relevant design features and their effects on emotions and learning , 2015, Comput. Hum. Behav..

[12]  Raimondo Schettini,et al.  Robust smile detection using convolutional neural networks , 2016, J. Electronic Imaging.

[13]  Roland Brünken,et al.  Emotional design and positive emotions in multimedia learning: An eyetracking study on the use of anthropomorphisms , 2015, Comput. Educ..

[14]  Luca Citi,et al.  Revealing Real-Time Emotional Responses: a Personalized Assessment based on Heartbeat Dynamics , 2014, Scientific Reports.

[15]  J. Russell Core affect and the psychological construction of emotion. , 2003, Psychological review.

[16]  Shang-Hong Lai,et al.  A Compact Deep Learning Model for Robust Facial Expression Recognition , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[17]  Eunjoon Rachel Um,et al.  Emotional design in multimedia learning. , 2012 .

[18]  Fei Yan,et al.  Effectiveness Proving and Control of Platoon-Based Vehicular Cyber-Physical Systems , 2018, IEEE Access.

[19]  HeidigSteffi,et al.  Emotional design in multimedia learning , 2015 .

[20]  JiQiang,et al.  Active and Dynamic Information Fusion for Facial Expression Understanding from Image Sequences , 2005 .

[21]  Ahmed Hadj Kacem,et al.  Computing of Learner’s Personality Traits Based on Digital Annotations , 2016, International Journal of Artificial Intelligence in Education.

[22]  P. Ekman,et al.  EMFACS-7: Emotional Facial Action Coding System , 1983 .

[23]  Khan M. Iftekharuddin,et al.  Sparse Simultaneous Recurrent Deep Learning for Robust Facial Expression Recognition , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[24]  Qiang Ji,et al.  Active and dynamic information fusion for facial expression understanding from image sequences , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[25]  Zheru Chi,et al.  Smile detection in the wild with deep convolutional neural networks , 2017, Machine Vision and Applications.

[26]  FuHong,et al.  Smile detection in the wild with deep convolutional neural networks , 2017, MVA 2017.

[27]  C. Lewin,et al.  Social media and education: reconceptualizing the boundaries of formal and informal learning , 2016, Social Media and Education.

[28]  Tomoya Kurokawa,et al.  A Multi-Modal Emotion-Diagnosis System to Support e-Learning , 2006, First International Conference on Innovative Computing, Information and Control - Volume I (ICICIC'06).

[29]  L PlassJan,et al.  Emotional design and positive emotions in multimedia learning , 2015 .