Framework for an embedded emotion assessment system for space science applications

In the context of a growing need for dedicated solutions in the field of human space mission support, this paper proposes an emotion assessment system and its dedicated implementation architecture. The proposed system is described and a pilot implementation of the emotion assessment system is developed utilizing open source software OpenFace for detection of key facial features, called Action Units (AUs), in image sequences. The Cohn-Kanade extended database is utilized to train a neural network (ANN) for detecting emotions from AU values. The system is evaluated in a preliminary study with respect to AU detection accuracy. The correlation between the area underneath the ROC score is obtained by applying OpenFace on the CK+ database and the benchmark score is 0.61. On the AU presence detection task, the OpenFace F1 score on the CK+ database is well within the standard deviation of the scores obtained on 4 other databases. The AU and ANN execution performance is evaluated on the Jetson TK1, a low power embedded platform, and performance bottlenecks are identified.

[1]  Sorin Cotofana,et al.  Energy-Efficient Computation of L1 and L2 Norms on a FPGA SIMD Accelerator, with Applications to Visual Search , 2014 .

[2]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[3]  Peter Robinson,et al.  Cross-dataset learning and person-specific normalisation for automatic Action Unit detection , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[4]  Mehryar Mohri,et al.  Confidence Intervals for the Area Under the ROC Curve , 2004, NIPS.

[5]  Adrian Dinculescu,et al.  Novel approach to face expression analysis in determining emotional valence and intensity with benefit for human space flight studies , 2015, 2015 E-Health and Bioengineering Conference (EHB).

[6]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[7]  Lijun Yin,et al.  FERA 2015 - second Facial Expression Recognition and Analysis challenge , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[8]  P. Ekman,et al.  Measuring facial movement , 1976 .

[9]  L.A. Wickman,et al.  Human performance considerations for a Mars mission , 2006, 2006 IEEE Aerospace Conference.

[10]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[11]  Peter Robinson,et al.  OpenFace: An open source facial behavior analysis toolkit , 2016, 2016 IEEE Winter Conference on Applications of Computer Vision (WACV).

[12]  Dario Izzo,et al.  Artificial Intelligence for Space Applications , 2007 .

[13]  Osman S. Unsal,et al.  An energy efficient hybrid FPGA-GPU based embedded platform to accelerate face recognition application , 2015, 2015 IEEE Symposium in Low-Power and High-Speed Chips (COOL CHIPS XVIII).