EmoGame: Towards a Self-Rewarding Methodology for Capturing Children Faces in an Engaging Context

Facial expression datasets are currently limited as most of them only capture the emotional expressions of adults. Researchers have begun to assert the importance of having child exemplars of the various emotional expressions in order to study the interpretation of these expressions developmentally. Capturing children expression is more complicated as the protocols used for eliciting and recording expressions for adults are not necessarily adequate for children. This paper describes the creation of a flexible Emotional Game for capturing children faces in an engaging context. The game is inspired by the well-known Guitar HeroTM gameplay, but instead of playing notes, the player should produce series of expressions. In the current work, we measure the capacity of the game to engage the children and we discuss the requirements in terms of expression recognition needed to ensure a viable gameplay. The preliminary experiments conducted with a group of 12 children with ages between 7 and 11 in various settings and social contexts show high levels of engagement and positive feedback.

[1]  Fred Nicolls,et al.  Locating Facial Features with an Extended Active Shape Model , 2008, ECCV.

[2]  Gwen Littlewort,et al.  Machine learning methods for fully automatic recognition of facial expressions and facial actions , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[3]  Gwen Littlewort,et al.  Automatic Recognition of Facial Actions in Spontaneous Expressions , 2006, J. Multim..

[4]  Maja Pantic,et al.  This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING , 2022 .

[5]  Harry Wechsler,et al.  The FERET database and evaluation procedure for face-recognition algorithms , 1998, Image Vis. Comput..

[6]  Fabien Ringeval,et al.  Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[7]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[8]  Ioan Marius Bilasco,et al.  A Local Approach for Negative Emotion Detection , 2014, 2014 22nd International Conference on Pattern Recognition.

[9]  Takeo Kanade,et al.  Neural Network-Based Face Detection , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Ioan Marius Bilasco,et al.  Intelligent pixels of interest selection with application to facial expression recognition using multilayer perceptron , 2013, Signal Process..

[11]  Rama Chellappa,et al.  Fast detection of facial wrinkles based on Gabor features using image morphology and geometric constraints , 2015, Pattern Recognit..

[12]  Björn W. Schuller,et al.  Recognising realistic emotions and affect in speech: State of the art and lessons learnt from the first challenge , 2011, Speech Commun..

[13]  Takeo Kanade,et al.  Recognizing Action Units for Facial Expression Analysis , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Emiel Krahmer,et al.  Alone or Together: Exploring the Effect of Physical Co-presence on the Emotional Expressions of Game Playing Children Across Cultures , 2008, Fun and Games.

[15]  Skyler T. Hawk,et al.  Moving faces, looking places: validation of the Amsterdam Dynamic Facial Expression Set (ADFES). , 2011, Emotion.

[16]  E. Leibenluft,et al.  The NIMH Child Emotional Faces Picture Set (NIMH‐ChEFS): a new set of children's facial emotion stimuli , 2011, International journal of methods in psychiatric research.

[17]  Michael J. Lyons,et al.  Coding facial expressions with Gabor wavelets , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[18]  Kirsten A. Dalrymple,et al.  The Dartmouth Database of Children’s Faces: Acquisition and Validation of a New Face Stimulus Set , 2013, PloS one.

[19]  Carlos A. Reyes-García,et al.  Emowisconsin: an emotional children speech database in mexican spanish , 2011, ACII 2011.

[20]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[21]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[22]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[23]  Ioan Marius Bilasco,et al.  Automatic Facial Feature Detection for Facial Expression Recognition , 2010, VISAPP.