DYNEMO: A Corpus of dynamic and spontaneous emotional facial expressions

DynEmo is a publicly available database of significant size containing dynamic and authentic EFE of annoyance, astonishment, boredom, cheerfulness, disgust, fright, curiosity, moved, pride, and shame. All EFEs' affective states are identified both by the expresser and by observers, with all methodological, contextual, etc., elements at the disposal of the scientific community. This database was elaborated by a multidisciplinary team. This multimodal corpus meets psychological, technical and ethical criteria. 358 EFE videos (1 to 15 min. long) of ordinary people (aged from 25 to 65, half women and half men) recorded in natural (but experimental) conditions are associated with 2 types of data: first, the affective state of the expresser (self-reported once the emotional inducing task completed), and second, the timeline of observers' assessments regarding the emotions displayed all along the recording. This timeline allows easy emotion segmentations for any searcher interested in human non verbal behavior analysis

[1]  Takeo Kanade,et al.  Feature-point tracking by optical flow discriminates subtle differences in facial expression , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[2]  Timothy R. Jordan,et al.  The recognition of mental states from dynamic and static facial expressions , 2009 .

[3]  E. D. Paolo,et al.  Participatory sensemaking An enactive approach to social cognition , 2007 .

[4]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[5]  A. Tcherkassof,et al.  Facial expressions of emotions: A methodological contribution to the study of spontaneous and dynamic emotional faces , 2007 .

[6]  Qiang Ji,et al.  Facial Action Unit Recognition by Exploiting Their Dynamic and Semantic Relationships , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  R. Gross Face Databases , 2005 .

[8]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Yajie Tian,et al.  Handbook of face recognition , 2003 .

[10]  Alice J. O'Toole,et al.  A video database of moving faces and people , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  E. Bould,et al.  Role of motion signals in recognizing subtle facial expressions of emotion. , 2008, British journal of psychology.

[12]  Maja Pantic,et al.  MMI Face Database , 2005 .

[13]  Terence Sim,et al.  The CMU Pose, Illumination, and Expression Database , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Hatice Gunes,et al.  A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[15]  Catherine Pelachaud,et al.  The HUMAINE Database , 2011 .

[16]  L. Leyman,et al.  The Karolinska Directed Emotional Faces: A validation study , 2008 .

[17]  Anil K. Jain,et al.  Handbook of Face Recognition, 2nd Edition , 2011 .