Construction and analysis of social-affective interaction corpus in English and Indonesian

Social-affective aspects of interaction play a vital role in making human communication a rich and dynamic experience. Observation of complex emotional phenomena requires rich sets of labeled data of natural interaction. Although there has been an increase of interest in constructing corpora containing social interactions, there is still a lack of spontaneous and emotionally rich corpora. This paper presents a corpus of social-affective interactions in English and Indonesian, constructed from various television talk shows, containing natural conversations and real emotion occurrences. We carefully annotate the corpus in terms of emotion and discourse structure to allow for the aforementioned observation. The corpus is still in its early stage of development, yielding wide-ranging possibilities for future work.

[1]  Maja Pantic,et al.  This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING , 2022 .

[2]  Kostas Karpouzis,et al.  The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data , 2007, ACII.

[3]  Koteswara Rao Anne,et al.  Emotion Recognition Using Spectral Features , 2015 .

[4]  B. Rimé,et al.  Exposure to the social sharing of emotion: Emotional impact, listener responses and secondary social sharing , 1997 .

[5]  Roddy Cowie,et al.  FEELTRACE: an instrument for recording perceived emotion in real time , 2000 .

[6]  Nobuhiro Kaji,et al.  Predicting and Eliciting Addressee's Emotion in Online Dialogue , 2013, ACL.

[7]  Fakhri Karray,et al.  Survey on speech emotion recognition: Features, classification schemes, and databases , 2011, Pattern Recognit..

[8]  Fabien Ringeval,et al.  Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[9]  Leontios J. Hadjileontiadis,et al.  EEG‐Based Emotion Recognition Using Advanced Signal Processing Techniques , 2015 .

[10]  J. Russell A circumplex model of affect. , 1980 .

[11]  Klaus R. Scherer,et al.  Using Actor Portrayals to Systematically Study Multimodal Emotion Expression: The GEMEP Corpus , 2007, ACII.

[12]  Dan Jurafsky,et al.  Dialog Act Modeling for Conversational Speech , 1998 .

[13]  Shrikanth S. Narayanan,et al.  The Vera am Mittag German audio-visual emotional speech database , 2008, 2008 IEEE International Conference on Multimedia and Expo.

[14]  Tomoki Toda,et al.  Emotion and Its Triggers in Human Spoken Dialogue: Recognition and Analysis , 2016 .

[15]  Csr Young,et al.  How to Do Things With Words , 2009 .

[16]  Hichem Sahli,et al.  Real-Time Emotion Recognition from Natural Bodily Expressions in Child-Robot Interaction , 2014, ECCV Workshops.

[17]  Satoshi Nakamura,et al.  Construction and analysis of Indonesian Emotional Speech Corpus , 2014, 2014 17th Oriental Chapter of the International Committee for the Co-ordination and Standardization of Speech Databases and Assessment Techniques (COCOSDA).