The NOMCO Multimodal Nordic Resource - Goals and Characteristics

This paper presents the multimodal corpora that are being collected and annotated in the Nordic NOMCO project. The corpora will be used to study communicative phenomena such as feedback, turn management and sequencing. They already include video material for Swedish, Danish, Finnish and Estonian, and several social activities are represented. The data will make it possible to verify empirically how gestures (head movements, facial displays, hand gestures and body postures) and speech interact in all the three mentioned aspects of communication. The data are being annotated following the MUMIN annotation scheme, which provides attributes concerning the shape and the communicative functions of head movements, face expressions, body posture and hand gestures. After having described the corpora, the paper discusses how they will be used to study the way feedback is expressed in speech and gestures, and reports results from two pilot studies where we investigated the function of head gestures ― both single and repeated ― in combination with feedback expressions. The annotated corpora will be valuable sources for research on intercultural communication as well as for interaction in the individual languages.

[1]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[2]  Jens Allwood,et al.  Hesitation in Intercultural Communication : Some observations on Interpreting Shoulder Shrugging , 2010 .

[4]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[5]  Jens Allwood,et al.  Repeated head movements, their function and relation to speech , 2010 .

[6]  Kadri Karma,et al.  Cross‐cultural Research Methods in Psychology , 2012 .

[7]  Nina Grønnum,et al.  A Danish phonetically annotated spontaneous speech corpus (DanPASS) , 2009, Speech Commun..

[8]  Joakim Nivre,et al.  On the Semantics and Pragmatics of Linguistic Feedback , 1992, J. Semant..

[9]  Loredana Cerrato,et al.  Investigating Communicative Feedback Phenomena across Languages and Modalities , 2007 .

[10]  Costanza Navarretta,et al.  Distinguishing the Communicative Functions of Gestures , 2008, MLMI.

[11]  Judith Masthoff,et al.  Reference and Gestures in Dialogue Generation: Three Studies with Embodied Conversational Agents , 2005 .

[12]  Costanza Navarretta,et al.  The MUMIN coding scheme for the annotation of feedback, turn management and sequencing phenomena , 2007, Lang. Resour. Evaluation.

[13]  J. Allwood BODILY COMMUNICATION DIMENSIONS OF EXPRESSION AND CONTENT , 2002 .

[14]  Trevor Darrell,et al.  The Role of Context in Head Gesture Recognition , 2006, AAAI.

[15]  Kristiina Jokinen,et al.  Stand-up Gestures — Annotation for Communication Management , 2009 .

[16]  Michael Kipp,et al.  ANVIL - a generic annotation tool for multimodal dialogue , 2001, INTERSPEECH.

[17]  John B. Nezlek Cross-Cultural Research Methods in Psychology: Multilevel Modeling and Cross-Cultural Research , 2010 .

[18]  Anton Nijholt,et al.  Development of Multimodal Interfaces: Active Listening and Synchrony, Second COST 2102 International Training School, Dublin, Ireland, March 23-27, 2009, Revised Selected Papers , 2010, COST 2102 Training School.

[19]  C. Peirce,et al.  Collected Papers of Charles Sanders Peirce , 1936, Nature.

[20]  C. Hartshorne,et al.  Collected Papers of Charles Sanders Peirce , 1935, Nature.

[21]  Jens Edlund,et al.  Turn-taking gestures and hour-glasses in a multi-modal dialogue system , 2002 .

[22]  Björn Granström,et al.  Multimodality in Language and Speech Systems , 2002 .

[23]  Masafumi Nishida,et al.  Eye-gaze experiments for conversation monitoring , 2009, IUCS.