Exploring annotation of head gesture forms in spontaneous human interaction.

Face-to-face interaction is characterised by head gestures that vary greatly in form and function. We present on-going exploratory work in characterising the form of these gestures. In particular, we define a kinematic annotation scheme and compute various agreement measures among two trained annotators. Gesture type mismatches among annotators are compared against kinematic characteristics of head gesture classes derived from motion capture data.

[1]  C. Pelachaud,et al.  Generating Listening Behaviour , 2011 .

[2]  Hennie Brugman,et al.  Annotating Multi-media/Multi-modal Resources with ELAN , 2004, LREC.

[3]  Stacy Marsella,et al.  Evaluating models of speaker head nods for virtual agents , 2010, AAMAS.

[4]  Frank A. Pintar,et al.  Physical properties of the human head: mass, center of gravity and moment of inertia. , 2009, Journal of biomechanics.

[5]  Petra Wagner,et al.  Evaluating a minimally invasive laboratory architecture for recording multimodal conversational data , 2012, Interspeech 2012.

[6]  Zofia Malisz,et al.  Listener head gestures and verbal feedback expressions in a distraction task , 2012, Interspeech 2012.

[7]  Laura Vincze,et al.  Types of Nods. The Polysemy of a Social Signal , 2010, LREC.

[8]  Petra Saskia Bayerl,et al.  What Determines Inter-Coder Agreement in Manual Annotations? A Meta-Analytic Investigation , 2011, CL.

[9]  Costanza Navarretta,et al.  The MUMIN coding scheme for the annotation of feedback, turn management and sequencing phenomena , 2007, Lang. Resour. Evaluation.

[10]  A. Savitzky,et al.  Smoothing and Differentiation of Data by Simplified Least Squares Procedures. , 1964 .

[11]  C. Pelachaud,et al.  Emotion-Oriented Systems: The Humaine Handbook , 2011 .

[12]  Evelyn Z. McClave Linguistic functions of head movements in the context of speech , 2000 .

[13]  J. Allwood,et al.  A study of gestural feedback expressions , 2006 .

[14]  Alice Caplier,et al.  Head nods analysis: interpretation of non verbal communication gestures , 2005, IEEE International Conference on Image Processing 2005.

[15]  Vladimir I. Levenshtein,et al.  Binary codes capable of correcting deletions, insertions, and reversals , 1965 .

[16]  Patrizia Paggio,et al.  Learning to classify the feedback function of head movements in a Danish corpus of first encounters , 2011 .

[17]  Hatice Gunes,et al.  Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners , 2010, IVA.

[18]  Nizar Habash,et al.  Inter-annotator Agreement on a Multilingual Semantic Annotation Task , 2006, LREC.

[19]  Uri Hadar,et al.  Kinematics of head movements accompanying speech during conversation , 1983 .

[20]  Ipke Wachsmuth Modeling Communication with Robots and Virtual Humans, Second ZiF Research Group International Workshop on Embodied Communication in Humans and Machines, Bielefeld, Germany, April 5-8, 2006, Revised Selected Papers , 2008, ZiF Workshop.

[21]  Dirk Heylen,et al.  Listening Heads , 2006, ZiF Workshop.

[22]  A. Murat Tekalp,et al.  Analysis of Head Gesture and Prosody Patterns for Prosody-Driven Head-Gesture Animation , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Ron Artstein,et al.  Crowdsourcing micro-level multimedia annotations: the challenges of evaluation and interface , 2012, CrowdMM '12.

[24]  Trevor Darrell,et al.  Contextual recognition of head gestures , 2005, ICMI '05.