Quantification of prosodic entrainment in affective spontaneous spoken interactions of married couples

Interaction synchrony among interlocutors happens naturally as people adapt their speaking style gradually to promote efficient communication. In this work, we quantify one aspect of interaction synchrony prosodic entrainment, specifically pitch and energy, in married couples’ problem-solving interactions using speech signal-derived measures. Statistical testings demonstrate that some of these measures capture useful information; they show higher values in interactions with couples having high positive attitude compared to high negative attitude. Further, by using quantized entrainment measures employed with statistical symbol sequence matching in a maximum likelihood framework, we obtained 76% accuracy in predicting positive affect vs. negative affect.

[1]  R. Warner,et al.  Attraction and Social Coordination: Mutual Entrainment of Vocal Activity Rhythms , 2003, Journal of psycholinguistic research.

[2]  R. Warner,et al.  Rhythmic organization of social interaction and observer ratings of positive affect and involvement , 1987 .

[3]  Ikuo Daibo,et al.  Interactional Synchrony in Conversations about Emotional Episodes: A Measurement by “the Between-Participants Pseudosynchrony Experimental Paradigm” , 2006 .

[4]  Paul Boersma,et al.  Praat, a system for doing phonetics by computer , 2002 .

[5]  Andreas Stolcke,et al.  SRILM - an extensible language modeling toolkit , 2002, INTERSPEECH.

[6]  David C. Atkins,et al.  Traditional versus integrative behavioral couple therapy for significantly and chronically distressed married couples. , 2004, Journal of consulting and clinical psychology.

[7]  Julia Hirschberg,et al.  High Frequency Word Entrainment in Spoken Dialogue , 2008, ACL.

[8]  References , 1971 .

[9]  Jennifer S. Pardo,et al.  On phonetic convergence during conversational interaction. , 2006, The Journal of the Acoustical Society of America.

[10]  Michael J. Richardson,et al.  Effects of visual and verbal interaction on unintentional interpersonal coordination. , 2005, Journal of experimental psychology. Human perception and performance.

[11]  Athanasios Katsamanis,et al.  Automatic classification of married couples' behavior using audio features , 2010, INTERSPEECH.

[12]  Pedro J. Moreno,et al.  A recursive algorithm for the forced alignment of very long audio segments , 1998, ICSLP.

[13]  G. Margolin,et al.  The Nuts and Bolts of Behavioral Observation of Marital and Family Interaction , 1998, Clinical child and family psychology review.