Predicting Behavior in Cancer-Afflicted Patient and Spouse Interactions using Speech and Language

Cancer impacts the quality of life of those diagnosed as well as their spouse caregivers, in addition to potentially influencing their day-to-day behaviors. There is evidence that effective communication between spouses can improve well-being related to cancer but it is difficult to efficiently evaluate the quality of daily life interactions using manual annotation frameworks. Automated recognition of behaviors based on the interaction cues of speakers can help analyze interactions in such couples and identify behaviors which are beneficial for effective communication. In this paper, we present and detail a dataset of dyadic interactions in 85 real-life cancer-afflicted couples and a set of observational behavior codes pertaining to interpersonal communication attributes. We describe and employ neural network-based systems for classifying these behaviors based on turn-level acoustic and lexical speech patterns. Furthermore, we investigate the effect of controlling for factors such as gender, patient/caregiver role and conversation content on behavior classification. Analysis of our preliminary results indicates the challenges in this task due to the nature of the targeted behaviors and suggests that techniques incorporating contextual processing might be better suited to tackle this problem.

[1]  R. Heyman,et al.  Marital interaction coding system: revision and empirical evaluation. , 1995, Behaviour research and therapy.

[2]  Luca Antiga,et al.  Automatic differentiation in PyTorch , 2017 .

[3]  Athanasios Katsamanis,et al.  Toward automating a human behavioral coding system for married couples' interactions using speech acoustic features , 2013, Speech Commun..

[4]  Hoda Badr,et al.  New frontiers in couple-based interventions in cancer care: refining the prescription for spousal communication , 2017, Acta oncologica.

[5]  Maija Reblin,et al.  Behind closed doors: How advanced cancer couples communicate at home , 2018, Journal of psychosocial oncology.

[6]  Athanasios Katsamanis,et al.  "You made me do it": Classification of Blame in Married Couples' Interactions by Fusing Automatically Derived Speech and Language Information , 2011, INTERSPEECH.

[7]  J. de Vries,et al.  Dyadic coping and relationship functioning in couples coping with cancer: a systematic review. , 2015, British journal of health psychology.

[8]  Panayiotis G. Georgiou,et al.  Complexity in Speech and its Relation to Emotional Bond in Therapist-Patient Interactions During Suicide Risk Assessment Interviews , 2017, INTERSPEECH.

[9]  Louis-Philippe Morency,et al.  Adolescent Suicidal Risk Assessment in Clinician-Patient Interaction , 2017, IEEE Transactions on Affective Computing.

[10]  Carlos Busso,et al.  Analysis and Compensation of the Reaction Lag of Evaluators in Continuous Emotional Annotations , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[11]  Richard E. Heyman,et al.  Rapid marital interaction coding system (RMICS) , 2004 .

[12]  Tanaya Guha,et al.  Multimodal Prediction of Affective Dimensions and Depression in Human-Computer Interactions , 2014, AVEC '14.

[13]  Lucas P. J. J. Noldus,et al.  The Observer: A software system for collection and analysis of observational data , 1991 .

[14]  Masahiko Haruno,et al.  Brain response patterns to economic inequity predict present and future depression indices , 2017, Nature Human Behaviour.

[15]  Haoqi Li,et al.  "Honey, I Learned to Talk": Multimodal Fusion for Behavior Analysis , 2018, ICMI.

[16]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[17]  Alice Yuen Loke,et al.  A literature review on the mutual impact of the spousal caregiver-cancer patients dyads: 'communication', 'reciprocal influence', and 'caregiver-patient congruence'. , 2014, European journal of oncology nursing : the official journal of European Oncology Nursing Society.

[18]  Panayiotis G. Georgiou,et al.  Couples Behavior Modeling and Annotation Using Low-Resource LSTM Language Models , 2016, INTERSPEECH.

[19]  Thomas F. Quatieri,et al.  A review of depression and suicide risk assessment using speech analysis , 2015, Speech Commun..

[20]  S. Vadaparampil,et al.  Everyday couples' communication research: Overcoming methodological barriers with technology. , 2017, Patient education and counseling.

[21]  Shao-Yen Tseng,et al.  Unsupervised online multitask learning of behavioral sentence embeddings , 2018, PeerJ Comput. Sci..

[22]  Shrikanth S. Narayanan,et al.  The Promise and the Challenge of Technology-Facilitated Methods for Assessing Behavioral and Cognitive Markers of Risk for Suicide among U.S. Army National Guard Personnel , 2017, International journal of environmental research and public health.

[23]  Haoqi Li,et al.  Sparsely Connected and Disjointly Trained Deep Neural Networks for Low Resource Behavioral Annotation: Acoustic Classification in Couples' Therapy , 2016, INTERSPEECH.

[24]  Björn W. Schuller,et al.  Recent developments in openSMILE, the munich open-source multimedia feature extractor , 2013, ACM Multimedia.

[25]  P. Ganz,et al.  Living with cancer: the Cancer Inventory of Problem Situations. , 1984, Journal of clinical psychology.

[26]  Panayiotis G. Georgiou,et al.  Multimodal and Multiresolution Depression Detection from Speech and Facial Landmark Features , 2016, AVEC@ACM Multimedia.

[27]  Stephanie Ross,et al.  Couples' support-related communication, psychological distress, and relationship satisfaction among women with early stage breast cancer. , 2004, Journal of consulting and clinical psychology.

[28]  S. Swinnen,et al.  Action and Emotion Recognition from Point Light Displays: An Investigation of Gender Differences , 2011, PloS one.